Sample records for semi-automatic bubble counting

  1. Semi-automatic image analysis methodology for the segmentation of bubbles and drops in complex dispersions occurring in bioreactors

    NASA Astrophysics Data System (ADS)

    Taboada, B.; Vega-Alvarado, L.; Córdova-Aguilar, M. S.; Galindo, E.; Corkidi, G.

    2006-09-01

    Characterization of multiphase systems occurring in fermentation processes is a time-consuming and tedious process when manual methods are used. This work describes a new semi-automatic methodology for the on-line assessment of diameters of oil drops and air bubbles occurring in a complex simulated fermentation broth. High-quality digital images were obtained from the interior of a mechanically stirred tank. These images were pre-processed to find segments of edges belonging to the objects of interest. The contours of air bubbles and oil drops were then reconstructed using an improved Hough transform algorithm which was tested in two, three and four-phase simulated fermentation model systems. The results were compared against those obtained manually by a trained observer, showing no significant statistical differences. The method was able to reduce the total processing time for the measurements of bubbles and drops in different systems by 21-50% and the manual intervention time for the segmentation procedure by 80-100%.

  2. Density estimation in aerial images of large crowds for automatic people counting

    NASA Astrophysics Data System (ADS)

    Herrmann, Christian; Metzler, Juergen

    2013-05-01

    Counting people is a common topic in the area of visual surveillance and crowd analysis. While many image-based solutions are designed to count only a few persons at the same time, like pedestrians entering a shop or watching an advertisement, there is hardly any solution for counting large crowds of several hundred persons or more. We addressed this problem previously by designing a semi-automatic system being able to count crowds consisting of hundreds or thousands of people based on aerial images of demonstrations or similar events. This system requires major user interaction to segment the image. Our principle aim is to reduce this manual interaction. To achieve this, we propose a new and automatic system. Besides counting the people in large crowds, the system yields the positions of people allowing a plausibility check by a human operator. In order to automatize the people counting system, we use crowd density estimation. The determination of crowd density is based on several features like edge intensity or spatial frequency. They indicate the density and discriminate between a crowd and other image regions like buildings, bushes or trees. We compare the performance of our automatic system to the previous semi-automatic system and to manual counting in images. By counting a test set of aerial images showing large crowds containing up to 12,000 people, the performance gain of our new system will be measured. By improving our previous system, we will increase the benefit of an image-based solution for counting people in large crowds.

  3. Image-based red cell counting for wild animals blood.

    PubMed

    Mauricio, Claudio R M; Schneider, Fabio K; Dos Santos, Leonilda Correia

    2010-01-01

    An image-based red blood cell (RBC) automatic counting system is presented for wild animals blood analysis. Images with 2048×1536-pixel resolution acquired on an optical microscope using Neubauer chambers are used to evaluate RBC counting for three animal species (Leopardus pardalis, Cebus apella and Nasua nasua) and the error found using the proposed method is similar to that obtained for inter observer visual counting method, i.e., around 10%. Smaller errors (e.g., 3%) can be obtained in regions with less grid artifacts. These promising results allow the use of the proposed method either as a complete automatic counting tool in laboratories for wild animal's blood analysis or as a first counting stage in a semi-automatic counting tool.

  4. Semi-automatic, octave-spanning optical frequency counter.

    PubMed

    Liu, Tze-An; Shu, Ren-Huei; Peng, Jin-Long

    2008-07-07

    This work presents and demonstrates a semi-automatic optical frequency counter with octave-spanning counting capability using two fiber laser combs operated at different repetition rates. Monochromators are utilized to provide an approximate frequency of the laser under measurement to determine the mode number difference between the two laser combs. The exact mode number of the beating comb line is obtained from the mode number difference and the measured beat frequencies. The entire measurement process, except the frequency stabilization of the laser combs and the optimization of the beat signal-to-noise ratio, is controlled by a computer running a semi-automatic optical frequency counter.

  5. Semi-automatic assessment of skin capillary density: proof of principle and validation.

    PubMed

    Gronenschild, E H B M; Muris, D M J; Schram, M T; Karaca, U; Stehouwer, C D A; Houben, A J H M

    2013-11-01

    Skin capillary density and recruitment have been proven to be relevant measures of microvascular function. Unfortunately, the assessment of skin capillary density from movie files is very time-consuming, since this is done manually. This impedes the use of this technique in large-scale studies. We aimed to develop a (semi-) automated assessment of skin capillary density. CapiAna (Capillary Analysis) is a newly developed semi-automatic image analysis application. The technique involves four steps: 1) movement correction, 2) selection of the frame range and positioning of the region of interest (ROI), 3) automatic detection of capillaries, and 4) manual correction of detected capillaries. To gain insight into the performance of the technique, skin capillary density was measured in twenty participants (ten women; mean age 56.2 [42-72] years). To investigate the agreement between CapiAna and the classic manual counting procedure, we used weighted Deming regression and Bland-Altman analyses. In addition, intra- and inter-observer coefficients of variation (CVs), and differences in analysis time were assessed. We found a good agreement between CapiAna and the classic manual method, with a Pearson's correlation coefficient (r) of 0.95 (P<0.001) and a Deming regression coefficient of 1.01 (95%CI: 0.91; 1.10). In addition, we found no significant differences between the two methods, with an intercept of the Deming regression of 1.75 (-6.04; 9.54), while the Bland-Altman analysis showed a mean difference (bias) of 2.0 (-13.5; 18.4) capillaries/mm(2). The intra- and inter-observer CVs of CapiAna were 2.5% and 5.6% respectively, while for the classic manual counting procedure these were 3.2% and 7.2%, respectively. Finally, the analysis time for CapiAna ranged between 25 and 35min versus 80 and 95min for the manual counting procedure. We have developed a semi-automatic image analysis application (CapiAna) for the assessment of skin capillary density, which agrees well with the classic manual counting procedure, is time-saving, and has a better reproducibility as compared to the classic manual counting procedure. As a result, the use of skin capillaroscopy is feasible in large-scale studies, which importantly extends the possibilities to perform microcirculation research in humans. © 2013.

  6. AutoTag and AutoSnap: Standardized, semi-automatic capture of regions of interest from whole slide images

    PubMed Central

    Marien, Koen M.; Andries, Luc; De Schepper, Stefanie; Kockx, Mark M.; De Meyer, Guido R.Y.

    2015-01-01

    Tumor angiogenesis is measured by counting microvessels in tissue sections at high power magnification as a potential prognostic or predictive biomarker. Until now, regions of interest1 (ROIs) were selected by manual operations within a tumor by using a systematic uniform random sampling2 (SURS) approach. Although SURS is the most reliable sampling method, it implies a high workload. However, SURS can be semi-automated and in this way contribute to the development of a validated quantification method for microvessel counting in the clinical setting. Here, we report a method to use semi-automated SURS for microvessel counting: • Whole slide imaging with Pannoramic SCAN (3DHISTECH) • Computer-assisted sampling in Pannoramic Viewer (3DHISTECH) extended by two self-written AutoHotkey applications (AutoTag and AutoSnap) • The use of digital grids in Photoshop® and Bridge® (Adobe Systems) This rapid procedure allows traceability essential for high throughput protein analysis of immunohistochemically stained tissue. PMID:26150998

  7. Assessment of ICount software, a precise and fast egg counting tool for the mosquito vector Aedes aegypti.

    PubMed

    Gaburro, Julie; Duchemin, Jean-Bernard; Paradkar, Prasad N; Nahavandi, Saeid; Bhatti, Asim

    2016-11-18

    Widespread in the tropics, the mosquito Aedes aegypti is an important vector of many viruses, posing a significant threat to human health. Vector monitoring often requires fecundity estimation by counting eggs laid by female mosquitoes. Traditionally, manual data analyses have been used but this requires a lot of effort and is the methods are prone to errors. An easy tool to assess the number of eggs laid would facilitate experimentation and vector control operations. This study introduces a built-in software called ICount allowing automatic egg counting of the mosquito vector, Aedes aegypti. ICount egg estimation compared to manual counting is statistically equivalent, making the software effective for automatic and semi-automatic data analysis. This technique also allows rapid analysis compared to manual methods. Finally, the software has been used to assess p-cresol oviposition choices under laboratory conditions in order to test the system with different egg densities. ICount is a powerful tool for fast and precise egg count analysis, freeing experimenters from manual data processing. Software access is free and its user-friendly interface allows easy use by non-experts. Its efficiency has been tested in our laboratory with oviposition dual choices of Aedes aegypti females. The next step will be the development of a mobile application, based on the ICount platform, for vector monitoring surveys in the field.

  8. A methodology for the semi-automatic digital image analysis of fragmental impactites

    NASA Astrophysics Data System (ADS)

    Chanou, A.; Osinski, G. R.; Grieve, R. A. F.

    2014-04-01

    A semi-automated digital image analysis method is developed for the comparative textural study of impact melt-bearing breccias. This method uses the freeware software ImageJ developed by the National Institute of Health (NIH). Digital image analysis is performed on scans of hand samples (10-15 cm across), based on macroscopic interpretations of the rock components. All image processing and segmentation are done semi-automatically, with the least possible manual intervention. The areal fraction of components is estimated and modal abundances can be deduced, where the physical optical properties (e.g., contrast, color) of the samples allow it. Other parameters that can be measured include, for example, clast size, clast-preferred orientations, average box-counting dimension or fragment shape complexity, and nearest neighbor distances (NnD). This semi-automated method allows the analysis of a larger number of samples in a relatively short time. Textures, granulometry, and shape descriptors are of considerable importance in rock characterization. The methodology is used to determine the variations of the physical characteristics of some examples of fragmental impactites.

  9. A boundary element model of the transport of a semi-infinite bubble through a microvessel bifurcation

    NASA Astrophysics Data System (ADS)

    Calderon, Andres J.; Eshpuniyani, Brijesh; Fowlkes, J. Brian; Bull, Joseph L.

    2010-06-01

    Motivated by a developmental gas embolotherapy technique for selective occlusion of blood flow to tumors, we examined the transport of a pressure-driven semi-infinite bubble through a liquid-filled bifurcating channel. Homogeneity of bubble splitting as the bubble passes through a vessel bifurcation affects the degree to which the vascular network near the tumor can be uniformly occluded. The homogeneity of bubble splitting was found to increase with bubble driving pressure and to decrease with increased bifurcation angle. Viscous losses at the bifurcation were observed to affect the bubble speed significantly. The potential for oscillating bubble interfaces to induce flow recirculation and impart high stresses on the vessel endothelium was also observed.

  10. Web platform using digital image processing and geographic information system tools: a Brazilian case study on dengue.

    PubMed

    Brasil, Lourdes M; Gomes, Marília M F; Miosso, Cristiano J; da Silva, Marlete M; Amvame-Nze, Georges D

    2015-07-16

    Dengue fever is endemic in Asia, the Americas, the East of the Mediterranean and the Western Pacific. According to the World Health Organization, it is one of the diseases of greatest impact on health, affecting millions of people each year worldwide. A fast detection of increases in populations of the transmitting vector, the Aedes aegypti mosquito, is essential to avoid dengue outbreaks. Unfortunately, in several countries, such as Brazil, the current methods for detecting populations changes and disseminating this information are too slow to allow efficient allocation of resources to fight outbreaks. To reduce the delay in providing the information regarding A. aegypti population changes, we propose, develop, and evaluate a system for counting the eggs found in special traps and to provide the collected data using a web structure with geographical location resources. One of the most useful tools for the detection and surveillance of arthropods is the ovitrap, a special trap built to collect the mosquito eggs. This allows for an egg counting process, which is still usually performed manually, in countries such as Brazil. We implement and evaluate a novel system for automatically counting the eggs found in the ovitraps' cardboards. The system we propose is based on digital image processing (DIP) techniques, as well as a Web based Semi-Automatic Counting System (SCSA-WEB). All data collected are geographically referenced in a geographic information system (GIS) and made available on a Web platform. The work was developed in Gama's administrative region, in Brasília/Brazil, with the aid of the Environmental Surveillance Directory (DIVAL-Gama) and Brasília's Board of Health (SSDF), in partnership with the University of Brasília (UnB). The system was built based on a field survey carried out during three months and provided by health professionals. These professionals provided 84 cardboards from 84 ovitraps, sized 15 × 5 cm. In developing the system, we conducted the following steps: i. Obtain images from the eggs on an ovitrap's cardboards, with a microscope. ii. Apply a proposed image-processing-based semi-automatic counting system. The system we developed uses the Java programming language and the Java Server Faces technology. This is a framework suite for web applications development. This approach will allow a simple migration to any Operating System platform and future applications on mobile devices. iii. Collect and store all data into a Database (DB) and then georeference them in a GIS. The Database Management System used to develop the DB is based on PostgreSQL. The GIS will assist in the visualization and spatial analysis of digital maps, allowing the location of Dengue outbreaks in the region of study. This will also facilitate the planning, analysis, and evaluation of temporal and spatial epidemiology, as required by the Brazilian Health Care Control Center. iv. Deploy the SCSA-WEB, DB and GIS on a single Web platform. The statistical results obtained by DIP were satisfactory when compared with the SCSA-WEB's semi-automated eggs count. The results also indicate that the time spent in manual counting has being considerably reduced when using our fully automated DIP algorithm and semi-automated SCSA-WEB. The developed georeferencing Web platform proves to be of great support for future visualization with statistical and trace analysis of the disease. The analyses suggest the efficiency of our algorithm for automatic eggs counting, in terms of expediting the work of the laboratory technician, reducing considerably its time and error counting rates. We believe that this kind of integrated platform and tools can simplify the decision making process of the Brazilian Health Care Control Center.

  11. The Influence of Endmember Selection Method in Extracting Impervious Surface from Airborne Hyperspectral Imagery

    NASA Astrophysics Data System (ADS)

    Wang, J.; Feng, B.

    2016-12-01

    Impervious surface area (ISA) has long been studied as an important input into moisture flux models. In general, ISA impedes groundwater recharge, increases stormflow/flood frequency, and alters in-stream and riparian habitats. Urban area is recognized as one of the richest ISA environment. Urban ISA mapping assists flood prevention and urban planning. Hyperspectral imagery (HI), for its ability to detect subtle spectral signature, becomes an ideal candidate in urban ISA mapping. To map ISA from HI involves endmember (EM) selection. The high degree of spatial and spectral heterogeneity of urban environment puts great difficulty in this task: a compromise point is needed between the automatic degree and the good representativeness of the method. The study tested one manual and two semi-automatic EM selection strategies. The manual and the first semi-automatic methods have been widely used in EM selection. The second semi-automatic EM selection method is rather new and has been only proposed for moderate spatial resolution satellite. The manual method visually selected the EM candidates from eight landcover types in the original image. The first semi-automatic method chose the EM candidates using a threshold over the pixel purity index (PPI) map. The second semi-automatic method used the triangle shape of the HI scatter plot in the n-Dimension visualizer to identify the V-I-S (vegetation-impervious surface-soil) EM candidates: the pixels locate at the triangle points. The initial EM candidates from the three methods were further refined by three indexes (EM average RMSE, minimum average spectral angle, and count based EM selection) and generated three spectral libraries, which were used to classify the test image. Spectral angle mapper was applied. The accuracy reports for the classification results were generated. The overall accuracy are 85% for the manual method, 81% for the PPI method, and 87% for the V-I-S method. The V-I-S EM selection method performs best in this study. This fact proves the value of V-I-S EM selection method in not only moderate spatial resolution satellite image but also the more and more accessible high spatial resolution airborne image. This semi-automatic EM selection method can be adopted into a wide range of remote sensing images and provide ISA map for hydrology analysis.

  12. Informative frame detection from wireless capsule video endoscopic images

    NASA Astrophysics Data System (ADS)

    Bashar, Md. Khayrul; Mori, Kensaku; Suenaga, Yasuhito; Kitasaka, Takayuki; Mekada, Yoshito

    2008-03-01

    Wireless capsule endoscopy (WCE) is a new clinical technology permitting the visualization of the small bowel, the most difficult segment of the digestive tract. The major drawback of this technology is the high amount of time for video diagnosis. In this study, we propose a method for informative frame detection by isolating useless frames that are substantially covered by turbid fluids or their contamination with other materials, e.g., faecal, semi-processed or unabsorbed foods etc. Such materials and fluids present a wide range of colors, from brown to yellow, and/or bubble-like texture patterns. The detection scheme, therefore, consists of two stages: highly contaminated non-bubbled (HCN) frame detection and significantly bubbled (SB) frame detection. Local color moments in the Ohta color space are used to characterize HCN frames, which are isolated by the Support Vector Machine (SVM) classifier in Stage-1. The rest of the frames go to the Stage-2, where Laguerre gauss Circular Harmonic Functions (LG-CHFs) extract the characteristics of the bubble-structures in a multi-resolution framework. An automatic segmentation method is designed to extract the bubbled regions based on local absolute energies of the CHF responses, derived from the grayscale version of the original color image. Final detection of the informative frames is obtained by using threshold operation on the extracted regions. An experiment with 20,558 frames from the three videos shows the excellent average detection accuracy (96.75%) by the proposed method, when compared with the Gabor based- (74.29%) and discrete wavelet based features (62.21%).

  13. Hydrodynamics and propulsion mechanism of self-propelled catalytic micromotors: model and experiment.

    PubMed

    Li, Longqiu; Wang, Jiyuan; Li, Tianlong; Song, Wenping; Zhang, Guangyu

    2014-10-14

    The hydrodynamic behavior and propulsion mechanism of self-propelled micromotors are studied theoretically and experimentally. A hydrodynamic model to describe bubble growth and detachment is proposed to investigate the mechanism of a self-propelled conical tubular catalytic micromotor considering bubble geometric asymmetry and buoyancy force. The growth force caused by the growth of the bubble surface against the fluid is the driving force for micromotor motion. Also, the buoyancy force plays a primary role in bubble detachment. The effect of geometrical parameters on the micromotor velocity and drag force is presented. The bubble radius ratio is investigated for different micromotor radii to determine its hydrodynamic behavior during bubble ejection. The average micromotor velocity is found to be strongly dependent on the semi-cone angle, expelling frequency and bubble radius ratio. The semi-cone angle has a significant effect on the expelling frequency for conical tubular micromotors. The predicted results are compared to already existing experimental data for cylindrical micromotors (semi-cone angle δ = 0°) and conical micromotors. A good agreement is found between the theoretical calculation and experimental results. This model provides a profound explanation for the propulsion mechanism of a catalytic micromotor and can be used to optimize the micromotor design for its biomedical and environmental applications.

  14. Response Evaluation of Malignant Liver Lesions After TACE/SIRT: Comparison of Manual and Semi-Automatic Measurement of Different Response Criteria in Multislice CT.

    PubMed

    Höink, Anna Janina; Schülke, Christoph; Koch, Raphael; Löhnert, Annika; Kammerer, Sara; Fortkamp, Rasmus; Heindel, Walter; Buerke, Boris

    2017-11-01

    Purpose  To compare measurement precision and interobserver variability in the evaluation of hepatocellular carcinoma (HCC) and liver metastases in MSCT before and after transarterial local ablative therapies. Materials and Methods  Retrospective study of 72 patients with malignant liver lesions (42 metastases; 30 HCCs) before and after therapy (43 SIRT procedures; 29 TACE procedures). Established (LAD; SAD; WHO) and vitality-based parameters (mRECIST; mLAD; mSAD; EASL) were assessed manually and semi-automatically by two readers. The relative interobserver difference (RID) and intraclass correlation coefficient (ICC) were calculated. Results  The median RID for vitality-based parameters was lower from semi-automatic than from manual measurement of mLAD (manual 12.5 %; semi-automatic 3.4 %), mSAD (manual 12.7 %; semi-automatic 5.7 %) and EASL (manual 10.4 %; semi-automatic 1.8 %). The difference in established parameters was not statistically noticeable (p > 0.05). The ICCs of LAD (manual 0.984; semi-automatic 0.982), SAD (manual 0.975; semi-automatic 0.958) and WHO (manual 0.984; semi-automatic 0.978) are high, both in manual and semi-automatic measurements. The ICCs of manual measurements of mLAD (0.897), mSAD (0.844) and EASL (0.875) are lower. This decrease cannot be found in semi-automatic measurements of mLAD (0.997), mSAD (0.992) and EASL (0.998). Conclusion  Vitality-based tumor measurements of HCC and metastases after transarterial local therapies should be performed semi-automatically due to greater measurement precision, thus increasing the reproducibility and in turn the reliability of therapeutic decisions. Key points   · Liver lesion measurements according to EASL and mRECIST are more precise when performed semi-automatically.. · The higher reproducibility may facilitate a more reliable classification of therapy response.. · Measurements according to RECIST and WHO offer equivalent precision semi-automatically and manually.. Citation Format · Höink AJ, Schülke C, Koch R et al. Response Evaluation of Malignant Liver Lesions After TACE/SIRT: Comparison of Manual and Semi-Automatic Measurement of Different Response Criteria in Multislice CT. Fortschr Röntgenstr 2017; 189: 1067 - 1075. © Georg Thieme Verlag KG Stuttgart · New York.

  15. Extracting the exponential behaviors in the market data

    NASA Astrophysics Data System (ADS)

    Watanabe, Kota; Takayasu, Hideki; Takayasu, Misako

    2007-08-01

    We introduce a mathematical criterion defining the bubbles or the crashes in financial market price fluctuations by considering exponential fitting of the given data. By applying this criterion we can automatically extract the periods in which bubbles and crashes are identified. From stock market data of so-called the Internet bubbles it is found that the characteristic length of bubble period is about 100 days.

  16. A simple bubble-flowmeter with quasicontinuous registration.

    PubMed

    Ludt, H; Herrmann, H D

    1976-07-22

    The construction of a simple bubble-flow-meter is described. The instrument has the following features: 1. automatic bubble injection, 2. precise measurement of the bubble passage time by a digital counter, 3. quasicontinuous registration of the flow rate, 4. alternative run with clear fluid (water) and coloured fluid (blood), 5. low volume, 6. closed measuring system for measurements in low and high pressure systems.

  17. Patient warming excess heat: the effects on orthopedic operating room ventilation performance.

    PubMed

    Belani, Kumar G; Albrecht, Mark; McGovern, Paul D; Reed, Mike; Nachtsheim, Christopher

    2013-08-01

    Patient warming has become a standard of care for the prevention of unintentional hypothermia based on benefits established in general surgery. However, these benefits may not fully translate to contamination-sensitive surgery (i.e., implants), because patient warming devices release excess heat that may disrupt the intended ceiling-to-floor ventilation airflows and expose the surgical site to added contamination. Therefore, we studied the effects of 2 popular patient warming technologies, forced air and conductive fabric, versus control conditions on ventilation performance in an orthopedic operating room with a mannequin draped for total knee replacement. Ventilation performance was assessed by releasing neutrally buoyant detergent bubbles ("bubbles") into the nonsterile region under the head-side of the anesthesia drape. We then tracked whether the excess heat from upper body patient warming mobilized the "bubbles" into the surgical site. Formally, a randomized replicated design assessed the effect of device (forced air, conductive fabric, control) and anesthesia drape height (low-drape, high-drape) on the number of bubbles photographed over the surgical site. The direct mass-flow exhaust from forced air warming generated hot air convection currents that mobilized bubbles over the anesthesia drape and into the surgical site, resulting in a significant increase in bubble counts for the factor of patient warming device (P < 0.001). Forced air had an average count of 132.5 versus 0.48 for conductive fabric (P = 0.003) and 0.01 for control conditions (P = 0.008) across both drape heights. Differences in average bubble counts across both drape heights were insignificant between conductive fabric and control conditions (P = 0.87). The factor of drape height had no significant effect (P = 0.94) on bubble counts. Excess heat from forced air warming resulted in the disruption of ventilation airflows over the surgical site, whereas conductive patient warming devices had no noticeable effect on ventilation airflows. These findings warrant future research into the effects of forced air warming excess heat on clinical outcomes during contamination-sensitive surgery.

  18. Size-sensitive particle trajectories in three-dimensional micro-bubble acoustic streaming flows

    NASA Astrophysics Data System (ADS)

    Volk, Andreas; Rossi, Massimiliano; Hilgenfeldt, Sascha; Rallabandi, Bhargav; Kähler, Christian; Marin, Alvaro

    2015-11-01

    Oscillating microbubbles generate steady streaming flows with interesting features and promising applications for microparticle manipulation. The flow around oscillating semi-cylindrical bubbles has been typically assumed to be independent of the axial coordinate. However, it has been recently revealed that particle motion is strongly three-dimensional: Small tracer particles follow vortical trajectories with pronounced axial displacements near the bubble, weaving a toroidal stream-surface. A well-known consequence of bubble streaming flows is size-dependent particle migration, which can be exploited for sorting and trapping of microparticles in microfluidic devices. In this talk, we will show how the three-dimensional toroidal topology found for small tracer particles is modified as the particle size increases up to 1/3 of the bubble radius. Our results show size-sensitive particle positioning along the axis of the semi-cylindrical bubble. In order to analyze the three-dimensional sorting and trapping capabilities of the system, experiments with an imposed flow and polydisperse particle solutions are also shown.

  19. The use of portable 2D echocardiography and 'frame-based' bubble counting as a tool to evaluate diving decompression stress.

    PubMed

    Germonpré, Peter; Papadopoulou, Virginie; Hemelryck, Walter; Obeid, Georges; Lafère, Pierre; Eckersley, Robert J; Tang, Meng-Xing; Balestra, Costantino

    2014-03-01

    'Decompression stress' is commonly evaluated by scoring circulating bubble numbers post dive using Doppler or cardiac echography. This information may be used to develop safer decompression algorithms, assuming that the lower the numbers of venous gas emboli (VGE) observed post dive, the lower the statistical risk of decompression sickness (DCS). Current echocardiographic evaluation of VGE, using the Eftedal and Brubakk method, has some disadvantages as it is less well suited for large-scale evaluation of recreational diving profiles. We propose and validate a new 'frame-based' VGE-counting method which offers a continuous scale of measurement. Nine 'raters' of varying familiarity with echocardiography were asked to grade 20 echocardiograph recordings using both the Eftedal and Brubakk grading and the new 'frame-based' counting method. They were also asked to count the number of bubbles in 50 still-frame images, some of which were randomly repeated. A Wilcoxon Spearman ρ calculation was used to assess test-retest reliability of each rater for the repeated still frames. For the video images, weighted kappa statistics, with linear and quadratic weightings, were calculated to measure agreement between raters for the Eftedal and Brubakk method. Bland-Altman plots and intra-class correlation coefficients were used to measure agreement between raters for the frame-based counting method. Frame-based counting showed a better inter-rater agreement than the Eftedal and Brubakk grading, even with relatively inexperienced assessors, and has good intra- and inter-rater reliability. Frame-based bubble counting could be used to evaluate post-dive decompression stress, and offers possibilities for computer-automated algorithms to allow near-real-time counting.

  20. Regimes of Micro-bubble Formation Using Gas Injection into Ladle Shroud

    NASA Astrophysics Data System (ADS)

    Chang, Sheng; Cao, Xiangkun; Zou, Zongshu

    2018-03-01

    Gas injection into a ladle shroud is a practical approach to produce micro-bubbles in tundishes, to promote inclusion removal from liquid steel. A semi-empirical model was established to characterize the bubble formation considering the effect of shearing action combined with the non-fully bubble break-up by turbulence. The model shows a good accuracy in predicting the size of bubbles formed in complex flow within the ladle shroud.

  1. Regimes of Micro-bubble Formation Using Gas Injection into Ladle Shroud

    NASA Astrophysics Data System (ADS)

    Chang, Sheng; Cao, Xiangkun; Zou, Zongshu

    2018-06-01

    Gas injection into a ladle shroud is a practical approach to produce micro-bubbles in tundishes, to promote inclusion removal from liquid steel. A semi-empirical model was established to characterize the bubble formation considering the effect of shearing action combined with the non-fully bubble break-up by turbulence. The model shows a good accuracy in predicting the size of bubbles formed in complex flow within the ladle shroud.

  2. Field-driven chiral bubble dynamics analysed by a semi-analytical approach

    NASA Astrophysics Data System (ADS)

    Vandermeulen, J.; Leliaert, J.; Dupré, L.; Van Waeyenberge, B.

    2017-12-01

    Nowadays, field-driven chiral bubble dynamics in the presence of the Dzyaloshinskii-Moriya interaction are a topic of thorough investigation. In this paper, a semi-analytical approach is used to derive equations of motion that express the bubble wall (BW) velocity and the change in in-plane magnetization angle as function of the micromagnetic parameters of the involved interactions, thereby taking into account the two-dimensional nature of the bubble wall. It is demonstrated that the equations of motion enable an accurate description of the expanding and shrinking convex bubble dynamics and an expression for the transition field between shrinkage and expansion is derived. In addition, these equations of motion show that the BW velocity is not only dependent on the driving force, but also on the BW curvature. The absolute BW velocity increases for both a shrinking and an expanding bubble, but for different reasons: for expanding bubbles, it is due to the increasing importance of the driving force, while for shrinking bubbles, it is due to the increasing importance of contributions related to the BW curvature. Finally, using this approach we show how the recently proposed magnetic bubblecade memory can operate in the flow regime in the presence of a tilted sinusoidal magnetic field and at greatly reduced bubble sizes compared to the original device prototype.

  3. The rate of bubble growth in a superheated liquid in pool boiling

    NASA Astrophysics Data System (ADS)

    Abdollahi, Mohammad Reza; Jafarian, Mehdi; Jamialahmadi, Mohammad

    2017-12-01

    A semi-empirical model for the estimation of the rate of bubble growth in nucleate pool boiling is presented, considering a new equation to estimate the temperature history of the bubble in the bulk of liquid. The conservation equations of energy, mass and momentum have been firstly derived and solved analytically. The present analytical model of the bubble growth predicts that the radius of the bubble grows as a function of √{t}.{\\operatorname{erf}}( N√{t}) , while so far the bubble growth rate has been mainly correlated to √{t} in the previous studies. In the next step, the analytical solutions were used to develop a new semi-empirical equation. To achieve this, firstly the analytical solution were non-dimensionalised and then the experimental data, available in the literature, were applied to tune the dimensionless coefficients appeared in the dimensionless equation. Finally, the reliability of the proposed semi-empirical model was assessed through comparison of the model predictions with the available experimental data in the literature, which were not applied in the tuning of the dimensionless parameters of the model. The comparison of the model predictions with other proposed models in the literature was also performed. These comparisons show that this model enables more accurate predictions than previously proposed models with a deviation of less than 10% in a wide range of operating conditions.

  4. Usefulness of a rotation-revolution mixer for mixing powder-liquid reline material.

    PubMed

    Yamaga, Yoshio; Kanatani, Mitsugu; Nomura, Shuichi

    2015-01-01

    The purpose of this study was to evaluate the distribution of bubbles, degree of mixing, flowability and mechanical strength of powder-liquid reline material by manually and with a rotation-revolution (planetary) mixer, and to determine the usefulness of a rotation-revolution mixer for this application. Powder-liquid reline material (Mild Rebaron, GC, Tokyo, Japan) was mixed with a powder to liquid ratio of 1:0.62 according to the manufacturer's instruction. Two methods were used to mix it: mixed by manually ("manual-mixing") and automatically with a rotation-revolution mixer (Super Rakuneru Fine, GC, Tokyo, Japan; "automatic-mixing"). Disc-shaped specimens, 30 mm in diameter and 1.0mm in thickness, were used to observe the distribution of bubbles in at 10× magnifications. Flowability tests were carried out according to the JIS T6521 for denture base hard reline materials. A three point bending test was carried out by a universal testing machine. Elastic modulus and flexural stress at the proportional limit were calculated. A median of 4 bubbles and inhomogeneous were observed in manual-mixed specimens. However, no bubbles and homogeneous were observed in automatic-mixed specimens. Flowability was within the JIS range in all mixing conditions and did not differ significantly across conditions. The elastic modulus was the same for manual-mixed and automatic-mixed specimens. On the other hand, the flexural stress at the proportional limit differed significantly between manual-mixed and automatic-mixed specimens. The results confirm that rotation-revolution mixer is useful for mixing powder-liquid reline material. Automatic-mixing may be recommended for clinical practice. Copyright © 2014 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.

  5. Numerical investigation of shock induced bubble collapse in water

    NASA Astrophysics Data System (ADS)

    Apazidis, N.

    2016-04-01

    A semi-conservative, stable, interphase-capturing numerical scheme for shock propagation in heterogeneous systems is applied to the problem of shock propagation in liquid-gas systems. The scheme is based on the volume-fraction formulation of the equations of motion for liquid and gas phases with separate equations of state. The semi-conservative formulation of the governing equations ensures the absence of spurious pressure oscillations at the material interphases between liquid and gas. Interaction of a planar shock in water with a single spherical bubble as well as twin adjacent bubbles is investigated. Several stages of the interaction process are considered, including focusing of the transmitted shock within the deformed bubble, creation of a water-hammer shock as well as generation of high-speed liquid jet in the later stages of the process.

  6. Pulmonary lobar volumetry using novel volumetric computer-aided diagnosis and computed tomography

    PubMed Central

    Iwano, Shingo; Kitano, Mariko; Matsuo, Keiji; Kawakami, Kenichi; Koike, Wataru; Kishimoto, Mariko; Inoue, Tsutomu; Li, Yuanzhong; Naganawa, Shinji

    2013-01-01

    OBJECTIVES To compare the accuracy of pulmonary lobar volumetry using the conventional number of segments method and novel volumetric computer-aided diagnosis using 3D computed tomography images. METHODS We acquired 50 consecutive preoperative 3D computed tomography examinations for lung tumours reconstructed at 1-mm slice thicknesses. We calculated the lobar volume and the emphysematous lobar volume < −950 HU of each lobe using (i) the slice-by-slice method (reference standard), (ii) number of segments method, and (iii) semi-automatic and (iv) automatic computer-aided diagnosis. We determined Pearson correlation coefficients between the reference standard and the three other methods for lobar volumes and emphysematous lobar volumes. We also compared the relative errors among the three measurement methods. RESULTS Both semi-automatic and automatic computer-aided diagnosis results were more strongly correlated with the reference standard than the number of segments method. The correlation coefficients for automatic computer-aided diagnosis were slightly lower than those for semi-automatic computer-aided diagnosis because there was one outlier among 50 cases (2%) in the right upper lobe and two outliers among 50 cases (4%) in the other lobes. The number of segments method relative error was significantly greater than those for semi-automatic and automatic computer-aided diagnosis (P < 0.001). The computational time for automatic computer-aided diagnosis was 1/2 to 2/3 than that of semi-automatic computer-aided diagnosis. CONCLUSIONS A novel lobar volumetry computer-aided diagnosis system could more precisely measure lobar volumes than the conventional number of segments method. Because semi-automatic computer-aided diagnosis and automatic computer-aided diagnosis were complementary, in clinical use, it would be more practical to first measure volumes by automatic computer-aided diagnosis, and then use semi-automatic measurements if automatic computer-aided diagnosis failed. PMID:23526418

  7. Groping for quantitative digital 3-D image analysis: an approach to quantitative fluorescence in situ hybridization in thick tissue sections of prostate carcinoma.

    PubMed

    Rodenacker, K; Aubele, M; Hutzler, P; Adiga, P S

    1997-01-01

    In molecular pathology numerical chromosome aberrations have been found to be decisive for the prognosis of malignancy in tumours. The existence of such aberrations can be detected by interphase fluorescence in situ hybridization (FISH). The gain or loss of certain base sequences in the desoxyribonucleic acid (DNA) can be estimated by counting the number of FISH signals per cell nucleus. The quantitative evaluation of such events is a necessary condition for a prospective use in diagnostic pathology. To avoid occlusions of signals, the cell nucleus has to be analyzed in three dimensions. Confocal laser scanning microscopy is the means to obtain series of optical thin sections from fluorescence stained or marked material to fulfill the conditions mentioned above. A graphical user interface (GUI) to a software package for display, inspection, count and (semi-)automatic analysis of 3-D images for pathologists is outlined including the underlying methods of 3-D image interaction and segmentation developed. The preparative methods are briefly described. Main emphasis is given to the methodical questions of computer-aided analysis of large 3-D image data sets for pathologists. Several automated analysis steps can be performed for segmentation and succeeding quantification. However tumour material is in contrast to isolated or cultured cells even for visual inspection, a difficult material. For the present a fully automated digital image analysis of 3-D data is not in sight. A semi-automatic segmentation method is thus presented here.

  8. μ-PIV measurements of the ensemble flow fields surrounding a migrating semi-infinite bubble.

    PubMed

    Yamaguchi, Eiichiro; Smith, Bradford J; Gaver, Donald P

    2009-08-01

    Microscale particle image velocimetry (μ-PIV) measurements of ensemble flow fields surrounding a steadily-migrating semi-infinite bubble through the novel adaptation of a computer controlled linear motor flow control system. The system was programmed to generate a square wave velocity input in order to produce accurate constant bubble propagation repeatedly and effectively through a fused glass capillary tube. We present a novel technique for re-positioning of the coordinate axis to the bubble tip frame of reference in each instantaneous field through the analysis of the sudden change of standard deviation of centerline velocity profiles across the bubble interface. Ensemble averages were then computed in this bubble tip frame of reference. Combined fluid systems of water/air, glycerol/air, and glycerol/Si-oil were used to investigate flows comparable to computational simulations described in Smith and Gaver (2008) and to past experimental observations of interfacial shape. Fluorescent particle images were also analyzed to measure the residual film thickness trailing behind the bubble. The flow fields and film thickness agree very well with the computational simulations as well as existing experimental and analytical results. Particle accumulation and migration associated with the flow patterns near the bubble tip after long experimental durations are discussed as potential sources of error in the experimental method.

  9. Beneficial effect of enriched air nitrox on bubble formation during scuba diving. An open-water study.

    PubMed

    Brebeck, Anne-Kathrin; Deussen, Andreas; Range, Ursula; Balestra, Costantino; Cleveland, Sinclair; Schipke, Jochen D

    2018-03-01

    Bubble formation during scuba diving might induce decompression sickness. This prospective randomised and double-blind study included 108 advanced recreational divers (38 females). Fifty-four pairs of divers, 1 breathing air and the other breathing nitrox28 undertook a standardised dive (24 ± 1 msw; 62 ± 5min) in the Red Sea. Venous gas bubbles were counted (Doppler) 30-<45 min (early) and 45-60 min (late) post-dive at jugular, subclavian and femoral sites. Only 7% (air) vs. 11% (air28®) (n.s.) were bubble-free after a dive. Independent of sampling time and breathing gas, there were more bubbles in the jugular than in the femoral vein. More bubbles were counted in the air-group than in the air28-group (pooled vein: early: 1845 vs. 948; P = 0.047, late: 1817 vs. 953; P = 0.088). The number of bubbles was sex-dependent. Lastly, 29% of female air divers but only 14% of male divers were bubble-free (P = 0.058). Air28® helps to reduce venous gas emboli in recreational divers. The bubble number depended on the breathing gas, sampling site and sex. Thus, both exact reporting the dive and in particular standardising sampling characteristics seem mandatory to compare results from different studies to further investigate the hitherto incoherent relation between inert gas bubbles and DCS.

  10. Existence problem of proton semi-bubble structure in the 21 + state of 34Si

    NASA Astrophysics Data System (ADS)

    Wu, Feng; Bai, C. L.; Yao, J. M.; Zhang, H. Q.; Zhang, X. Z.

    2017-09-01

    The fully self-consistent Hartree-Fock (HF) plus random phase approximation (RPA) based on Skyrme-type interaction is used to study the existence problem of proton semi-bubble structure in the 21+ state of 34Si. The experimental excitation energy and the transition strength of the 21+ state in 34Si can be reproduced quite well. The tensor effect is also studied. It is shown that the tensor interaction has a notable impact on the excitation energy of the 21+ state and a small effect on the B( E2) value. Besides, its effect on the density distributions in the ground and 21+ state of 34Si is negligible. Our present results with T36 and T44 show that the 21+ state of 34Si is mainly caused by proton transition from π 1d_{5/2} orbit to π 2s_{1/2} orbit, and the existence of a proton semi-bubble structure in this state is very unlikely.

  11. Automated vehicle counting using image processing and machine learning

    NASA Astrophysics Data System (ADS)

    Meany, Sean; Eskew, Edward; Martinez-Castro, Rosana; Jang, Shinae

    2017-04-01

    Vehicle counting is used by the government to improve roadways and the flow of traffic, and by private businesses for purposes such as determining the value of locating a new store in an area. A vehicle count can be performed manually or automatically. Manual counting requires an individual to be on-site and tally the traffic electronically or by hand. However, this can lead to miscounts due to factors such as human error A common form of automatic counting involves pneumatic tubes, but pneumatic tubes disrupt traffic during installation and removal, and can be damaged by passing vehicles. Vehicle counting can also be performed via the use of a camera at the count site recording video of the traffic, with counting being performed manually post-recording or using automatic algorithms. This paper presents a low-cost procedure to perform automatic vehicle counting using remote video cameras with an automatic counting algorithm. The procedure would utilize a Raspberry Pi micro-computer to detect when a car is in a lane, and generate an accurate count of vehicle movements. The method utilized in this paper would use background subtraction to process the images and a machine learning algorithm to provide the count. This method avoids fatigue issues that are encountered in manual video counting and prevents the disruption of roadways that occurs when installing pneumatic tubes

  12. A translating stage system for µ-PIV measurements surrounding the tip of a migrating semi-infinite bubble.

    PubMed

    Smith, B J; Yamaguchi, E; Gaver, D P

    2010-01-01

    We have designed, fabricated and evaluated a novel translating stage system (TSS) that augments a conventional micro particle image velocimetry (µ-PIV) system. The TSS has been used to enhance the ability to measure flow fields surrounding the tip of a migrating semi-infinite bubble in a glass capillary tube under both steady and pulsatile reopening conditions. With conventional µ-PIV systems, observations near the bubble tip are challenging because the forward progress of the bubble rapidly sweeps the air-liquid interface across the microscopic field of view. The translating stage mechanically cancels the mean bubble tip velocity, keeping the interface within the microscope field of view and providing a tenfold increase in data collection efficiency compared to fixed-stage techniques. This dramatic improvement allows nearly continuous observation of the flow field over long propagation distances. A large (136-frame) ensemble-averaged velocity field recorded with the TSS near the tip of a steadily migrating bubble is shown to compare well with fixed-stage results under identical flow conditions. Use of the TSS allows the ensemble-averaged measurement of pulsatile bubble propagation flow fields, which would be practically impossible using conventional fixed-stage techniques. We demonstrate our ability to analyze these time-dependent two-phase flows using the ensemble-averaged flow field at four points in the oscillatory cycle.

  13. Modeling Cryptosporidium spp. Oocyst Inactivation in Bubble-Diffuser Ozone Contactors

    DTIC Science & Technology

    1998-07-01

    requirements for Giardia lamblia (G. lamblia) and viruses under the Surface Water Treatment Rule (SWTR). Minimum CT requirements include relatively...parvum and C. muris ) oocysts in ozone bubble-diffuser contactors. The model is calibrated with semi-batch kinetic data, verified with pilot-scale

  14. Cartilage formation in the CELLS 'double bubble' hardware

    NASA Technical Reports Server (NTRS)

    Duke, P. J.; Arizpe, Jorge; Montufar-Solis, Dina

    1991-01-01

    The CELLS experiment scheduled to be flown on the first International Microgravity Laboratory is designed to study the effect of microgravity on the cartilage formation, by measuring parameters of growth in a differentiating cartilage cell culture. This paper investigates the conditions for this experiment by studying cartilage differentiation in the 'bubble exchange' hardware with the 'double bubble' design in which the bubbles are joined by a flange which also overlays the gasket. Four types of double bubbles (or double gas permeable membranes) were tested: injection-molded bubbles 0.01- and 0.005-in. thick, and compression molded bubbles 0.015- and 0.01-in. thick. It was found that double bubble membranes of 0.005- and 0.010-in. thickness supported cartilage differentiation, while the 0.015-in. bubbles did not. It was also found that nodule count, used in this study as a parameter, is not the best measure of the amount of cartilage differentiation.

  15. Pulse shaping circuit for active counting of superheated emulsion

    NASA Astrophysics Data System (ADS)

    Murai, Ikuo; Sawamura, Teruko

    2005-08-01

    A pulse shaping circuit for active counting of superheated emulsions is described. A piezoelectric transducer is used for sensing bubble formation acoustically and the acoustic signal is transformed to a shaping pulse for counting. The circuit has a short signal processing time in the order of 10 ms.

  16. Comparison between manual and semi-automatic segmentation of nasal cavity and paranasal sinuses from CT images.

    PubMed

    Tingelhoff, K; Moral, A I; Kunkel, M E; Rilk, M; Wagner, I; Eichhorn, K G; Wahl, F M; Bootz, F

    2007-01-01

    Segmentation of medical image data is getting more and more important over the last years. The results are used for diagnosis, surgical planning or workspace definition of robot-assisted systems. The purpose of this paper is to find out whether manual or semi-automatic segmentation is adequate for ENT surgical workflow or whether fully automatic segmentation of paranasal sinuses and nasal cavity is needed. We present a comparison of manual and semi-automatic segmentation of paranasal sinuses and the nasal cavity. Manual segmentation is performed by custom software whereas semi-automatic segmentation is realized by a commercial product (Amira). For this study we used a CT dataset of the paranasal sinuses which consists of 98 transversal slices, each 1.0 mm thick, with a resolution of 512 x 512 pixels. For the analysis of both segmentation procedures we used volume, extension (width, length and height), segmentation time and 3D-reconstruction. The segmentation time was reduced from 960 minutes with manual to 215 minutes with semi-automatic segmentation. We found highest variances segmenting nasal cavity. For the paranasal sinuses manual and semi-automatic volume differences are not significant. Dependent on the segmentation accuracy both approaches deliver useful results and could be used for e.g. robot-assisted systems. Nevertheless both procedures are not useful for everyday surgical workflow, because they take too much time. Fully automatic and reproducible segmentation algorithms are needed for segmentation of paranasal sinuses and nasal cavity.

  17. Preliminary clinical evaluation of semi-automated nailfold capillaroscopy in the assessment of patients with Raynaud's phenomenon.

    PubMed

    Murray, Andrea K; Feng, Kaiyan; Moore, Tonia L; Allen, Phillip D; Taylor, Christopher J; Herrick, Ariane L

    2011-08-01

      Nailfold capillaroscopy is well established in screening patients with Raynaud's phenomenon for underlying SSc-spectrum disorders, by identifying abnormal capillaries. Our aim was to compare semi-automatic feature measurement from newly developed software with manual measurements, and determine the degree to which semi-automated data allows disease group classification.   Images from 46 healthy controls, 21 patients with PRP and 49 with SSc were preprocessed, and semi-automated measurements of intercapillary distance and capillary width, tortuosity, and derangement were performed. These were compared with manual measurements. Features were used to classify images into the three subject groups.   Comparison of automatic and manual measures for distance, width, tortuosity, and derangement had correlations of r=0.583, 0.624, 0.495 (p<0.001), and 0.195 (p=0.040). For automatic measures, correlations were found between width and intercapillary distance, r=0.374, and width and tortuosity, r=0.573 (p<0.001). Significant differences between subject groups were found for all features (p<0.002). Overall, 75% of images correctly matched clinical classification using semi-automated features, compared with 71% for manual measurements.   Semi-automatic and manual measurements of distance, width, and tortuosity showed moderate (but statistically significant) correlations. Correlation for derangement was weaker. Semi-automatic measurements are faster than manual measurements. Semi-automatic parameters identify differences between groups, and are as good as manual measurements for between-group classification. © 2011 John Wiley & Sons Ltd.

  18. Multifractal-based nuclei segmentation in fish images.

    PubMed

    Reljin, Nikola; Slavkovic-Ilic, Marijeta; Tapia, Coya; Cihoric, Nikola; Stankovic, Srdjan

    2017-09-01

    The method for nuclei segmentation in fluorescence in-situ hybridization (FISH) images, based on the inverse multifractal analysis (IMFA) is proposed. From the blue channel of the FISH image in RGB format, the matrix of Holder exponents, with one-by-one correspondence with the image pixels, is determined first. The following semi-automatic procedure is proposed: initial nuclei segmentation is performed automatically from the matrix of Holder exponents by applying predefined hard thresholding; then the user evaluates the result and is able to refine the segmentation by changing the threshold, if necessary. After successful nuclei segmentation, the HER2 (human epidermal growth factor receptor 2) scoring can be determined in usual way: by counting red and green dots within segmented nuclei, and finding their ratio. The IMFA segmentation method is tested over 100 clinical cases, evaluated by skilled pathologist. Testing results show that the new method has advantages compared to already reported methods.

  19. Development of advanced image analysis techniques for the in situ characterization of multiphase dispersions occurring in bioreactors.

    PubMed

    Galindo, Enrique; Larralde-Corona, C Patricia; Brito, Teresa; Córdova-Aguilar, Ma Soledad; Taboada, Blanca; Vega-Alvarado, Leticia; Corkidi, Gabriel

    2005-03-30

    Fermentation bioprocesses typically involve two liquid phases (i.e. water and organic compounds) and one gas phase (air), together with suspended solids (i.e. biomass), which are the components to be dispersed. Characterization of multiphase dispersions is required as it determines mass transfer efficiency and bioreactor homogeneity. It is also needed for the appropriate design of contacting equipment, helping in establishing optimum operational conditions. This work describes the development of image analysis based techniques with advantages (in terms of data acquisition and processing), for the characterization of oil drops and bubble diameters in complex simulated fermentation broths. The system consists of fully digital acquisition of in situ images obtained from the inside of a mixing tank using a CCD camera synchronized with a stroboscopic light source, which are processed with a versatile commercial software. To improve the automation of particle recognition and counting, the Hough transform (HT) was used, so bubbles and oil drops were automatically detected and the processing time was reduced by 55% without losing accuracy with respect to a fully manual analysis. The system has been used for the detailed characterization of a number of operational conditions, including oil content, biomass morphology, presence of surfactants (such as proteins) and viscosity of the aqueous phase.

  20. Automatic Detection of Mitosis and Nuclei From Cytogenetic Images by CellProfiler Software for Mitotic Index Estimation.

    PubMed

    González, Jorge Ernesto; Radl, Analía; Romero, Ivonne; Barquinero, Joan Francesc; García, Omar; Di Giorgio, Marina

    2016-12-01

    Mitotic Index (MI) estimation expressed as percentage of mitosis plays an important role as quality control endpoint. To this end, MI is applied to check the lot of media and reagents to be used throughout the assay and also to check cellular viability after blood sample shipping, indicating satisfactory/unsatisfactory conditions for the progression of cell culture. The objective of this paper was to apply the CellProfiler open-source software for automatic detection of mitotic and nuclei figures from digitized images of cultured human lymphocytes for MI assessment, and to compare its performance to that performed through semi-automatic and visual detection. Lymphocytes were irradiated and cultured for mitosis detection. Sets of images from cultures were analyzed visually and findings were compared with those using CellProfiler software. The CellProfiler pipeline includes the detection of nuclei and mitosis with 80% sensitivity and more than 99% specificity. We conclude that CellProfiler is a reliable tool for counting mitosis and nuclei from cytogenetic images, saves considerable time compared to manual operation and reduces the variability derived from the scoring criteria of different scorers. The CellProfiler automated pipeline achieves good agreement with visual counting workflow, i.e. it allows fully automated mitotic and nuclei scoring in cytogenetic images yielding reliable information with minimal user intervention. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. Mitosis Counting in Breast Cancer: Object-Level Interobserver Agreement and Comparison to an Automatic Method

    PubMed Central

    Veta, Mitko; van Diest, Paul J.; Jiwa, Mehdi; Al-Janabi, Shaimaa; Pluim, Josien P. W.

    2016-01-01

    Background Tumor proliferation speed, most commonly assessed by counting of mitotic figures in histological slide preparations, is an important biomarker for breast cancer. Although mitosis counting is routinely performed by pathologists, it is a tedious and subjective task with poor reproducibility, particularly among non-experts. Inter- and intraobserver reproducibility of mitosis counting can be improved when a strict protocol is defined and followed. Previous studies have examined only the agreement in terms of the mitotic count or the mitotic activity score. Studies of the observer agreement at the level of individual objects, which can provide more insight into the procedure, have not been performed thus far. Methods The development of automatic mitosis detection methods has received large interest in recent years. Automatic image analysis is viewed as a solution for the problem of subjectivity of mitosis counting by pathologists. In this paper we describe the results from an interobserver agreement study between three human observers and an automatic method, and make two unique contributions. For the first time, we present an analysis of the object-level interobserver agreement on mitosis counting. Furthermore, we train an automatic mitosis detection method that is robust with respect to staining appearance variability and compare it with the performance of expert observers on an “external” dataset, i.e. on histopathology images that originate from pathology labs other than the pathology lab that provided the training data for the automatic method. Results The object-level interobserver study revealed that pathologists often do not agree on individual objects, even if this is not reflected in the mitotic count. The disagreement is larger for objects from smaller size, which suggests that adding a size constraint in the mitosis counting protocol can improve reproducibility. The automatic mitosis detection method can perform mitosis counting in an unbiased way, with substantial agreement with human experts. PMID:27529701

  2. Mitosis Counting in Breast Cancer: Object-Level Interobserver Agreement and Comparison to an Automatic Method.

    PubMed

    Veta, Mitko; van Diest, Paul J; Jiwa, Mehdi; Al-Janabi, Shaimaa; Pluim, Josien P W

    2016-01-01

    Tumor proliferation speed, most commonly assessed by counting of mitotic figures in histological slide preparations, is an important biomarker for breast cancer. Although mitosis counting is routinely performed by pathologists, it is a tedious and subjective task with poor reproducibility, particularly among non-experts. Inter- and intraobserver reproducibility of mitosis counting can be improved when a strict protocol is defined and followed. Previous studies have examined only the agreement in terms of the mitotic count or the mitotic activity score. Studies of the observer agreement at the level of individual objects, which can provide more insight into the procedure, have not been performed thus far. The development of automatic mitosis detection methods has received large interest in recent years. Automatic image analysis is viewed as a solution for the problem of subjectivity of mitosis counting by pathologists. In this paper we describe the results from an interobserver agreement study between three human observers and an automatic method, and make two unique contributions. For the first time, we present an analysis of the object-level interobserver agreement on mitosis counting. Furthermore, we train an automatic mitosis detection method that is robust with respect to staining appearance variability and compare it with the performance of expert observers on an "external" dataset, i.e. on histopathology images that originate from pathology labs other than the pathology lab that provided the training data for the automatic method. The object-level interobserver study revealed that pathologists often do not agree on individual objects, even if this is not reflected in the mitotic count. The disagreement is larger for objects from smaller size, which suggests that adding a size constraint in the mitosis counting protocol can improve reproducibility. The automatic mitosis detection method can perform mitosis counting in an unbiased way, with substantial agreement with human experts.

  3. Automatic irradiation control by an optical feedback technique for selective retina treatment (SRT) in a rabbit model

    NASA Astrophysics Data System (ADS)

    Seifert, Eric; Roh, Young-Jung; Fritz, Andreas; Park, Young Gun; Kang, Seungbum; Theisen-Kunde, Dirk; Brinkmann, Ralf

    2013-06-01

    Selective Retina Therapy (SRT) targets the Retinal Pigment Epithelium (RPE) without effecting neighboring layers as the photoreceptors or the choroid. SRT related RPE defects are ophthalmoscopically invisible. Owing to this invisibility and the variation of the threshold radiant exposure for RPE damage the treating physician does not know whether the treatment was successful or not. Thus measurement techniques enabling a correct dosing are a demanded element in SRT devices. The acquired signal can be used for monitoring or automatic irradiation control. Existing monitoring techniques are based on the detection of micro-bubbles. These bubbles are the origin of RPE cell damage for pulse durations in the ns and μs time regime 5μs. The detection can be performed by optical or acoustical approaches. Monitoring based on an acoustical approach has already been used to study the beneficial effects of SRT on diabetic macula edema and central serous retinopathy. We have developed a first real time feedback technique able to detect micro-bubble induced characteristics in the backscattered laser light fast enough to cease the laser irradiation within a burst. Therefore the laser energy within a burst of at most 30 pulses is increased linearly with every pulse. The laser irradiation is ceased as soon as micro-bubbles are detected. With this automatic approach it was possible to observe invisible lesions, an intact photoreceptor layer and a reconstruction of the RPE within one week.

  4. AUTOMATIC COUNTING APPARATUS

    DOEpatents

    Howell, W.D.

    1957-08-20

    An apparatus for automatically recording the results of counting operations on trains of electrical pulses is described. The disadvantages of prior devices utilizing the two common methods of obtaining the count rate are overcome by this apparatus; in the case of time controlled operation, the disclosed system automatically records amy information stored by the scaler but not transferred to the printer at the end of the predetermined time controlled operations and, in the case of count controlled operation, provision is made to prevent a weak sample from occupying the apparatus for an excessively long period of time.

  5. Validation of semi-automatic segmentation of the left atrium

    NASA Astrophysics Data System (ADS)

    Rettmann, M. E.; Holmes, D. R., III; Camp, J. J.; Packer, D. L.; Robb, R. A.

    2008-03-01

    Catheter ablation therapy has become increasingly popular for the treatment of left atrial fibrillation. The effect of this treatment on left atrial morphology, however, has not yet been completely quantified. Initial studies have indicated a decrease in left atrial size with a concomitant decrease in pulmonary vein diameter. In order to effectively study if catheter based therapies affect left atrial geometry, robust segmentations with minimal user interaction are required. In this work, we validate a method to semi-automatically segment the left atrium from computed-tomography scans. The first step of the technique utilizes seeded region growing to extract the entire blood pool including the four chambers of the heart, the pulmonary veins, aorta, superior vena cava, inferior vena cava, and other surrounding structures. Next, the left atrium and pulmonary veins are separated from the rest of the blood pool using an algorithm that searches for thin connections between user defined points in the volumetric data or on a surface rendering. Finally, pulmonary veins are separated from the left atrium using a three dimensional tracing tool. A single user segmented three datasets three times using both the semi-automatic technique as well as manual tracing. The user interaction time for the semi-automatic technique was approximately forty-five minutes per dataset and the manual tracing required between four and eight hours per dataset depending on the number of slices. A truth model was generated using a simple voting scheme on the repeated manual segmentations. A second user segmented each of the nine datasets using the semi-automatic technique only. Several metrics were computed to assess the agreement between the semi-automatic technique and the truth model including percent differences in left atrial volume, DICE overlap, and mean distance between the boundaries of the segmented left atria. Overall, the semi-automatic approach was demonstrated to be repeatable within and between raters, and accurate when compared to the truth model. Finally, we generated a visualization to assess the spatial variability in the segmentation errors between the semi-automatic approach and the truth model. The visualization demonstrates the highest errors occur at the boundaries between the left atium and pulmonary veins as well as the left atrium and left atrial appendage. In conclusion, we describe a semi-automatic approach for left atrial segmentation that demonstrates repeatability and accuracy, with the advantage of significant time reduction in user interaction time.

  6. Sensitivity of a bubble growth to the cheese material properties during ripening

    NASA Astrophysics Data System (ADS)

    Fokoua, G.; Grenier, D.; Lucas, T.

    2016-10-01

    In this study, a model of transport phenomena describes a single bubble growth in semi-hard cheese. Carbon dioxide production, its transport to the bubble interface, equilibrium laws and mechanics were coupled. Semi-hard cheese mainly behaves as elastic when loads are quickly applied to a piece of cheese like during chewing (few seconds). However, when slowly loaded with increasing gas pressure during ripening in warm room, the mechanical cheese behavior can be simply modelled as a viscous material (Grenier et al. [9]). It is true, as long as viscosity remains low compared to the rate of gas production. This paper investigates a wider range of viscosity (from core η = 6.32 × 107 Pa.s to rind η = 2.88 × 108 Pa.s) than that used in previous studies. FEM simulations have shown that higher viscosities encountered close to the rind of a cheese block can partly explain the increase in gas pressure within bubbles from the core to the rind (up to 3.4 kPa). These results confirm that mechanics does not really control the evolution of bubble volume in cheese. However, mechanics can explain greater pressure observed close to the rind even if gas production is lower than at core.

  7. Semi-automatic measures of activity in selected south polar regions of Mars using morphological image analysis

    NASA Astrophysics Data System (ADS)

    Aye, Klaus-Michael; Portyankina, Ganna; Pommerol, Antoine; Thomas, Nicolas

    The High Resolution Imaging Science Experiment (HiRISE) onboard Mars Reconnaissance Orbiter (MRO) has been used to monitor the seasonal evolution of several regions at high southern latitudes. Of particular interest have been jet-like activities that may result from the process described by Kieffer (2007), involving translucent CO2 ice. These jets are assumed to create fan-shaped ground features, as studied e.g. in Hansen et.al. (2010) and Portyankina et.al. (2010). In Thomas et.al. (2009), a small region of interest (ROI) inside the south polar Inca City region (81° S, 296° E) was defined for which the seasonal change of the number of fans was determined. This ROI was chosen for its strong visual variability in ground features. The mostly manual counting work showed, that the number of apparent fans increases monotonously for a considerable amount of time from the beginning of the spring time observations at Ls of 178° until approx. 230° , following the increase of available solar energy for the aforementioned processes of the Kieffer model. This fact indicates that the number of visual fan features can be used as an activity measure for the seasonal evolution of this area, in addition to commonly used evolution studies of surface reflectance. Motivated by these results, we would like to determine the fan count evolution for more south polar areas like Ithaca, Manhattan, Giza and others. To increase the reproducibility of the results by avoiding potential variability in fan shape recognition by human eye and to increase the production efficiency, efforts are being undertaken to automise the fan counting procedure. The techniques used, cleanly separated in different stages of the procedure, the difficulties for each stage and an overview of the tools used at each step will be presented. After showing a proof of concept in Aye et.al. (2010), for a ROI that is comparable to the one previously used for manual counting in Thomas et.al. (2009), we now will show results of these semi-automatically determined seasonal fan count evolutions for Inca City, Ithaca and Manhattan ROIs, compare these evolutionary patterns with each other and with surface reflectance evolutions of both HiRISE and CRISM for the same locations. References: Aye, K.-M. et. al. (2010), LPSC 2010, 2707 Hansen, C. et. al (2010) Icarus, 205, Issue 1, p. 283-295 Kieffer, H.H. (2007), JGR 112 Portyankina, G. et. al. (2010), Icarus, 205, Issue 1, p. 311-320 Thomas, N. et. Al. (2009), Vol. 4, EPSC2009-478

  8. Reevaluation of pollen quantitation by an automatic pollen counter.

    PubMed

    Muradil, Mutarifu; Okamoto, Yoshitaka; Yonekura, Syuji; Chazono, Hideaki; Hisamitsu, Minako; Horiguchi, Shigetoshi; Hanazawa, Toyoyuki; Takahashi, Yukie; Yokota, Kunihiko; Okumura, Satoshi

    2010-01-01

    Accurate and detailed pollen monitoring is useful for selection of medication and for allergen avoidance in patients with allergic rhinitis. Burkard and Durham pollen samplers are commonly used, but are labor and time intensive. In contrast, automatic pollen counters allow simple real-time pollen counting; however, these instruments have difficulty in distinguishing pollen from small nonpollen airborne particles. Misidentification and underestimation rates for an automatic pollen counter were examined to improve the accuracy of the pollen count. The characteristics of the automatic pollen counter were determined in a chamber study with exposure to cedar pollens or soil grains. The cedar pollen counts were monitored in 2006 and 2007, and compared with those from a Durham sampler. The pollen counts from the automatic counter showed a good correlation (r > 0.7) with those from the Durham sampler when pollen dispersal was high, but a poor correlation (r < 0.5) when pollen dispersal was low. The new correction method, which took into account the misidentification and underestimation, improved this correlation to r > 0.7 during the pollen season. The accuracy of automatic pollen counting can be improved using a correction to include rates of underestimation and misidentification in a particular geographical area.

  9. Reliable enumeration of malaria parasites in thick blood films using digital image analysis.

    PubMed

    Frean, John A

    2009-09-23

    Quantitation of malaria parasite density is an important component of laboratory diagnosis of malaria. Microscopy of Giemsa-stained thick blood films is the conventional method for parasite enumeration. Accurate and reproducible parasite counts are difficult to achieve, because of inherent technical limitations and human inconsistency. Inaccurate parasite density estimation may have adverse clinical and therapeutic implications for patients, and for endpoints of clinical trials of anti-malarial vaccines or drugs. Digital image analysis provides an opportunity to improve performance of parasite density quantitation. Accurate manual parasite counts were done on 497 images of a range of thick blood films with varying densities of malaria parasites, to establish a uniformly reliable standard against which to assess the digital technique. By utilizing descriptive statistical parameters of parasite size frequency distributions, particle counting algorithms of the digital image analysis programme were semi-automatically adapted to variations in parasite size, shape and staining characteristics, to produce optimum signal/noise ratios. A reliable counting process was developed that requires no operator decisions that might bias the outcome. Digital counts were highly correlated with manual counts for medium to high parasite densities, and slightly less well correlated with conventional counts. At low densities (fewer than 6 parasites per analysed image) signal/noise ratios were compromised and correlation between digital and manual counts was poor. Conventional counts were consistently lower than both digital and manual counts. Using open-access software and avoiding custom programming or any special operator intervention, accurate digital counts were obtained, particularly at high parasite densities that are difficult to count conventionally. The technique is potentially useful for laboratories that routinely perform malaria parasite enumeration. The requirements of a digital microscope camera, personal computer and good quality staining of slides are potentially reasonably easy to meet.

  10. Bubble measuring instrument and method

    NASA Technical Reports Server (NTRS)

    Magari, Patrick J. (Inventor); Kline-Schoder, Robert (Inventor)

    2003-01-01

    Method and apparatus are provided for a non-invasive bubble measuring instrument operable for detecting, distinguishing, and counting gaseous embolisms such as bubbles over a selectable range of bubble sizes of interest. A selected measurement volume in which bubbles may be detected is insonified by two distinct frequencies from a pump transducer and an image transducer, respectively. The image transducer frequency is much higher than the pump transducer frequency. The relatively low-frequency pump signal is used to excite bubbles to resonate at a frequency related to their diameter. The image transducer is operated in a pulse-echo mode at a controllable repetition rate that transmits bursts of high-frequency ultrasonic signal to the measurement volume in which bubbles may be detected and then receives the echo. From the echo or received signal, a beat signal related to the repetition rate may be extracted and used to indicate the presence or absence of a resonant bubble. In a preferred embodiment, software control maintains the beat signal at a preselected frequency while varying the pump transducer frequency to excite bubbles of different diameters to resonate depending on the range of bubble diameters selected for investigation.

  11. Bubble Measuring Instrument and Method

    NASA Technical Reports Server (NTRS)

    Kline-Schoder, Robert (Inventor); Magari, Patrick J. (Inventor)

    2002-01-01

    Method and apparatus are provided for a non-invasive bubble measuring instrument operable for detecting, distinguishing, and counting gaseous embolisms such as bubbles over a selectable range of bubble sizes of interest. A selected measurement volume in which bubbles may be detected is insonified by two distinct frequencies from a pump transducer and an image transducer. respectively. The image transducer frequency is much higher than the pump transducer frequency. The relatively low-frequency pump signal is used to excite bubbles to resonate at a frequency related to their diameter. The image transducer is operated in a pulse-echo mode at a controllable repetition rate that transmits bursts of high-frequency ultrasonic signal to the measurement volume in which bubbles may be detected and then receives the echo. From the echo or received signal, a beat signal related to the repetition rate may be extracted and used to indicate the presence or absence of a resonant bubble. In a preferred embodiment, software control maintains the beat signal at a preselected frequency while varying the pump transducer frequency to excite bubbles of different diameters to resonate depending on the range of bubble diameters selected for investigation.

  12. Bubble vector in automatic merging

    NASA Technical Reports Server (NTRS)

    Pamidi, P. R.; Butler, T. G.

    1987-01-01

    It is shown that it is within the capability of the DMAP language to build a set of vectors that can grow incrementally to be applied automatically and economically within a DMAP loop that serves to append sub-matrices that are generated within a loop to a core matrix. The method of constructing such vectors is explained.

  13. Automatic counting and classification of bacterial colonies using hyperspectral imaging

    USDA-ARS?s Scientific Manuscript database

    Detection and counting of bacterial colonies on agar plates is a routine microbiology practice to get a rough estimate of the number of viable cells in a sample. There have been a variety of different automatic colony counting systems and software algorithms mainly based on color or gray-scale pictu...

  14. Application of a semi-automatic cartilage segmentation method for biomechanical modeling of the knee joint.

    PubMed

    Liukkonen, Mimmi K; Mononen, Mika E; Tanska, Petri; Saarakkala, Simo; Nieminen, Miika T; Korhonen, Rami K

    2017-10-01

    Manual segmentation of articular cartilage from knee joint 3D magnetic resonance images (MRI) is a time consuming and laborious task. Thus, automatic methods are needed for faster and reproducible segmentations. In the present study, we developed a semi-automatic segmentation method based on radial intensity profiles to generate 3D geometries of knee joint cartilage which were then used in computational biomechanical models of the knee joint. Six healthy volunteers were imaged with a 3T MRI device and their knee cartilages were segmented both manually and semi-automatically. The values of cartilage thicknesses and volumes produced by these two methods were compared. Furthermore, the influences of possible geometrical differences on cartilage stresses and strains in the knee were evaluated with finite element modeling. The semi-automatic segmentation and 3D geometry construction of one knee joint (menisci, femoral and tibial cartilages) was approximately two times faster than with manual segmentation. Differences in cartilage thicknesses, volumes, contact pressures, stresses, and strains between segmentation methods in femoral and tibial cartilage were mostly insignificant (p > 0.05) and random, i.e. there were no systematic differences between the methods. In conclusion, the devised semi-automatic segmentation method is a quick and accurate way to determine cartilage geometries; it may become a valuable tool for biomechanical modeling applications with large patient groups.

  15. A multiparametric assay for quantitative nerve regeneration evaluation.

    PubMed

    Weyn, B; van Remoortere, M; Nuydens, R; Meert, T; van de Wouwer, G

    2005-08-01

    We introduce an assay for the semi-automated quantification of nerve regeneration by image analysis. Digital images of histological sections of regenerated nerves are recorded using an automated inverted microscope and merged into high-resolution mosaic images representing the entire nerve. These are analysed by a dedicated image-processing package that computes nerve-specific features (e.g. nerve area, fibre count, myelinated area) and fibre-specific features (area, perimeter, myelin sheet thickness). The assay's performance and correlation of the automatically computed data with visually obtained data are determined on a set of 140 semithin sections from the distal part of a rat tibial nerve from four different experimental treatment groups (control, sham, sutured, cut) taken at seven different time points after surgery. Results show a high correlation between the manually and automatically derived data, and a high discriminative power towards treatment. Extra value is added by the large feature set. In conclusion, the assay is fast and offers data that currently can be obtained only by a combination of laborious and time-consuming tests.

  16. Experimental Study for Automatic Colony Counting System Based Onimage Processing

    NASA Astrophysics Data System (ADS)

    Fang, Junlong; Li, Wenzhe; Wang, Guoxin

    Colony counting in many colony experiments is detected by manual method at present, therefore it is difficult for man to execute the method quickly and accurately .A new automatic colony counting system was developed. Making use of image-processing technology, a study was made on the feasibility of distinguishing objectively white bacterial colonies from clear plates according to the RGB color theory. An optimal chromatic value was obtained based upon a lot of experiments on the distribution of the chromatic value. It has been proved that the method greatly improves the accuracy and efficiency of the colony counting and the counting result is not affected by using inoculation, shape or size of the colony. It is revealed that automatic detection of colony quantity using image-processing technology could be an effective way.

  17. Efficient Semi-Automatic 3D Segmentation for Neuron Tracing in Electron Microscopy Images

    PubMed Central

    Jones, Cory; Liu, Ting; Cohan, Nathaniel Wood; Ellisman, Mark; Tasdizen, Tolga

    2015-01-01

    0.1. Background In the area of connectomics, there is a significant gap between the time required for data acquisition and dense reconstruction of the neural processes contained in the same dataset. Automatic methods are able to eliminate this timing gap, but the state-of-the-art accuracy so far is insufficient for use without user corrections. If completed naively, this process of correction can be tedious and time consuming. 0.2. New Method We present a new semi-automatic method that can be used to perform 3D segmentation of neurites in EM image stacks. It utilizes an automatic method that creates a hierarchical structure for recommended merges of superpixels. The user is then guided through each predicted region to quickly identify errors and establish correct links. 0.3. Results We tested our method on three datasets with both novice and expert users. Accuracy and timing were compared with published automatic, semi-automatic, and manual results. 0.4. Comparison with Existing Methods Post-automatic correction methods have also been used in [1] and [2]. These methods do not provide navigation or suggestions in the manner we present. Other semi-automatic methods require user input prior to the automatic segmentation such as [3] and [4] and are inherently different than our method. 0.5. Conclusion Using this method on the three datasets, novice users achieved accuracy exceeding state-of-the-art automatic results, and expert users achieved accuracy on par with full manual labeling but with a 70% time improvement when compared with other examples in publication. PMID:25769273

  18. Automatic, time-interval traffic counts for recreation area management planning

    Treesearch

    D. L. Erickson; C. J. Liu; H. K. Cordell

    1980-01-01

    Automatic, time-interval recorders were used to count directional vehicular traffic on a multiple entry/exit road network in the Red River Gorge Geological Area, Daniel Boone National Forest. Hourly counts of entering and exiting traffic differed according to recorder location, but an aggregated distribution showed a delayed peak in exiting traffic thought to be...

  19. Molecular Dynamics Investigation of Each Bubble Behavior in Coarsening of Cavitation Bubbles in a Finite Space

    NASA Astrophysics Data System (ADS)

    Tsuda, Shin-Ichi; Nakano, Yuta; Watanabe, Satoshi

    2017-11-01

    Recently, several studies using Molecular Dynamics (MD) simulation have been conducted for investigation of Ostwald ripening of cavitation bubbles in a finite space. The previous studies focused a characteristic length of bubbles as one of the spatially-averaged quantities, but each bubble behavior was not been investigated in detail. The objective of this study is clarification of the characteristics of each bubble behavior in Ostwald ripening, and we conducted MD simulation of a Lennard-Jones fluid in a semi-confined space. As a result, the time dependency of the characteristic length of bubbles as a spatially-averaged quantity suggested that the driving force of the Ostwald ripening is Evaporation/Condensation (EC) across liquid-vapor surface, which is the same result as the previous works. The radius change of the relatively larger bubbles also showed the same tendency to a classical EC model. However, the sufficiently smaller bubbles than the critical size, e.g., the bubbles just before collapsing, showed a different characteristic from the classical EC model. Those smaller bubbles has a tendency to be limited by mechanical non-equilibrium in which viscosity of liquid is dominant rather than by EC across liquid-vapor surface. This work was supported by JSPS KAKENHI Grant Number JP16K06085.

  20. Inhibition of bubble coalescence: effects of salt concentration and speed of approach.

    PubMed

    Del Castillo, Lorena A; Ohnishi, Satomi; Horn, Roger G

    2011-04-01

    Bubble coalescence experiments have been performed using a sliding bubble apparatus, in which mm-sized bubbles in an aqueous electrolyte solution without added surfactant rose toward an air meniscus at different speeds obtained by varying the inclination of a closed glass cylinder containing the liquid. The coalescence times of single bubbles contacting the meniscus were monitored using a high speed camera. Results clearly show that stability against coalescence of colliding air bubbles is influenced by both the salt concentration and the approach speed of the bubbles. Contrary to the widespread belief that bubbles in pure water are unstable, we demonstrate that bubbles formed in highly purified water and colliding with the meniscus at very slow approach speeds can survive for minutes or even hours. At higher speeds, bubbles in water only survive for a few seconds, and at still higher speeds they coalesce instantly. Addition of a simple electrolyte (KCl) removes the low-speed stability and shifts the transition between transient stability and instant coalescence to higher approach speeds. At high electrolyte concentration no bubbles were observed to coalesce instantly. These observations are consistent with recent results of Yaminsky et al. (Langmuir 26 (2010) 8061) and the transitions between different regions of behavior are in semi-quantitative agreement with Yaminsky's model. Copyright © 2010 Elsevier Inc. All rights reserved.

  1. Size Control of Sessile Microbubbles for Reproducibly Driven Acoustic Streaming

    NASA Astrophysics Data System (ADS)

    Volk, Andreas; Kähler, Christian J.

    2018-05-01

    Acoustically actuated bubbles are receiving growing interest in microfluidic applications, as they induce a streaming field that can be used for particle sorting and fluid mixing. An essential but often unspoken challenge in such applications is to maintain a constant bubble size to achieve reproducible conditions. We present an automatized system for the size control of a cylindrical bubble that is formed at a blind side pit of a polydimethylsiloxane microchannel. Using a pressure control system, we adapt the protrusion depth of the bubble into the microchannel to a precision of approximately 0.5 μ m on a timescale of seconds. By comparing the streaming field generated by bubbles of width 80 μ m with a protrusion depth between -12 and 60 μ m , we find that the mean velocity of the induced streaming fields varies by more than a factor of 4. We also find a qualitative change of the topology of the streaming field. Both observations confirm the importance of the bubble size control system in order to achieve reproducible and reliable bubble-driven streaming experiments.

  2. Synergistic effect of microbubble emulsion and sonic or ultrasonic agitation on endodontic biofilm in vitro.

    PubMed

    Halford, Andrew; Ohl, Claus-Dieter; Azarpazhooh, Amir; Basrani, Bettina; Friedman, Shimon; Kishen, Anil

    2012-11-01

    Irrigation dynamics and antibacterial activity determine the efficacy of root canal disinfection. Sonic or ultrasonic agitation of irrigants is expected to improve irrigation dynamics. This study examined the effects of microbubble emulsion (ME) combined with sonic or ultrasonic agitation on irrigation dynamics and reduction of biofilm bacteria within root canal models. Two experiments were conducted. First, high-speed imaging was used to characterize the bubble dynamics generated in ME by sonic or ultrasonic agitation within canals of polymer tooth models. Second, 5.25% NaOCl irrigation or ME was sonically or ultrasonically agitated in canals of extracted teeth with 7-day-grown Enterococcus faecalis biofilms. Dentinal shavings from canal walls were sampled at 1 mm and 3 mm from the apical terminus, and colony-forming units (CFUs) were enumerated. Mean log CFU/mL values were analyzed with analysis of variance and post hoc tests. High-speed imaging demonstrated strongly oscillating and vaporizing bubbles generated within ME during ultrasonic but not sonic agitation. Compared with CFU counts in controls, NaOCl-sonic and NaOCl-ultrasonic yielded significantly lower counts (P < .05) at both measurement levels. ME-sonic yielded significantly lower counts (P = .002) at 3 mm, whereas ME-ultrasonic yielded highly significantly lower counts (P = .000) at both measurement levels. At 3 mm, ME-ultrasonic yielded significantly lower CFU counts (P = .000) than ME-sonic, NaOCl-sonic, and NaOCl-ultrasonic. Enhanced bubble dynamics and reduced E. faecalis biofilm bacteria beyond the level achieved by sonic or ultrasonic agitation of NaOCl suggested a synergistic effect of ME combined with ultrasonic agitation. Copyright © 2012 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  3. Automatic vehicle counting using background subtraction method on gray scale images and morphology operation

    NASA Astrophysics Data System (ADS)

    Adi, K.; Widodo, A. P.; Widodo, C. E.; Pamungkas, A.; Putranto, A. B.

    2018-05-01

    Traffic monitoring on road needs to be done, the counting of the number of vehicles passing the road is necessary. It is more emphasized for highway transportation management in order to prevent efforts. Therefore, it is necessary to develop a system that is able to counting the number of vehicles automatically. Video processing method is able to counting the number of vehicles automatically. This research has development a system of vehicle counting on toll road. This system includes processes of video acquisition, frame extraction, and image processing for each frame. Video acquisition is conducted in the morning, at noon, in the afternoon, and in the evening. This system employs of background subtraction and morphology methods on gray scale images for vehicle counting. The best vehicle counting results were obtained in the morning with a counting accuracy of 86.36 %, whereas the lowest accuracy was in the evening, at 21.43 %. Differences in morning and evening results are caused by different illumination in the morning and evening. This will cause the values in the image pixels to be different.

  4. Study on ambient noise generated from breaking waves simulated by a wave maker in a tank

    NASA Astrophysics Data System (ADS)

    Wei, Ruey-Chang; Chan, Hsiang-Chih

    2002-11-01

    This paper studies ambient noise in the surf zone that was simulated by a piston-type wave maker in a tank. The experiment analyzed the bubbles of a breaking wave by using a hydrophone to receive the acoustic signal, and the images of bubbles were recorded by a digital video camera to observe the distribution of the bubbles. The slope of the simulated seabed is 1:5, and the dimensions of the water tank are 35 m x1 m x1.2 m. The studied parameters of ambient noise generated by breaking wave bubbles were wave height, period, and water depth. Short-time Fourier transform was applied to obtain the acoustic spectrum of bubbles, MATLAB programs were used to calculate mean sound pressure level, and determine the number of bubbles. Bubbles with resonant frequency from 0.5 to 10 kHz were studied, counted from peaks in the spectrum. The number of bubbles generated by breaking waves could be estimated by the bubbles energy distributions. The sound pressure level of ambient noise was highly related to the wave height and period, with correlation coefficient 0.7.

  5. An evaluation of automatic coronary artery calcium scoring methods with cardiac CT using the orCaScore framework.

    PubMed

    Wolterink, Jelmer M; Leiner, Tim; de Vos, Bob D; Coatrieux, Jean-Louis; Kelm, B Michael; Kondo, Satoshi; Salgado, Rodrigo A; Shahzad, Rahil; Shu, Huazhong; Snoeren, Miranda; Takx, Richard A P; van Vliet, Lucas J; van Walsum, Theo; Willems, Tineke P; Yang, Guanyu; Zheng, Yefeng; Viergever, Max A; Išgum, Ivana

    2016-05-01

    The amount of coronary artery calcification (CAC) is a strong and independent predictor of cardiovascular disease (CVD) events. In clinical practice, CAC is manually identified and automatically quantified in cardiac CT using commercially available software. This is a tedious and time-consuming process in large-scale studies. Therefore, a number of automatic methods that require no interaction and semiautomatic methods that require very limited interaction for the identification of CAC in cardiac CT have been proposed. Thus far, a comparison of their performance has been lacking. The objective of this study was to perform an independent evaluation of (semi)automatic methods for CAC scoring in cardiac CT using a publicly available standardized framework. Cardiac CT exams of 72 patients distributed over four CVD risk categories were provided for (semi)automatic CAC scoring. Each exam consisted of a noncontrast-enhanced calcium scoring CT (CSCT) and a corresponding coronary CT angiography (CCTA) scan. The exams were acquired in four different hospitals using state-of-the-art equipment from four major CT scanner vendors. The data were divided into 32 training exams and 40 test exams. A reference standard for CAC in CSCT was defined by consensus of two experts following a clinical protocol. The framework organizers evaluated the performance of (semi)automatic methods on test CSCT scans, per lesion, artery, and patient. Five (semi)automatic methods were evaluated. Four methods used both CSCT and CCTA to identify CAC, and one method used only CSCT. The evaluated methods correctly detected between 52% and 94% of CAC lesions with positive predictive values between 65% and 96%. Lesions in distal coronary arteries were most commonly missed and aortic calcifications close to the coronary ostia were the most common false positive errors. The majority (between 88% and 98%) of correctly identified CAC lesions were assigned to the correct artery. Linearly weighted Cohen's kappa for patient CVD risk categorization by the evaluated methods ranged from 0.80 to 1.00. A publicly available standardized framework for the evaluation of (semi)automatic methods for CAC identification in cardiac CT is described. An evaluation of five (semi)automatic methods within this framework shows that automatic per patient CVD risk categorization is feasible. CAC lesions at ambiguous locations such as the coronary ostia remain challenging, but their detection had limited impact on CVD risk determination.

  6. Morphological Studies of Rising Equatorial Spread F Bubbles

    DTIC Science & Technology

    1977-11-01

    depletions. In the present paper , we wish to discuss equatorial Spread F bubble shapes and vertical rise rates within the context of the collisional...simulation results are needed to ascertain which model fits best. All of the models described in this paper , based on collisional Rayleigh-Taylor type...Analysis of Barium Clouds - Semi-Annual Technical Report, RADC-TR-72-103, Vol. I, Avco Everett Reserach Laboratory, Everett, Mass., January 1972

  7. Ejection of Metal Particles into Superfluid 4He by Laser Ablation.

    PubMed

    Buelna, Xavier; Freund, Adam; Gonzalez, Daniel; Popov, Evgeny; Eloranta, Jussi

    2016-10-05

    The dynamics following laser ablation of a metal target immersed in superfluid $^4$He is studied by time-resolved shadowgraph photography. The delayed ejection of hot micrometer-sized particles from the target surface into the liquid was indirectly observed by monitoring the formation and growth of gaseous bubbles around the particles. The experimentally determined particle average velocity distribution appears similar as previously measured in vacuum but exhibits a sharp cutoff at the speed of sound of the liquid. The propagation of the subsonic particles terminates in slightly elongated non-spherical gas bubbles residing near the target whereas faster particles reveal an unusual hydrodynamic response of the liquid. Based on the previously established semi-empirical model developed for macroscopic objects, the ejected transonic particles exhibit supercavitating flow to reduce their hydrodynamic drag. Supersonic particles appear to follow a completely different propagation mechanism as they leave discrete and semi-continuous bubble trails in the liquid. The relatively low number density of the observed non-spherical gas bubbles indicates that only large micron-sized particles are visualized in the experiments. Although the unique properties of superfluid helium allow a detailed characterization of these processes, the developed technique can be used to study the hydrodynamic response of any liquid to fast propagating objects on the micrometer-scale.

  8. Semi Automatic Ontology Instantiation in the domain of Risk Management

    NASA Astrophysics Data System (ADS)

    Makki, Jawad; Alquier, Anne-Marie; Prince, Violaine

    One of the challenging tasks in the context of Ontological Engineering is to automatically or semi-automatically support the process of Ontology Learning and Ontology Population from semi-structured documents (texts). In this paper we describe a Semi-Automatic Ontology Instantiation method from natural language text, in the domain of Risk Management. This method is composed from three steps 1 ) Annotation with part-of-speech tags, 2) Semantic Relation Instances Extraction, 3) Ontology instantiation process. It's based on combined NLP techniques using human intervention between steps 2 and 3 for control and validation. Since it heavily relies on linguistic knowledge it is not domain dependent which is a good feature for portability between the different fields of risk management application. The proposed methodology uses the ontology of the PRIMA1 project (supported by the European community) as a Generic Domain Ontology and populates it via an available corpus. A first validation of the approach is done through an experiment with Chemical Fact Sheets from Environmental Protection Agency2.

  9. The influence of polymeric membrane gas spargers on hydrodynamics and mass transfer in bubble column bioreactors.

    PubMed

    Tirunehe, Gossaye; Norddahl, B

    2016-04-01

    Gas sparging performances of a flat sheet and tubular polymeric membranes were investigated in 3.1 m bubble column bioreactor operated in a semi batch mode. Air-water and air-CMC (Carboxymethyl cellulose) solutions of 0.5, 0.75 and 1.0 % w/w were used as interacting gas-liquid mediums. CMC solutions were employed in the study to simulate rheological properties of bioreactor broth. Gas holdup, bubble size distribution, interfacial area and gas-liquid mass transfer were studied in the homogeneous bubbly flow hydrodynamic regime with superficial gas velocity (U(G)) range of 0.0004-0.0025 m/s. The study indicated that the tubular membrane sparger produced the highest gas holdup and densely populated fine bubbles with narrow size distribution. An increase in liquid viscosity promoted a shift in bubble size distribution to large stable bubbles and smaller specific interfacial area. The tubular membrane sparger achieved greater interfacial area and an enhanced overall mass transfer coefficient (K(L)a) by a factor of 1.2-1.9 compared to the flat sheet membrane.

  10. Three-dimensional features on oscillating microbubbles streaming flows

    NASA Astrophysics Data System (ADS)

    Rossi, Massimiliano; Marin, Alvaro G.; Wang, Cheng; Hilgenfeldt, Sascha; Kähler, Christian J.

    2013-11-01

    Ultrasound-driven oscillating micro-bubbles have been used as active actuators in microfluidic devices to perform manifold tasks such as mixing, sorting and manipulation of microparticles. A common configuration consists in side-bubbles, created by trapping air pockets in blind channels perpendicular to the main channel direction. This configuration results in bubbles with a semi-cylindrical shape that creates a streaming flow generally considered quasi two-dimensional. However, recent experiments performed with three-dimensional velocimetry methods have shown how microparticles can present significant three-dimensional trajectories, especially in regions close to the bubble interface. Several reasons will be discussed such as boundary effects of the bottom/top wall, deformation of the bubble interface leading to more complex vibrational modes, or bubble-particle interactions. In the present investigation, precise measurements of particle trajectories close to the bubble interface will be performed by means of 3D Astigmatic Particle Tracking Velocimetry. The results will allow us to characterize quantitatively the three-dimensional features of the streaming flow and to estimate its implications in practical applications as particle trapping, sorting or mixing.

  11. BubbleGUM: automatic extraction of phenotype molecular signatures and comprehensive visualization of multiple Gene Set Enrichment Analyses.

    PubMed

    Spinelli, Lionel; Carpentier, Sabrina; Montañana Sanchis, Frédéric; Dalod, Marc; Vu Manh, Thien-Phong

    2015-10-19

    Recent advances in the analysis of high-throughput expression data have led to the development of tools that scaled-up their focus from single-gene to gene set level. For example, the popular Gene Set Enrichment Analysis (GSEA) algorithm can detect moderate but coordinated expression changes of groups of presumably related genes between pairs of experimental conditions. This considerably improves extraction of information from high-throughput gene expression data. However, although many gene sets covering a large panel of biological fields are available in public databases, the ability to generate home-made gene sets relevant to one's biological question is crucial but remains a substantial challenge to most biologists lacking statistic or bioinformatic expertise. This is all the more the case when attempting to define a gene set specific of one condition compared to many other ones. Thus, there is a crucial need for an easy-to-use software for generation of relevant home-made gene sets from complex datasets, their use in GSEA, and the correction of the results when applied to multiple comparisons of many experimental conditions. We developed BubbleGUM (GSEA Unlimited Map), a tool that allows to automatically extract molecular signatures from transcriptomic data and perform exhaustive GSEA with multiple testing correction. One original feature of BubbleGUM notably resides in its capacity to integrate and compare numerous GSEA results into an easy-to-grasp graphical representation. We applied our method to generate transcriptomic fingerprints for murine cell types and to assess their enrichments in human cell types. This analysis allowed us to confirm homologies between mouse and human immunocytes. BubbleGUM is an open-source software that allows to automatically generate molecular signatures out of complex expression datasets and to assess directly their enrichment by GSEA on independent datasets. Enrichments are displayed in a graphical output that helps interpreting the results. This innovative methodology has recently been used to answer important questions in functional genomics, such as the degree of similarities between microarray datasets from different laboratories or with different experimental models or clinical cohorts. BubbleGUM is executable through an intuitive interface so that both bioinformaticians and biologists can use it. It is available at http://www.ciml.univ-mrs.fr/applications/BubbleGUM/index.html .

  12. Modeling of single film bubble and numerical study of the plateau structure in foam system

    NASA Astrophysics Data System (ADS)

    Sun, Zhong-guo; Ni, Ni; Sun, Yi-jie; Xi, Guang

    2018-02-01

    The single-film bubble has a special geometry with a certain amount of gas shrouded by a thin layer of liquid film under the surface tension force both on the inside and outside surfaces of the bubble. Based on the mesh-less moving particle semi-implicit (MPS) method, a single-film double-gas-liquid-interface surface tension (SDST) model is established for the single-film bubble, which characteristically has totally two gas-liquid interfaces on both sides of the film. Within this framework, the conventional surface free energy surface tension model is improved by using a higher order potential energy equation between particles, and the modification results in higher accuracy and better symmetry properties. The complex interface movement in the oscillation process of the single-film bubble is numerically captured, as well as typical flow phenomena and deformation characteristics of the liquid film. In addition, the basic behaviors of the coalescence and connection process between two and even three single-film bubbles are studied, and the cases with bubbles of different sizes are also included. Furthermore, the classic plateau structure in the foam system is reproduced and numerically proved to be in the steady state for multi-bubble connections.

  13. Respiratory Changes and Consequences for Treatment of Decompression Bubbles Following Severe Decompression Accidents

    DTIC Science & Technology

    2001-06-01

    conditions hypobares ou hyperbares ] To order the complete compilation report, use: ADA395680 The component part is provided here to allow users access to...the following report: TITLE: Operational Medical Issues in Hypo-and Hyperbaric Conditions [les Questions medicales a caractere oprationel liees aux...anaesthetised animals subjected to controlled primary and treatment hyperbaric procedures; the range of bubble counts was from zero to fatal. Treatment

  14. Semi-automatic tracking, smoothing and segmentation of hyoid bone motion from videofluoroscopic swallowing study.

    PubMed

    Kim, Won-Seok; Zeng, Pengcheng; Shi, Jian Qing; Lee, Youngjo; Paik, Nam-Jong

    2017-01-01

    Motion analysis of the hyoid bone via videofluoroscopic study has been used in clinical research, but the classical manual tracking method is generally labor intensive and time consuming. Although some automatic tracking methods have been developed, masked points could not be tracked and smoothing and segmentation, which are necessary for functional motion analysis prior to registration, were not provided by the previous software. We developed software to track the hyoid bone motion semi-automatically. It works even in the situation where the hyoid bone is masked by the mandible and has been validated in dysphagia patients with stroke. In addition, we added the function of semi-automatic smoothing and segmentation. A total of 30 patients' data were used to develop the software, and data collected from 17 patients were used for validation, of which the trajectories of 8 patients were partly masked. Pearson correlation coefficients between the manual and automatic tracking are high and statistically significant (0.942 to 0.991, P-value<0.0001). Relative errors between automatic tracking and manual tracking in terms of the x-axis, y-axis and 2D range of hyoid bone excursion range from 3.3% to 9.2%. We also developed an automatic method to segment each hyoid bone trajectory into four phases (elevation phase, anterior movement phase, descending phase and returning phase). The semi-automatic hyoid bone tracking from VFSS data by our software is valid compared to the conventional manual tracking method. In addition, the ability of automatic indication to switch the automatic mode to manual mode in extreme cases and calibration without attaching the radiopaque object is convenient and useful for users. Semi-automatic smoothing and segmentation provide further information for functional motion analysis which is beneficial to further statistical analysis such as functional classification and prognostication for dysphagia. Therefore, this software could provide the researchers in the field of dysphagia with a convenient, useful, and all-in-one platform for analyzing the hyoid bone motion. Further development of our method to track the other swallowing related structures or objects such as epiglottis and bolus and to carry out the 2D curve registration may be needed for a more comprehensive functional data analysis for dysphagia with big data.

  15. Trauma Pod: a semi-automated telerobotic surgical system.

    PubMed

    Garcia, Pablo; Rosen, Jacob; Kapoor, Chetan; Noakes, Mark; Elbert, Greg; Treat, Michael; Ganous, Tim; Hanson, Matt; Manak, Joe; Hasser, Chris; Rohler, David; Satava, Richard

    2009-06-01

    The Trauma Pod (TP) vision is to develop a rapidly deployable robotic system to perform critical acute stabilization and/or surgical procedures, autonomously or in a teleoperative mode, on wounded soldiers in the battlefield who might otherwise die before treatment in a combat hospital could be provided. In the first phase of a project pursuing this vision, a robotic TP system was developed and its capability demonstrated by performing selected surgical procedures on a patient phantom. The system demonstrates the feasibility of performing acute stabilization procedures with the patient being the only human in the surgical cell. The teleoperated surgical robot is supported by autonomous robotic arms and subsystems that carry out scrub-nurse and circulating-nurse functions. Tool change and supply delivery are performed automatically and at least as fast as performed manually by nurses. Tracking and counting of the supplies is performed automatically. The TP system also includes a tomographic X-ray facility for patient diagnosis and two-dimensional (2D) fluoroscopic data to support interventions. The vast amount of clinical protocols generated in the TP system are recorded automatically. Automation and teleoperation capabilities form the basis for a more comprehensive acute diagnostic and management platform that will provide life-saving care in environments where surgical personnel are not present.

  16. Diagnostic accuracy of semi-automatic quantitative metrics as an alternative to expert reading of CT myocardial perfusion in the CORE320 study.

    PubMed

    Ostovaneh, Mohammad R; Vavere, Andrea L; Mehra, Vishal C; Kofoed, Klaus F; Matheson, Matthew B; Arbab-Zadeh, Armin; Fujisawa, Yasuko; Schuijf, Joanne D; Rochitte, Carlos E; Scholte, Arthur J; Kitagawa, Kakuya; Dewey, Marc; Cox, Christopher; DiCarli, Marcelo F; George, Richard T; Lima, Joao A C

    To determine the diagnostic accuracy of semi-automatic quantitative metrics compared to expert reading for interpretation of computed tomography perfusion (CTP) imaging. The CORE320 multicenter diagnostic accuracy clinical study enrolled patients between 45 and 85 years of age who were clinically referred for invasive coronary angiography (ICA). Computed tomography angiography (CTA), CTP, single photon emission computed tomography (SPECT), and ICA images were interpreted manually in blinded core laboratories by two experienced readers. Additionally, eight quantitative CTP metrics as continuous values were computed semi-automatically from myocardial and blood attenuation and were combined using logistic regression to derive a final quantitative CTP metric score. For the reference standard, hemodynamically significant coronary artery disease (CAD) was defined as a quantitative ICA stenosis of 50% or greater and a corresponding perfusion defect by SPECT. Diagnostic accuracy was determined by area under the receiver operating characteristic curve (AUC). Of the total 377 included patients, 66% were male, median age was 62 (IQR: 56, 68) years, and 27% had prior myocardial infarction. In patient based analysis, the AUC (95% CI) for combined CTA-CTP expert reading and combined CTA-CTP semi-automatic quantitative metrics was 0.87(0.84-0.91) and 0.86 (0.83-0.9), respectively. In vessel based analyses the AUC's were 0.85 (0.82-0.88) and 0.84 (0.81-0.87), respectively. No significant difference in AUC was found between combined CTA-CTP expert reading and CTA-CTP semi-automatic quantitative metrics in patient based or vessel based analyses(p > 0.05 for all). Combined CTA-CTP semi-automatic quantitative metrics is as accurate as CTA-CTP expert reading to detect hemodynamically significant CAD. Copyright © 2018 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  17. Does semi-automatic bone-fragment segmentation improve the reproducibility of the Letournel acetabular fracture classification?

    PubMed

    Boudissa, M; Orfeuvre, B; Chabanas, M; Tonetti, J

    2017-09-01

    The Letournel classification of acetabular fracture shows poor reproducibility in inexperienced observers, despite the introduction of 3D imaging. We therefore developed a method of semi-automatic segmentation based on CT data. The present prospective study aimed to assess: (1) whether semi-automatic bone-fragment segmentation increased the rate of correct classification; (2) if so, in which fracture types; and (3) feasibility using the open-source itksnap 3.0 software package without incurring extra cost for users. Semi-automatic segmentation of acetabular fractures significantly increases the rate of correct classification by orthopedic surgery residents. Twelve orthopedic surgery residents classified 23 acetabular fractures. Six used conventional 3D reconstructions provided by the center's radiology department (conventional group) and 6 others used reconstructions obtained by semi-automatic segmentation using the open-source itksnap 3.0 software package (segmentation group). Bone fragments were identified by specific colors. Correct classification rates were compared between groups on Chi 2 test. Assessment was repeated 2 weeks later, to determine intra-observer reproducibility. Correct classification rates were significantly higher in the "segmentation" group: 114/138 (83%) versus 71/138 (52%); P<0.0001. The difference was greater for simple (36/36 (100%) versus 17/36 (47%); P<0.0001) than complex fractures (79/102 (77%) versus 54/102 (53%); P=0.0004). Mean segmentation time per fracture was 27±3min [range, 21-35min]. The segmentation group showed excellent intra-observer correlation coefficients, overall (ICC=0.88), and for simple (ICC=0.92) and complex fractures (ICC=0.84). Semi-automatic segmentation, identifying the various bone fragments, was effective in increasing the rate of correct acetabular fracture classification on the Letournel system by orthopedic surgery residents. It may be considered for routine use in education and training. III: prospective case-control study of a diagnostic procedure. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  18. UAS-based automatic bird count of a common gull colony

    NASA Astrophysics Data System (ADS)

    Grenzdörffer, G. J.

    2013-08-01

    The standard procedure to count birds is a manual one. However a manual bird count is a time consuming and cumbersome process, requiring several people going from nest to nest counting the birds and the clutches. High resolution imagery, generated with a UAS (Unmanned Aircraft System) offer an interesting alternative. Experiences and results of UAS surveys for automatic bird count of the last two years are presented for the bird reserve island Langenwerder. For 2011 1568 birds (± 5%) were detected on the image mosaic, based on multispectral image classification and GIS-based post processing. Based on the experiences of 2011 the results and the accuracy of the automatic bird count 2012 became more efficient. For 2012 1938 birds with an accuracy of approx. ± 3% were counted. Additionally a separation of breeding and non-breeding birds was performed with the assumption, that standing birds cause a visible shade. The final section of the paper is devoted to the analysis of the 3D-point cloud. Thereby the point cloud was used to determine the height of the vegetation and the extend and depth of closed sinks, which are unsuitable for breeding birds.

  19. Detection and 3D representation of pulmonary air bubbles in HRCT volumes

    NASA Astrophysics Data System (ADS)

    Silva, Jose S.; Silva, Augusto F.; Santos, Beatriz S.; Madeira, Joaquim

    2003-05-01

    Bubble emphysema is a disease characterized by the presence of air bubbles within the lungs. With the purpose of identifying pulmonary air bubbles, two alternative methods were developed, using High Resolution Computer Tomography (HRCT) exams. The search volume is confined to the pulmonary volume through a previously developed pulmonary contour detection algorithm. The first detection method follows a slice by slice approach and uses selection criteria based on the Hounsfield levels, dimensions, shape and localization of the bubbles. Candidate regions that do not exhibit axial coherence along at least two sections are excluded. Intermediate sections are interpolated for a more realistic representation of lungs and bubbles. The second detection method, after the pulmonary volume delimitation, follows a fully 3D approach. A global threshold is applied to the entire lung volume returning candidate regions. 3D morphologic operators are used to remove spurious structures and to circumscribe the bubbles. Bubble representation is accomplished by two alternative methods. The first generates bubble surfaces based on the voxel volumes previously detected; the second method assumes that bubbles are approximately spherical. In order to obtain better 3D representations, fits super-quadrics to bubble volume. The fitting process is based on non-linear least squares optimization method, where a super-quadric is adapted to a regular grid of points defined on each bubble. All methods were applied to real and semi-synthetical data where artificial and randomly deformed bubbles were embedded in the interior of healthy lungs. Quantitative results regarding bubble geometric features are either similar to a priori known values used in simulation tests, or indicate clinically acceptable dimensions and locations when dealing with real data.

  20. Arraycount, an algorithm for automatic cell counting in microwell arrays.

    PubMed

    Kachouie, Nezamoddin; Kang, Lifeng; Khademhosseini, Ali

    2009-09-01

    Microscale technologies have emerged as a powerful tool for studying and manipulating biological systems and miniaturizing experiments. However, the lack of software complementing these techniques has made it difficult to apply them for many high-throughput experiments. This work establishes Arraycount, an approach to automatically count cells in microwell arrays. The procedure consists of fluorescent microscope imaging of cells that are seeded in microwells of a microarray system and then analyzing images via computer to recognize the array and count cells inside each microwell. To start counting, green and red fluorescent images (representing live and dead cells, respectively) are extracted from the original image and processed separately. A template-matching algorithm is proposed in which pre-defined well and cell templates are matched against the red and green images to locate microwells and cells. Subsequently, local maxima in the correlation maps are determined and local maxima maps are thresholded. At the end, the software records the cell counts for each detected microwell on the original image in high-throughput. The automated counting was shown to be accurate compared with manual counting, with a difference of approximately 1-2 cells per microwell: based on cell concentration, the absolute difference between manual and automatic counting measurements was 2.5-13%.

  1. Black holes as bubble nucleation sites

    NASA Astrophysics Data System (ADS)

    Gregory, Ruth; Moss, Ian G.; Withers, Benjamin

    2014-03-01

    We consider the effect of inhomogeneities on the rate of false vacuum decay. Modelling the inhomogeneity by a black hole, we construct explicit Euclidean instantons which describe the nucleation of a bubble of true vacuum centred on the inhomogeneity. We find that inhomogeneity significantly enhances the nucleation rate over that of the Coleman-de Luccia instanton — the black hole acts as a nucleation site for the bubble. The effect is larger than previously believed due to the contributions to the action from conical singularities. For a sufficiently low initial mass, the original black hole is replaced by flat space during this process, as viewed by a single causal patch observer. Increasing the initial mass, we find a critical value above which a black hole remnant survives the process. This resulting black hole can have a higher mass than the original black hole, but always has a lower entropy. We compare the process to bubble-to-bubble transitions, where there is a semi-classical Lorentzian description in the WKB approximation.

  2. Digital X-ray portable scanner based on monolithic semi-insulating GaAs detectors: General description and first “quantum” images

    NASA Astrophysics Data System (ADS)

    Dubecký, F.; Perd'ochová, A.; Ščepko, P.; Zat'ko, B.; Sekerka, V.; Nečas, V.; Sekáčová, M.; Hudec, M.; Boháček, P.; Huran, J.

    2005-07-01

    The present work describes a portable digital X-ray scanner based on bulk undoped semi-insulating (SI) GaAs monolithic strip line detectors. The scanner operates in "quantum" imaging mode ("single photon counting"), with potential improvement of the dynamic range in contrast of the observed X-ray images. The "heart" of the scanner (detection unit) is based on SI GaAs strip line detectors. The measured detection efficiency of the SI GaAs detector reached a value of over 60 % (compared to the theoretical one of ˜75 %) for the detection of 60 keV photons at a reverse bias of 200 V. The read-out electronics consists of 20 modules fabricated using a progressive SMD technology with automatic assembly of electronic devices. Signals from counters included in the digital parts of the modules are collected in a PC via a USB port and evaluated by custom developed software allowing X-ray image reconstruction. The collected data were used for the creation of the first X-ray "quantum" images of various test objects using the imaging software developed.

  3. A computational model of microbubble transport through a blood-filled vessel bifurcation

    NASA Astrophysics Data System (ADS)

    Calderon, Andres

    2005-11-01

    We are developing a novel gas embolotherapy technique to occlude blood vessels and starve tumors using gas bubbles that are produced by the acoustic vaporization of liquid perfluorocarbon droplets. The droplets are small enough to pass through the microcirculation, but the subsequent bubbles are large enough to lodge in vessels. The uniformity of tumor infarction depends on the transport the blood-borne bubbles before they stick. We examine the transport of a semi-infinite bubble through a single bifurcation in a liquid-filled two-dimensional channel. The flow is governed by the conservation of fluid mass and momentum equations. Reynolds numbers in the microcirculation are small, and we solve the governing equations using the boundary element method. The effect of gravity on bubble splitting is investigated and results are compared with our previous bench top experiments and to a quasi-steady one-dimensional analysis. The effects of daughter tube outlet pressures and bifurcation geometry are also considered. The findings suggest that slow moving bubbles will favor the upper branch of the bifurcation, but that increasing the bubble speed leads to more even splitting. It is also found that some bifurcation geometries and flow conditions result in severe thinning of the liquid film separating the bubble from the wall, suggesting the possibility bubble-wall contact. This work is supported by NSF grant BES-0301278 and NIH grant EB003541.

  4. The Speed of Axial Propagation of a Cylindrical Bubble Through a Cylindrical Vortex

    NASA Technical Reports Server (NTRS)

    Shariff, Karim; Mansour, Nagi N. (Technical Monitor)

    2002-01-01

    Inspired by the rapid elongation of air columns injected into vortices by dolphins, we present an exact inviscid solution for the axial speed (assumed steady) of propagation of the tip of a semi-infinite cylindrical bubble along the axis of a cylindrical vortex. The bubble is assumed to be held at constant pressure by being connected to a reservoir, the lungs of the dolphin, say. For a given bubble pressure, there is a modest critical rotation rate above which steadily propagating bubbles exist. For a bubble at ambient pressure, the propagation speed of the bubble (relative to axial velocity within the vortex) varies between 0.5 and 0.6 of the maximum rotational speed of the vortex. Surprisingly, the bubble tip can propagate (almost as rapidly) even when the pressure minimum in the vortex core is greater than the bubble pressure; in this case, solutions exhibit a dimple on the nose of the bubble. A situation important for incipient vortex cavitation, and one which dolphins also demonstrate, is elongation of a free bubble, i.e., one whose internal pressure may vary. Under the assumption that the acceleration term is small (checked a posteriori), the steady solution is applied at each instant during the elongation. Three types of behavior are then possible depending on physical parameters and initial conditions: (A) Unabated elongation with slowly increasing bubble pressure, and nearly constant volume. Volume begins to decrease in the late stages. (B1) Elongation with decreasing bubble pressure. A limit point of the steady solution is encountered at a finite bubble length. (B2) Unabated elongation with decreasing bubble pressure and indefinite creation of volume. This is made possible by the existence of propagating solutions at bubble pressures below the minimum vortex pressure. As the bubble stretches, its radius initially decreases but then becomes constant; this is also observed in experiments on incipient vortex cavitation.

  5. A Modular Hierarchical Approach to 3D Electron Microscopy Image Segmentation

    PubMed Central

    Liu, Ting; Jones, Cory; Seyedhosseini, Mojtaba; Tasdizen, Tolga

    2014-01-01

    The study of neural circuit reconstruction, i.e., connectomics, is a challenging problem in neuroscience. Automated and semi-automated electron microscopy (EM) image analysis can be tremendously helpful for connectomics research. In this paper, we propose a fully automatic approach for intra-section segmentation and inter-section reconstruction of neurons using EM images. A hierarchical merge tree structure is built to represent multiple region hypotheses and supervised classification techniques are used to evaluate their potentials, based on which we resolve the merge tree with consistency constraints to acquire final intra-section segmentation. Then, we use a supervised learning based linking procedure for the inter-section neuron reconstruction. Also, we develop a semi-automatic method that utilizes the intermediate outputs of our automatic algorithm and achieves intra-segmentation with minimal user intervention. The experimental results show that our automatic method can achieve close-to-human intra-segmentation accuracy and state-of-the-art inter-section reconstruction accuracy. We also show that our semi-automatic method can further improve the intra-segmentation accuracy. PMID:24491638

  6. Impact of translation on named-entity recognition in radiology texts

    PubMed Central

    Pedro, Vasco

    2017-01-01

    Abstract Radiology reports describe the results of radiography procedures and have the potential of being a useful source of information which can bring benefits to health care systems around the world. One way to automatically extract information from the reports is by using Text Mining tools. The problem is that these tools are mostly developed for English and reports are usually written in the native language of the radiologist, which is not necessarily English. This creates an obstacle to the sharing of Radiology information between different communities. This work explores the solution of translating the reports to English before applying the Text Mining tools, probing the question of what translation approach should be used. We created MRRAD (Multilingual Radiology Research Articles Dataset), a parallel corpus of Portuguese research articles related to Radiology and a number of alternative translations (human, automatic and semi-automatic) to English. This is a novel corpus which can be used to move forward the research on this topic. Using MRRAD we studied which kind of automatic or semi-automatic translation approach is more effective on the Named-entity recognition task of finding RadLex terms in the English version of the articles. Considering the terms extracted from human translations as our gold standard, we calculated how similar to this standard were the terms extracted using other translations. We found that a completely automatic translation approach using Google leads to F-scores (between 0.861 and 0.868, depending on the extraction approach) similar to the ones obtained through a more expensive semi-automatic translation approach using Unbabel (between 0.862 and 0.870). To better understand the results we also performed a qualitative analysis of the type of errors found in the automatic and semi-automatic translations. Database URL: https://github.com/lasigeBioTM/MRRAD PMID:29220455

  7. QuantiFly: Robust Trainable Software for Automated Drosophila Egg Counting.

    PubMed

    Waithe, Dominic; Rennert, Peter; Brostow, Gabriel; Piper, Matthew D W

    2015-01-01

    We report the development and testing of software called QuantiFly: an automated tool to quantify Drosophila egg laying. Many laboratories count Drosophila eggs as a marker of fitness. The existing method requires laboratory researchers to count eggs manually while looking down a microscope. This technique is both time-consuming and tedious, especially when experiments require daily counts of hundreds of vials. The basis of the QuantiFly software is an algorithm which applies and improves upon an existing advanced pattern recognition and machine-learning routine. The accuracy of the baseline algorithm is additionally increased in this study through correction of bias observed in the algorithm output. The QuantiFly software, which includes the refined algorithm, has been designed to be immediately accessible to scientists through an intuitive and responsive user-friendly graphical interface. The software is also open-source, self-contained, has no dependencies and is easily installed (https://github.com/dwaithe/quantifly). Compared to manual egg counts made from digital images, QuantiFly achieved average accuracies of 94% and 85% for eggs laid on transparent (defined) and opaque (yeast-based) fly media. Thus, the software is capable of detecting experimental differences in most experimental situations. Significantly, the advanced feature recognition capabilities of the software proved to be robust to food surface artefacts like bubbles and crevices. The user experience involves image acquisition, algorithm training by labelling a subset of eggs in images of some of the vials, followed by a batch analysis mode in which new images are automatically assessed for egg numbers. Initial training typically requires approximately 10 minutes, while subsequent image evaluation by the software is performed in just a few seconds. Given the average time per vial for manual counting is approximately 40 seconds, our software introduces a timesaving advantage for experiments starting with as few as 20 vials. We also describe an optional acrylic box to be used as a digital camera mount and to provide controlled lighting during image acquisition which will guarantee the conditions used in this study.

  8. QuantiFly: Robust Trainable Software for Automated Drosophila Egg Counting

    PubMed Central

    Waithe, Dominic; Rennert, Peter; Brostow, Gabriel; Piper, Matthew D. W.

    2015-01-01

    We report the development and testing of software called QuantiFly: an automated tool to quantify Drosophila egg laying. Many laboratories count Drosophila eggs as a marker of fitness. The existing method requires laboratory researchers to count eggs manually while looking down a microscope. This technique is both time-consuming and tedious, especially when experiments require daily counts of hundreds of vials. The basis of the QuantiFly software is an algorithm which applies and improves upon an existing advanced pattern recognition and machine-learning routine. The accuracy of the baseline algorithm is additionally increased in this study through correction of bias observed in the algorithm output. The QuantiFly software, which includes the refined algorithm, has been designed to be immediately accessible to scientists through an intuitive and responsive user-friendly graphical interface. The software is also open-source, self-contained, has no dependencies and is easily installed (https://github.com/dwaithe/quantifly). Compared to manual egg counts made from digital images, QuantiFly achieved average accuracies of 94% and 85% for eggs laid on transparent (defined) and opaque (yeast-based) fly media. Thus, the software is capable of detecting experimental differences in most experimental situations. Significantly, the advanced feature recognition capabilities of the software proved to be robust to food surface artefacts like bubbles and crevices. The user experience involves image acquisition, algorithm training by labelling a subset of eggs in images of some of the vials, followed by a batch analysis mode in which new images are automatically assessed for egg numbers. Initial training typically requires approximately 10 minutes, while subsequent image evaluation by the software is performed in just a few seconds. Given the average time per vial for manual counting is approximately 40 seconds, our software introduces a timesaving advantage for experiments starting with as few as 20 vials. We also describe an optional acrylic box to be used as a digital camera mount and to provide controlled lighting during image acquisition which will guarantee the conditions used in this study. PMID:25992957

  9. CO2 Absorption from Biogas by Glycerol: Conducted in Semi-Batch Bubble Column

    NASA Astrophysics Data System (ADS)

    puji lestari, Pratiwi; Mindaryani, Aswati; Wirawan, S. K.

    2018-03-01

    Biogas is a renewable energy source that has been developed recently. The main contents of Biogas itself are Methane and carbon dioxide (CO2) where Methane is the main component of biogas with CO2 as the highest impurities. The quality of biogas depends on the CO2 content, the lower CO2 levels, the higher biogas quality. Absorption is one of the methods to reduce CO2 level. The selections of absorbent and appropriate operating parameters are important factors in the CO2 absorption from biogas. This study aimed to find out the design parameters for CO2 absorption using glycerol that represented by the overall mass transfer coefficient (KLa) and Henry’s constant (H). This study was conducted in semi-batch bubble column. Mixed gas was contacted with glycerol in a bubble column. The concentration of CO2 in the feed gas inlet and outlet columns were analysed by Gas Chromatograph. The variables observed in this study were superficial gas velocity and temperatures. The results showed that higher superficial gas velocity and lower temperature increased the rate of absorption process and the amount of CO2 absorbed.

  10. The Nature of Indexing: How Humans and Machines Analyze Messages and Texts for Retrieval. Part II: Machine Indexing, and the Allocation of Human versus Machine Effort.

    ERIC Educational Resources Information Center

    Anderson, James D.; Perez-Carballo, Jose

    2001-01-01

    Discussion of human intellectual indexing versus automatic indexing focuses on automatic indexing. Topics include keyword indexing; negative vocabulary control; counting words; comparative counting and weighting; stemming; words versus phrases; clustering; latent semantic indexing; citation indexes; bibliographic coupling; co-citation; relevance…

  11. Design and development of a prototypical software for semi-automatic generation of test methodologies and security checklists for IT vulnerability assessment in small- and medium-sized enterprises (SME)

    NASA Astrophysics Data System (ADS)

    Möller, Thomas; Bellin, Knut; Creutzburg, Reiner

    2015-03-01

    The aim of this paper is to show the recent progress in the design and prototypical development of a software suite Copra Breeder* for semi-automatic generation of test methodologies and security checklists for IT vulnerability assessment in small and medium-sized enterprises.

  12. The Pathophysiology of Decompression Sickness and the Effects of Doppler Detectable Bubbles.

    DTIC Science & Technology

    1980-12-18

    Doppler Ultrasound and a calibrated 6 1 Venous Gas Embol i Scale. C. Electronic Counting of Doppler Bubble Signals 72 £ III. Pulmonary Embolism Studies...IA. Background 75 B. Right Ventricular Systolic Pressure following Gas 81 Embolization and Venous Gas Phase Content IC. Effects of Pulmonary Gas... Embolism on the Development 9 of Limb-Bend Decompression Sickness 1 IV. Gas Phase Formation in Highly Perfused Tissues IA. Renal 9 B. Cerebral 9 1 I I V

  13. Hi-fidelity multi-scale local processing for visually optimized far-infrared Herschel images

    NASA Astrophysics Data System (ADS)

    Li Causi, G.; Schisano, E.; Liu, S. J.; Molinari, S.; Di Giorgio, A.

    2016-07-01

    In the context of the "Hi-Gal" multi-band full-plane mapping program for the Galactic Plane, as imaged by the Herschel far-infrared satellite, we have developed a semi-automatic tool which produces high definition, high quality color maps optimized for visual perception of extended features, like bubbles and filaments, against the high background variations. We project the map tiles of three selected bands onto a 3-channel panorama, which spans the central 130 degrees of galactic longitude times 2.8 degrees of galactic latitude, at the pixel scale of 3.2", in cartesian galactic coordinates. Then we process this image piecewise, applying a custom multi-scale local stretching algorithm, enforced by a local multi-scale color balance. Finally, we apply an edge-preserving contrast enhancement to perform an artifact-free details sharpening. Thanks to this tool, we have thus produced a stunning giga-pixel color image of the far-infrared Galactic Plane that we made publicly available with the recent release of the Hi-Gal mosaics and compact source catalog.

  14. Assessing the Performance of a Machine Learning Algorithm in Identifying Bubbles in Dust Emission

    NASA Astrophysics Data System (ADS)

    Xu, Duo; Offner, Stella S. R.

    2017-12-01

    Stellar feedback created by radiation and winds from massive stars plays a significant role in both physical and chemical evolution of molecular clouds. This energy and momentum leaves an identifiable signature (“bubbles”) that affects the dynamics and structure of the cloud. Most bubble searches are performed “by eye,” which is usually time-consuming, subjective, and difficult to calibrate. Automatic classifications based on machine learning make it possible to perform systematic, quantifiable, and repeatable searches for bubbles. We employ a previously developed machine learning algorithm, Brut, and quantitatively evaluate its performance in identifying bubbles using synthetic dust observations. We adopt magnetohydrodynamics simulations, which model stellar winds launching within turbulent molecular clouds, as an input to generate synthetic images. We use a publicly available three-dimensional dust continuum Monte Carlo radiative transfer code, HYPERION, to generate synthetic images of bubbles in three Spitzer bands (4.5, 8, and 24 μm). We designate half of our synthetic bubbles as a training set, which we use to train Brut along with citizen-science data from the Milky Way Project (MWP). We then assess Brut’s accuracy using the remaining synthetic observations. We find that Brut’s performance after retraining increases significantly, and it is able to identify yellow bubbles, which are likely associated with B-type stars. Brut continues to perform well on previously identified high-score bubbles, and over 10% of the MWP bubbles are reclassified as high-confidence bubbles, which were previously marginal or ambiguous detections in the MWP data. We also investigate the influence of the size of the training set, dust model, evolutionary stage, and background noise on bubble identification.

  15. A semi-automatic traffic sign detection, classification, and positioning system

    NASA Astrophysics Data System (ADS)

    Creusen, I. M.; Hazelhoff, L.; de With, P. H. N.

    2012-01-01

    The availability of large-scale databases containing street-level panoramic images offers the possibility to perform semi-automatic surveying of real-world objects such as traffic signs. These inventories can be performed significantly more efficiently than using conventional methods. Governmental agencies are interested in these inventories for maintenance and safety reasons. This paper introduces a complete semi-automatic traffic sign inventory system. The system consists of several components. First, a detection algorithm locates the 2D position of the traffic signs in the panoramic images. Second, a classification algorithm is used to identify the traffic sign. Third, the 3D position of the traffic sign is calculated using the GPS position of the photographs. Finally, the results are listed in a table for quick inspection and are also visualized in a web browser.

  16. Time and Space Resolved Heat Transfer Measurements Under Nucleate Bubbles with Constant Heat Flux Boundary Conditions

    NASA Technical Reports Server (NTRS)

    Myers, Jerry G.; Hussey, Sam W.; Yee, Glenda F.; Kim, Jungho

    2003-01-01

    Investigations into single bubble pool boiling phenomena are often complicated by the difficulties in obtaining time and space resolved information in the bubble region. This usually occurs because the heaters and diagnostics used to measure heat transfer data are often on the order of, or larger than, the bubble characteristic length or region of influence. This has contributed to the development of many different and sometimes contradictory models of pool boiling phenomena and dominant heat transfer mechanisms. Recent investigations by Yaddanapyddi and Kim and Demiray and Kim have obtained time and space resolved heat transfer information at the bubble/heater interface under constant temperature conditions using a novel micro-heater array (10x10 array, each heater 100 microns on a side) that is semi-transparent and doubles as a measurement sensor. By using active feedback to maintain a state of constant temperature at the heater surface, they showed that the area of influence of bubbles generated in FC-72 was much smaller than predicted by standard models and that micro-conduction/micro-convection due to re-wetting dominated heat transfer effects. This study seeks to expand on the previous work by making time and space resolved measurements under bubbles nucleating on a micro-heater array operated under constant heat flux conditions. In the planned investigation, wall temperature measurements made under a single bubble nucleation site will be synchronized with high-speed video to allow analysis of the bubble energy removal from the wall.

  17. Transesophageal Echocardiographic Study of Decompression-Induced Venous Gas Emboli

    NASA Technical Reports Server (NTRS)

    Butler, B. D.; Morris, W. P.

    1995-01-01

    Transesophageal echo-cardiography was used to evaluate venous bubbles produced in nine anesthetized dogs following decompression from 2.84 bar after 120 min at pressure. In five dogs a pulsed Doppler cuff probe was placed around the inferior vena cava for bubble grade determination. The transesophageal echo images demonstrated several novel or less defined events. In each case where the pulmonary artery was clearly visualized, the venous bubbles were seen to oscillate back and forth several times, bringing into question the effect of coincidental counting in routine bubble grade analysis using precordial Doppler. A second finding was that in all cases, extensive bubbling occurred in the portal veins with complete extraction by the liver sinusoids, with one exception where a portal-to-hepatic venous anastomosis was observed. Compression of the bowel released copious numbers of bubbles into the portal veins, sometimes more than were released into the inferior vena cava. Finally, large masses of foam were routinely observed in the non-dependent regions of the inferior vena cava that not only delayed the appearance of bubbles in the pulmonary artery but also allowed additional opportunity for further reaction with blood products and for coalescence to occur before reaching the pulmonary microcirculation. These novel observations are discussed in relation to the decompression process.

  18. A Graph-Based Recovery and Decomposition of Swanson’s Hypothesis using Semantic Predications

    PubMed Central

    Cameron, Delroy; Bodenreider, Olivier; Yalamanchili, Hima; Danh, Tu; Vallabhaneni, Sreeram; Thirunarayan, Krishnaprasad; Sheth, Amit P.; Rindflesch, Thomas C.

    2014-01-01

    Objectives This paper presents a methodology for recovering and decomposing Swanson’s Raynaud Syndrome–Fish Oil Hypothesis semi-automatically. The methodology leverages the semantics of assertions extracted from biomedical literature (called semantic predications) along with structured background knowledge and graph-based algorithms to semi-automatically capture the informative associations originally discovered manually by Swanson. Demonstrating that Swanson’s manually intensive techniques can be undertaken semi-automatically, paves the way for fully automatic semantics-based hypothesis generation from scientific literature. Methods Semantic predications obtained from biomedical literature allow the construction of labeled directed graphs which contain various associations among concepts from the literature. By aggregating such associations into informative subgraphs, some of the relevant details originally articulated by Swanson has been uncovered. However, by leveraging background knowledge to bridge important knowledge gaps in the literature, a methodology for semi-automatically capturing the detailed associations originally explicated in natural language by Swanson has been developed. Results Our methodology not only recovered the 3 associations commonly recognized as Swanson’s Hypothesis, but also decomposed them into an additional 16 detailed associations, formulated as chains of semantic predications. Altogether, 14 out of the 19 associations that can be attributed to Swanson were retrieved using our approach. To the best of our knowledge, such an in-depth recovery and decomposition of Swanson’s Hypothesis has never been attempted. Conclusion In this work therefore, we presented a methodology for semi- automatically recovering and decomposing Swanson’s RS-DFO Hypothesis using semantic representations and graph algorithms. Our methodology provides new insights into potential prerequisites for semantics-driven Literature-Based Discovery (LBD). These suggest that three critical aspects of LBD include: 1) the need for more expressive representations beyond Swanson’s ABC model; 2) an ability to accurately extract semantic information from text; and 3) the semantic integration of scientific literature with structured background knowledge. PMID:23026233

  19. Semi-Automatic Extraction Algorithm for Images of the Ciliary Muscle

    PubMed Central

    Kao, Chiu-Yen; Richdale, Kathryn; Sinnott, Loraine T.; Ernst, Lauren E.; Bailey, Melissa D.

    2011-01-01

    Purpose To development and evaluate a semi-automatic algorithm for segmentation and morphological assessment of the dimensions of the ciliary muscle in Visante™ Anterior Segment Optical Coherence Tomography images. Methods Geometric distortions in Visante images analyzed as binary files were assessed by imaging an optical flat and human donor tissue. The appropriate pixel/mm conversion factor to use for air (n = 1) was estimated by imaging calibration spheres. A semi-automatic algorithm was developed to extract the dimensions of the ciliary muscle from Visante images. Measurements were also made manually using Visante software calipers. Interclass correlation coefficients (ICC) and Bland-Altman analyses were used to compare the methods. A multilevel model was fitted to estimate the variance of algorithm measurements that was due to differences within- and between-examiners in scleral spur selection versus biological variability. Results The optical flat and the human donor tissue were imaged and appeared without geometric distortions in binary file format. Bland-Altman analyses revealed that caliper measurements tended to underestimate ciliary muscle thickness at 3 mm posterior to the scleral spur in subjects with the thickest ciliary muscles (t = 3.6, p < 0.001). The percent variance due to within- or between-examiner differences in scleral spur selection was found to be small (6%) when compared to the variance due to biological difference across subjects (80%). Using the mean of measurements from three images achieved an estimated ICC of 0.85. Conclusions The semi-automatic algorithm successfully segmented the ciliary muscle for further measurement. Using the algorithm to follow the scleral curvature to locate more posterior measurements is critical to avoid underestimating thickness measurements. This semi-automatic algorithm will allow for repeatable, efficient, and masked ciliary muscle measurements in large datasets. PMID:21169877

  20. Automatic milking systems in the Protected Designation of Origin Montasio cheese production chain: effects on milk and cheese quality.

    PubMed

    Innocente, N; Biasutti, M

    2013-02-01

    Montasio cheese is a typical Italian semi-hard, semi-cooked cheese produced in northeastern Italy from unpasteurized (raw or thermised) cow milk. The Protected Designation of Origin label regulations for Montasio cheese require that local milk be used from twice-daily milking. The number of farms milking with automatic milking systems (AMS) has increased rapidly in the last few years in the Montasio production area. The objective of this study was to evaluate the effects of a variation in milking frequency, associated with the adoption of an automatic milking system, on milk quality and on the specific characteristics of Montasio cheese. Fourteen farms were chosen, all located in the Montasio production area, with an average herd size of 60 (Simmental, Holstein-Friesian, and Brown Swiss breeds). In 7 experimental farms, the cows were milked 3 times per day with an AMS, whereas in the other 7 control farms, cows were milked twice daily in conventional milking parlors (CMP). The study showed that the main components, the hygienic quality, and the cheese-making features of milk were not affected by the milking system adopted. In fact, the control and experimental milks did not reveal a statistically significant difference in fat, protein, and lactose contents; in the casein index; or in the HPLC profiles of casein and whey protein fractions. Milk from farms that used an AMS always showed somatic cell counts and total bacterial counts below the legal limits imposed by European Union regulations for raw milk. Finally, bulk milk clotting characteristics (clotting time, curd firmness, and time to curd firmness of 20mm) did not differ between milk from AMS and milk from CMP. Montasio cheese was made from milk collected from the 2 groups of farms milking either with AMS or with CMP. Three different cheese-making trials were performed during the year at different times. As expected, considering the results of the milk analysis, the moisture, fat, and protein contents of the experimental and control cheeses were comparable. The milking system was not seen to significantly affect the biochemical processes associated with ripening. In fact, all cheeses showed a normal proteolysis trend and a characteristic volatile compound profile during aging. Therefore, the milking system does not appear to modify the distinctive characteristics of this cheese that remain dependent on the area and methodology of production. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  1. Tests with VHR images for the identification of olive trees and other fruit trees in the European Union

    NASA Astrophysics Data System (ADS)

    Masson, Josiane; Soille, Pierre; Mueller, Rick

    2004-10-01

    In the context of the Common Agricultural Policy (CAP) there is a strong interest of the European Commission for counting and individually locating fruit trees. An automatic counting algorithm developed by the JRC (OLICOUNT) was used in the past for olive trees only, on 1m black and white orthophotos but with limits in case of young trees or irregular groves. This study investigates the improvement of fruit tree identification using VHR images on a large set of data in three test sites, one in Creta (Greece; one in the south-east of France with a majority of olive trees and associated fruit trees, and the last one in Florida on citrus trees. OLICOUNT was compared with two other automatic tree counting, applications, one using the CRISP software on citrus trees and the other completely automatic based on regional minima (morphological image analysis). Additional investigation was undertaken to refine the methods. This paper describes the automatic methods and presents the results derived from the tests.

  2. The Minnaert bubble: an acoustic approach

    NASA Astrophysics Data System (ADS)

    Devaud, Martin; Hocquet, Thierry; Bacri, Jean-Claude; Leroy, Valentin

    2008-11-01

    We propose an ab initio introduction to the well-known Minnaert pulsating bubble at graduate level. After a brief recall of the standard stuff, we begin with a detailed discussion of the radial movements of an air bubble in water. This discussion is managed from an acoustic point of view, and using the Lagrangian rather than the Eulerian variables. In unbounded water, the air-water system has a continuum of eigenmodes, some of them correspond to regular Fabry-Pérot resonances. A singular resonance, the lowest one, is shown to coincide with that of Minnaert. In bounded water, the eigenmodes spectrum is discrete, with a finite fundamental frequency. A spectacular quasi-locking of the latter occurs if it happens to exceed the Minnaert frequency, which provides an unforeseen one-bubble alternative version of the famous 'hot chocolate effect'. In the (low) frequency domain in which sound propagation inside the bubble reduces to a simple 'breathing' (i.e. inflation/deflation), the light air bubble can be 'dressed' by the outer water pressure forces, and is turned into the heavy Minnaert bubble. Thanks to this unexpected renormalization process, we demonstrate that the Minnaert bubble definitely behaves like a true harmonic oscillator of the spring-bob type, but with a damping term and a forcing term in apparent disagreement with those commonly admitted in the literature. Finally, we underline the double role played by the water. In order to tell the water motion associated with water compressibility (i.e. the sound) from the simple incompressible accompaniment of the bubble breathing, we introduce a new picture analogous to the electromagnetic radiative picture in Coulomb gauge, which naturally leads us to split the water displacement in an instantaneous and a retarded part. The Minnaert renormalized mass of the dressed bubble is then automatically recovered.

  3. Quantitative analysis of the patellofemoral motion pattern using semi-automatic processing of 4D CT data.

    PubMed

    Forsberg, Daniel; Lindblom, Maria; Quick, Petter; Gauffin, Håkan

    2016-09-01

    To present a semi-automatic method with minimal user interaction for quantitative analysis of the patellofemoral motion pattern. 4D CT data capturing the patellofemoral motion pattern of a continuous flexion and extension were collected for five patients prone to patellar luxation both pre- and post-surgically. For the proposed method, an observer would place landmarks in a single 3D volume, which then are automatically propagated to the other volumes in a time sequence. From the landmarks in each volume, the measures patellar displacement, patellar tilt and angle between femur and tibia were computed. Evaluation of the observer variability showed the proposed semi-automatic method to be favorable over a fully manual counterpart, with an observer variability of approximately 1.5[Formula: see text] for the angle between femur and tibia, 1.5 mm for the patellar displacement, and 4.0[Formula: see text]-5.0[Formula: see text] for the patellar tilt. The proposed method showed that surgery reduced the patellar displacement and tilt at maximum extension with approximately 10-15 mm and 15[Formula: see text]-20[Formula: see text] for three patients but with less evident differences for two of the patients. A semi-automatic method suitable for quantification of the patellofemoral motion pattern as captured by 4D CT data has been presented. Its observer variability is on par with that of other methods but with the distinct advantage to support continuous motions during the image acquisition.

  4. Application of a Novel Semi-Automatic Technique for Determining the Bilateral Symmetry Plane of the Facial Skeleton of Normal Adult Males.

    PubMed

    Roumeliotis, Grayson; Willing, Ryan; Neuert, Mark; Ahluwalia, Romy; Jenkyn, Thomas; Yazdani, Arjang

    2015-09-01

    The accurate assessment of symmetry in the craniofacial skeleton is important for cosmetic and reconstructive craniofacial surgery. Although there have been several published attempts to develop an accurate system for determining the correct plane of symmetry, all are inaccurate and time consuming. Here, the authors applied a novel semi-automatic method for the calculation of craniofacial symmetry, based on principal component analysis and iterative corrective point computation, to a large sample of normal adult male facial computerized tomography scans obtained clinically (n = 32). The authors hypothesized that this method would generate planes of symmetry that would result in less error when one side of the face was compared to the other than a symmetry plane generated using a plane defined by cephalometric landmarks. When a three-dimensional model of one side of the face was reflected across the semi-automatic plane of symmetry there was less error than when reflected across the cephalometric plane. The semi-automatic plane was also more accurate when the locations of bilateral cephalometric landmarks (eg, frontozygomatic sutures) were compared across the face. The authors conclude that this method allows for accurate and fast measurements of craniofacial symmetry. This has important implications for studying the development of the facial skeleton, and clinical application for reconstruction.

  5. The role of awareness of repetition during the development of automaticity in a dot-counting task

    PubMed Central

    Shadbolt, Emma

    2018-01-01

    This study examined whether being aware of the repetition of stimuli in a simple numerosity task could aid the development of automaticity. The numerosity task used in this study was a simple counting task. Thirty-four participants were divided into two groups. One group was instructed that the stimuli would repeat many times throughout the experiment. The results showed no significant differences in the way automatic processing developed between the groups. Similarly, there was no correlation between the point at which automatic processing developed and the point at which participants felt they benefitted from the repetition of stimuli. These results suggest that extra-trial features of a task may have no effect on the development of automaticity, a finding consistent with the instance theory of automatisation. PMID:29404220

  6. DESIGN OF A PATTERN RECOGNITION DIGITAL COMPUTER WITH APPLICATION TO THE AUTOMATIC SCANNING OF BUBBLE CHAMBER NEGATIVES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCormick, B.H.; Narasimhan, R.

    1963-01-01

    The overall computer system contains three main parts: an input device, a pattern recognition unit (PRU), and a control computer. The bubble chamber picture is divided into a grid of st run. Concent 1-mm squares on the film. It is then processed in parallel in a two-dimensional array of 1024 identical processing modules (stalactites) of the PRU. The array can function as a two- dimensional shift register in which results of successive shifting operations can be accumulated. The pattern recognition process is generally controlled by a conventional arithmetic computer. (A.G.W.)

  7. In vitro fertilization (IVF) from low or high antral follicle count pubertal beef heifers using semi-defined culture conditions

    USDA-ARS?s Scientific Manuscript database

    Antral follicle counts (AFC) vary among pubertal beef heifers. Our objective was to compare the in vitro maturation and fertilization of oocytes collected from low and high AFC heifers. Previously we reported results using serum-based IVF media and in this study report results using semi-defined m...

  8. System for definition of the central-chest vasculature

    NASA Astrophysics Data System (ADS)

    Taeprasartsit, Pinyo; Higgins, William E.

    2009-02-01

    Accurate definition of the central-chest vasculature from three-dimensional (3D) multi-detector CT (MDCT) images is important for pulmonary applications. For instance, the aorta and pulmonary artery help in automatic definition of the Mountain lymph-node stations for lung-cancer staging. This work presents a system for defining major vascular structures in the central chest. The system provides automatic methods for extracting the aorta and pulmonary artery and semi-automatic methods for extracting the other major central chest arteries/veins, such as the superior vena cava and azygos vein. Automatic aorta and pulmonary artery extraction are performed by model fitting and selection. The system also extracts certain vascular structure information to validate outputs. A semi-automatic method extracts vasculature by finding the medial axes between provided important sites. Results of the system are applied to lymph-node station definition and guidance of bronchoscopic biopsy.

  9. Development of an Automatic Echo-counting Program for HROFFT Spectrograms

    NASA Astrophysics Data System (ADS)

    Noguchi, Kazuya; Yamamoto, Masa-Yuki

    2008-06-01

    Radio meteor observations by Ham-band beacon or FM radio broadcasts using “Ham-band Radio meteor Observation Fast Fourier Transform” (HROFFT) an automatic operating software have been performed widely in recent days. Previously, counting of meteor echoes on the spectrograms of radio meteor observation was performed manually by observers. In the present paper, we introduce an automatic meteor echo counting software application. Although output images of the HROFFT contain both the features of meteor echoes and those of various types of noises, a newly developed image processing technique has been applied, resulting in software that enables a useful auto-counting tool. There exists a slight error in the processing on spectrograms when the observation site is affected by many disturbing noises. Nevertheless, comparison between software and manual counting revealed an agreement of almost 90%. Therefore, we can easily obtain a dataset of detection time, duration time, signal strength, and Doppler shift of each meteor echo from the HROFFT spectrograms. Using this software, statistical analyses of meteor activities is based on the results obtained at many Ham-band Radio meteor Observation (HRO) sites throughout the world, resulting in a very useful “standard” for monitoring meteor stream activities in real time.

  10. Preclinical Biokinetic Modelling of Tc-99m Radiophamaceuticals Obtained from Semi-Automatic Image Processing.

    PubMed

    Cornejo-Aragón, Luz G; Santos-Cuevas, Clara L; Ocampo-García, Blanca E; Chairez-Oria, Isaac; Diaz-Nieto, Lorenza; García-Quiroz, Janice

    2017-01-01

    The aim of this study was to develop a semi automatic image processing algorithm (AIPA) based on the simultaneous information provided by X-ray and radioisotopic images to determine the biokinetic models of Tc-99m radiopharmaceuticals from quantification of image radiation activity in murine models. These radioisotopic images were obtained by a CCD (charge couple device) camera coupled to an ultrathin phosphorous screen in a preclinical multimodal imaging system (Xtreme, Bruker). The AIPA consisted of different image processing methods for background, scattering and attenuation correction on the activity quantification. A set of parametric identification algorithms was used to obtain the biokinetic models that characterize the interaction between different tissues and the radiopharmaceuticals considered in the study. The set of biokinetic models corresponded to the Tc-99m biodistribution observed in different ex vivo studies. This fact confirmed the contribution of the semi-automatic image processing technique developed in this study.

  11. Fuzzy logic and image processing techniques for the interpretation of seismic data

    NASA Astrophysics Data System (ADS)

    Orozco-del-Castillo, M. G.; Ortiz-Alemán, C.; Urrutia-Fucugauchi, J.; Rodríguez-Castellanos, A.

    2011-06-01

    Since interpretation of seismic data is usually a tedious and repetitive task, the ability to do so automatically or semi-automatically has become an important objective of recent research. We believe that the vagueness and uncertainty in the interpretation process makes fuzzy logic an appropriate tool to deal with seismic data. In this work we developed a semi-automated fuzzy inference system to detect the internal architecture of a mass transport complex (MTC) in seismic images. We propose that the observed characteristics of a MTC can be expressed as fuzzy if-then rules consisting of linguistic values associated with fuzzy membership functions. The constructions of the fuzzy inference system and various image processing techniques are presented. We conclude that this is a well-suited problem for fuzzy logic since the application of the proposed methodology yields a semi-automatically interpreted MTC which closely resembles the MTC from expert manual interpretation.

  12. Fully automatic region of interest selection in glomerular filtration rate estimation from 99mTc-DTPA renogram.

    PubMed

    Lin, Kun-Ju; Huang, Jia-Yann; Chen, Yung-Sheng

    2011-12-01

    Glomerular filtration rate (GFR) is a common accepted standard estimation of renal function. Gamma camera-based methods for estimating renal uptake of (99m)Tc-diethylenetriaminepentaacetic acid (DTPA) without blood or urine sampling have been widely used. Of these, the method introduced by Gates has been the most common method. Currently, most of gamma cameras are equipped with a commercial program for GFR determination, a semi-quantitative analysis by manually drawing region of interest (ROI) over each kidney. Then, the GFR value can be computed from the scintigraphic determination of (99m)Tc-DTPA uptake within the kidney automatically. Delineating the kidney area is difficult when applying a fixed threshold value. Moreover, hand-drawn ROIs are tedious, time consuming, and dependent highly on operator skill. Thus, we developed a fully automatic renal ROI estimation system based on the temporal changes in intensity counts, intensity-pair distribution image contrast enhancement method, adaptive thresholding, and morphological operations that can locate the kidney area and obtain the GFR value from a (99m)Tc-DTPA renogram. To evaluate the performance of the proposed approach, 30 clinical dynamic renograms were introduced. The fully automatic approach failed in one patient with very poor renal function. Four patients had a unilateral kidney, and the others had bilateral kidneys. The automatic contours from the remaining 54 kidneys were compared with the contours of manual drawing. The 54 kidneys were included for area error and boundary error analyses. There was high correlation between two physicians' manual contours and the contours obtained by our approach. For area error analysis, the mean true positive area overlap is 91%, the mean false negative is 13.4%, and the mean false positive is 9.3%. The boundary error is 1.6 pixels. The GFR calculated using this automatic computer-aided approach is reproducible and may be applied to help nuclear medicine physicians in clinical practice.

  13. User Interaction in Semi-Automatic Segmentation of Organs at Risk: a Case Study in Radiotherapy.

    PubMed

    Ramkumar, Anjana; Dolz, Jose; Kirisli, Hortense A; Adebahr, Sonja; Schimek-Jasch, Tanja; Nestle, Ursula; Massoptier, Laurent; Varga, Edit; Stappers, Pieter Jan; Niessen, Wiro J; Song, Yu

    2016-04-01

    Accurate segmentation of organs at risk is an important step in radiotherapy planning. Manual segmentation being a tedious procedure and prone to inter- and intra-observer variability, there is a growing interest in automated segmentation methods. However, automatic methods frequently fail to provide satisfactory result, and post-processing corrections are often needed. Semi-automatic segmentation methods are designed to overcome these problems by combining physicians' expertise and computers' potential. This study evaluates two semi-automatic segmentation methods with different types of user interactions, named the "strokes" and the "contour", to provide insights into the role and impact of human-computer interaction. Two physicians participated in the experiment. In total, 42 case studies were carried out on five different types of organs at risk. For each case study, both the human-computer interaction process and quality of the segmentation results were measured subjectively and objectively. Furthermore, different measures of the process and the results were correlated. A total of 36 quantifiable and ten non-quantifiable correlations were identified for each type of interaction. Among those pairs of measures, 20 of the contour method and 22 of the strokes method were strongly or moderately correlated, either directly or inversely. Based on those correlated measures, it is concluded that: (1) in the design of semi-automatic segmentation methods, user interactions need to be less cognitively challenging; (2) based on the observed workflows and preferences of physicians, there is a need for flexibility in the interface design; (3) the correlated measures provide insights that can be used in improving user interaction design.

  14. Different binarization processes validated against manual counts of fluorescent bacterial cells.

    PubMed

    Tamminga, Gerrit G; Paulitsch-Fuchs, Astrid H; Jansen, Gijsbert J; Euverink, Gert-Jan W

    2016-09-01

    State of the art software methods (such as fixed value approaches or statistical approaches) to create a binary image of fluorescent bacterial cells are not as accurate and precise as they should be for counting bacteria and measuring their area. To overcome these bottlenecks, we introduce biological significance to obtain a binary image from a greyscale microscopic image. Using our biological significance approach we are able to automatically count about the same number of cells as an individual researcher would do by manual/visual counting. Using the fixed value or statistical approach to obtain a binary image leads to about 20% less cells in automatic counting. In our procedure we included the area measurements of the bacterial cells to determine the right parameters for background subtraction and threshold values. In an iterative process the threshold and background subtraction values were incremented until the number of particles smaller than a typical bacterial cell is less than the number of bacterial cells with a certain area. This research also shows that every image has a specific threshold with respect to the optical system, magnification and staining procedure as well as the exposure time. The biological significance approach shows that automatic counting can be performed with the same accuracy, precision and reproducibility as manual counting. The same approach can be used to count bacterial cells using different optical systems (Leica, Olympus and Navitar), magnification factors (200× and 400×), staining procedures (DNA (Propidium Iodide) and RNA (FISH)) and substrates (polycarbonate filter or glass). Copyright © 2016 Elsevier B.V. All rights reserved.

  15. AUTOMATIC COUNTER

    DOEpatents

    Robinson, H.P.

    1960-06-01

    An automatic counter of alpha particle tracks recorded by a sensitive emulsion of a photographic plate is described. The counter includes a source of mcdulated dark-field illumination for developing light flashes from the recorded particle tracks as the photographic plate is automatically scanned in narrow strips. Photoelectric means convert the light flashes to proportional current pulses for application to an electronic counting circuit. Photoelectric means are further provided for developing a phase reference signal from the photographic plate in such a manner that signals arising from particle tracks not parallel to the edge of the plate are out of phase with the reference signal. The counting circuit includes provision for rejecting the out-of-phase signals resulting from unoriented tracks as well as signals resulting from spurious marks on the plate such as scratches, dust or grain clumpings, etc. The output of the circuit is hence indicative only of the tracks that would be counted by a human operator.

  16. An Evaluation of the Accuracy of the Subtraction Method Used for Determining Platelet Counts in Advanced Platelet-Rich Fibrin and Concentrated Growth Factor Preparations

    PubMed Central

    Watanabe, Taisuke; Isobe, Kazushige; Suzuki, Taiji; Kawabata, Hideo; Nakamura, Masayuki; Tsukioka, Tsuneyuki; Okudera, Toshimitsu; Okudera, Hajime; Uematsu, Kohya; Okuda, Kazuhiro; Nakata, Koh; Kawase, Tomoyuki

    2017-01-01

    Platelet concentrates should be quality-assured of purity and identity prior to clinical use. Unlike for the liquid form of platelet-rich plasma, platelet counts cannot be directly determined in solid fibrin clots and are instead calculated by subtracting the counts in other liquid or semi-clotted fractions from those in whole blood samples. Having long suspected the validity of this method, we herein examined the possible loss of platelets in the preparation process. Blood samples collected from healthy male donors were immediately centrifuged for advanced platelet-rich fibrin (A-PRF) and concentrated growth factors (CGF) according to recommended centrifugal protocols. Blood cells in liquid and semi-clotted fractions were directly counted. Platelets aggregated on clot surfaces were observed by scanning electron microscopy. A higher centrifugal force increased the numbers of platelets and platelet aggregates in the liquid red blood cell fraction and the semi-clotted red thrombus in the presence and absence of the anticoagulant, respectively. Nevertheless, the calculated platelet counts in A-PRF/CGF preparations were much higher than expected, rendering the currently accepted subtraction method inaccurate for determining platelet counts in fibrin clots. To ensure the quality of solid types of platelet concentrates chairside in a timely manner, a simple and accurate platelet-counting method should be developed immediately. PMID:29563413

  17. Bubble propagation in Hele-Shaw channels with centred constrictions

    NASA Astrophysics Data System (ADS)

    Franco-Gómez, Andrés; Thompson, Alice B.; Hazel, Andrew L.; Juel, Anne

    2018-04-01

    We study the propagation of finite bubbles in a Hele-Shaw channel, where a centred occlusion (termed a rail) is introduced to provide a small axially uniform depth constriction. For bubbles wide enough to span the channel, the system’s behaviour is similar to that of semi-infinite fingers and a symmetric static solution is stable. Here, we focus on smaller bubbles, in which case the symmetric static solution is unstable and the static bubble is displaced towards one of the deeper regions of the channel on either side of the rail. Using a combination of experiments and numerical simulations of a depth-averaged model, we show that a bubble propagating axially due to a small imposed flow rate can be stabilised in a steady symmetric mode centred on the rail through a subtle interaction between stabilising viscous forces and destabilising surface tension forces. However, for sufficiently large capillary numbers Ca, the ratio of viscous to surface tension forces, viscous forces in turn become destabilising thus returning the bubble to an off-centred propagation regime. With decreasing bubble size, the range of Ca for which steady centred propagation is stable decreases, and eventually vanishes through the coalescence of two supercritical pitchfork bifurcations. The depth-averaged model is found to accurately predict all the steady modes of propagation observed experimentally, and provides a comprehensive picture of the underlying steady bifurcation structure. However, for sufficiently large imposed flow rates, we find that initially centred bubbles do not converge onto a steady mode of propagation. Instead they transiently explore weakly unstable steady modes, an evolution which results in their break-up and eventual settling into a steady propagating state of changed topology.

  18. Semi-automatic volume measurement for orbital fat and total extraocular muscles based on Cube FSE-flex sequence in patients with thyroid-associated ophthalmopathy.

    PubMed

    Tang, X; Liu, H; Chen, L; Wang, Q; Luo, B; Xiang, N; He, Y; Zhu, W; Zhang, J

    2018-05-24

    To investigate the accuracy of two semi-automatic segmentation measurements based on magnetic resonance imaging (MRI) three-dimensional (3D) Cube fast spin echo (FSE)-flex sequence in phantoms, and to evaluate the feasibility of determining the volumetric alterations of orbital fat (OF) and total extraocular muscles (TEM) in patients with thyroid-associated ophthalmopathy (TAO) by semi-automatic segmentation. Forty-four fatty (n=22) and lean (n=22) phantoms were scanned by using Cube FSE-flex sequence with a 3 T MRI system. Their volumes were measured by manual segmentation (MS) and two semi-automatic segmentation algorithms (regional growing [RG], multi-dimensional threshold [MDT]). Pearson correlation and Bland-Altman analysis were used to evaluate the measuring accuracy of MS, RG, and MDT in phantoms as compared with the true volume. Then, OF and TEM volumes of 15 TAO patients and 15 normal controls were measured using MDT. Paired-sample t-tests were used to compare the volumes and volume ratios of different orbital tissues between TAO patients and controls. Each segmentation (MS RG, MDT) has a significant correlation (p<0.01) with true volume. There was a minimal bias for MS, and a stronger agreement between MDT and the true volume than RG and the true volume both in fatty and lean phantoms. The reproducibility of Cube FSE-flex determined MDT was adequate. The volumetric ratios of OF/globe (p<0.01), TEM/globe (p<0.01), whole orbit/globe (p<0.01) and bone orbit/globe (p<0.01) were significantly greater in TAO patients than those in healthy controls. MRI Cube FSE-flex determined MDT is a relatively accurate semi-automatic segmentation that can be used to evaluate OF and TEM volumes in clinic. Copyright © 2018 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  19. Recurrent neural network based virtual detection line

    NASA Astrophysics Data System (ADS)

    Kadikis, Roberts

    2018-04-01

    The paper proposes an efficient method for detection of moving objects in the video. The objects are detected when they cross a virtual detection line. Only the pixels of the detection line are processed, which makes the method computationally efficient. A Recurrent Neural Network processes these pixels. The machine learning approach allows one to train a model that works in different and changing outdoor conditions. Also, the same network can be trained for various detection tasks, which is demonstrated by the tests on vehicle and people counting. In addition, the paper proposes a method for semi-automatic acquisition of labeled training data. The labeling method is used to create training and testing datasets, which in turn are used to train and evaluate the accuracy and efficiency of the detection method. The method shows similar accuracy as the alternative efficient methods but provides greater adaptability and usability for different tasks.

  20. Measurements of the neutron dose and energy spectrum on the International Space Station during expeditions ISS-16 to ISS-21.

    PubMed

    Smith, M B; Akatov, Yu; Andrews, H R; Arkhangelsky, V; Chernykh, I V; Ing, H; Khoshooniy, N; Lewis, B J; Machrafi, R; Nikolaev, I; Romanenko, R Y; Shurshakov, V; Thirsk, R B; Tomi, L

    2013-01-01

    As part of the international Matroshka-R and Radi-N experiments, bubble detectors have been used on board the ISS in order to characterise the neutron dose and the energy spectrum of neutrons. Experiments using bubble dosemeters inside a tissue-equivalent phantom were performed during the ISS-16, ISS-18 and ISS-19 expeditions. During the ISS-20 and ISS-21 missions, the bubble dosemeters were supplemented by a bubble-detector spectrometer, a set of six detectors that was used to determine the neutron energy spectrum at various locations inside the ISS. The temperature-compensated spectrometer set used is the first to be developed specifically for space applications and its development is described in this paper. Results of the dose measurements indicate that the dose received at two different depths inside the phantom is not significantly different, suggesting that bubble detectors worn by a person provide an accurate reading of the dose received inside the body. The energy spectra measured using the spectrometer are in good agreement with previous measurements and do not show a strong dependence on the precise location inside the station. To aid the understanding of the bubble-detector response to charged particles in the space environment, calculations have been performed using a Monte-Carlo code, together with data collected on the ISS. These calculations indicate that charged particles contribute <2% to the bubble count on the ISS, and can therefore be considered as negligible for bubble-detector measurements in space.

  1. A Benchmark of Vehicle Maintenance Training Between the U.S. Air Force and a Civilian Industry Leader

    DTIC Science & Technology

    1992-09-01

    Aas :nosen to 113entlfy tasKs oerformed Dv reczcnizeo :omoe:ent automotive serv’ce Personnel :intry level o"ersonnei 4ere iot , ,ic udec i n tie sirve...Diagnose the cause of poor, intermittent, or no electric door and hatch/trunk lock operation. 10. Repair or replace switches, relays, actuators ...Semi-4utomative Temoerature Controls i. Cnecx ooeration of automatic ana semi-automatic neating, HP ventalation ana air-conaitioning ( HVAC ) control

  2. Implementation of a microcontroller-based semi-automatic coagulator.

    PubMed

    Chan, K; Kirumira, A; Elkateeb, A

    2001-01-01

    The coagulator is an instrument used in hospitals to detect clot formation as a function of time. Generally, these coagulators are very expensive and therefore not affordable by a doctors' office and small clinics. The objective of this project is to design and implement a low cost semi-automatic coagulator (SAC) prototype. The SAC is capable of assaying up to 12 samples and can perform the following tests: prothrombin time (PT), activated partial thromboplastin time (APTT), and PT/APTT combination. The prototype has been tested successfully.

  3. Semi-automatic version of the potentiometric titration method for characterization of uranium compounds.

    PubMed

    Cristiano, Bárbara F G; Delgado, José Ubiratan; da Silva, José Wanderley S; de Barros, Pedro D; de Araújo, Radier M S; Dias, Fábio C; Lopes, Ricardo T

    2012-09-01

    The potentiometric titration method was used for characterization of uranium compounds to be applied in intercomparison programs. The method is applied with traceability assured using a potassium dichromate primary standard. A semi-automatic version was developed to reduce the analysis time and the operator variation. The standard uncertainty in determining the total concentration of uranium was around 0.01%, which is suitable for uranium characterization and compatible with those obtained by manual techniques. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Application of DNA Chip Scanning Technology for Automatic Detection of Chlamydia trachomatis and Chlamydia pneumoniae Inclusions

    PubMed Central

    Bogdanov, Anita; Endrész, Valeria; Urbán, Szabolcs; Lantos, Ildikó; Deák, Judit; Burián, Katalin; Önder, Kamil; Ayaydin, Ferhan; Balázs, Péter

    2014-01-01

    Chlamydiae are obligate intracellular bacteria that propagate in the inclusion, a specific niche inside the host cell. The standard method for counting chlamydiae is immunofluorescent staining and manual counting of chlamydial inclusions. High- or medium-throughput estimation of the reduction in chlamydial inclusions should be the basis of testing antichlamydial compounds and other drugs that positively or negatively influence chlamydial growth, yet low-throughput manual counting is the common approach. To overcome the time-consuming and subjective manual counting, we developed an automatic inclusion-counting system based on a commercially available DNA chip scanner. Fluorescently labeled inclusions are detected by the scanner, and the image is processed by ChlamyCount, a custom plug-in of the ImageJ software environment. ChlamyCount was able to measure the inclusion counts over a 1-log-unit dynamic range with a high correlation to the theoretical counts. ChlamyCount was capable of accurately determining the MICs of the novel antimicrobial compound PCC00213 and the already known antichlamydial antibiotics moxifloxacin and tetracycline. ChlamyCount was also able to measure the chlamydial growth-altering effect of drugs that influence host-bacterium interaction, such as gamma interferon, DEAE-dextran, and cycloheximide. ChlamyCount is an easily adaptable system for testing antichlamydial antimicrobials and other compounds that influence Chlamydia-host interactions. PMID:24189259

  5. Closed circuit TV system automatically guides welding arc

    NASA Technical Reports Server (NTRS)

    Stephans, D. L.; Wall, W. A., Jr.

    1968-01-01

    Closed circuit television /CCTV/ system automatically guides a welding torch to position the welding arc accurately along weld seams. Digital counting and logic techniques incorporated in the control circuitry, ensure performance reliability.

  6. Comparison of the effects of model-based iterative reconstruction and filtered back projection algorithms on software measurements in pulmonary subsolid nodules.

    PubMed

    Cohen, Julien G; Kim, Hyungjin; Park, Su Bin; van Ginneken, Bram; Ferretti, Gilbert R; Lee, Chang Hyun; Goo, Jin Mo; Park, Chang Min

    2017-08-01

    To evaluate the differences between filtered back projection (FBP) and model-based iterative reconstruction (MBIR) algorithms on semi-automatic measurements in subsolid nodules (SSNs). Unenhanced CT scans of 73 SSNs obtained using the same protocol and reconstructed with both FBP and MBIR algorithms were evaluated by two radiologists. Diameter, mean attenuation, mass and volume of whole nodules and their solid components were measured. Intra- and interobserver variability and differences between FBP and MBIR were then evaluated using Bland-Altman method and Wilcoxon tests. Longest diameter, volume and mass of nodules and those of their solid components were significantly higher using MBIR (p < 0.05) with mean differences of 1.1% (limits of agreement, -6.4 to 8.5%), 3.2% (-20.9 to 27.3%) and 2.9% (-16.9 to 22.7%) and 3.2% (-20.5 to 27%), 6.3% (-51.9 to 64.6%), 6.6% (-50.1 to 63.3%), respectively. The limits of agreement between FBP and MBIR were within the range of intra- and interobserver variability for both algorithms with respect to the diameter, volume and mass of nodules and their solid components. There were no significant differences in intra- or interobserver variability between FBP and MBIR (p > 0.05). Semi-automatic measurements of SSNs significantly differed between FBP and MBIR; however, the differences were within the range of measurement variability. • Intra- and interobserver reproducibility of measurements did not differ between FBP and MBIR. • Differences in SSNs' semi-automatic measurement induced by reconstruction algorithms were not clinically significant. • Semi-automatic measurement may be conducted regardless of reconstruction algorithm. • SSNs' semi-automated classification agreement (pure vs. part-solid) did not significantly differ between algorithms.

  7. A semi-automatic computer-aided method for surgical template design

    NASA Astrophysics Data System (ADS)

    Chen, Xiaojun; Xu, Lu; Yang, Yue; Egger, Jan

    2016-02-01

    This paper presents a generalized integrated framework of semi-automatic surgical template design. Several algorithms were implemented including the mesh segmentation, offset surface generation, collision detection, ruled surface generation, etc., and a special software named TemDesigner was developed. With a simple user interface, a customized template can be semi- automatically designed according to the preoperative plan. Firstly, mesh segmentation with signed scalar of vertex is utilized to partition the inner surface from the input surface mesh based on the indicated point loop. Then, the offset surface of the inner surface is obtained through contouring the distance field of the inner surface, and segmented to generate the outer surface. Ruled surface is employed to connect inner and outer surfaces. Finally, drilling tubes are generated according to the preoperative plan through collision detection and merging. It has been applied to the template design for various kinds of surgeries, including oral implantology, cervical pedicle screw insertion, iliosacral screw insertion and osteotomy, demonstrating the efficiency, functionality and generality of our method.

  8. A semi-automatic computer-aided method for surgical template design

    PubMed Central

    Chen, Xiaojun; Xu, Lu; Yang, Yue; Egger, Jan

    2016-01-01

    This paper presents a generalized integrated framework of semi-automatic surgical template design. Several algorithms were implemented including the mesh segmentation, offset surface generation, collision detection, ruled surface generation, etc., and a special software named TemDesigner was developed. With a simple user interface, a customized template can be semi- automatically designed according to the preoperative plan. Firstly, mesh segmentation with signed scalar of vertex is utilized to partition the inner surface from the input surface mesh based on the indicated point loop. Then, the offset surface of the inner surface is obtained through contouring the distance field of the inner surface, and segmented to generate the outer surface. Ruled surface is employed to connect inner and outer surfaces. Finally, drilling tubes are generated according to the preoperative plan through collision detection and merging. It has been applied to the template design for various kinds of surgeries, including oral implantology, cervical pedicle screw insertion, iliosacral screw insertion and osteotomy, demonstrating the efficiency, functionality and generality of our method. PMID:26843434

  9. A semi-automatic computer-aided method for surgical template design.

    PubMed

    Chen, Xiaojun; Xu, Lu; Yang, Yue; Egger, Jan

    2016-02-04

    This paper presents a generalized integrated framework of semi-automatic surgical template design. Several algorithms were implemented including the mesh segmentation, offset surface generation, collision detection, ruled surface generation, etc., and a special software named TemDesigner was developed. With a simple user interface, a customized template can be semi- automatically designed according to the preoperative plan. Firstly, mesh segmentation with signed scalar of vertex is utilized to partition the inner surface from the input surface mesh based on the indicated point loop. Then, the offset surface of the inner surface is obtained through contouring the distance field of the inner surface, and segmented to generate the outer surface. Ruled surface is employed to connect inner and outer surfaces. Finally, drilling tubes are generated according to the preoperative plan through collision detection and merging. It has been applied to the template design for various kinds of surgeries, including oral implantology, cervical pedicle screw insertion, iliosacral screw insertion and osteotomy, demonstrating the efficiency, functionality and generality of our method.

  10. Towards the Real-Time Evaluation of Collaborative Activities: Integration of an Automatic Rater of Collaboration Quality in the Classroom from the Teacher's Perspective

    ERIC Educational Resources Information Center

    Chounta, Irene-Angelica; Avouris, Nikolaos

    2016-01-01

    This paper presents the integration of a real time evaluation method of collaboration quality in a monitoring application that supports teachers in class orchestration. The method is implemented as an automatic rater of collaboration quality and studied in a real time scenario of use. We argue that automatic and semi-automatic methods which…

  11. Optical Detection Of Cryogenic Leaks

    NASA Technical Reports Server (NTRS)

    Wyett, Lynn M.

    1988-01-01

    Conceptual system identifies leakage without requiring shutdown for testing. Proposed device detects and indicates leaks of cryogenic liquids automatically. Detector makes it unnecessary to shut equipment down so it can be checked for leakage by soap-bubble or helium-detection methods. Not necessary to mix special gases or other materials with cryogenic liquid flowing through equipment.

  12. Count Me In! on the Automaticity of Numerosity Processing

    ERIC Educational Resources Information Center

    Naparstek, Sharon; Henik, Avishai

    2010-01-01

    Extraction of numerosity (i.e., enumeration) is an essential component of mathematical abilities. The current study asked how automatic is the processing of numerosity and whether automatic activation is task dependent. Participants were presented with displays containing a variable number of digits and were asked to pay attention to the number of…

  13. Semi-automatic computerized approach to radiological quantification in rheumatoid arthritis

    NASA Astrophysics Data System (ADS)

    Steiner, Wolfgang; Schoeffmann, Sylvia; Prommegger, Andrea; Boegl, Karl; Klinger, Thomas; Peloschek, Philipp; Kainberger, Franz

    2004-04-01

    Rheumatoid Arthritis (RA) is a common systemic disease predominantly involving the joints. Precise diagnosis and follow-up therapy requires objective quantification. For this purpose, radiological analyses using standardized scoring systems are considered to be the most appropriate method. The aim of our study is to develop a semi-automatic image analysis software, especially applicable for scoring of joints in rheumatic disorders. The X-Ray RheumaCoach software delivers various scoring systems (Larsen-Score and Ratingen-Rau-Score) which can be applied by the scorer. In addition to the qualitative assessment of joints performed by the radiologist, a semi-automatic image analysis for joint detection and measurements of bone diameters and swollen tissue supports the image assessment process. More than 3000 radiographs from hands and feet of more than 200 RA patients were collected, analyzed, and statistically evaluated. Radiographs were quantified using conventional paper-based Larsen score and the X-Ray RheumaCoach software. The use of the software shortened the scoring time by about 25 percent and reduced the rate of erroneous scorings in all our studies. Compared to paper-based scoring methods, the X-Ray RheumaCoach software offers several advantages: (i) Structured data analysis and input that minimizes variance by standardization, (ii) faster and more precise calculation of sum scores and indices, (iii) permanent data storing and fast access to the software"s database, (iv) the possibility of cross-calculation to other scores, (v) semi-automatic assessment of images, and (vii) reliable documentation of results in the form of graphical printouts.

  14. AUTOMATIC HAND COUNTER

    DOEpatents

    Mann J.R.; Wainwright, A.E.

    1963-06-11

    An automatic, personnel-operated, alpha-particle hand monitor is described which functions as a qualitative instrument to indicate to the person using it whether his hands are cold'' or hot.'' The monitor is activated by a push button and includes several capacitor-triggered thyratron tubes. Upon release of the push button, the monitor starts the counting of the radiation present on the hands of the person. If the count of the radiation exceeds a predetermined level within a predetermined time, then a capacitor will trigger a first thyratron tube to light a hot'' lamp. If, however, the count is below such level during this time period, another capacitor will fire a second thyratron to light a safe'' lamp. (AEC)

  15. Automatic and semi-automatic approaches for arteriolar-to-venular computation in retinal photographs

    NASA Astrophysics Data System (ADS)

    Mendonça, Ana Maria; Remeseiro, Beatriz; Dashtbozorg, Behdad; Campilho, Aurélio

    2017-03-01

    The Arteriolar-to-Venular Ratio (AVR) is a popular dimensionless measure which allows the assessment of patients' condition for the early diagnosis of different diseases, including hypertension and diabetic retinopathy. This paper presents two new approaches for AVR computation in retinal photographs which include a sequence of automated processing steps: vessel segmentation, caliber measurement, optic disc segmentation, artery/vein classification, region of interest delineation, and AVR calculation. Both approaches have been tested on the INSPIRE-AVR dataset, and compared with a ground-truth provided by two medical specialists. The obtained results demonstrate the reliability of the fully automatic approach which provides AVR ratios very similar to at least one of the observers. Furthermore, the semi-automatic approach, which includes the manual modification of the artery/vein classification if needed, allows to significantly reduce the error to a level below the human error.

  16. A computational efficient modelling of laminar separation bubbles

    NASA Technical Reports Server (NTRS)

    Dini, Paolo; Maughmer, Mark D.

    1990-01-01

    In predicting the aerodynamic characteristics of airfoils operating at low Reynolds numbers, it is often important to account for the effects of laminar (transitional) separation bubbles. Previous approaches to the modelling of this viscous phenomenon range from fast but sometimes unreliable empirical correlations for the length of the bubble and the associated increase in momentum thickness, to more accurate but significantly slower displacement-thickness iteration methods employing inverse boundary-layer formulations in the separated regions. Since the penalty in computational time associated with the more general methods is unacceptable for airfoil design applications, use of an accurate yet computationally efficient model is highly desirable. To this end, a semi-empirical bubble model was developed and incorporated into the Eppler and Somers airfoil design and analysis program. The generality and the efficiency was achieved by successfully approximating the local viscous/inviscid interaction, the transition location, and the turbulent reattachment process within the framework of an integral boundary-layer method. Comparisons of the predicted aerodynamic characteristics with experimental measurements for several airfoils show excellent and consistent agreement for Reynolds numbers from 2,000,000 down to 100,000.

  17. Adaptive and automatic red blood cell counting method based on microscopic hyperspectral imaging technology

    NASA Astrophysics Data System (ADS)

    Liu, Xi; Zhou, Mei; Qiu, Song; Sun, Li; Liu, Hongying; Li, Qingli; Wang, Yiting

    2017-12-01

    Red blood cell counting, as a routine examination, plays an important role in medical diagnoses. Although automated hematology analyzers are widely used, manual microscopic examination by a hematologist or pathologist is still unavoidable, which is time-consuming and error-prone. This paper proposes a full-automatic red blood cell counting method which is based on microscopic hyperspectral imaging of blood smears and combines spatial and spectral information to achieve high precision. The acquired hyperspectral image data of the blood smear in the visible and near-infrared spectral range are firstly preprocessed, and then a quadratic blind linear unmixing algorithm is used to get endmember abundance images. Based on mathematical morphological operation and an adaptive Otsu’s method, a binaryzation process is performed on the abundance images. Finally, the connected component labeling algorithm with magnification-based parameter setting is applied to automatically select the binary images of red blood cell cytoplasm. Experimental results show that the proposed method can perform well and has potential for clinical applications.

  18. Design for Manufacturing and Assembly in Apparel. Part 1. Handbook

    DTIC Science & Technology

    1994-02-01

    reduced and the inverted pleat was eliminated to take advantage of the automatic seam stitcher . The shape and size of the side back section seam...coin pocket. The size and shape of the pocket would be designed to best utilize the equipment. An automatic dart stitcher may be utilized to stitch the...with stacker Semi-automatic serging units with stacker Automatic seaming units/profile stitchers Programmable seaming units for various operations

  19. Broken bridges: a counter-example of the ER=EPR conjecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Pisin; Wu, Chih-Hung; Yeom, Dong-han, E-mail: pisinchen@phys.ntu.edu.tw, E-mail: b02202007@ntu.edu.tw, E-mail: innocent.yeom@gmail.com

    In this paper, we provide a counter-example to the ER=EPR conjecture. In an anti-de Sitter space, we construct a pair of maximally entangled but separated black holes. Due to the vacuum decay of the anti-de Sitter background toward a deeper vacuum, these two parts can be trapped by bubbles. If these bubbles are reasonably large, then within the scrambling time, there should appear an Einstein-Rosen bridge between the two black holes. Now by tracing more details on the bubble dynamics, one can identify parameters such that one of the two bubbles either monotonically shrinks or expands. Because of the changemore » of vacuum energy, one side of the black hole would evaporate completely. Due to the shrinking of the apparent horizon, a signal of one side of the Einstein-Rosen bridge can be viewed from the opposite side. We analytically and numerically demonstrate that within a reasonable semi-classical parameter regime, such process can happen. Bubbles are a non-perturbative effect, which is the crucial reason that allows the transmission of information between the two black holes through the Einstein-Rosen bridge, even though the probability is highly suppressed. Therefore, the ER=EPR conjecture cannot be generic in its present form and its validity maybe restricted.« less

  20. Development of bubble memory recorder onboard Japan Earth Resources Satellite-1

    NASA Astrophysics Data System (ADS)

    Araki, Tsunehiko; Ishida, Chu; Ochiai, Kiyoshi; Nozue, Tatsuhiro; Tachibana, Kyozo; Yoshida, Kazutoshi

    The Bubble Memory Recorder (BMR) developed for use on the Earth Resources Satellite is described in terms of its design, capabilities, and functions. The specifications of the BMR are given listing memory capacity, functions, and interface types for data, command, and telemetry functions. The BMR has an emergency signal interface to provide contingency recording, and a satellite-separation signal interface can be turned on automatically by signal input. Data are stored in a novolatile memory device so that the memory is retained during power outages. The BMR is characterized by a capability for random access, nonvolatility, and a solid-state design that is useful for space operations since it does not disturb spacecraft attitude.

  1. Quantum dots-based double imaging combined with organic dye imaging to establish an automatic computerized method for cancer Ki67 measurement.

    PubMed

    Wang, Lin-Wei; Qu, Ai-Ping; Liu, Wen-Lou; Chen, Jia-Mei; Yuan, Jing-Ping; Wu, Han; Li, Yan; Liu, Juan

    2016-02-03

    As a widely used proliferative marker, Ki67 has important impacts on cancer prognosis, especially for breast cancer (BC). However, variations in analytical practice make it difficult for pathologists to manually measure Ki67 index. This study is to establish quantum dots (QDs)-based double imaging of nuclear Ki67 as red signal by QDs-655, cytoplasmic cytokeratin (CK) as yellow signal by QDs-585, and organic dye imaging of cell nucleus as blue signal by 4',6-diamidino-2-phenylindole (DAPI), and to develop a computer-aided automatic method for Ki67 index measurement. The newly developed automatic computerized Ki67 measurement could efficiently recognize and count Ki67-positive cancer cell nuclei with red signals and cancer cell nuclei with blue signals within cancer cell cytoplasmic with yellow signals. Comparisons of computerized Ki67 index, visual Ki67 index, and marked Ki67 index for 30 patients of 90 images with Ki67 ≤ 10% (low grade), 10% < Ki67 < 50% (moderate grade), and Ki67 ≥ 50% (high grade) showed computerized Ki67 counting is better than visual Ki67 counting, especially for Ki67 low and moderate grades. Based on QDs-based double imaging and organic dye imaging on BC tissues, this study successfully developed an automatic computerized Ki67 counting method to measure Ki67 index.

  2. A procedural method for the efficient implementation of full-custom VLSI designs

    NASA Technical Reports Server (NTRS)

    Belk, P.; Hickey, N.

    1987-01-01

    An imbedded language system for the layout of very large scale integration (VLSI) circuits is examined. It is shown that through the judicious use of this system, a large variety of circuits can be designed with circuit density and performance comparable to traditional full-custom design methods, but with design costs more comparable to semi-custom design methods. The high performance of this methodology is attributable to the flexibility of procedural descriptions of VLSI layouts and to a number of automatic and semi-automatic tools within the system.

  3. Application of semi-active RFID power meter in automatic verification pipeline and intelligent storage system

    NASA Astrophysics Data System (ADS)

    Chen, Xiangqun; Huang, Rui; Shen, Liman; chen, Hao; Xiong, Dezhi; Xiao, Xiangqi; Liu, Mouhai; Xu, Renheng

    2018-03-01

    In this paper, the semi-active RFID watt-hour meter is applied to automatic test lines and intelligent warehouse management, from the transmission system, test system and auxiliary system, monitoring system, realize the scheduling of watt-hour meter, binding, control and data exchange, and other functions, make its more accurate positioning, high efficiency of management, update the data quickly, all the information at a glance. Effectively improve the quality, efficiency and automation of verification, and realize more efficient data management and warehouse management.

  4. Automatic identification of bullet signatures based on consecutive matching striae (CMS) criteria.

    PubMed

    Chu, Wei; Thompson, Robert M; Song, John; Vorburger, Theodore V

    2013-09-10

    The consecutive matching striae (CMS) numeric criteria for firearm and toolmark identifications have been widely accepted by forensic examiners, although there have been questions concerning its observer subjectivity and limited statistical support. In this paper, based on signal processing and extraction, a model for the automatic and objective counting of CMS is proposed. The position and shape information of the striae on the bullet land is represented by a feature profile, which is used for determining the CMS number automatically. Rapid counting of CMS number provides a basis for ballistics correlations with large databases and further statistical and probability analysis. Experimental results in this report using bullets fired from ten consecutively manufactured barrels support this developed model. Published by Elsevier Ireland Ltd.

  5. Hematological markers and biochemical profiles in terms of gender and age of captive collared peccaries (Tayassu tajacu) in eastern Amazon.

    PubMed

    Jorge, E M; Silva, C J O; Ritter, R A; Monteiro, M V B; Albuquerque, N I; Kahwage, P R; Monteiro, F O B; Costa, C T C; Rahal, S C; Silva Filho, E

    2015-11-25

    Complete blood counts and blood biochemical analyses are laboratory tests that allow the monitoring of physiological condition, nutrition, and health in free-living or captive wild animals. When interpreting these tests, it is essential to compare the results with reference ranges that are suitable for the species. Few studies have been conducted on the hematological and biochemical characteristics of Tayassu tajacu, particularly for animals raised in the Amazon biome. The objectives of this study were to evaluate the influence of age and gender on the hematological and biochemical profiles of captive T. tajacu, and to establish reference intervals for these parameters. Complete blood counts and biochemical analyses were performed using manual methods and semi-automatic equipment, respectively. There were significant differences in relation to age in hematocrit and hemoglobin levels, and mean cell volumes, in captive T. tajacu. No basophils were observed, and the neutrophil:lymphocyte ratio was less than 1. Levels of total protein, urea, phosphorus, and alkaline phosphatase were significantly affected by age (P < 0.05). Gender did not affect any of the results. The hematological and biochemical parameters for this species were determined, and may be used as reference ranges for captive T. tajacu.

  6. 10 CFR Appendix J to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Automatic and Semi-Automatic Clothes...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...'s true energy consumption characteristics as to provide materially inaccurate comparative data... clothes washers should be totally representative of the design, construction, and control system that will...

  7. 10 CFR Appendix J to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Automatic and Semi-Automatic Clothes...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...'s true energy consumption characteristics as to provide materially inaccurate comparative data... clothes washers should be totally representative of the design, construction, and control system that will...

  8. 10 CFR Appendix J to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Automatic and Semi-Automatic Clothes...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...'s true energy consumption characteristics as to provide materially inaccurate comparative data... clothes washers should be totally representative of the design, construction, and control system that will...

  9. Automatic measurements and computations for radiochemical analyses

    USGS Publications Warehouse

    Rosholt, J.N.; Dooley, J.R.

    1960-01-01

    In natural radioactive sources the most important radioactive daughter products useful for geochemical studies are protactinium-231, the alpha-emitting thorium isotopes, and the radium isotopes. To resolve the abundances of these thorium and radium isotopes by their characteristic decay and growth patterns, a large number of repeated alpha activity measurements on the two chemically separated elements were made over extended periods of time. Alpha scintillation counting with automatic measurements and sample changing is used to obtain the basic count data. Generation of the required theoretical decay and growth functions, varying with time, and the least squares solution of the overdetermined simultaneous count rate equations are done with a digital computer. Examples of the complex count rate equations which may be solved and results of a natural sample containing four ??-emitting isotopes of thorium are illustrated. These methods facilitate the determination of the radioactive sources on the large scale required for many geochemical investigations.

  10. To do it or to let an automatic tool do it? The priority of control over effort.

    PubMed

    Osiurak, François; Wagner, Clara; Djerbi, Sara; Navarro, Jordan

    2013-01-01

    The aim of the present study is to provide experimental data relevant to the issue of what leads humans to use automatic tools. Two answers can be offered. The first is that humans strive to minimize physical and/or cognitive effort (principle of least effort). The second is that humans tend to keep their perceived control over the environment (principle of more control). These two factors certainly play a role, but the question raised here is to what do people give priority in situations wherein both manual and automatic actions take the same time - minimizing effort or keeping perceived control? To answer that question, we built four experiments in which participants were confronted with a recurring choice between performing a task manually (physical effort) or in a semi-automatic way (cognitive effort) versus using an automatic tool that completes the task for them (no effort). In this latter condition, participants were required to follow the progression of the automatic tool step by step. Our results showed that participants favored the manual or semi-automatic condition over the automatic condition. However, when they were offered the opportunity to perform recreational tasks in parallel, the shift toward manual condition disappeared. The findings give support to the idea that people give priority to keeping control over minimizing effort.

  11. Automatic three-dimensional tracking of particles with high-numerical-aperture digital lensless holographic microscopy.

    PubMed

    Restrepo, John F; Garcia-Sucerquia, Jorge

    2012-02-15

    We present an automatic procedure for 3D tracking of micrometer-sized particles with high-NA digital lensless holographic microscopy. The method uses a two-feature approach to search for the best focal planes and to distinguish particles from artifacts or other elements on the reconstructed stream of the holograms. A set of reconstructed images is axially projected onto a single image. From the projected image, the centers of mass of all the reconstructed elements are identified. Starting from the centers of mass, the morphology of the profile of the maximum intensity along the reconstruction direction allows for the distinguishing of particles from others elements. The method is tested with modeled holograms and applied to automatically track micrometer-sized bubbles in a sample of 4 mm3 of soda.

  12. Thin Film Interference: An Experiment with Microwaves and Paraffin Oil

    ERIC Educational Resources Information Center

    D'Anna, Michele; Corridoni, Tommaso

    2015-01-01

    Thin film interference manifests itself in a wide range of visually pleasing situations in everyday life (in the colored effects caused by a drop of oil on water, in soap bubbles, etc.) and is also involved in important technical applications (semi-reflecting mirrors, anti-reflection lenses, etc.). Yet, despite its familiarity, high school…

  13. Relative acoustic frequency response of induced methane, carbon dioxide and air gas bubble plumes, observed laterally.

    PubMed

    Kubilius, Rokas; Pedersen, Geir

    2016-10-01

    There is an increased need to detect, identify, and monitor natural and manmade seabed gas leaks. Fisheries echosounders are well suited to monitor large volumes of water and acoustic frequency response [normalized acoustic backscatter, when a measure at one selected frequency is used as a denominator, r(f)] is commonly used to identify echoes from fish and zooplankton species. Information on gas plume r(f) would be valuable for automatic detection of subsea leaks and for separating bubble plumes from natural targets such as swimbladder-bearing fish. Controlled leaks were produced with a specially designed instrument frame suspended in mid-water in a sheltered fjord. The frame was equipped with echosounders, stereo-camera, and gas-release nozzles. The r(f) of laterally observed methane, carbon dioxide, and air plumes (0.040-29 l/min) were measured at 70, 120, 200, and 333 kHz, with bubble sizes determined optically. The observed bubble size range (1-25 mm) was comparable to that reported in the literature for natural cold seeps of methane. A negative r(f) with increasing frequency was observed, namely, r(f) of about 0.7, 0.6, and 0.5 at 120, 200, and 333 kHz when normalized to 70 kHz. Measured plume r(f) is also compared to resolved, single bubble target strength-based, and modeled r(f).

  14. Noise reduction by the application of an air-bubble curtain in offshore pile driving

    NASA Astrophysics Data System (ADS)

    Tsouvalas, A.; Metrikine, A. V.

    2016-06-01

    Underwater noise pollution is a by-product of marine industrial operations. In particular, the noise generated when a foundation pile is driven into the soil with an impact hammer is considered to be harmful for the aquatic species. In an attempt to reduce the ecological footprint, several noise mitigation techniques have been investigated. Among the various solutions proposed, the air-bubble curtain is often applied due to its efficacy in noise reduction. In this paper, a model is proposed for the investigation of the sound reduction during marine piling when an air-bubble curtain is placed around the pile. The model consists of the pile, the surrounding water and soil media, and the air-bubble curtain which is positioned at a certain distance from the pile surface. The solution approach is semi-analytical and is based on the dynamic sub-structuring technique and the modal decomposition method. Two main results of the paper can be distinguished. First, a new model is proposed that can be used for predictions of the noise levels in a computationally efficient manner. Second, an analysis is presented of the principal mechanisms that are responsible for the noise reduction due to the application of the air-bubble curtain in marine piling. The understanding of these mechanisms turns to be crucial for the exploitation of the maximum efficiency of the system. It is shown that the principal mechanism of noise reduction depends strongly on the frequency content of the radiated sound and the characteristics of the bubbly medium. For piles of large diameter which radiate most of the acoustic energy at relatively low frequencies, the noise reduction is mainly attributed to the mismatch of the acoustic impedances between the seawater and the bubbly layer. On the contrary, for smaller piles and when the radiated acoustic energy is concentrated at frequencies close to, or higher than, the resonance frequency of the air bubbles, the sound absorption within the bubbly layer becomes critical.

  15. Validation of a semi-automatic protocol for the assessment of the tear meniscus central area based on open-source software

    NASA Astrophysics Data System (ADS)

    Pena-Verdeal, Hugo; Garcia-Resua, Carlos; Yebra-Pimentel, Eva; Giraldez, Maria J.

    2017-08-01

    Purpose: Different lower tear meniscus parameters can be clinical assessed on dry eye diagnosis. The aim of this study was to propose and analyse the variability of a semi-automatic method for measuring lower tear meniscus central area (TMCA) by using open source software. Material and methods: On a group of 105 subjects, one video of the lower tear meniscus after fluorescein instillation was generated by a digital camera attached to a slit-lamp. A short light beam (3x5 mm) with moderate illumination in the central portion of the meniscus (6 o'clock) was used. Images were extracted from each video by a masked observer. By using an open source software based on Java (NIH ImageJ), a further observer measured in a masked and randomized order the TMCA in the short light beam illuminated area by two methods: (1) manual method, where TMCA images was "manually" measured; (2) semi-automatic method, where TMCA images were transformed in an 8-bit-binary image, then holes inside this shape were filled and on the isolated shape, the area size was obtained. Finally, both measurements, manual and semi-automatic, were compared. Results: Paired t-test showed no statistical difference between both techniques results (p = 0.102). Pearson correlation between techniques show a significant positive near to perfect correlation (r = 0.99; p < 0.001). Conclusions: This study showed a useful tool to objectively measure the frontal central area of the meniscus in photography by free open source software.

  16. Semi-automatic mapping of cultural heritage from airborne laser scanning using deep learning

    NASA Astrophysics Data System (ADS)

    Due Trier, Øivind; Salberg, Arnt-Børre; Holger Pilø, Lars; Tonning, Christer; Marius Johansen, Hans; Aarsten, Dagrun

    2016-04-01

    This paper proposes to use deep learning to improve semi-automatic mapping of cultural heritage from airborne laser scanning (ALS) data. Automatic detection methods, based on traditional pattern recognition, have been applied in a number of cultural heritage mapping projects in Norway for the past five years. Automatic detection of pits and heaps have been combined with visual interpretation of the ALS data for the mapping of deer hunting systems, iron production sites, grave mounds and charcoal kilns. However, the performance of the automatic detection methods varies substantially between ALS datasets. For the mapping of deer hunting systems on flat gravel and sand sediment deposits, the automatic detection results were almost perfect. However, some false detections appeared in the terrain outside of the sediment deposits. These could be explained by other pit-like landscape features, like parts of river courses, spaces between boulders, and modern terrain modifications. However, these were easy to spot during visual interpretation, and the number of missed individual pitfall traps was still low. For the mapping of grave mounds, the automatic method produced a large number of false detections, reducing the usefulness of the semi-automatic approach. The mound structure is a very common natural terrain feature, and the grave mounds are less distinct in shape than the pitfall traps. Still, applying automatic mound detection on an entire municipality did lead to a new discovery of an Iron Age grave field with more than 15 individual mounds. Automatic mound detection also proved to be useful for a detailed re-mapping of Norway's largest Iron Age grave yard, which contains almost 1000 individual graves. Combined pit and mound detection has been applied to the mapping of more than 1000 charcoal kilns that were used by an iron work 350-200 years ago. The majority of charcoal kilns were indirectly detected as either pits on the circumference, a central mound, or both. However, kilns with a flat interior and a shallow ditch along the circumference were often missed by the automatic detection method. The successfulness of automatic detection seems to depend on two factors: (1) the density of ALS ground hits on the cultural heritage structures being sought, and (2) to what extent these structures stand out from natural terrain structures. The first factor may, to some extent, be improved by using a higher number of ALS pulses per square meter. The second factor is difficult to change, and also highlights another challenge: how to make a general automatic method that is applicable in all types of terrain within a country. The mixed experience with traditional pattern recognition for semi-automatic mapping of cultural heritage led us to consider deep learning as an alternative approach. The main principle is that a general feature detector has been trained on a large image database. The feature detector is then tailored to a specific task by using a modest number of images of true and false examples of the features being sought. Results of using deep learning are compared with previous results using traditional pattern recognition.

  17. Control of treatment size in cavitation-enhanced high-intensity focused ultrasound using radio-frequency echo signals

    NASA Astrophysics Data System (ADS)

    Tomiyasu, Kentaro; Takagi, Ryo; Iwasaki, Ryosuke; Yoshizawa, Shin; Umemura, Shin-ichiro

    2017-07-01

    In high-intensity focused ultrasound (HIFU) treatment, controlling the ultrasound dose at each focal target spot is important because it is a problem that the length of the coagulated region in front of the focal point deviates owing to the differences in absorption in each focal target spot and attenuation in the intervening tissues. In this study, the detected changes in the power spectra of HIFU echoes were used by controlling the HIFU duration in the “trigger HIFU” sequence with the aim to increase coagulation size through the enhancement of the ultrasonic heating by the cavitation induced by the preceding extremely high intensity short “trigger” pulse. The result shows that this method can be used to detect boiling bubbles and the following generated cavitation bubbles at their early stage. By automatically stopping HIFU exposure immediately after detecting the bubbles, overheating was prevented and the deviation of the length of the coagulated region was reduced.

  18. Proposal of a taste evaluating method of the sponge cake by using 3D range sensor

    NASA Astrophysics Data System (ADS)

    Kato, Kunihito; Yamamoto, Kazuhiko; Ogawa, Noriko

    2002-10-01

    Nowadays, the image processing techniques are while applying to the food industry in many situations. The most of these researches are applications for the quality control in plants, and there are hardly any cases of measuring the 'taste'. We are developing the measuring system of the deliciousness by using the image sensing. In this paper, we propose the estimation method of the deliciousness of a sponge cake. Considering about the deliciousness of the sponge cake, if the size of the bubbles on the surface is small and the number of them is large, then it is defined that the deliciousness of the sponge cake is better in the field of the food science. We proposed a method of detection bubbles in the surface of the sectional sponge cake automatically by using 3-D image processing. By the statistical information of these detected bubbles based on the food science, the deliciousness is estimated.

  19. Passenger Flow Estimation and Characteristics Expansion

    DOT National Transportation Integrated Search

    2016-04-01

    Mark R. McCord (ORCID ID 0000-0002-6293-3143) The objectives of this study are to investigate the estimation of bus passenger boarding-to-alighting (B2A) flows using Automatic Passenger Count (APC) and Automatic Fare Collection (AFC) (fare-box) data ...

  20. Semi-automated CCTV surveillance: the effects of system confidence, system accuracy and task complexity on operator vigilance, reliance and workload.

    PubMed

    Dadashi, N; Stedmon, A W; Pridmore, T P

    2013-09-01

    Recent advances in computer vision technology have lead to the development of various automatic surveillance systems, however their effectiveness is adversely affected by many factors and they are not completely reliable. This study investigated the potential of a semi-automated surveillance system to reduce CCTV operator workload in both detection and tracking activities. A further focus of interest was the degree of user reliance on the automated system. A simulated prototype was developed which mimicked an automated system that provided different levels of system confidence information. Dependent variable measures were taken for secondary task performance, reliance and subjective workload. When the automatic component of a semi-automatic CCTV surveillance system provided reliable system confidence information to operators, workload significantly decreased and spare mental capacity significantly increased. Providing feedback about system confidence and accuracy appears to be one important way of making the status of the automated component of the surveillance system more 'visible' to users and hence more effective to use. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  1. Automatic detection and counting of cattle in UAV imagery based on machine vision technology (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Rahnemoonfar, Maryam; Foster, Jamie; Starek, Michael J.

    2017-05-01

    Beef production is the main agricultural industry in Texas, and livestock are managed in pasture and rangeland which are usually huge in size, and are not easily accessible by vehicles. The current research method for livestock location identification and counting is visual observation which is very time consuming and costly. For animals on large tracts of land, manned aircraft may be necessary to count animals which is noisy and disturbs the animals, and may introduce a source of error in counts. Such manual approaches are expensive, slow and labor intensive. In this paper we study the combination of small unmanned aerial vehicle (sUAV) and machine vision technology as a valuable solution to manual animal surveying. A fixed-wing UAV fitted with GPS and digital RGB camera for photogrammetry was flown at the Welder Wildlife Foundation in Sinton, TX. Over 600 acres were flown with four UAS flights and individual photographs used to develop orthomosaic imagery. To detect animals in UAV imagery, a fully automatic technique was developed based on spatial and spectral characteristics of objects. This automatic technique can even detect small animals that are partially occluded by bushes. Experimental results in comparison to ground-truth show the effectiveness of our algorithm.

  2. Degree counting and Shadow system for Toda system of rank two: One bubbling

    NASA Astrophysics Data System (ADS)

    Lee, Youngae; Lin, Chang-Shou; Wei, Juncheng; Yang, Wen

    2018-04-01

    We initiate the program for computing the Leray-Schauder topological degree for Toda systems of rank two. This program still contains a lot of challenging problems for analysts. As the first step, we prove that if a sequence of solutions (u1k ,u2k) blows up, then one of hje ujk/∫Mhje ujk dvg, j = 1 , 2 tends to a sum of Dirac measures. This is so-called the phenomena of weak concentration. Our purposes in this article are (i) to introduce the shadow system due to the bubbling phenomena when one of parameters ρi crosses 4π and ρj ∉ 4 πN where 1 ≤ i ≠ j ≤ 2; (ii) to show how to calculate the topological degree of Toda systems by computing the topological degree of the general shadow systems; (iii) to calculate the topological degree of the shadow system for one point blow up. We believe that the degree counting formula for the shadow system would be useful in other problems.

  3. Automatic, semi-automatic and manual validation of urban drainage data.

    PubMed

    Branisavljević, N; Prodanović, D; Pavlović, D

    2010-01-01

    Advances in sensor technology and the possibility of automated long distance data transmission have made continuous measurements the preferable way of monitoring urban drainage processes. Usually, the collected data have to be processed by an expert in order to detect and mark the wrong data, remove them and replace them with interpolated data. In general, the first step in detecting the wrong, anomaly data is called the data quality assessment or data validation. Data validation consists of three parts: data preparation, validation scores generation and scores interpretation. This paper will present the overall framework for the data quality improvement system, suitable for automatic, semi-automatic or manual operation. The first two steps of the validation process are explained in more detail, using several validation methods on the same set of real-case data from the Belgrade sewer system. The final part of the validation process, which is the scores interpretation, needs to be further investigated on the developed system.

  4. Semi-automatic brain tumor segmentation by constrained MRFs using structural trajectories.

    PubMed

    Zhao, Liang; Wu, Wei; Corso, Jason J

    2013-01-01

    Quantifying volume and growth of a brain tumor is a primary prognostic measure and hence has received much attention in the medical imaging community. Most methods have sought a fully automatic segmentation, but the variability in shape and appearance of brain tumor has limited their success and further adoption in the clinic. In reaction, we present a semi-automatic brain tumor segmentation framework for multi-channel magnetic resonance (MR) images. This framework does not require prior model construction and only requires manual labels on one automatically selected slice. All other slices are labeled by an iterative multi-label Markov random field optimization with hard constraints. Structural trajectories-the medical image analog to optical flow and 3D image over-segmentation are used to capture pixel correspondences between consecutive slices for pixel labeling. We show robustness and effectiveness through an evaluation on the 2012 MICCAI BRATS Challenge Dataset; our results indicate superior performance to baselines and demonstrate the utility of the constrained MRF formulation.

  5. Automatic sequential fluid handling with multilayer microfluidic sample isolated pumping

    PubMed Central

    Liu, Jixiao; Fu, Hai; Yang, Tianhang; Li, Songjing

    2015-01-01

    To sequentially handle fluids is of great significance in quantitative biology, analytical chemistry, and bioassays. However, the technological options are limited when building such microfluidic sequential processing systems, and one of the encountered challenges is the need for reliable, efficient, and mass-production available microfluidic pumping methods. Herein, we present a bubble-free and pumping-control unified liquid handling method that is compatible with large-scale manufacture, termed multilayer microfluidic sample isolated pumping (mμSIP). The core part of the mμSIP is the selective permeable membrane that isolates the fluidic layer from the pneumatic layer. The air diffusion from the fluidic channel network into the degassing pneumatic channel network leads to fluidic channel pressure variation, which further results in consistent bubble-free liquid pumping into the channels and the dead-end chambers. We characterize the mμSIP by comparing the fluidic actuation processes with different parameters and a flow rate range of 0.013 μl/s to 0.097 μl/s is observed in the experiments. As the proof of concept, we demonstrate an automatic sequential fluid handling system aiming at digital assays and immunoassays, which further proves the unified pumping-control and suggests that the mμSIP is suitable for functional microfluidic assays with minimal operations. We believe that the mμSIP technology and demonstrated automatic sequential fluid handling system would enrich the microfluidic toolbox and benefit further inventions. PMID:26487904

  6. The structure of separated flow regions occurring near the leading edge of airfoils including transition

    NASA Technical Reports Server (NTRS)

    Mueller, T. J.

    1986-01-01

    A semi-empirical method for predicting separation bubble characteristics was evaluated using low Reynolds number test data. On the basis of this data, several observations were made. First, a sizable growth in the momentum thickness can occur in the laminar portion of a separation bubble. This is in direct contrast to the theory and is apparently due to low Reynolds number effects. Secondly, the transition Reynolds number (R sub l sub 1) which governs the extent of a bubble's laminar region, was found to be much lower than that used in the method. At present, there does not seem to be any evidence supporting a single value for R sub l sub 1. Apparently, R sub l sub 1 is affected by the freestream disturbance environment, and airfoil's pressure distribution, and possibly the chord Reynolds number as well. Thirdly, the growth in momentum thickness over a bubble's turbulent region was predicted reasonably well by the method, provided that Roberts' suggested value for the mean dissipation coefficient was used. Finally, the present data does not substantiate the universality of the velocity profile at reattachment. However, measurement error may be responsible for this result.

  7. High-content image analysis (HCIA) assay has the highest correlation with direct counting cell suspension compared to the ATP, WST-8 and Alamar blue assays for measurement of cytotoxicity.

    PubMed

    Tahara, Haruna; Matsuda, Shun; Yamamoto, Yusuke; Yoshizawa, Hiroe; Fujita, Masaharu; Katsuoka, Yasuhiro; Kasahara, Toshihiko

    2017-11-01

    Various cytotoxicity assays measuring indicators such as enzyme activity, dye uptake, or cellular ATP content are often performed using 96-well microplates. However, recent reports show that cytotoxicity assays such as the ATP assay and MTS assay underestimate cytotoxicity when compounds such as anti-cancer drugs or mutagens induce cell hypertrophy whilst increasing intracellular ATP content. Therefore, we attempted to evaluate the reliability of a high-content image analysis (HCIA) assay to count cell number in a 96-well microplate automatically without using a cell-number indicator. We compared cytotoxicity results of 25 compounds obtained from ATP, WST-8, Alamar blue, and HCIA assays with those directly measured using an automatic cell counter, and repeating individual experiments thrice. The number of compounds showing low correlation in cell viability measured using cytotoxicity assays compared to automatic cell counting (r 2 <0.8, at least 2 of 3 experiments) were follows: ATP assay; 7; WST-8 assay, 2; Alamar blue assay, 3; HCIA cytotoxicity assay, 0. Compounds for which correlation was poor in 3 assays, except the HCIA assay, induced an increase in nuclear and cell size. However, correlation between cell viability measured by automatic cell counter and the HCIA assay was strong regardless of nuclear and cell size. Additionally, correlation coefficients between IC 50 values obtained from automatic cell counter and from cytotoxicity assays were as follows: ATP assay, 0.80; WST-8 assay, 0.84; Alamar blue assay, 0.84; and HCIA assay, 0.98. From the above, we showed that the HCIA cytotoxicity assay produces similar data to the automatic cell counter and is highly accurate in measuring cytotoxicity. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Development of Semi-Automatic Lathe by using Intelligent Soft Computing Technique

    NASA Astrophysics Data System (ADS)

    Sakthi, S.; Niresh, J.; Vignesh, K.; Anand Raj, G.

    2018-03-01

    This paper discusses the enhancement of conventional lathe machine to semi-automated lathe machine by implementing a soft computing method. In the present scenario, lathe machine plays a vital role in the engineering division of manufacturing industry. While the manual lathe machines are economical, the accuracy and efficiency are not up to the mark. On the other hand, CNC machine provide the desired accuracy and efficiency, but requires a huge capital. In order to over come this situation, a semi-automated approach towards the conventional lathe machine is developed by employing stepper motors to the horizontal and vertical drive, that can be controlled by Arduino UNO -microcontroller. Based on the input parameters of the lathe operation the arduino coding is been generated and transferred to the UNO board. Thus upgrading from manual to semi-automatic lathe machines can significantly increase the accuracy and efficiency while, at the same time, keeping a check on investment cost and consequently provide a much needed escalation to the manufacturing industry.

  9. Advanced readout methods for superheated emulsion detectors

    NASA Astrophysics Data System (ADS)

    d'Errico, F.; Di Fulvio, A.

    2018-05-01

    Superheated emulsions develop visible vapor bubbles when exposed to ionizing radiation. They consist in droplets of a metastable liquid, emulsified in an inert matrix. The formation of a bubble cavity is accompanied by sound waves. Evaporated bubbles also exhibit a lower refractive index, compared to the inert gel matrix. These two physical phenomena have been exploited to count the number of evaporated bubbles and thus measure the interacting radiation flux. Systems based on piezoelectric transducers have been traditionally used to acquire the acoustic (pressure) signals generated by bubble evaporation. Such systems can operate at ambient noise levels exceeding 100 dB; however, they are affected by a significant dead time (>10 ms). An optical readout technique relying on the scattering of light by neutron-induced bubbles has been recently improved in order to minimize measurement dead time and ambient noise sensitivity. Beams of infra-red light from light-emitting diode (LED) sources cross the active area of the detector and are deflected by evaporated bubbles. The scattered light correlates with bubble density. Planar photodiodes are affixed along the detector length in optimized positions, allowing the detection of scattered light from the bubbles and minimizing the detection of direct light from the LEDs. A low-noise signal-conditioning stage has been designed and realized to amplify the current induced in the photodiodes by scattered light and to subtract the background signal due to intrinsic scattering within the detector matrix. The proposed amplification architecture maximizes the measurement signal-to-noise ratio, yielding a readout uncertainty of 6% (±1 SD), with 1000 evaporated bubbles in a detector active volume of 150 ml (6 cm detector diameter). In this work, we prove that the intensity of scattered light also relates to the bubble size, which can be controlled by applying an external pressure to the detector emulsion. This effect can be exploited during the readout procedure to minimize shadowing effects between bubbles, which become severe when the latter are several thousands. The detector we used in this work is based on superheated C-318 (octafluorocyclobutane), emulsified in 100 μm ± 10% (1 SD) diameter drops in an inert matrix of approximately 150 ml. The detector was operated at room temperature and ambient pressure.

  10. Robust extraction of the aorta and pulmonary artery from 3D MDCT image data

    NASA Astrophysics Data System (ADS)

    Taeprasartsit, Pinyo; Higgins, William E.

    2010-03-01

    Accurate definition of the aorta and pulmonary artery from three-dimensional (3D) multi-detector CT (MDCT) images is important for pulmonary applications. This work presents robust methods for defining the aorta and pulmonary artery in the central chest. The methods work on both contrast enhanced and no-contrast 3D MDCT image data. The automatic methods use a common approach employing model fitting and selection and adaptive refinement. During the occasional event that more precise vascular extraction is desired or the method fails, we also have an alternate semi-automatic fail-safe method. The semi-automatic method extracts the vasculature by extending the medial axes into a user-guided direction. A ground-truth study over a series of 40 human 3D MDCT images demonstrates the efficacy, accuracy, robustness, and efficiency of the methods.

  11. Semi-automatic motion compensation of contrast-enhanced ultrasound images from abdominal organs for perfusion analysis.

    PubMed

    Schäfer, Sebastian; Nylund, Kim; Sævik, Fredrik; Engjom, Trond; Mézl, Martin; Jiřík, Radovan; Dimcevski, Georg; Gilja, Odd Helge; Tönnies, Klaus

    2015-08-01

    This paper presents a system for correcting motion influences in time-dependent 2D contrast-enhanced ultrasound (CEUS) images to assess tissue perfusion characteristics. The system consists of a semi-automatic frame selection method to find images with out-of-plane motion as well as a method for automatic motion compensation. Translational and non-rigid motion compensation is applied by introducing a temporal continuity assumption. A study consisting of 40 clinical datasets was conducted to compare the perfusion with simulated perfusion using pharmacokinetic modeling. Overall, the proposed approach decreased the mean average difference between the measured perfusion and the pharmacokinetic model estimation. It was non-inferior for three out of four patient cohorts to a manual approach and reduced the analysis time by 41% compared to manual processing. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Semi-automated identification of cones in the human retina using circle Hough transform

    PubMed Central

    Bukowska, Danuta M.; Chew, Avenell L.; Huynh, Emily; Kashani, Irwin; Wan, Sue Ling; Wan, Pak Ming; Chen, Fred K

    2015-01-01

    A large number of human retinal diseases are characterized by a progressive loss of cones, the photoreceptors critical for visual acuity and color perception. Adaptive Optics (AO) imaging presents a potential method to study these cells in vivo. However, AO imaging in ophthalmology is a relatively new phenomenon and quantitative analysis of these images remains difficult and tedious using manual methods. This paper illustrates a novel semi-automated quantitative technique enabling registration of AO images to macular landmarks, cone counting and its radius quantification at specified distances from the foveal center. The new cone counting approach employs the circle Hough transform (cHT) and is compared to automated counting methods, as well as arbitrated manual cone identification. We explore the impact of varying the circle detection parameter on the validity of cHT cone counting and discuss the potential role of using this algorithm in detecting both cones and rods separately. PMID:26713186

  13. Discrete-vortex simulation of pulsating flow on a turbulent leading-edge separation bubble

    NASA Technical Reports Server (NTRS)

    Sung, Hyung Jin; Rhim, Jae Wook; Kiya, Masaru

    1992-01-01

    Studies are made of the turbulent separation bubble in a two-dimensional semi-infinite blunt plate aligned to a uniform free stream with a pulsating component. The discrete-vortex method is applied to simulate this flow situation because this approach is effective for representing the unsteady motions of the turbulent shear layer and the effect of viscosity near the solid surface. The numerical simulation provides reasonable predictions when compared with the experimental results. A particular frequency with a minimum reattachment is related to the drag reduction. The most effective frequency is dependent on the amplified shedding frequency. The turbulent flow structure is scrutinized. This includes the time-mean and fluctuations of the velocity and the surface pressure, together with correlations between the fluctuating components. A comparison between the pulsating flow and the non-pulsating flow at the particular frequency of the minimum reattachment length of the separation bubble suggests that the large-scale vortical structure is associated with the shedding frequency and the flow instabilities.

  14. Comparison and assessment of semi-automatic image segmentation in computed tomography scans for image-guided kidney surgery.

    PubMed

    Glisson, Courtenay L; Altamar, Hernan O; Herrell, S Duke; Clark, Peter; Galloway, Robert L

    2011-11-01

    Image segmentation is integral to implementing intraoperative guidance for kidney tumor resection. Results seen in computed tomography (CT) data are affected by target organ physiology as well as by the segmentation algorithm used. This work studies variables involved in using level set methods found in the Insight Toolkit to segment kidneys from CT scans and applies the results to an image guidance setting. A composite algorithm drawing on the strengths of multiple level set approaches was built using the Insight Toolkit. This algorithm requires image contrast state and seed points to be identified as input, and functions independently thereafter, selecting and altering method and variable choice as needed. Semi-automatic results were compared to expert hand segmentation results directly and by the use of the resultant surfaces for registration of intraoperative data. Direct comparison using the Dice metric showed average agreement of 0.93 between semi-automatic and hand segmentation results. Use of the segmented surfaces in closest point registration of intraoperative laser range scan data yielded average closest point distances of approximately 1 mm. Application of both inverse registration transforms from the previous step to all hand segmented image space points revealed that the distance variability introduced by registering to the semi-automatically segmented surface versus the hand segmented surface was typically less than 3 mm both near the tumor target and at distal points, including subsurface points. Use of the algorithm shortened user interaction time and provided results which were comparable to the gold standard of hand segmentation. Further, the use of the algorithm's resultant surfaces in image registration provided comparable transformations to surfaces produced by hand segmentation. These data support the applicability and utility of such an algorithm as part of an image guidance workflow.

  15. Self-calibrating models for dynamic monitoring and diagnosis

    NASA Technical Reports Server (NTRS)

    Kuipers, Benjamin

    1996-01-01

    A method for automatically building qualitative and semi-quantitative models of dynamic systems, and using them for monitoring and fault diagnosis, is developed and demonstrated. The qualitative approach and semi-quantitative method are applied to monitoring observation streams, and to design of non-linear control systems.

  16. Very Portable Remote Automatic Weather Stations

    Treesearch

    John R. Warren

    1987-01-01

    Remote Automatic Weather Stations (RAWS) were introduced to Forest Service and Bureau of Land Management field units in 1978 following development, test, and evaluation activities conducted jointly by the two agencies. The original configuration was designed for semi-permanent installation. Subsequently, a need for a more portable RAWS was expressed, and one was...

  17. Research in Automatic Russian-English Scientific and Technical Lexicography. Final Report.

    ERIC Educational Resources Information Center

    Wayne State Univ., Detroit, MI.

    Techniques of reversing English-Russian scientific and technical dictionaries into Russian-English versions through semi-automated compilation are described. Sections on manual and automatic processing discuss pre- and post-editing, the task program, updater (correction of errors and revision by specialist in a given field), the system employed…

  18. Label free cell-tracking and division detection based on 2D time-lapse images for lineage analysis of early embryo development.

    PubMed

    Cicconet, Marcelo; Gutwein, Michelle; Gunsalus, Kristin C; Geiger, Davi

    2014-08-01

    In this paper we report a database and a series of techniques related to the problem of tracking cells, and detecting their divisions, in time-lapse movies of mammalian embryos. Our contributions are (1) a method for counting embryos in a well, and cropping each individual embryo across frames, to create individual movies for cell tracking; (2) a semi-automated method for cell tracking that works up to the 8-cell stage, along with a software implementation available to the public (this software was used to build the reported database); (3) an algorithm for automatic tracking up to the 4-cell stage, based on histograms of mirror symmetry coefficients captured using wavelets; (4) a cell-tracking database containing 100 annotated examples of mammalian embryos up to the 8-cell stage; and (5) statistical analysis of various timing distributions obtained from those examples. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Effects of Image Compression on Automatic Count of Immunohistochemically Stained Nuclei in Digital Images

    PubMed Central

    López, Carlos; Lejeune, Marylène; Escrivà, Patricia; Bosch, Ramón; Salvadó, Maria Teresa; Pons, Lluis E.; Baucells, Jordi; Cugat, Xavier; Álvaro, Tomás; Jaén, Joaquín

    2008-01-01

    This study investigates the effects of digital image compression on automatic quantification of immunohistochemical nuclear markers. We examined 188 images with a previously validated computer-assisted analysis system. A first group was composed of 47 images captured in TIFF format, and other three contained the same images converted from TIFF to JPEG format with 3×, 23× and 46× compression. Counts of TIFF format images were compared with the other three groups. Overall, differences in the count of the images increased with the percentage of compression. Low-complexity images (≤100 cells/field, without clusters or with small-area clusters) had small differences (<5 cells/field in 95–100% of cases) and high-complexity images showed substantial differences (<35–50 cells/field in 95–100% of cases). Compression does not compromise the accuracy of immunohistochemical nuclear marker counts obtained by computer-assisted analysis systems for digital images with low complexity and could be an efficient method for storing these images. PMID:18755997

  20. Automated Analysis of Human Sperm Number and Concentration (Oligospermia) Using Otsu Threshold Method and Labelling

    NASA Astrophysics Data System (ADS)

    Susrama, I. G.; Purnama, K. E.; Purnomo, M. H.

    2016-01-01

    Oligospermia is a male fertility issue defined as a low sperm concentration in the ejaculate. Normally the sperm concentration is 20-120 million/ml, while Oligospermia patients has sperm concentration less than 20 million/ml. Sperm test done in the fertility laboratory to determine oligospermia by checking fresh sperm according to WHO standards in 2010 [9]. The sperm seen in a microscope using a Neubauer improved counting chamber and manually count the number of sperm. In order to be counted automatically, this research made an automation system to analyse and count the sperm concentration called Automated Analysis of Sperm Concentration Counters (A2SC2) using Otsu threshold segmentation process and morphology. Data sperm used is the fresh sperm directly in the analysis in the laboratory from 10 people. The test results using A2SC2 method obtained an accuracy of 91%. Thus in this study, A2SC2 can be used to calculate the amount and concentration of sperm automatically

  1. Bubble Entropy: An Entropy Almost Free of Parameters.

    PubMed

    Manis, George; Aktaruzzaman, Md; Sassi, Roberto

    2017-11-01

    Objective : A critical point in any definition of entropy is the selection of the parameters employed to obtain an estimate in practice. We propose a new definition of entropy aiming to reduce the significance of this selection. Methods: We call the new definition Bubble Entropy . Bubble Entropy is based on permutation entropy, where the vectors in the embedding space are ranked. We use the bubble sort algorithm for the ordering procedure and count instead the number of swaps performed for each vector. Doing so, we create a more coarse-grained distribution and then compute the entropy of this distribution. Results: Experimental results with both real and synthetic HRV signals showed that bubble entropy presents remarkable stability and exhibits increased descriptive and discriminating power compared to all other definitions, including the most popular ones. Conclusion: The definition proposed is almost free of parameters. The most common ones are the scale factor r and the embedding dimension m . In our definition, the scale factor is totally eliminated and the importance of m is significantly reduced. The proposed method presents increased stability and discriminating power. Significance: After the extensive use of some entropy measures in physiological signals, typical values for their parameters have been suggested, or at least, widely used. However, the parameters are still there, application and dataset dependent, influencing the computed value and affecting the descriptive power. Reducing their significance or eliminating them alleviates the problem, decoupling the method from the data and the application, and eliminating subjective factors. Objective : A critical point in any definition of entropy is the selection of the parameters employed to obtain an estimate in practice. We propose a new definition of entropy aiming to reduce the significance of this selection. Methods: We call the new definition Bubble Entropy . Bubble Entropy is based on permutation entropy, where the vectors in the embedding space are ranked. We use the bubble sort algorithm for the ordering procedure and count instead the number of swaps performed for each vector. Doing so, we create a more coarse-grained distribution and then compute the entropy of this distribution. Results: Experimental results with both real and synthetic HRV signals showed that bubble entropy presents remarkable stability and exhibits increased descriptive and discriminating power compared to all other definitions, including the most popular ones. Conclusion: The definition proposed is almost free of parameters. The most common ones are the scale factor r and the embedding dimension m . In our definition, the scale factor is totally eliminated and the importance of m is significantly reduced. The proposed method presents increased stability and discriminating power. Significance: After the extensive use of some entropy measures in physiological signals, typical values for their parameters have been suggested, or at least, widely used. However, the parameters are still there, application and dataset dependent, influencing the computed value and affecting the descriptive power. Reducing their significance or eliminating them alleviates the problem, decoupling the method from the data and the application, and eliminating subjective factors.

  2. A Semi-Automatic Alignment Method for Math Educational Standards Using the MP (Materialization Pattern) Model

    ERIC Educational Resources Information Center

    Choi, Namyoun

    2010-01-01

    Educational standards alignment, which matches similar or equivalent concepts of educational standards, is a necessary task for educational resource discovery and retrieval. Automated or semi-automated alignment systems for educational standards have been recently available. However, existing systems frequently result in inconsistency in…

  3. Semi-automated identification of leopard frogs

    USGS Publications Warehouse

    Petrovska-Delacrétaz, Dijana; Edwards, Aaron; Chiasson, John; Chollet, Gérard; Pilliod, David S.

    2014-01-01

    Principal component analysis is used to implement a semi-automatic recognition system to identify recaptured northern leopard frogs (Lithobates pipiens). Results of both open set and closed set experiments are given. The presented algorithm is shown to provide accurate identification of 209 individual leopard frogs from a total set of 1386 images.

  4. SABRE--A Novel Software Tool for Bibliographic Post-Processing.

    ERIC Educational Resources Information Center

    Burge, Cecil D.

    1989-01-01

    Describes the software architecture and application of SABRE (Semi-Automated Bibliographic Environment), which is one of the first products to provide a semi-automatic environment for relevancy ranking of citations obtained from searches of bibliographic databases. Features designed to meet the review, categorization, culling, and reporting needs…

  5. Automatic choroid cells segmentation and counting in fluorescence microscopic image

    NASA Astrophysics Data System (ADS)

    Fei, Jianjun; Zhu, Weifang; Shi, Fei; Xiang, Dehui; Lin, Xiao; Yang, Lei; Chen, Xinjian

    2016-03-01

    In this paper, we proposed a method to automatically segment and count the rhesus choroid-retinal vascular endothelial cells (RF/6A) in fluorescence microscopic images which is based on shape classification, bottleneck detection and accelerated Dijkstra algorithm. The proposed method includes four main steps. First, a thresholding filter and morphological operations are applied to reduce the noise. Second, a shape classifier is used to decide whether a connected component is needed to be segmented. In this step, the AdaBoost classifier is applied with a set of shape features. Third, the bottleneck positions are found based on the contours of the connected components. Finally, the cells segmentation and counting are completed based on the accelerated Dijkstra algorithm with the gradient information between the bottleneck positions. The results show the feasibility and efficiency of the proposed method.

  6. Semi-automatic 3D lung nodule segmentation in CT using dynamic programming

    NASA Astrophysics Data System (ADS)

    Sargent, Dustin; Park, Sun Young

    2017-02-01

    We present a method for semi-automatic segmentation of lung nodules in chest CT that can be extended to general lesion segmentation in multiple modalities. Most semi-automatic algorithms for lesion segmentation or similar tasks use region-growing or edge-based contour finding methods such as level-set. However, lung nodules and other lesions are often connected to surrounding tissues, which makes these algorithms prone to growing the nodule boundary into the surrounding tissue. To solve this problem, we apply a 3D extension of the 2D edge linking method with dynamic programming to find a closed surface in a spherical representation of the nodule ROI. The algorithm requires a user to draw a maximal diameter across the nodule in the slice in which the nodule cross section is the largest. We report the lesion volume estimation accuracy of our algorithm on the FDA lung phantom dataset, and the RECIST diameter estimation accuracy on the lung nodule dataset from the SPIE 2016 lung nodule classification challenge. The phantom results in particular demonstrate that our algorithm has the potential to mitigate the disparity in measurements performed by different radiologists on the same lesions, which could improve the accuracy of disease progression tracking.

  7. Automatic Detection of Pitching and Throwing Events in Baseball With Inertial Measurement Sensors.

    PubMed

    Murray, Nick B; Black, Georgia M; Whiteley, Rod J; Gahan, Peter; Cole, Michael H; Utting, Andy; Gabbett, Tim J

    2017-04-01

    Throwing loads are known to be closely related to injury risk. However, for logistic reasons, typically only pitchers have their throws counted, and then only during innings. Accordingly, all other throws made are not counted, so estimates of throws made by players may be inaccurately recorded and underreported. A potential solution to this is the use of wearable microtechnology to automatically detect, quantify, and report pitch counts in baseball. This study investigated the accuracy of detection of baseball pitching and throwing in both practice and competition using a commercially available wearable microtechnology unit. Seventeen elite youth baseball players (mean ± SD age 16.5 ± 0.8 y, height 184.1 ± 5.5 cm, mass 78.3 ± 7.7 kg) participated in this study. Participants performed pitching, fielding, and throwing during practice and competition while wearing a microtechnology unit. Sensitivity and specificity of a pitching and throwing algorithm were determined by comparing automatic measures (ie, microtechnology unit) with direct measures (ie, manually recorded pitching counts). The pitching and throwing algorithm was sensitive during both practice (100%) and competition (100%). Specificity was poorer during both practice (79.8%) and competition (74.4%). These findings demonstrate that the microtechnology unit is sensitive to detect pitching and throwing events, but further development of the pitching algorithm is required to accurately and consistently quantify throwing loads using microtechnology.

  8. The Neutral Islands during the Late Epoch of Reionization

    NASA Astrophysics Data System (ADS)

    Xu, Yidong; Yue, Bin; Chen, Xuelei

    2018-05-01

    The large-scale structure of the ionization field during the epoch of reionization (EoR) can be modeled by the excursion set theory. While the growth of ionized regions during the early stage are described by the ``bubble model'', the shrinking process of neutral regions after the percolation of the ionized region calls for an ``island model''. An excursion set based analytical model and a semi-numerical code (islandFAST) have been developed. The ionizing background and the bubbles inside the islands are also included in the treatment. With two kinds of absorbers of ionizing photons, i.e. the large-scale under-dense neutral islands and the small-scale over-dense clumps, the ionizing background are self-consistently evolved in the model.

  9. A semi-automated method of monitoring dam passage of American Eels Anguilla rostrata

    USGS Publications Warehouse

    Welsh, Stuart A.; Aldinger, Joni L.

    2014-01-01

    Fish passage facilities at dams have become an important focus of fishery management in riverine systems. Given the personnel and travel costs associated with physical monitoring programs, automated or semi-automated systems are an attractive alternative for monitoring fish passage facilities. We designed and tested a semi-automated system for eel ladder monitoring at Millville Dam on the lower Shenandoah River, West Virginia. A motion-activated eel ladder camera (ELC) photographed each yellow-phase American Eel Anguilla rostrata that passed through the ladder. Digital images (with date and time stamps) of American Eels allowed for total daily counts and measurements of eel TL using photogrammetric methods with digital imaging software. We compared physical counts of American Eels with camera-based counts; TLs obtained with a measuring board were compared with TLs derived from photogrammetric methods. Data from the ELC were consistent with data obtained by physical methods, thus supporting the semi-automated camera system as a viable option for monitoring American Eel passage. Time stamps on digital images allowed for the documentation of eel passage time—data that were not obtainable from physical monitoring efforts. The ELC has application to eel ladder facilities but can also be used to monitor dam passage of other taxa, such as crayfishes, lampreys, and water snakes.

  10. Intentional Subitizing: Exploring the Role of Automaticity in Enumeration

    ERIC Educational Resources Information Center

    Pincham, Hannah L.; Szucs, Denes

    2012-01-01

    Subitizing is traditionally described as the rapid, preattentive and automatic enumeration of up to four items. Counting, by contrast, describes the enumeration of larger sets of items and requires slower serial shifts of attention. Although recent research has called into question the preattentive nature of subitizing, whether or not numerosities…

  11. Flow visualization study of close-coupled canard wing and strake wing configuration

    NASA Technical Reports Server (NTRS)

    Miner, D. D.; Gloss, B. B.

    1975-01-01

    The Langley 1/8-scale V/STOL model tunnel was used to qualitatively determine the flow fields associated with semi-span close coupled canard wing and strake wing models. Small helium filled bubbles were injected upstream of the models to make the flow visible. Photographs were taken over the angle-of-attack ranges of -10 deg to 40 deg.

  12. 10 CFR Appendix J2 to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Automatic and Semi-Automatic Clothes...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... hardness or less) using 27.0 grams + 4.0 grams per pound of cloth load of AHAM Standard detergent Formula 3... repellent finishes, such as fluoropolymer stain resistant finishes shall not be applied to the test cloth...

  13. The Use of Opto-Electronics in Viscometry.

    ERIC Educational Resources Information Center

    Mazza, R. J.; Washbourn, D. H.

    1982-01-01

    Describes a semi-automatic viscometer which incorporates a microprocessor system and uses optoelectronics to detect flow of liquid through the capillary, flow time being displayed on a timer with accuracy of 0.01 second. The system could be made fully automatic with an additional microprocessor circuit and inclusion of a pump. (Author/JN)

  14. Integration of wireless sensor networks into automatic irrigation scheduling of a center pivot

    USDA-ARS?s Scientific Manuscript database

    A six-span center pivot system was used as a platform for testing two wireless sensor networks (WSN) of infrared thermometers. The cropped field was a semi-circle, divided into six pie shaped sections of which three were irrigated manually and three were irrigated automatically based on the time tem...

  15. Variability in circulating gas emboli after a same scuba diving exposure.

    PubMed

    Papadopoulou, V; Germonpré, P; Cosgrove, D; Eckersley, R J; Dayton, P A; Obeid, G; Boutros, A; Tang, M-X; Theunissen, S; Balestra, C

    2018-06-01

    A reduction in ambient pressure or decompression from scuba diving can result in ultrasound-detectable venous gas emboli (VGE). These environmental exposures carry a risk of decompression sickness (DCS) which is mitigated by adherence to decompression schedules; however, bubbles are routinely observed for dives well within these limits and significant inter-personal variability in DCS risk exists. Here, we assess the variability and evolution of VGE for 2 h post-dive using echocardiography, following a standardized pool dive in calm warm conditions. 14 divers performed either one or two (with a 24 h interval) standardized scuba dives to 33 mfw (400 kPa) for 20 min of immersion time at NEMO 33 in Brussels, Belgium. Measurements were performed at 21, 56, 91 and 126 min post-dive: bubbles were counted for all 68 echocardiography recordings and the average over ten consecutive cardiac cycles taken as the bubble score. Significant inter-personal variability was demonstrated despite all divers following the same protocol in controlled pool conditions: in the detection or not of VGE, in the peak VGE score, as well as time to VGE peak. In addition, intra-personal differences in 2/3 of the consecutive day dives were seen (lower VGE counts or faster clearance). Since VGE evolution post-dive varies between people, more work is clearly needed to isolate contributing factors. In this respect, going toward a more continuous evaluation, or developing new means to detect decompression stress markers, may offer the ability to better assess dynamic correlations to other physiological parameters.

  16. Optoelectronic imaging of speckle using image processing method

    NASA Astrophysics Data System (ADS)

    Wang, Jinjiang; Wang, Pengfei

    2018-01-01

    A detailed image processing of laser speckle interferometry is proposed as an example for the course of postgraduate student. Several image processing methods were used together for dealing with optoelectronic imaging system, such as the partial differential equations (PDEs) are used to reduce the effect of noise, the thresholding segmentation also based on heat equation with PDEs, the central line is extracted based on image skeleton, and the branch is removed automatically, the phase level is calculated by spline interpolation method, and the fringe phase can be unwrapped. Finally, the imaging processing method was used to automatically measure the bubble in rubber with negative pressure which could be used in the tire detection.

  17. A semi-automatic annotation tool for cooking video

    NASA Astrophysics Data System (ADS)

    Bianco, Simone; Ciocca, Gianluigi; Napoletano, Paolo; Schettini, Raimondo; Margherita, Roberto; Marini, Gianluca; Gianforme, Giorgio; Pantaleo, Giuseppe

    2013-03-01

    In order to create a cooking assistant application to guide the users in the preparation of the dishes relevant to their profile diets and food preferences, it is necessary to accurately annotate the video recipes, identifying and tracking the foods of the cook. These videos present particular annotation challenges such as frequent occlusions, food appearance changes, etc. Manually annotate the videos is a time-consuming, tedious and error-prone task. Fully automatic tools that integrate computer vision algorithms to extract and identify the elements of interest are not error free, and false positive and false negative detections need to be corrected in a post-processing stage. We present an interactive, semi-automatic tool for the annotation of cooking videos that integrates computer vision techniques under the supervision of the user. The annotation accuracy is increased with respect to completely automatic tools and the human effort is reduced with respect to completely manual ones. The performance and usability of the proposed tool are evaluated on the basis of the time and effort required to annotate the same video sequences.

  18. Counting and RAN: Predictors of Arithmetic Calculation and Reading Fluency

    ERIC Educational Resources Information Center

    Koponen, Tuire; Salmi, Paula; Eklund, Kenneth; Aro, Tuija

    2013-01-01

    This study examined whether counting and rapid automatized naming (RAN) could operate as significant predictors of both later arithmetic calculation and reading fluency. The authors also took an important step to clarify the cognitive mechanisms underlying these predictive relationships by controlling for the effect of phonological awareness and…

  19. Laboratory and semi-field evaluations of two (transfluthrin) spatial repellent devices against Aedes aegypti (L.) (Diptera: Culicidae).

    PubMed

    McPhatter, Lee P; Mischler, Paula D; Webb, Meiling Z; Chauhan, Kamal; Lindroth, Erica J; Richardson, Alec G; Debboun, Mustapha

    2017-01-01

    Two transfluthrin-based spatial repellent products (Raid Dual Action Insect Repellent and Home Freshener and Raid Shield (currently not commercially available), SC Johnson, Racine WI) were evaluated for spatial repellent effects against female Aedes aegypti (L.) mosquitoes under laboratory (wind tunnel) and semi-field (outdoor enclosure) conditions. The placement of either product in the wind tunnel significantly reduced host-seeking behaviors. The mean baseline (control) landing counts for the Raid Dual Action and Raid Shield were reduced by 95% and 74% respectively. Mean probing counts for the Raid Dual Action were reduced by 95%, while the probing counts for the Raid Shield were decreased by 69%. Baseline blood-feeding success was significantly reduced for both treatments: Raid Dual Action (100%) and Raid Shield (96%). Semi-field evaluations were conducted in outdoor enclosures at the Navy Entomology Center of Excellence, Jacksonville, Florida. A moderate reduction in mosquito entry into military style tents resulted when either product was placed near the tent opening. The Raid Shield reduced mosquito entry into tents by 88%, while the Dual Action decreased entry by 66%.

  20. Counting the number of Feynman graphs in QCD

    NASA Astrophysics Data System (ADS)

    Kaneko, T.

    2018-05-01

    Information about the number of Feynman graphs for a given physical process in a given field theory is especially useful for confirming the result of a Feynman graph generator used in an automatic system of perturbative calculations. A method of counting the number of Feynman graphs with weight of symmetry factor was established based on zero-dimensional field theory, and was used in scalar theories and QED. In this article this method is generalized to more complicated models by direct calculation of generating functions on a computer algebra system. This method is applied to QCD with and without counter terms, where many higher order are being calculated automatically.

  1. Acoustic mapping of shallow water gas releases using shipborne multibeam systems

    NASA Astrophysics Data System (ADS)

    Urban, Peter; Köser, Kevin; Weiß, Tim; Greinert, Jens

    2015-04-01

    Water column imaging (WCI) shipborne multibeam systems are effective tools for investigating marine free gas (bubble) release. Like single- and splitbeam systems they are very sensitive towards gas bubbles in the water column, and have the advantage of the wide swath opening angle, 120° or more allowing a better mapping and possible 3D investigations of targets in the water column. On the downside, WCI data are degraded by specific noise from side-lobe effects and are usually not calibrated for target backscattering strength analysis. Most approaches so far concentrated on manual investigations of bubbles in the water column data. Such investigations allow the detection of bubble streams (flares) and make it possible to get an impression about the strength of detected flares/the gas release. Because of the subjective character of these investigations it is difficult to understand how well an area has been investigated by a flare mapping survey and subjective impressions about flare strength can easily be fooled by the many acoustic effects multibeam systems create. Here we present a semi-automated approach that uses the behavior of bubble streams in varying water currents to detect and map their exact source positions. The focus of the method is application of objective rules for flare detection, which makes it possible to extract information about the quality of the seepage mapping survey, perform automated noise reduction and create acoustic maps with quality discriminators indicating how well an area has been mapped.

  2. A conceptual study of automatic and semi-automatic quality assurance techniques for round image processing

    NASA Technical Reports Server (NTRS)

    1983-01-01

    This report summarizes the results of a study conducted by Engineering and Economics Research (EER), Inc. under NASA Contract Number NAS5-27513. The study involved the development of preliminary concepts for automatic and semiautomatic quality assurance (QA) techniques for ground image processing. A distinction is made between quality assessment and the more comprehensive quality assurance which includes decision making and system feedback control in response to quality assessment.

  3. Automatic ultrasound image enhancement for 2D semi-automatic breast-lesion segmentation

    NASA Astrophysics Data System (ADS)

    Lu, Kongkuo; Hall, Christopher S.

    2014-03-01

    Breast cancer is the fastest growing cancer, accounting for 29%, of new cases in 2012, and second leading cause of cancer death among women in the United States and worldwide. Ultrasound (US) has been used as an indispensable tool for breast cancer detection/diagnosis and treatment. In computer-aided assistance, lesion segmentation is a preliminary but vital step, but the task is quite challenging in US images, due to imaging artifacts that complicate detection and measurement of the suspect lesions. The lesions usually present with poor boundary features and vary significantly in size, shape, and intensity distribution between cases. Automatic methods are highly application dependent while manual tracing methods are extremely time consuming and have a great deal of intra- and inter- observer variability. Semi-automatic approaches are designed to counterbalance the advantage and drawbacks of the automatic and manual methods. However, considerable user interaction might be necessary to ensure reasonable segmentation for a wide range of lesions. This work proposes an automatic enhancement approach to improve the boundary searching ability of the live wire method to reduce necessary user interaction while keeping the segmentation performance. Based on the results of segmentation of 50 2D breast lesions in US images, less user interaction is required to achieve desired accuracy, i.e. < 80%, when auto-enhancement is applied for live-wire segmentation.

  4. Reticulocyte analysis using flow cytometry.

    PubMed

    Corberand, J X

    1996-12-01

    Automation of the reticulocyte count by means of flow cytometry has considerably improved the quality of this investigation. This article deals firstly with the reasons for the poor performance of the microscopic technique and with the physiological principles underlying identification and classification of reticulocytes using RNA labeling. It then outlines the automated methods currently on the market, which can be classified in three categories: a) "general-purpose" cytofluorometers, which in clinical laboratories usually deal with lymphocyte immunophenotyping; b) the only commercially available cytofluorometer dedicated to the reticulocyte count; this automat has the advantage of requiring no human intervention as it merely needs to be fed with samples; c) hematology analyzers with specific modules for automatic counting of reticulocytes previously incubated with a non-fluorescent dye. Of the various fluorescent markers available, thiazole orange, DEQTC iodide and auramine are most often used for this basic hematology test. The quality of the count, the availability of new reticulocyte indices (maturation index, percentage of young reticulocytes) and rapidity of the count give this test renewed value in the practical approach to the diagnosis of anemia, and also open new perspectives in the surveillance of aplastic anemia after chemotherapy or bone marrow grafting.

  5. The cavitation induced Becquerel effect and the hot spot theory of sonoluminescence.

    PubMed

    Prevenslik, T V

    2003-06-01

    Over 150 years ago, Becquerel discovered the ultraviolet illumination of one of a pair of identical electrodes in liquid water produced an electric current, the phenomenon called the Becquerel effect. Recently, a similar effect was observed if the water surrounding one electrode is made to cavitate by focused acoustic radiation, which by similarity is referred to as the cavitation induced Becquerel effect. The current in the cavitation induced Becquerel effect was found to be semi-logarithmic with the standard electrode potential that is consistent with the oxidation of the electrode surface by the photo-decomposition theory of photoelectrochemistry. But oxidation of the electrode surface usually requires high temperatures, say as in cavitation. Absent high bubble temperatures, cavitation may produce vacuum ultraviolet (VUV) light that excites water molecules in the electrode film to higher H(2)O(*) energy states, the excited states oxidizing the electrode surface by chemical reaction. Solutions of the Rayleigh-Plesset equation during bubble collapse that include the condensation of water vapor show any increase in temperature or pressure of the water vapor by compression heating is compensated by the condensation of vapor to the bubble wall, the bubbles collapsing almost isothermally. Hence, the cavitation induced Becquerel effect is likely caused by cavitation induced VUV light at ambient temperature.

  6. Stationary bubbles and their tunneling channels toward trivial geometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Pisin; Yeom, Dong-han; Domènech, Guillem

    2016-04-01

    In the path integral approach, one has to sum over all histories that start from the same initial condition in order to obtain the final condition as a superposition of histories. Applying this into black hole dynamics, we consider stable and unstable stationary bubbles as a reasonable and regular initial condition. We find examples where the bubble can either form a black hole or tunnel toward a trivial geometry, i.e., with no singularity nor event horizon. We investigate the dynamics and tunneling channels of true vacuum bubbles for various tensions. In particular, in line with the idea of superposition ofmore » geometries, we build a classically stable stationary thin-shell solution in a Minkowski background where its fate is probabilistically given by non-perturbative effects. Since there exists a tunneling channel toward a trivial geometry in the entire path integral, the entire information is encoded in the wave function. This demonstrates that the unitarity is preserved and there is no loss of information when viewed from the entire wave function of the universe, whereas a semi-classical observer, who can see only a definitive geometry, would find an effective loss of information. This may provide a resolution to the information loss dilemma.« less

  7. Stationary bubbles and their tunneling channels toward trivial geometry

    DOE PAGES

    Chen, Pisin; Domènech, Guillem; Sasaki, Misao; ...

    2016-04-07

    In the path integral approach, one has to sum over all histories that start from the same initial condition in order to obtain the final condition as a superposition of histories. Applying this into black hole dynamics, we consider stable and unstable stationary bubbles as a reasonable and regular initial condition. We find examples where the bubble can either form a black hole or tunnel toward a trivial geometry, i.e., with no singularity nor event horizon. We investigate the dynamics and tunneling channels of true vacuum bubbles for various tensions. In particular, in line with the idea of superposition ofmore » geometries, we build a classically stable stationary thin-shell solution in a Minkowski background where its fate is probabilistically given by non-perturbative effects. Since there exists a tunneling channel toward a trivial geometry in the entire path integral, the entire information is encoded in the wave function. This demonstrates that the unitarity is preserved and there is no loss of information when viewed from the entire wave function of the universe, whereas a semi-classical observer, who can see only a definitive geometry, would find an effective loss of information. Ultimately, this may provide a resolution to the information loss dilemma.« less

  8. A semi-automatic 2D-to-3D video conversion with adaptive key-frame selection

    NASA Astrophysics Data System (ADS)

    Ju, Kuanyu; Xiong, Hongkai

    2014-11-01

    To compensate the deficit of 3D content, 2D to 3D video conversion (2D-to-3D) has recently attracted more attention from both industrial and academic communities. The semi-automatic 2D-to-3D conversion which estimates corresponding depth of non-key-frames through key-frames is more desirable owing to its advantage of balancing labor cost and 3D effects. The location of key-frames plays a role on quality of depth propagation. This paper proposes a semi-automatic 2D-to-3D scheme with adaptive key-frame selection to keep temporal continuity more reliable and reduce the depth propagation errors caused by occlusion. The potential key-frames would be localized in terms of clustered color variation and motion intensity. The distance of key-frame interval is also taken into account to keep the accumulated propagation errors under control and guarantee minimal user interaction. Once their depth maps are aligned with user interaction, the non-key-frames depth maps would be automatically propagated by shifted bilateral filtering. Considering that depth of objects may change due to the objects motion or camera zoom in/out effect, a bi-directional depth propagation scheme is adopted where a non-key frame is interpolated from two adjacent key frames. The experimental results show that the proposed scheme has better performance than existing 2D-to-3D scheme with fixed key-frame interval.

  9. Comparison of computer systems and ranking criteria for automatic melanoma detection in dermoscopic images.

    PubMed

    Møllersen, Kajsa; Zortea, Maciel; Schopf, Thomas R; Kirchesch, Herbert; Godtliebsen, Fred

    2017-01-01

    Melanoma is the deadliest form of skin cancer, and early detection is crucial for patient survival. Computer systems can assist in melanoma detection, but are not widespread in clinical practice. In 2016, an open challenge in classification of dermoscopic images of skin lesions was announced. A training set of 900 images with corresponding class labels and semi-automatic/manual segmentation masks was released for the challenge. An independent test set of 379 images, of which 75 were of melanomas, was used to rank the participants. This article demonstrates the impact of ranking criteria, segmentation method and classifier, and highlights the clinical perspective. We compare five different measures for diagnostic accuracy by analysing the resulting ranking of the computer systems in the challenge. Choice of performance measure had great impact on the ranking. Systems that were ranked among the top three for one measure, dropped to the bottom half when changing performance measure. Nevus Doctor, a computer system previously developed by the authors, was used to participate in the challenge, and investigate the impact of segmentation and classifier. The diagnostic accuracy when using an automatic versus the semi-automatic/manual segmentation is investigated. The unexpected small impact of segmentation method suggests that improvements of the automatic segmentation method w.r.t. resemblance to semi-automatic/manual segmentation will not improve diagnostic accuracy substantially. A small set of similar classification algorithms are used to investigate the impact of classifier on the diagnostic accuracy. The variability in diagnostic accuracy for different classifier algorithms was larger than the variability for segmentation methods, and suggests a focus for future investigations. From a clinical perspective, the misclassification of a melanoma as benign has far greater cost than the misclassification of a benign lesion. For computer systems to have clinical impact, their performance should be ranked by a high-sensitivity measure.

  10. Comparison Of Semi-Automatic And Automatic Slick Detection Algorithms For Jiyeh Power Station Oil Spill, Lebanon

    NASA Astrophysics Data System (ADS)

    Osmanoglu, B.; Ozkan, C.; Sunar, F.

    2013-10-01

    After air strikes on July 14 and 15, 2006 the Jiyeh Power Station started leaking oil into the eastern Mediterranean Sea. The power station is located about 30 km south of Beirut and the slick covered about 170 km of coastline threatening the neighboring countries Turkey and Cyprus. Due to the ongoing conflict between Israel and Lebanon, cleaning efforts could not start immediately resulting in 12 000 to 15 000 tons of fuel oil leaking into the sea. In this paper we compare results from automatic and semi-automatic slick detection algorithms. The automatic detection method combines the probabilities calculated for each pixel from each image to obtain a joint probability, minimizing the adverse effects of atmosphere on oil spill detection. The method can readily utilize X-, C- and L-band data where available. Furthermore wind and wave speed observations can be used for a more accurate analysis. For this study, we utilize Envisat ASAR ScanSAR data. A probability map is generated based on the radar backscatter, effect of wind and dampening value. The semi-automatic algorithm is based on supervised classification. As a classifier, Artificial Neural Network Multilayer Perceptron (ANN MLP) classifier is used since it is more flexible and efficient than conventional maximum likelihood classifier for multisource and multi-temporal data. The learning algorithm for ANN MLP is chosen as the Levenberg-Marquardt (LM). Training and test data for supervised classification are composed from the textural information created from SAR images. This approach is semiautomatic because tuning the parameters of classifier and composing training data need a human interaction. We point out the similarities and differences between the two methods and their results as well as underlining their advantages and disadvantages. Due to the lack of ground truth data, we compare obtained results to each other, as well as other published oil slick area assessments.

  11. Advanced subgrid-scale modeling for convection-dominated species transport at fluid interfaces with application to mass transfer from rising bubbles

    NASA Astrophysics Data System (ADS)

    Weiner, Andre; Bothe, Dieter

    2017-10-01

    This paper presents a novel subgrid scale (SGS) model for simulating convection-dominated species transport at deformable fluid interfaces. One possible application is the Direct Numerical Simulation (DNS) of mass transfer from rising bubbles. The transport of a dissolving gas along the bubble-liquid interface is determined by two transport phenomena: convection in streamwise direction and diffusion in interface normal direction. The convective transport for technical bubble sizes is several orders of magnitude higher, leading to a thin concentration boundary layer around the bubble. A true DNS, fully resolving hydrodynamic and mass transfer length scales results in infeasible computational costs. Our approach is therefore a DNS of the flow field combined with a SGS model to compute the mass transfer between bubble and liquid. An appropriate model-function is used to compute the numerical fluxes on all cell faces of an interface cell. This allows to predict the mass transfer correctly even if the concentration boundary layer is fully contained in a single cell layer around the interface. We show that the SGS-model reduces the resolution requirements at the interface by a factor of ten and more. The integral flux correction is also applicable to other thin boundary layer problems. Two flow regimes are investigated to validate the model. A semi-analytical solution for creeping flow is used to assess local and global mass transfer quantities. For higher Reynolds numbers ranging from Re = 100 to Re = 460 and Péclet numbers between Pe =104 and Pe = 4 ṡ106 we compare the global Sherwood number against correlations from literature. In terms of accuracy, the predicted mass transfer never deviates more than 4% from the reference values.

  12. Automatic sample changer control software for automation of neutron activation analysis process in Malaysian Nuclear Agency

    NASA Astrophysics Data System (ADS)

    Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.

    2018-01-01

    Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.

  13. Digital controller for a Baum folding machine. [providing automatic counting and machine shutoff

    NASA Technical Reports Server (NTRS)

    Bryant, W. H. (Inventor)

    1974-01-01

    A digital controller for controlling the operation of a folding machine enables automatic folding of a desired number of sheets responsive to entry of that number into a selector. The controller includes three decade counter stages for corresponding rows of units, tens and hundreds push buttons. Each stage including a decimal-to-BCD encoder, a buffer register, and a digital or binary counter. The BCD representation of the selected count for each digit is loaded into the respective decade down counters. Pulses generated by a sensor and associated circuitry are used to decrease the count in the decade counters. When the content of the decade counter reaches either 0 or 1, a solenoid control valve is actuated which interrupts operation of the machine. A repeat switch, when actuated, prevents clearing of the buffer registers so that multiple groups of the same number of sheets can be folded without reentering the number into the selector.

  14. p-barp interactions at 2. 32 GeV/c

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, C.K.; Fields, T.; Rhines, D.S.

    1978-01-01

    A bubble-chamber experiment based on 304 000 events of p-barp interactions at 2.32 GeV/c is described. The film was automatically scanned and measured by the POLLY II system. Details of the data-analysis methods are given. We report results on cross sections for constrained final states, tests of C invariance, and inclusive pion and rho/sup 0/ multiplicity parameters for annihilation final states.

  15. SURE reliability analysis: Program and mathematics

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; White, Allan L.

    1988-01-01

    The SURE program is a new reliability analysis tool for ultrareliable computer system architectures. The computational methods on which the program is based provide an efficient means for computing accurate upper and lower bounds for the death state probabilities of a large class of semi-Markov models. Once a semi-Markov model is described using a simple input language, the SURE program automatically computes the upper and lower bounds on the probability of system failure. A parameter of the model can be specified as a variable over a range of values directing the SURE program to perform a sensitivity analysis automatically. This feature, along with the speed of the program, makes it especially useful as a design tool.

  16. [Left ventricular volume determination by first-pass radionuclide angiocardiography using a semi-geometric count-based method].

    PubMed

    Kinoshita, S; Suzuki, T; Yamashita, S; Muramatsu, T; Ide, M; Dohi, Y; Nishimura, K; Miyamae, T; Yamamoto, I

    1992-01-01

    A new radionuclide technique for the calculation of left ventricular (LV) volume by the first-pass (FP) method was developed and examined. Using a semi-geometric count-based method, the LV volume can be measured by the following equation: CV = CM/(L/d). V = (CT/CV) x d3 = (CT/CM) x L x d2. (V = LV volume, CV = voxel count, CM = the maximum LV count, CT = the total LV count, L = LV depth where the maximum count was obtained, and d = pixel size.) This theorem was applied to FP LV images obtained in the 30-degree right anterior oblique position. Frame-mode acquisition was performed and the LV end-diastolic maximum count and total count were obtained. The maximum LV depth was obtained as the maximum width of the LV on the FP end-diastolic image, using the assumption that the LV cross-section is circular. These values were substituted in the above equation and the LV end-diastolic volume (FP-EDV) was calculated. A routine equilibrium (EQ) study was done, and the end-diastolic maximum count and total count were obtained. The LV maximum depth was measured on the FP end-diastolic frame, as the maximum length of the LV image. Using these values, the EQ-EDV was calculated and the FP-EDV was compared to the EQ-EDV. The correlation coefficient for these two values was r = 0.96 (n = 23, p less than 0.001), and the standard error of the estimated volume was 10 ml.(ABSTRACT TRUNCATED AT 250 WORDS)

  17. Unbinding of targeted ultrasound contrast agent microbubbles by secondary acoustic forces.

    PubMed

    Garbin, Valeria; Overvelde, Marlies; Dollet, Benjamin; de Jong, Nico; Lohse, Detlef; Versluis, Michel

    2011-10-07

    Targeted molecular imaging with ultrasound contrast agent microbubbles is achieved by incorporating targeting ligands on the bubble coating and allows for specific imaging of tissues affected by diseases. Improved understanding of the interplay between the acoustic forces acting on the bubbles during insonation with ultrasound and other forces (e.g. shear due to blood flow, binding of targeting ligands to receptors on cell membranes) can help improve the efficacy of this technique. This work focuses on the effects of the secondary acoustic radiation force, which causes bubbles to attract each other and may affect the adhesion of targeted bubbles. First, we examine the translational dynamics of ultrasound contrast agent microbubbles in contact with (but not adherent to) a semi-rigid membrane due to the secondary acoustic radiation force. An equation of motion that effectively accounts for the proximity of the membrane is developed, and the predictions of the model are compared with experimental data extracted from optical recordings at 15 million frames per second. A time-averaged model is also proposed and validated. In the second part of the paper, initial results on the translation due to the secondary acoustic radiation force of targeted, adherent bubbles are presented. Adherent bubbles are also found to move due to secondary acoustic radiation force, and a restoring force is observed that brings them back to their initial positions. For increasing magnitude of the secondary acoustic radiation force, a threshold is reached above which the adhesion of targeted microbubbles is disrupted. This points to the fact that secondary acoustic radiation forces can cause adherent bubbles to detach and alter the spatial distribution of targeted contrast agents bound to tissues during activation with ultrasound. While the details of the rupture of intermolecular bonds remain elusive, this work motivates the use of the secondary acoustic radiation force to measure the strength of adhesion of targeted microbubbles.

  18. A Generalized Eulerian-Lagrangian Analysis, with Application to Liquid Flows with Vapor Bubbles

    NASA Technical Reports Server (NTRS)

    Dejong, Frederik J.; Meyyappan, Meyya

    1993-01-01

    Under a NASA MSFC SBIR Phase 2 effort an analysis has been developed for liquid flows with vapor bubbles such as those in liquid rocket engine components. The analysis is based on a combined Eulerian-Lagrangian technique, in which Eulerian conservation equations are solved for the liquid phase, while Lagrangian equations of motion are integrated in computational coordinates for the vapor phase. The novel aspect of the Lagrangian analysis developed under this effort is that it combines features of the so-called particle distribution approach with those of the so-called particle trajectory approach and can, in fact, be considered as a generalization of both of those traditional methods. The result of this generalization is a reduction in CPU time and memory requirements. Particle time step (stability) limitations have been eliminated by semi-implicit integration of the particle equations of motion (and, for certain applications, the particle temperature equation), although practical limitations remain in effect for reasons of accuracy. The analysis has been applied to the simulation of cavitating flow through a single-bladed section of a labyrinth seal. Models for the simulation of bubble formation and growth have been included, as well as models for bubble drag and heat transfer. The results indicate that bubble formation is more or less 'explosive'. for a given flow field, the number density of bubble nucleation sites is very sensitive to the vapor properties and the surface tension. The bubble motion, on the other hand, is much less sensitive to the properties, but is affected strongly by the local pressure gradients in the flow field. In situations where either the material properties or the flow field are not known with sufficient accuracy, parametric studies can be carried out rapidly to assess the effect of the important variables. Future work will include application of the analysis to cavitation in inducer flow fields.

  19. Dryout and Rewetting in the Pool Boiling Experiment Flown on STS-72 (PBE-2 B) and STS-77 (PBE-2 A)

    NASA Technical Reports Server (NTRS)

    Merte, Herman, Jr.; Lee, Ho Sung; Keller, Robert B.

    1998-01-01

    Experiments were conducted in the microgravity of space in which a pool of liquid (R-113), initially at a precisely defined pressure and temperature, is subjected to a step imposed heat flux from a semi-transparent thin-film heater forming part of one wall of the container such that boiling is initiated and maintained for a defined period of time at a constant pressure level. A total of nine tests were conducted at three levels of heat flux and three levels of subcooling in each of the two space experiments in a GAS canister on the STS-77, -72, respectively. Three (3) modes of propagation of boiling across the heater surface and subsequent vapor bubble growths were observed, in addition to the two (2) modes observed in the previous microgravity pool boiling space flights on STS-47, -57, and -60. Of particular interest were the extremely dynamic or "explosive" growths, which were determined to be the consequence of the large increase in the liquid-vapor interface area associated with the appearance of a corrugated or rough interface. Predictions of circumstances for its onset have been carried out. Assumptions were necessary regarding the character of disturbances necessary for the instabilities to grow. Also, a new vapor bubble phenomena was observed in which small vapor bubbles migrated toward a larger bubble, eventually coalescing with this larger bubble. The heat transfer was enhanced approximately 30% as a result of these migrating bubbles, which is believed to be a vapor bubble manifestation of Marangoni convection and/or molecular momentum effects, sometimes referred to as vapor recoil. The circumstances of heat flux and liquid subcooling necessary to produce heater surface dryout for an initially stagnant liquid subjected to an imposed heat flux have been more closely identified.

  20. Automated Mobile System for Accurate Outdoor Tree Crop Enumeration Using an Uncalibrated Camera.

    PubMed

    Nguyen, Thuy Tuong; Slaughter, David C; Hanson, Bradley D; Barber, Andrew; Freitas, Amy; Robles, Daniel; Whelan, Erin

    2015-07-28

    This paper demonstrates an automated computer vision system for outdoor tree crop enumeration in a seedling nursery. The complete system incorporates both hardware components (including an embedded microcontroller, an odometry encoder, and an uncalibrated digital color camera) and software algorithms (including microcontroller algorithms and the proposed algorithm for tree crop enumeration) required to obtain robust performance in a natural outdoor environment. The enumeration system uses a three-step image analysis process based upon: (1) an orthographic plant projection method integrating a perspective transform with automatic parameter estimation; (2) a plant counting method based on projection histograms; and (3) a double-counting avoidance method based on a homography transform. Experimental results demonstrate the ability to count large numbers of plants automatically with no human effort. Results show that, for tree seedlings having a height up to 40 cm and a within-row tree spacing of approximately 10 cm, the algorithms successfully estimated the number of plants with an average accuracy of 95.2% for trees within a single image and 98% for counting of the whole plant population in a large sequence of images.

  1. Automated Mobile System for Accurate Outdoor Tree Crop Enumeration Using an Uncalibrated Camera

    PubMed Central

    Nguyen, Thuy Tuong; Slaughter, David C.; Hanson, Bradley D.; Barber, Andrew; Freitas, Amy; Robles, Daniel; Whelan, Erin

    2015-01-01

    This paper demonstrates an automated computer vision system for outdoor tree crop enumeration in a seedling nursery. The complete system incorporates both hardware components (including an embedded microcontroller, an odometry encoder, and an uncalibrated digital color camera) and software algorithms (including microcontroller algorithms and the proposed algorithm for tree crop enumeration) required to obtain robust performance in a natural outdoor environment. The enumeration system uses a three-step image analysis process based upon: (1) an orthographic plant projection method integrating a perspective transform with automatic parameter estimation; (2) a plant counting method based on projection histograms; and (3) a double-counting avoidance method based on a homography transform. Experimental results demonstrate the ability to count large numbers of plants automatically with no human effort. Results show that, for tree seedlings having a height up to 40 cm and a within-row tree spacing of approximately 10 cm, the algorithms successfully estimated the number of plants with an average accuracy of 95.2% for trees within a single image and 98% for counting of the whole plant population in a large sequence of images. PMID:26225982

  2. New semi-quantitative 123I-MIBG estimation method compared with scoring system in follow-up of advanced neuroblastoma: utility of total MIBG retention ratio versus scoring method.

    PubMed

    Sano, Yuko; Okuyama, Chio; Iehara, Tomoko; Matsushima, Shigenori; Yamada, Kei; Hosoi, Hajime; Nishimura, Tsunehiko

    2012-07-01

    The purpose of this study is to evaluate a new semi-quantitative estimation method using (123)I-MIBG retention ratio to assess response to chemotherapy for advanced neuroblastoma. Thirteen children with advanced neuroblastoma (International Neuroblastoma Risk Group Staging System: stage M) were examined for a total of 51 studies with (123)I-MIBG scintigraphy (before and during chemotherapy). We proposed a new semi-quantitative method using MIBG retention ratio (count obtained with delayed image/count obtained with early image with decay correction) to estimate MIBG accumulation. We analyzed total (123)I-MIBG retention ratio (TMRR: total body count obtained with delayed image/total body count obtained with early image with decay correction) and compared with a scoring method in terms of correlation with tumor markers. TMRR showed significantly higher correlations with urinary catecholamine metabolites before chemotherapy (VMA: r(2) = 0.45, P < 0.05, HVA: r(2) = 0.627, P < 0.01) than MIBG score (VMA: r(2) = 0.19, P = 0.082, HVA: r(2) = 0.25, P = 0.137). There were relatively good correlations between serial change of TMRR and those of urinary catecholamine metabolites (VMA: r(2) = 0.274, P < 0.001, HVA: r(2) = 0.448, P < 0.0001) compared with serial change of MIBG score and those of tumor markers (VMA: r(2) = 0.01, P = 0.537, HVA: 0.084, P = 0.697) during chemotherapy for advanced neuroblastoma. TMRR could be a useful semi-quantitative method for estimating early response to chemotherapy of advanced neuroblastoma because of its high correlation with urine catecholamine metabolites.

  3. 20 CFR 220.133 - Skill requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... needs little or no judgment to do simple duties that can be learned on the job in a short period of time... claimant can usually learn to do the job in 30 days, and little job training and judgment are needed. The... machines which are automatic or operated by others); or (4) Machine tending. (c) Semi-skilled work. Semi...

  4. 20 CFR 220.133 - Skill requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... needs little or no judgment to do simple duties that can be learned on the job in a short period of time... claimant can usually learn to do the job in 30 days, and little job training and judgment are needed. The... machines which are automatic or operated by others); or (4) Machine tending. (c) Semi-skilled work. Semi...

  5. 20 CFR 220.133 - Skill requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... needs little or no judgment to do simple duties that can be learned on the job in a short period of time... claimant can usually learn to do the job in 30 days, and little job training and judgment are needed. The... machines which are automatic or operated by others); or (4) Machine tending. (c) Semi-skilled work. Semi...

  6. 20 CFR 220.133 - Skill requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... needs little or no judgment to do simple duties that can be learned on the job in a short period of time... claimant can usually learn to do the job in 30 days, and little job training and judgment are needed. The... machines which are automatic or operated by others); or (4) Machine tending. (c) Semi-skilled work. Semi...

  7. 20 CFR 220.133 - Skill requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... needs little or no judgment to do simple duties that can be learned on the job in a short period of time... claimant can usually learn to do the job in 30 days, and little job training and judgment are needed. The... machines which are automatic or operated by others); or (4) Machine tending. (c) Semi-skilled work. Semi...

  8. Extraction of sandy bedforms features through geodesic morphometry

    NASA Astrophysics Data System (ADS)

    Debese, Nathalie; Jacq, Jean-José; Garlan, Thierry

    2016-09-01

    State-of-art echosounders reveal fine-scale details of mobile sandy bedforms, which are commonly found on continental shelfs. At present, their dynamics are still far from being completely understood. These bedforms are a serious threat to navigation security, anthropic structures and activities, placing emphasis on research breakthroughs. Bedform geometries and their dynamics are closely linked; therefore, one approach is to develop semi-automatic tools aiming at extracting their structural features from bathymetric datasets. Current approaches mimic manual processes or rely on morphological simplification of bedforms. The 1D and 2D approaches cannot address the wide ranges of both types and complexities of bedforms. In contrast, this work attempts to follow a 3D global semi-automatic approach based on a bathymetric TIN. The currently extracted primitives are the salient ridge and valley lines of the sand structures, i.e., waves and mega-ripples. The main difficulty is eliminating the ripples that are found to heavily overprint any observations. To this end, an anisotropic filter that is able to discard these structures while still enhancing the wave ridges is proposed. The second part of the work addresses the semi-automatic interactive extraction and 3D augmented display of the main lines structures. The proposed protocol also allows geoscientists to interactively insert topological constraints.

  9. Singular value decomposition of received ultrasound signal to separate tissue, blood flow, and cavitation signals

    NASA Astrophysics Data System (ADS)

    Ikeda, Hayato; Nagaoka, Ryo; Lafond, Maxime; Yoshizawa, Shin; Iwasaki, Ryosuke; Maeda, Moe; Umemura, Shin-ichiro; Saijo, Yoshifumi

    2018-07-01

    High-intensity focused ultrasound is a noninvasive treatment applied by externally irradiating ultrasound to the body to coagulate the target tissue thermally. Recently, it has been proposed as a noninvasive treatment for vascular occlusion to replace conventional invasive treatments. Cavitation bubbles generated by the focused ultrasound can accelerate the effect of thermal coagulation. However, the tissues surrounding the target may be damaged by cavitation bubbles generated outside the treatment area. Conventional methods based on Doppler analysis only in the time domain are not suitable for monitoring blood flow in the presence of cavitation. In this study, we proposed a novel filtering method based on the differences in spatiotemporal characteristics, to separate tissue, blood flow, and cavitation by employing singular value decomposition. Signals from cavitation and blood flow were extracted automatically using spatial and temporal covariance matrices.

  10. Semi-automatic knee cartilage segmentation

    NASA Astrophysics Data System (ADS)

    Dam, Erik B.; Folkesson, Jenny; Pettersen, Paola C.; Christiansen, Claus

    2006-03-01

    Osteo-Arthritis (OA) is a very common age-related cause of pain and reduced range of motion. A central effect of OA is wear-down of the articular cartilage that otherwise ensures smooth joint motion. Quantification of the cartilage breakdown is central in monitoring disease progression and therefore cartilage segmentation is required. Recent advances allow automatic cartilage segmentation with high accuracy in most cases. However, the automatic methods still fail in some problematic cases. For clinical studies, even if a few failing cases will be averaged out in the overall results, this reduces the mean accuracy and precision and thereby necessitates larger/longer studies. Since the severe OA cases are often most problematic for the automatic methods, there is even a risk that the quantification will introduce a bias in the results. Therefore, interactive inspection and correction of these problematic cases is desirable. For diagnosis on individuals, this is even more crucial since the diagnosis will otherwise simply fail. We introduce and evaluate a semi-automatic cartilage segmentation method combining an automatic pre-segmentation with an interactive step that allows inspection and correction. The automatic step consists of voxel classification based on supervised learning. The interactive step combines a watershed transformation of the original scan with the posterior probability map from the classification step at sub-voxel precision. We evaluate the method for the task of segmenting the tibial cartilage sheet from low-field magnetic resonance imaging (MRI) of knees. The evaluation shows that the combined method allows accurate and highly reproducible correction of the segmentation of even the worst cases in approximately ten minutes of interaction.

  11. Automatic airline baggage counting using 3D image segmentation

    NASA Astrophysics Data System (ADS)

    Yin, Deyu; Gao, Qingji; Luo, Qijun

    2017-06-01

    The baggage number needs to be checked automatically during baggage self-check-in. A fast airline baggage counting method is proposed in this paper using image segmentation based on height map which is projected by scanned baggage 3D point cloud. There is height drop in actual edge of baggage so that it can be detected by the edge detection operator. And then closed edge chains are formed from edge lines that is linked by morphological processing. Finally, the number of connected regions segmented by closed chains is taken as the baggage number. Multi-bag experiment that is performed on the condition of different placement modes proves the validity of the method.

  12. In-laboratory development of an automatic track counting system for solid state nuclear track detectors

    NASA Astrophysics Data System (ADS)

    Uzun, Sefa Kemal; Demiröz, Işık; Ulus, İzzet

    2017-01-01

    In this study, an automatic track counting system was developed for solid state nuclear track detectors (SSNTD). Firstly the specifications of required hardware components were determined, and accordingly the CCD camera, microscope and stage motor table was supplied and integrated. The system was completed by developing parametric software with VB.Net language. Finally a set of test intended for radon activity concentration measurement was applied. According to the test results, the system was enabled for routine radon measurement. Whether the parameters of system are adjusted for another SSNTD application, it could be used for other fields of SSNTD like neutron dosimetry or heavy charged particle detection.

  13. Tool Efficiency Analysis model research in SEMI industry

    NASA Astrophysics Data System (ADS)

    Lei, Ma; Nana, Zhang; Zhongqiu, Zhang

    2018-06-01

    One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states, and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.

  14. Is bacteriostatic saline superior to normal saline as an echocardiographic contrast agent?

    PubMed

    Cardozo, Shaun; Gunasekaran, Prasad; Patel, Hena; McGorisk, Timothy; Toosi, Mehrdad; Faraz, Haroon; Zalawadiya, Sandip; Alesh, Issa; Kottam, Anupama; Afonso, Luis

    2014-12-01

    Objective data on the performance characteristics and physical properties of commercially available saline formulations [normal saline (NS) vs. bacteriostatic normal saline (bNS)] are sparse. This study sought to compare the in vitro physical properties and in vivo characteristics of two commonly employed echocardiographic saline contrast agents in an attempt to assess superiority. Nineteen patients undergoing transesophageal echocardiograms were each administered agitated regular NS and bNS injections in random order and in a blinded manner according to a standardized protocol. Video time-intensity (TI) curves were constructed from a representative region of interest, placed paraseptally within the right atrium, in the bicaval view. TI curves were analyzed for maximal plateau acoustic intensity (Vmax, dB) and dwell time (DT, s), defined as time duration between onset of Vmax and decay of video intensity below clinically useful levels, reflecting the duration of homogenous opacification of the right atrium. To further characterize the physical properties of the bubbles in vitro, fixed aliquots of similarly agitated saline were injected into a glass well slide-cover slip assembly and examined using an optical microscope to determine bubble diameter in microns (µm) and concentration [bubble count/high power field (hpf)]. A higher acoustic intensity (a less negative dB level), higher bubble concentration and longer DT were considered properties of a superior contrast agent. For statistical analysis, a paired t test was conducted to evaluate the differences in means of Vmax and DT. Compared to NS, bNS administration was associated with superior opacification (video intensity -8.69 ± 4.7 vs. -10.46 ± 4.1 dB, P = 0.002), longer DT (17.3 ± 6.1 vs. 10.2 ± 3.7 s) in vivo and smaller mean bubble size (43.4 vs. 58.6 μm) and higher bubble concentration (1,002 vs. 298 bubble/hpf) in vitro. bNS provides higher intensity and more sustained opacification of the right atrium compared to NS. Higher bubble concentration and stability appear to be additional desirable rheological characteristics favoring bNS as a contrast agent.

  15. 10 CFR Appendix J1 to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Automatic and Semi-Automatic Clothes...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... characteristics of the clothes load placed in the clothes container, without allowing or requiring consumer... weight of the clothes load placed in the clothes container, without allowing or requiring consumer....4Clothes container means the compartment within the clothes washer that holds the clothes during the...

  16. Three-dimensional reconstruction from serial sections in PC-Windows platform by using 3D_Viewer.

    PubMed

    Xu, Yi-Hua; Lahvis, Garet; Edwards, Harlene; Pitot, Henry C

    2004-11-01

    Three-dimensional (3D) reconstruction from serial sections allows identification of objects of interest in 3D and clarifies the relationship among these objects. 3D_Viewer, developed in our laboratory for this purpose, has four major functions: image alignment, movie frame production, movie viewing, and shift-overlay image generation. Color images captured from serial sections were aligned; then the contours of objects of interest were highlighted in a semi-automatic manner. These 2D images were then automatically stacked at different viewing angles, and their composite images on a projected plane were recorded by an image transform-shift-overlay technique. These composition images are used in the object-rotation movie show. The design considerations of the program and the procedures used for 3D reconstruction from serial sections are described. This program, with a digital image-capture system, a semi-automatic contours highlight method, and an automatic image transform-shift-overlay technique, greatly speeds up the reconstruction process. Since images generated by 3D_Viewer are in a general graphic format, data sharing with others is easy. 3D_Viewer is written in MS Visual Basic 6, obtainable from our laboratory on request.

  17. Integration of tools for binding archetypes to SNOMED CT.

    PubMed

    Sundvall, Erik; Qamar, Rahil; Nyström, Mikael; Forss, Mattias; Petersson, Håkan; Karlsson, Daniel; Ahlfeldt, Hans; Rector, Alan

    2008-10-27

    The Archetype formalism and the associated Archetype Definition Language have been proposed as an ISO standard for specifying models of components of electronic healthcare records as a means of achieving interoperability between clinical systems. This paper presents an archetype editor with support for manual or semi-automatic creation of bindings between archetypes and terminology systems. Lexical and semantic methods are applied in order to obtain automatic mapping suggestions. Information visualisation methods are also used to assist the user in exploration and selection of mappings. An integrated tool for archetype authoring, semi-automatic SNOMED CT terminology binding assistance and terminology visualization was created and released as open source. Finding the right terms to bind is a difficult task but the effort to achieve terminology bindings may be reduced with the help of the described approach. The methods and tools presented are general, but here only bindings between SNOMED CT and archetypes based on the openEHR reference model are presented in detail.

  18. Integration of tools for binding archetypes to SNOMED CT

    PubMed Central

    Sundvall, Erik; Qamar, Rahil; Nyström, Mikael; Forss, Mattias; Petersson, Håkan; Karlsson, Daniel; Åhlfeldt, Hans; Rector, Alan

    2008-01-01

    Background The Archetype formalism and the associated Archetype Definition Language have been proposed as an ISO standard for specifying models of components of electronic healthcare records as a means of achieving interoperability between clinical systems. This paper presents an archetype editor with support for manual or semi-automatic creation of bindings between archetypes and terminology systems. Methods Lexical and semantic methods are applied in order to obtain automatic mapping suggestions. Information visualisation methods are also used to assist the user in exploration and selection of mappings. Results An integrated tool for archetype authoring, semi-automatic SNOMED CT terminology binding assistance and terminology visualization was created and released as open source. Conclusion Finding the right terms to bind is a difficult task but the effort to achieve terminology bindings may be reduced with the help of the described approach. The methods and tools presented are general, but here only bindings between SNOMED CT and archetypes based on the openEHR reference model are presented in detail. PMID:19007444

  19. Direct Numerical Simulations of Multiphase Flows

    NASA Astrophysics Data System (ADS)

    Tryggvason, Gretar

    2013-03-01

    Many natural and industrial processes, such as rain and gas exchange between the atmosphere and oceans, boiling heat transfer, atomization and chemical reactions in bubble columns, involve multiphase flows. Often the mixture can be described as a disperse flow where one phase consists of bubbles or drops. Direct numerical simulations (DNS) of disperse flow have recently been used to study the dynamics of multiphase flows with a large number of bubbles and drops, often showing that the collective motion results in relatively simple large-scale structure. Here we review simulations of bubbly flows in vertical channels where the flow direction, as well as the bubble deformability, has profound implications on the flow structure and the total flow rate. Results obtained so far are summarized and open questions identified. The resolution for DNS of multiphase flows is usually determined by a dominant scale, such as the average bubble or drop size, but in many cases much smaller scales are also present. These scales often consist of thin films, threads, or tiny drops appearing during coalescence or breakup, or are due to the presence of additional physical processes that operate on a very different time scale than the fluid flow. The presence of these small-scale features demand excessive resolution for conventional numerical approaches. However, at small flow scales the effects of surface tension are generally strong so the interface geometry is simple and viscous forces dominate the flow and keep it simple also. These are exactly the conditions under which analytical models can be used and we will discuss efforts to combine a semi-analytical description for the small-scale processes with a fully resolved simulation of the rest of the flow. We will, in particular, present an embedded analytical description to capture the mass transfer from bubbles in liquids where the diffusion of mass is much slower than the diffusion of momentum. This results in very thin mass-boundary layers that are difficult to resolve, but the new approach allows us to simulate the mass transfer from many freely evolving bubbles and examine the effect of the interactions of the bubbles with each other and the flow. We will conclude by attempting to summarize the current status of DNS of multiphase flows. Support by NSF and DOE (CASL)

  20. Addressing case specific biogas plant tasks: industry oriented methane yields derived from 5L Automatic Methane Potential Test Systems in batch or semi-continuous tests using realistic inocula, substrate particle sizes and organic loading.

    PubMed

    Kolbl, Sabina; Paloczi, Attila; Panjan, Jože; Stres, Blaž

    2014-02-01

    The primary aim of the study was to develop and validate an in-house upscale of Automatic Methane Potential Test System II for studying real-time inocula and real-scale substrates in batch, codigestion and enzyme enhanced hydrolysis experiments, in addition to semi-continuous operation of the developed equipment and experiments testing inoculum functional quality. The successful upscale to 5L enabled comparison of different process configurations in shorter preparation times with acceptable accuracy and high-through put intended for industrial decision making. The adoption of the same scales, equipment and methodologies in batch and semi-continuous tests mirroring those at full scale biogas plants resulted in matching methane yields between the two laboratory tests and full-scale, confirming thus the increased decision making value of the approach for industrial operations. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. A 3D THz image processing methodology for a fully integrated, semi-automatic and near real-time operational system

    NASA Astrophysics Data System (ADS)

    Brook, A.; Cristofani, E.; Vandewal, M.; Matheis, C.; Jonuscheit, J.; Beigang, R.

    2012-05-01

    The present study proposes a fully integrated, semi-automatic and near real-time mode-operated image processing methodology developed for Frequency-Modulated Continuous-Wave (FMCW) THz images with the center frequencies around: 100 GHz and 300 GHz. The quality control of aeronautics composite multi-layered materials and structures using Non-Destructive Testing is the main focus of this work. Image processing is applied on the 3-D images to extract useful information. The data is processed by extracting areas of interest. The detected areas are subjected to image analysis for more particular investigation managed by a spatial model. Finally, the post-processing stage examines and evaluates the spatial accuracy of the extracted information.

  2. The SURE reliability analysis program

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1986-01-01

    The SURE program is a new reliability tool for ultrareliable computer system architectures. The program is based on computational methods recently developed for the NASA Langley Research Center. These methods provide an efficient means for computing accurate upper and lower bounds for the death state probabilities of a large class of semi-Markov models. Once a semi-Markov model is described using a simple input language, the SURE program automatically computes the upper and lower bounds on the probability of system failure. A parameter of the model can be specified as a variable over a range of values directing the SURE program to perform a sensitivity analysis automatically. This feature, along with the speed of the program, makes it especially useful as a design tool.

  3. The SURE Reliability Analysis Program

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1986-01-01

    The SURE program is a new reliability analysis tool for ultrareliable computer system architectures. The program is based on computational methods recently developed for the NASA Langley Research Center. These methods provide an efficient means for computing accurate upper and lower bounds for the death state probabilities of a large class of semi-Markov models. Once a semi-Markov model is described using a simple input language, the SURE program automatically computes the upper and lower bounds on the probability of system failure. A parameter of the model can be specified as a variable over a range of values directing the SURE program to perform a sensitivity analysis automatically. This feature, along with the speed of the program, makes it especially useful as a design tool.

  4. Semi automatic indexing of PostScript files using Medical Text Indexer in medical education.

    PubMed

    Mollah, Shamim Ara; Cimino, Christopher

    2007-10-11

    At Albert Einstein College of Medicine a large part of online lecture materials contain PostScript files. As the collection grows it becomes essential to create a digital library to have easy access to relevant sections of the lecture material that is full-text indexed; to create this index it is necessary to extract all the text from the document files that constitute the originals of the lectures. In this study we present a semi automatic indexing method using robust technique for extracting text from PostScript files and National Library of Medicine's Medical Text Indexer (MTI) program for indexing the text. This model can be applied to other medical schools for indexing purposes.

  5. A semi-automatic method for positioning a femoral bone reconstruction for strict view generation.

    PubMed

    Milano, Federico; Ritacco, Lucas; Gomez, Adrian; Gonzalez Bernaldo de Quiros, Fernan; Risk, Marcelo

    2010-01-01

    In this paper we present a semi-automatic method for femoral bone positioning after 3D image reconstruction from Computed Tomography images. This serves as grounding for the definition of strict axial, longitudinal and anterior-posterior views, overcoming the problem of patient positioning biases in 2D femoral bone measuring methods. After the bone reconstruction is aligned to a standard reference frame, new tomographic slices can be generated, on which unbiased measures may be taken. This could allow not only accurate inter-patient comparisons but also intra-patient comparisons, i.e., comparisons of images of the same patient taken at different times. This method could enable medical doctors to diagnose and follow up several bone deformities more easily.

  6. Sequential Blood Filtration for Extracorporeal Circulation: Initial Results from a Proof-of-Concept Prototype.

    PubMed

    Herbst, Daniel P

    2014-09-01

    Micropore filters are used during extracorporeal circulation to prevent gaseous and solid particles from entering the patient's systemic circulation. Although these devices improve patient safety, limitations in current designs have prompted the development of a new concept in micropore filtration. A prototype of the new design was made using 40-μm filter screens and compared against four commercially available filters for performance in pressure loss and gross air handling. Pre- and postfilter bubble counts for 5- and 10-mL bolus injections in an ex vivo test circuit were recorded using a Doppler ultrasound bubble counter. Statistical analysis of results for bubble volume reduction between test filters was performed with one-way repeated-measures analysis of variance using Bonferroni post hoc tests. Changes in filter performance with changes in microbubble load were also assessed with dependent t tests using the 5- and 10-mL bolus injections as the paired sample for each filter. Significance was set at p < .05. All filters in the test group were comparable in pressure loss performance, showing a range of 26-33 mmHg at a flow rate of 6 L/min. In gross air-handling studies, the prototype showed improved bubble volume reduction, reaching statistical significance with three of the four commercial filters. All test filters showed decreased performance in bubble volume reduction when the microbubble load was increased. Findings from this research support the underpinning theories of a sequential arterial-line filter design and suggest that improvements in microbubble filtration may be possible using this technique.

  7. Sequential Blood Filtration for Extracorporeal Circulation: Initial Results from a Proof-of-Concept Prototype

    PubMed Central

    Herbst, Daniel P.

    2014-01-01

    Abstract: Micropore filters are used during extracorporeal circulation to prevent gaseous and solid particles from entering the patient’s systemic circulation. Although these devices improve patient safety, limitations in current designs have prompted the development of a new concept in micropore filtration. A prototype of the new design was made using 40-μm filter screens and compared against four commercially available filters for performance in pressure loss and gross air handling. Pre- and postfilter bubble counts for 5- and 10-mL bolus injections in an ex vivo test circuit were recorded using a Doppler ultrasound bubble counter. Statistical analysis of results for bubble volume reduction between test filters was performed with one-way repeated-measures analysis of variance using Bonferroni post hoc tests. Changes in filter performance with changes in microbubble load were also assessed with dependent t tests using the 5- and 10-mL bolus injections as the paired sample for each filter. Significance was set at p < .05. All filters in the test group were comparable in pressure loss performance, showing a range of 26–33 mmHg at a flow rate of 6 L/min. In gross air-handling studies, the prototype showed improved bubble volume reduction, reaching statistical significance with three of the four commercial filters. All test filters showed decreased performance in bubble volume reduction when the microbubble load was increased. Findings from this research support the underpinning theories of a sequential arterial-line filter design and suggest that improvements in microbubble filtration may be possible using this technique. PMID:26357790

  8. Computer measurement of particle sizes in electron microscope images

    NASA Technical Reports Server (NTRS)

    Hall, E. L.; Thompson, W. B.; Varsi, G.; Gauldin, R.

    1976-01-01

    Computer image processing techniques have been applied to particle counting and sizing in electron microscope images. Distributions of particle sizes were computed for several images and compared to manually computed distributions. The results of these experiments indicate that automatic particle counting within a reasonable error and computer processing time is feasible. The significance of the results is that the tedious task of manually counting a large number of particles can be eliminated while still providing the scientist with accurate results.

  9. Advances in Automated Plankton Imaging: Enhanced Throughput, Automated Staining, and Extended Deployment Modes for Imaging FlowCytobot

    NASA Astrophysics Data System (ADS)

    Sosik, H. M.; Olson, R. J.; Brownlee, E.; Brosnahan, M.; Crockford, E. T.; Peacock, E.; Shalapyonok, A.

    2016-12-01

    Imaging FlowCytobot (IFCB) was developed to fill a need for automated identification and monitoring of nano- and microplankton, especially phytoplankton in the size range 10 200 micrometer, which are important in coastal blooms (including harmful algal blooms). IFCB uses a combination of flow cytometric and video technology to capture high resolution (1 micrometer) images of suspended particles. This proven, now commercially available, submersible instrument technology has been deployed in fixed time series locations for extended periods (months to years) and in shipboard laboratories where underway water is automatically analyzed during surveys. Building from these successes, we have now constructed and evaluated three new prototype IFCB designs that extend measurement and deployment capabilities. To improve cell counting statistics without degrading image quality, a high throughput version (IFCB-HT) incorporates in-flow acoustic focusing to non-disruptively pre-concentrate cells before the measurement area of the flow cell. To extend imaging to all heterotrophic cells (even those that do not exhibit chlorophyll fluorescence), Staining IFCB (IFCB-S) incorporates automated addition of a live-cell fluorescent stain (fluorescein diacetate) to samples before analysis. A horizontally-oriented IFCB-AV design addresses the need for spatial surveying from surface autonomous vehicles, including design features that reliably eliminate air bubbles and mitigate wave motion impacts. Laboratory evaluation and test deployments in waters near Woods Hole show the efficacy of each of these enhanced IFCB designs.

  10. Semi-Automatic Modelling of Building FAÇADES with Shape Grammars Using Historic Building Information Modelling

    NASA Astrophysics Data System (ADS)

    Dore, C.; Murphy, M.

    2013-02-01

    This paper outlines a new approach for generating digital heritage models from laser scan or photogrammetric data using Historic Building Information Modelling (HBIM). HBIM is a plug-in for Building Information Modelling (BIM) software that uses parametric library objects and procedural modelling techniques to automate the modelling stage. The HBIM process involves a reverse engineering solution whereby parametric interactive objects representing architectural elements are mapped onto laser scan or photogrammetric survey data. A library of parametric architectural objects has been designed from historic manuscripts and architectural pattern books. These parametric objects were built using an embedded programming language within the ArchiCAD BIM software called Geometric Description Language (GDL). Procedural modelling techniques have been implemented with the same language to create a parametric building façade which automatically combines library objects based on architectural rules and proportions. Different configurations of the façade are controlled by user parameter adjustment. The automatically positioned elements of the façade can be subsequently refined using graphical editing while overlaying the model with orthographic imagery. Along with this semi-automatic method for generating façade models, manual plotting of library objects can also be used to generate a BIM model from survey data. After the 3D model has been completed conservation documents such as plans, sections, elevations and 3D views can be automatically generated for conservation projects.

  11. Fast and Frugal Heuristics Are Plausible Models of Cognition: Reply to Dougherty, Franco-Watkins, and Thomas (2008)

    ERIC Educational Resources Information Center

    Gigerenzer, Gerd; Hoffrage, Ulrich; Goldstein, Daniel G.

    2008-01-01

    M. R. Dougherty, A. M. Franco-Watkins, and R. Thomas (2008) conjectured that fast and frugal heuristics need an automatic frequency counter for ordering cues. In fact, only a few heuristics order cues, and these orderings can arise from evolutionary, social, or individual learning, none of which requires automatic frequency counting. The idea that…

  12. Fuel Performance Experiments and Modeling: Fission Gas Bubble Nucleation and Growth in Alloy Nuclear Fuels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDeavitt, Sean; Shao, Lin; Tsvetkov, Pavel

    2014-04-07

    Advanced fast reactor systems being developed under the DOE's Advanced Fuel Cycle Initiative are designed to destroy TRU isotopes generated in existing and future nuclear energy systems. Over the past 40 years, multiple experiments and demonstrations have been completed using U-Zr, U-Pu-Zr, U-Mo and other metal alloys. As a result, multiple empirical and semi-empirical relationships have been established to develop empirical performance modeling codes. Many mechanistic questions about fission as mobility, bubble coalescience, and gas release have been answered through industrial experience, research, and empirical understanding. The advent of modern computational materials science, however, opens new doors of development suchmore » that physics-based multi-scale models may be developed to enable a new generation of predictive fuel performance codes that are not limited by empiricism.« less

  13. 10 CFR Appendix J1 to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Automatic and Semi-Automatic Clothes...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... clothes washer design can achieve spin speeds in the 500g range. When this matrix is repeated 3 times, a...) or an equivalent extractor with same basket design (i.e. diameter, length, volume, and hole... materially inaccurate comparative data, field testing may be appropriate for establishing an acceptable test...

  14. 10 CFR Appendix J1 to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Automatic and Semi-Automatic Clothes...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    .... The 500g requirement will only be used if a clothes washer design can achieve spin speeds in the 500g... Products, P.O. Box 5127, Toledo, OH 43611) or an equivalent extractor with same basket design (i.e... provide materially inaccurate comparative data, field testing may be appropriate for establishing an...

  15. Automatic choroid cells segmentation and counting based on approximate convexity and concavity of chain code in fluorescence microscopic image

    NASA Astrophysics Data System (ADS)

    Lu, Weihua; Chen, Xinjian; Zhu, Weifang; Yang, Lei; Cao, Zhaoyuan; Chen, Haoyu

    2015-03-01

    In this paper, we proposed a method based on the Freeman chain code to segment and count rhesus choroid-retinal vascular endothelial cells (RF/6A) automatically for fluorescence microscopy images. The proposed method consists of four main steps. First, a threshold filter and morphological transform were applied to reduce the noise. Second, the boundary information was used to generate the Freeman chain codes. Third, the concave points were found based on the relationship between the difference of the chain code and the curvature. Finally, cells segmentation and counting were completed based on the characteristics of the number of the concave points, the area and shape of the cells. The proposed method was tested on 100 fluorescence microscopic cell images, and the average true positive rate (TPR) is 98.13% and the average false positive rate (FPR) is 4.47%, respectively. The preliminary results showed the feasibility and efficiency of the proposed method.

  16. Galaxy modelling. II. Multi-wavelength faint counts from a semi-analytic model of galaxy formation

    NASA Astrophysics Data System (ADS)

    Devriendt, J. E. G.; Guiderdoni, B.

    2000-11-01

    This paper predicts self-consistent faint galaxy counts from the UV to the submm wavelength range. The stardust spectral energy distributions described in Devriendt et al. \\citeparyear{DGS99} (Paper I) are embedded within the explicit cosmological framework of a simple semi-analytic model of galaxy formation and evolution. We begin with a description of the non-dissipative and dissipative collapses of primordial perturbations, and plug in standard recipes for star formation, stellar evolution and feedback. We also model the absorption of starlight by dust and its re-processing in the IR and submm. We then build a class of models which capture the luminosity budget of the universe through faint galaxy counts and redshift distributions in the whole wavelength range spanned by our spectra. In contrast with a rather stable behaviour in the optical and even in the far-IR, the submm counts are dramatically sensitive to variations in the cosmological parameters and changes in the star formation history. Faint submm counts are more easily accommodated within an open universe with a low value of Omega_0 , or a flat universe with a non-zero cosmological constant. We confirm the suggestion of Guiderdoni et al. \\citeparyear{GHBM98} that matching the current multi-wavelength data requires a population of heavily-extinguished, massive galaxies with large star formation rates ( ~ 500 M_sun yr-1) at intermediate and high redshift (z >= 1.5). Such a population of objects probably is the consequence of an increase of interaction and merging activity at high redshift, but a realistic quantitative description can only be obtained through more detailed modelling of such processes. This study illustrates the implementation of multi-wavelength spectra into a semi-analytic model. In spite of its simplicity, it already provides fair fits of the current data of faint counts, and a physically motivated way of interpolating and extrapolating these data to other wavelengths and fainter flux levels.

  17. Monitoring Cavitation in HIFU as an Aid to Assisting Treatment

    NASA Astrophysics Data System (ADS)

    Hsieh, Chang-yu; Smith, Penny Probert; Kennedy, James; Leslie, Thomas

    2007-05-01

    Rapid hypothermia resulting in tissue necrosis is often associated with bubble activity (normally from cavitation) in HIFU treatment. Indeed in some HIFU protocols, the evidence of cavitation is taken as an indicator of tissue lesions. In this paper we discuss two methods to delineate reliably the region in which cavitation occurs, so that a history of the cavitation events can be provided automatically during treatment. Results are shown on simulated images and from a clinical treatment session.

  18. Visual and semi-automatic non-invasive detection of interictal fast ripples: A potential biomarker of epilepsy in children with tuberous sclerosis complex.

    PubMed

    Bernardo, Danilo; Nariai, Hiroki; Hussain, Shaun A; Sankar, Raman; Salamon, Noriko; Krueger, Darcy A; Sahin, Mustafa; Northrup, Hope; Bebin, E Martina; Wu, Joyce Y

    2018-04-03

    We aim to establish that interictal fast ripples (FR; 250-500 Hz) are detectable on scalp EEG, and to investigate their association to epilepsy. Scalp EEG recordings of a subset of children with tuberous sclerosis complex (TSC)-associated epilepsy from two large multicenter observational TSC studies were analyzed and compared to control children without epilepsy or any other brain-based diagnoses. FR were identified both by human visual review and compared with semi-automated review utilizing a deep learning-based FR detector. Seven out of 7 children with TSC-associated epilepsy had scalp FR compared to 0 out of 4 children in the control group (p = 0.003). The automatic detector has a sensitivity of 98% and false positive rate with average of 11.2 false positives per minute. Non-invasive detection of interictal scalp FR was feasible, by both visual and semi-automatic detection. Interictal scalp FR occurred exclusively in children with TSC-associated epilepsy and were absent in controls without epilepsy. The proposed detector achieves high sensitivity of FR detection; however, expert review of the results to reduce false positives is advised. Interictal FR are detectable on scalp EEG and may potentially serve as a biomarker of epilepsy in children with TSC. Copyright © 2018 International Federation of Clinical Neurophysiology. All rights reserved.

  19. Semi-automatic object geometry estimation for image personalization

    NASA Astrophysics Data System (ADS)

    Ding, Hengzhou; Bala, Raja; Fan, Zhigang; Eschbach, Reiner; Bouman, Charles A.; Allebach, Jan P.

    2010-01-01

    Digital printing brings about a host of benefits, one of which is the ability to create short runs of variable, customized content. One form of customization that is receiving much attention lately is in photofinishing applications, whereby personalized calendars, greeting cards, and photo books are created by inserting text strings into images. It is particularly interesting to estimate the underlying geometry of the surface and incorporate the text into the image content in an intelligent and natural way. Current solutions either allow fixed text insertion schemes into preprocessed images, or provide manual text insertion tools that are time consuming and aimed only at the high-end graphic designer. It would thus be desirable to provide some level of automation in the image personalization process. We propose a semi-automatic image personalization workflow which includes two scenarios: text insertion and text replacement. In both scenarios, the underlying surfaces are assumed to be planar. A 3-D pinhole camera model is used for rendering text, whose parameters are estimated by analyzing existing structures in the image. Techniques in image processing and computer vison such as the Hough transform, the bilateral filter, and connected component analysis are combined, along with necessary user inputs. In particular, the semi-automatic workflow is implemented as an image personalization tool, which is presented in our companion paper.1 Experimental results including personalized images for both scenarios are shown, which demonstrate the effectiveness of our algorithms.

  20. Automatic scoring of dicentric chromosomes as a tool in large scale radiation accidents.

    PubMed

    Romm, H; Ainsbury, E; Barnard, S; Barrios, L; Barquinero, J F; Beinke, C; Deperas, M; Gregoire, E; Koivistoinen, A; Lindholm, C; Moquet, J; Oestreicher, U; Puig, R; Rothkamm, K; Sommer, S; Thierens, H; Vandersickel, V; Vral, A; Wojcik, A

    2013-08-30

    Mass casualty scenarios of radiation exposure require high throughput biological dosimetry techniques for population triage in order to rapidly identify individuals who require clinical treatment. The manual dicentric assay is a highly suitable technique, but it is also very time consuming and requires well trained scorers. In the framework of the MULTIBIODOSE EU FP7 project, semi-automated dicentric scoring has been established in six European biodosimetry laboratories. Whole blood was irradiated with a Co-60 gamma source resulting in 8 different doses between 0 and 4.5Gy and then shipped to the six participating laboratories. To investigate two different scoring strategies, cell cultures were set up with short term (2-3h) or long term (24h) colcemid treatment. Three classifiers for automatic dicentric detection were applied, two of which were developed specifically for these two different culture techniques. The automation procedure included metaphase finding, capture of cells at high resolution and detection of dicentric candidates. The automatically detected dicentric candidates were then evaluated by a trained human scorer, which led to the term 'semi-automated' being applied to the analysis. The six participating laboratories established at least one semi-automated calibration curve each, using the appropriate classifier for their colcemid treatment time. There was no significant difference between the calibration curves established, regardless of the classifier used. The ratio of false positive to true positive dicentric candidates was dose dependent. The total staff effort required for analysing 150 metaphases using the semi-automated approach was 2 min as opposed to 60 min for manual scoring of 50 metaphases. Semi-automated dicentric scoring is a useful tool in a large scale radiation accident as it enables high throughput screening of samples for fast triage of potentially exposed individuals. Furthermore, the results from the participating laboratories were comparable which supports networking between laboratories for this assay. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Pharmacological intervention against bubble-induced platelet aggregation in a rat model of decompression sickness

    PubMed Central

    Vallée, Nicolas; Ignatescu, Mihaela; Bourdon, Lionel

    2011-01-01

    Decompression sickness (DCS) with alterations in coagulation system and formation of platelet thrombi occurs when a subject is subjected to a reduction in environmental pressure. Blood platelet consumption after decompression is clearly linked to bubble formation in humans and offers an index for evaluating DCS severity in animal models. Previous studies highlighted a predominant involvement of platelet activation and thrombin generation in bubble-induced platelet aggregation (BIPA). To study the mechanism of the BIPA in DCS, we examined the effect of acetylsalicylic acid (ASA), heparin (Hep), and clopidogrel (Clo), with anti-thrombotic dose pretreatment in a rat model of DCS. Male Sprague-Dawley rats (n = 208) were randomly assigned to one experimental group treated before the hyperbaric exposure and decompression protocol either with ASA (3×100 mg·kg−1·day−1, n = 30), Clo (50 mg·kg−1·day−1, n = 60), Hep (500 IU/kg, n = 30), or to untreated group (n = 49). Rats were first compressed to 1,000 kPa (90 msw) for 45 min and then decompressed to surface in 38 min. In a control experiment, rats were treated with ASA (n = 13), Clo (n = 13), or Hep (n = 13) and maintained at atmospheric pressure for an equivalent period of time. Onset of DCS symptoms and death were recorded during a 60-min observation period after surfacing. DCS evaluation included pulmonary and neurological signs. Blood samples for platelet count (PC) were taken 30 min before hyperbaric exposure and 30 min after surfacing. Clo reduces the DCS mortality risk (mortality rate: 3/60 with Clo, 15/30 with ASA, 21/30 with Hep, and 35/49 in the untreated group) and DCS severity (neurological DCS incidence: 9/60 with Clo, 6/30 with ASA, 5/30 with Hep, and 12/49 in the untreated group). Clo reduced fall in platelet count and BIPA (−4,5% with Clo, −19.5% with ASA, −19,9% with Hep, and −29,6% in the untreated group). ASA, which inhibits the thromboxane A2 pathway, and Hep, which inhibits thrombin generation, have no protective effect on DCS incidence. Clo, a specific ADP-receptor antagonist, reduces post-decompression platelet consumption. These results point to the predominant involvement of the ADP release in BIPA but cannot differentiate definitively between bubble-induced vessel wall injury and bubble-blood component interactions in DCS. PMID:21212250

  2. Automatic measurement of skin textures of the dorsal hand in evaluating skin aging.

    PubMed

    Gao, Qian; Yu, Jiaming; Wang, Fang; Ge, Tiantian; Hu, Liwen; Liu, Yang

    2013-05-01

    Changes in skin textures have been used to evaluate skin aging in many studies. In our previous study, we built some skin texture parameters, which can be used to evaluate skin aging of human dorsal hand. However, it will take too much time and need to work arduously to get the information from digital skin image by manual work. So, we want to build a simple and effective method to automatically count some of those skin texture parameters by using digital image-processing technology. A total of 100 subjects aged 30 years and above were involved. Sun exposure history and demographic information were collected by using a questionnaire. The skin image of subjects' dorsal hand was obtained by using a portable skin detector. The number of grids, which is one of skin texture parameters built in our previous study, was measured manually and automatically. Automated image analysis program was developed by using Matlab 7.1 software. The number of grids counted automatically (NGA) was significantly correlated with the number of grids counted manually (NGM) (r = 0.9287, P < 0.0001). And in each age group, there were no significant differences between NGA and NGM. The NGA was negatively correlated with age and lifetime sun exposure, and decreased with increasing Beagley-Gibson score from 3 to 6. In addition, even after adjusting for NGA, the standard deviation of grid areas for each image was positively correlated with age, sun exposure, and Bealey-Gibson score. The method introduced in present study can be used to measure some skin aging parameters automatically and objectively. And it will save much time, reduce labor, and avoid measurement errors of deferent investigators when evaluating a great deal of skin images in a short time. © 2013 John Wiley & Sons A/S. Published by Blackwell Publishing Ltd.

  3. EMERGENCE OF GRANULAR-SIZED MAGNETIC BUBBLES THROUGH THE SOLAR ATMOSPHERE. II. NON-LTE CHROMOSPHERIC DIAGNOSTICS AND INVERSIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodríguez, Jaime de la Cruz; Hansteen, Viggo; Ortiz, Ada

    Magnetic flux emergence into the outer layers of the Sun is a fundamental mechanism for releasing energy into the chromosphere and the corona. In this paper, we study the emergence of granular-sized flux concentrations and the structuring of the corresponding physical parameters and atmospheric diagnostics in the upper photosphere and in the chromosphere. We make use of a realistic 3D MHD simulation of the outer layers of the Sun to study the formation of the Ca ii 8542 line. We also derive semi-empirical 3D models from non-LTE inversions of our observations. These models contain information on the line-of-sight stratifications ofmore » temperature, velocity, and the magnetic field. Our analysis explains the peculiar Ca ii 8542 Å profiles observed in the flux emerging region. Additionally, we derive detailed temperature and velocity maps describing the ascent of a magnetic bubble from the photosphere to the chromosphere. The inversions suggest that, in active regions, granular-sized bubbles emerge up to the lower chromosphere where the existing large-scale field hinders their ascent. We report hints of heating when the field reaches the chromosphere.« less

  4. Gas-Enhanced Ultra-High Shear Mixing: A Concept and Applications

    NASA Astrophysics Data System (ADS)

    Czerwinski, Frank; Birsan, Gabriel

    2017-04-01

    The processes of mixing, homogenizing, and deagglomeration are of paramount importance in many industries for modifying properties of liquids or liquid-based dispersions at room temperature and treatment of molten or semi-molten alloys at high temperatures, prior to their solidification. To implement treatments, a variety of technologies based on mechanical, electromagnetic, and ultrasonic principles are used commercially or tested at the laboratory scale. In a large number of techniques, especially those tailored toward metallurgical applications, the vital role is played by cavitation, generation of gas bubbles, and their interaction with the melt. This paper describes a novel concept exploring an integration of gas injection into the shear zone with ultra-high shear mixing. As revealed via experiments with a prototype of the cylindrical rotor-stator apparatus and transparent media, gases injected radially through the high-speed rotor generate highly refined bubbles of high concentration directly in the shear zone of the mixer. It is believed that an interaction of large volume of fine gas bubbles with the liquid, superimposed on ultra-high shear, will enhance mixing capabilities and cause superior refining and homogenizing of the liquids or solid-liquid slurries, thus allowing their effective property modification.

  5. Protein structural dynamics at the gas/water interface examined by hydrogen exchange mass spectrometry.

    PubMed

    Xiao, Yiming; Konermann, Lars

    2015-08-01

    Gas/water interfaces (such as air bubbles or foam) are detrimental to the stability of proteins, often causing aggregation. This represents a potential problem for industrial processes, for example, the production and handling of protein drugs. Proteins possess surfactant-like properties, resulting in a high affinity for gas/water interfaces. The tendency of previously buried nonpolar residues to maximize contact with the gas phase can cause significant structural distortion. Most earlier studies in this area employed spectroscopic tools that could only provide limited information. Here we use hydrogen/deuterium exchange (HDX) mass spectrometry (MS) for probing the conformational dynamics of the model protein myoglobin (Mb) in the presence of N(2) bubbles. HDX/MS relies on the principle that unfolded and/or highly dynamic regions undergo faster deuteration than tightly folded segments. In bubble-free solution Mb displays EX2 behavior, reflecting the occurrence of short-lived excursions to partially unfolded conformers. A dramatically different behavior is seen in the presence of N(2) bubbles; EX2 dynamics still take place, but in addition the protein shows EX1 behavior. The latter results from interconversion of the native state with conformers that are globally unfolded and long-lived. These unfolded species likely correspond to Mb that is adsorbed to the surface of gas bubbles. N(2) sparging also induces aggregation. To explain the observed behavior we propose a simple model, that is, "semi-unfolded" ↔ "native" ↔ "globally unfolded" → "aggregated". This model quantitatively reproduces the experimentally observed kinetics. To the best of our knowledge, the current study marks the first exploration of surface denaturation phenomena by HDX/MS. © 2015 The Protein Society.

  6. AfterQC: automatic filtering, trimming, error removing and quality control for fastq data.

    PubMed

    Chen, Shifu; Huang, Tanxiao; Zhou, Yanqing; Han, Yue; Xu, Mingyan; Gu, Jia

    2017-03-14

    Some applications, especially those clinical applications requiring high accuracy of sequencing data, usually have to face the troubles caused by unavoidable sequencing errors. Several tools have been proposed to profile the sequencing quality, but few of them can quantify or correct the sequencing errors. This unmet requirement motivated us to develop AfterQC, a tool with functions to profile sequencing errors and correct most of them, plus highly automated quality control and data filtering features. Different from most tools, AfterQC analyses the overlapping of paired sequences for pair-end sequencing data. Based on overlapping analysis, AfterQC can detect and cut adapters, and furthermore it gives a novel function to correct wrong bases in the overlapping regions. Another new feature is to detect and visualise sequencing bubbles, which can be commonly found on the flowcell lanes and may raise sequencing errors. Besides normal per cycle quality and base content plotting, AfterQC also provides features like polyX (a long sub-sequence of a same base X) filtering, automatic trimming and K-MER based strand bias profiling. For each single or pair of FastQ files, AfterQC filters out bad reads, detects and eliminates sequencer's bubble effects, trims reads at front and tail, detects the sequencing errors and corrects part of them, and finally outputs clean data and generates HTML reports with interactive figures. AfterQC can run in batch mode with multiprocess support, it can run with a single FastQ file, a single pair of FastQ files (for pair-end sequencing), or a folder for all included FastQ files to be processed automatically. Based on overlapping analysis, AfterQC can estimate the sequencing error rate and profile the error transform distribution. The results of our error profiling tests show that the error distribution is highly platform dependent. Much more than just another new quality control (QC) tool, AfterQC is able to perform quality control, data filtering, error profiling and base correction automatically. Experimental results show that AfterQC can help to eliminate the sequencing errors for pair-end sequencing data to provide much cleaner outputs, and consequently help to reduce the false-positive variants, especially for the low-frequency somatic mutations. While providing rich configurable options, AfterQC can detect and set all the options automatically and require no argument in most cases.

  7. 10 CFR Appendix J1 to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Automatic and Semi-Automatic Clothes...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    .... The 500g requirement will only be used if a clothes washer design can achieve spin speeds in the 500g... Products, P.O. Box 5127, Toledo, OH 43611) or an equivalent extractor with same basket design (i.e... characteristics as to provide materially inaccurate comparative data, field testing may be appropriate for...

  8. 10 CFR Appendix J1 to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Automatic and Semi-Automatic Clothes...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    .... The 500g requirement will only be used if a clothes washer design can achieve spin speeds in the 500g... Products, P.O. Box 5127, Toledo, OH 43611) or an equivalent extractor with same basket design (i.e... characteristics as to provide materially inaccurate comparative data, field testing may be appropriate for...

  9. Prototype Technology for Monitoring Volatile Organics. Volume 1.

    DTIC Science & Technology

    1988-03-01

    117, pp. 285-294. Grote, J.O. and Westendorf , R.G., "An Automatic Purge and Trap Concentrator," American Laboratory, December 1979. Khromchenko, Y.L...Environmental Monitoring and Support Laboratory, Office of Research and Development, Cincinnati, OH. Westendorf , R.G., "Closed-loop Stripping Analysis...Technique and Applications," American Laboratory, December 1982. Westendorf , R.G., "Development Application of A Semi-Automatic Purge and Trap Concentrator

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilke, R. N., E-mail: rwilke@gwdg.de; Wallentin, J.; Osterhoff, M.

    The Large Area Medipix-Based Detector Array (Lambda) has been used in a ptychographic imaging experiment on solar-cell nanowires. By using a semi-transparent central stop, the high flux density provided by nano-focusing Kirkpatrick–Baez mirrors can be fully exploited for high-resolution phase reconstructions. Suitable detection systems that are capable of recording high photon count rates with single-photon detection are instrumental for coherent X-ray imaging. The new single-photon-counting pixel detector ‘Lambda’ has been tested in a ptychographic imaging experiment on solar-cell nanowires using Kirkpatrick–Baez-focused 13.8 keV X-rays. Taking advantage of the high count rate of the Lambda and dynamic range expansion by themore » semi-transparent central stop, a high-dynamic-range diffraction signal covering more than seven orders of magnitude has been recorded, which corresponds to a photon flux density of about 10{sup 5} photons nm{sup −2} s{sup −1} or a flux of ∼10{sup 10} photons s{sup −1} on the sample. By comparison with data taken without the semi-transparent central stop, an increase in resolution by a factor of 3–4 is determined: from about 125 nm to about 38 nm for the nanowire and from about 83 nm to about 21 nm for the illuminating wavefield.« less

  11. School Fire Protection: Contents Count

    ERIC Educational Resources Information Center

    American School and University, 1976

    1976-01-01

    The heart of a fire protection system is the sprinkler system. National Fire Protection Association (NFPA) statistics show that automatic sprinklers dramatically reduce fire damage and loss of life. (Author)

  12. Automatically processed alpha-track radon monitor

    DOEpatents

    Langner, Jr., G. Harold

    1993-01-01

    An automatically processed alpha-track radon monitor is provided which includes a housing having an aperture allowing radon entry, and a filter that excludes the entry of radon daughters into the housing. A flexible track registration material is located within the housing that records alpha-particle emissions from the decay of radon and radon daughters inside the housing. The flexible track registration material is capable of being spliced such that the registration material from a plurality of monitors can be spliced into a single strip to facilitate automatic processing of the registration material from the plurality of monitors. A process for the automatic counting of radon registered by a radon monitor is also provided.

  13. Automatically processed alpha-track radon monitor

    DOEpatents

    Langner, G.H. Jr.

    1993-01-12

    An automatically processed alpha-track radon monitor is provided which includes a housing having an aperture allowing radon entry, and a filter that excludes the entry of radon daughters into the housing. A flexible track registration material is located within the housing that records alpha-particle emissions from the decay of radon and radon daughters inside the housing. The flexible track registration material is capable of being spliced such that the registration material from a plurality of monitors can be spliced into a single strip to facilitate automatic processing of the registration material from the plurality of monitors. A process for the automatic counting of radon registered by a radon monitor is also provided.

  14. Automated quantification of proliferation with automated hot-spot selection in phosphohistone H3/MART1 dual-stained stage I/II melanoma.

    PubMed

    Nielsen, Patricia Switten; Riber-Hansen, Rikke; Schmidt, Henrik; Steiniche, Torben

    2016-04-09

    Staging of melanoma includes quantification of a proliferation index, i.e., presumed melanocytic mitoses of H&E stains are counted manually in hot spots. Yet, its reproducibility and prognostic impact increases by immunohistochemical dual staining for phosphohistone H3 (PHH3) and MART1, which also may enable fully automated quantification by image analysis. To ensure manageable workloads and repeatable measurements in modern pathology, the study aimed to present an automated quantification of proliferation with automated hot-spot selection in PHH3/MART1-stained melanomas. Formalin-fixed, paraffin-embedded tissue from 153 consecutive stage I/II melanoma patients was immunohistochemically dual-stained for PHH3 and MART1. Whole slide images were captured, and the number of PHH3/MART1-positive cells was manually and automatically counted in the global tumor area and in a manually and automatically selected hot spot, i.e., a fixed 1-mm(2) square. Bland-Altman plots and hypothesis tests compared manual and automated procedures, and the Cox proportional hazards model established their prognostic impact. The mean difference between manual and automated global counts was 2.9 cells/mm(2) (P = 0.0071) and 0.23 cells per hot spot (P = 0.96) for automated counts in manually and automatically selected hot spots. In 77 % of cases, manual and automated hot spots overlapped. Fully manual hot-spot counts yielded the highest prognostic performance with an adjusted hazard ratio of 5.5 (95 % CI, 1.3-24, P = 0.024) as opposed to 1.3 (95 % CI, 0.61-2.9, P = 0.47) for automated counts with automated hot spots. The automated index and automated hot-spot selection were highly correlated to their manual counterpart, but altogether their prognostic impact was noticeably reduced. Because correct recognition of only one PHH3/MART1-positive cell seems important, extremely high sensitivity and specificity of the algorithm is required for prognostic purposes. Thus, automated analysis may still aid and improve the pathologists' detection of mitoses in melanoma and possibly other malignancies.

  15. Clinical Evaluation of Different Pre-impression Preparation Procedures of Dental Arch

    PubMed Central

    Arora, Nitin; Arora, Monika; Gupta, Naveen; Agarwal, Manisha; Verma, Rohit; Rathod, Pankaj

    2015-01-01

    Background: Bubbles and voids on the occlusal surface impede the actual intercuspation and pre-impression preparation aims to reduce the incidence of air bubbles and voids as well as influences the quality of occlusal reproduction and actual clinical intercuspation in the articulator. The study was undertaken to determine the influence of different pre-impression preparation procedures of antagonistic dental arch on the quality of the occlusal reproduction of the teeth in irreversible hydrocolloid impressions and to determine most reliable pre-impression preparation method to reduce the incidence of air bubbles. Materials and Methods: A total of 20 subjects were selected having full complement of mandibular teeth from second molar to second molar with well demarcated cusp height. 200 impressions were made with irreversible hydrocolloid material. The impressions were divided into five groups of 40 impressions each and each group had one specific type of pre-impression preparation. All the impressions were poured in die stone. A stereomicroscope with graduated eyepiece was used to count the number of bubbles on the occlusal surface of premolars and molars. The mean and standard deviations were calculated for each group. Mann–Whitney U-test was applied to find the significant difference between different groups. Results: Least bubbles were found in the group in which oral cavity was dried by saliva ejector and fluid hydrocolloid was finger painted onto the occlusal surfaces immediately before the placement of impression tray in the mouth. Conclusion: It was found that finger painting the tooth surfaces with fluid hydrocolloid immediately before the placement of loaded impression tray in the mouth was the most reliable method. The oral cavity can be cleared more easily of excess saliva by vacuum suction rather than by use of an astringent solution. PMID:26229376

  16. Clinical Evaluation of Different Pre-impression Preparation Procedures of Dental Arch.

    PubMed

    Arora, Nitin; Arora, Monika; Gupta, Naveen; Agarwal, Manisha; Verma, Rohit; Rathod, Pankaj

    2015-07-01

    Bubbles and voids on the occlusal surface impede the actual intercuspation and pre-impression preparation aims to reduce the incidence of air bubbles and voids as well as influences the quality of occlusal reproduction and actual clinical intercuspation in the articulator. The study was undertaken to determine the influence of different pre-impression preparation procedures of antagonistic dental arch on the quality of the occlusal reproduction of the teeth in irreversible hydrocolloid impressions and to determine most reliable pre-impression preparation method to reduce the incidence of air bubbles. A total of 20 subjects were selected having full complement of mandibular teeth from second molar to second molar with well demarcated cusp height. 200 impressions were made with irreversible hydrocolloid material. The impressions were divided into five groups of 40 impressions each and each group had one specific type of pre-impression preparation. All the impressions were poured in die stone. A stereomicroscope with graduated eyepiece was used to count the number of bubbles on the occlusal surface of premolars and molars. The mean and standard deviations were calculated for each group. Mann-Whitney U-test was applied to find the significant difference between different groups. Least bubbles were found in the group in which oral cavity was dried by saliva ejector and fluid hydrocolloid was finger painted onto the occlusal surfaces immediately before the placement of impression tray in the mouth. It was found that finger painting the tooth surfaces with fluid hydrocolloid immediately before the placement of loaded impression tray in the mouth was the most reliable method. The oral cavity can be cleared more easily of excess saliva by vacuum suction rather than by use of an astringent solution.

  17. Clinical Comparison of Two Methods of Graft Preparation in Descemet Membrane Endothelial Keratoplasty.

    PubMed

    Rickmann, Annekatrin; Opitz, Natalia; Szurman, Peter; Boden, Karl Thomas; Jung, Sascha; Wahl, Silke; Haus, Arno; Damm, Lara-Jil; Januschowski, Kai

    2018-01-01

    Descemet membrane endothelial keratoplasty (DMEK) has been improved over the last decade. The aim of this study was to compare the clinical outcome of the recently introduced liquid bubble method compared to the standard manual preparation. This retrospective study evaluated the outcome of 200 patients after DMEK surgery using two different graft preparation techniques. Ninety-six DMEK were prepared by manual dissection and 104 by the novel liquid bubble technique. The mean follow-up time was 13.7 months (SD ± 8, range 6-36 months). Best corrected mean visual acuity (BCVA) increased for all patients statistically significant from baseline 0.85 logMAR (SD ± 0.5) to 0.26 logMAR (SD ± 0.27) at the final follow-up (Wilcoxon, p = 0.001). Subgroup analyses of BCVA at the final follow-up between manual dissection and liquid bubble preparation showed no statistically significant difference (Mann-Whitney U Test, p = 0.64). The mean central corneal thickness was not statistically different (manual dissection: 539 µm, SD ± 68 µm and liquid bubble technique: 534 µm, SD ± 52 µm,) between the two groups (Mann-Whitney U Test, p = 0.64). At the final follow-up, mean endothelial cell count of donor grafts was statistically not significant different at the final follow-up with 1761 cells/mm 2 (-30.7%, SD ± 352) for manual dissection compared to liquid bubble technique with 1749 cells/mm 2 (-29.9%, SD ± 501) (Mann-Whitney U-Test, p = 0.73). The re-DMEK rate was comparable for manual dissection with 8 cases (8.3%) and 7 cases (6.7%) for liquid bubble dissection (p = 0.69, Chi-Square Test). Regarding the clinical outcome, we did not find a statistical significant difference between manual dissection and liquid bubble graft preparation. Both preparation techniques lead to an equivalent clinical outcome after DMEK surgery.

  18. WE-A-17A-06: Evaluation of An Automatic Interstitial Catheter Digitization Algorithm That Reduces Treatment Planning Time and Provide Means for Adaptive Re-Planning in HDR Brachytherapy of Gynecologic Cancers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dise, J; Liang, X; Lin, L

    Purpose: To evaluate an automatic interstitial catheter digitization algorithm that reduces treatment planning time and provide means for adaptive re-planning in HDR Brachytherapy of Gynecologic Cancers. Methods: The semi-automatic catheter digitization tool utilizes a region growing algorithm in conjunction with a spline model of the catheters. The CT images were first pre-processed to enhance the contrast between the catheters and soft tissue. Several seed locations were selected in each catheter for the region growing algorithm. The spline model of the catheters assisted in the region growing by preventing inter-catheter cross-over caused by air or metal artifacts. Source dwell positions frommore » day one CT scans were applied to subsequent CTs and forward calculated using the automatically digitized catheter positions. This method was applied to 10 patients who had received HDR interstitial brachytherapy on an IRB approved image-guided radiation therapy protocol. The prescribed dose was 18.75 or 20 Gy delivered in 5 fractions, twice daily, over 3 consecutive days. Dosimetric comparisons were made between automatic and manual digitization on day two CTs. Results: The region growing algorithm, assisted by the spline model of the catheters, was able to digitize all catheters. The difference between automatic and manually digitized positions was 0.8±0.3 mm. The digitization time ranged from 34 minutes to 43 minutes with a mean digitization time of 37 minutes. The bulk of the time was spent on manual selection of initial seed positions and spline parameter adjustments. There was no significance difference in dosimetric parameters between the automatic and manually digitized plans. D90% to the CTV was 91.5±4.4% for the manual digitization versus 91.4±4.4% for the automatic digitization (p=0.56). Conclusion: A region growing algorithm was developed to semi-automatically digitize interstitial catheters in HDR brachytherapy using the Syed-Neblett template. This automatic digitization tool was shown to be accurate compared to manual digitization.« less

  19. Load-Dependent Interference of Deep Brain Stimulation of the Subthalamic Nucleus with Switching from Automatic to Controlled Processing During Random Number Generation in Parkinson's Disease.

    PubMed

    Williams, Isobel Anne; Wilkinson, Leonora; Limousin, Patricia; Jahanshahi, Marjan

    2015-01-01

    Deep brain stimulation of the subthalamic nucleus (STN DBS) ameliorates the motor symptoms of Parkinson's disease (PD). However, some aspects of executive control are impaired with STN DBS. We tested the prediction that (i) STN DBS interferes with switching from automatic to controlled processing during fast-paced random number generation (RNG) (ii) STN DBS-induced cognitive control changes are load-dependent. Fifteen PD patients with bilateral STN DBS performed paced-RNG, under three levels of cognitive load synchronised with a pacing stimulus presented at 1, 0.5 and 0.33 Hz (faster rates require greater cognitive control), with DBS on or off. Measures of output randomness were calculated. Countscore 1 (CS1) indicates habitual counting in steps of one (CS1). Countscore 2 (CS2) indicates a more controlled strategy of counting in twos. The fastest rate was associated with an increased CS1 score with STN DBS on compared to off. At the slowest rate, patients had higher CS2 scores with DBS off than on, such that the differences between CS1 and CS2 scores disappeared. We provide evidence for a load-dependent effect of STN DBS on paced RNG in PD. Patients could switch to more controlled RNG strategies during conditions of low cognitive load at slower rates only when the STN stimulators were off, but when STN stimulation was on, they engaged in more automatic habitual counting under increased cognitive load. These findings are consistent with the proposal that the STN implements a switch signal from the medial frontal cortex which enables a shift from automatic to controlled processing.

  20. Load-Dependent Interference of Deep Brain Stimulation of the Subthalamic Nucleus with Switching from Automatic to Controlled Processing During Random Number Generation in Parkinson’s Disease

    PubMed Central

    Williams, Isobel Anne; Wilkinson, Leonora; Limousin, Patricia; Jahanshahi, Marjan

    2015-01-01

    Background: Deep brain stimulation of the subthalamic nucleus (STN DBS) ameliorates the motor symptoms of Parkinson’s disease (PD). However, some aspects of executive control are impaired with STN DBS. Objective: We tested the prediction that (i) STN DBS interferes with switching from automatic to controlled processing during fast-paced random number generation (RNG) (ii) STN DBS-induced cognitive control changes are load-dependent. Methods: Fifteen PD patients with bilateral STN DBS performed paced-RNG, under three levels of cognitive load synchronised with a pacing stimulus presented at 1, 0.5 and 0.33 Hz (faster rates require greater cognitive control), with DBS on or off. Measures of output randomness were calculated. Countscore 1 (CS1) indicates habitual counting in steps of one (CS1). Countscore 2 (CS2) indicates a more controlled strategy of counting in twos. Results: The fastest rate was associated with an increased CS1 score with STN DBS on compared to off. At the slowest rate, patients had higher CS2 scores with DBS off than on, such that the differences between CS1 and CS2 scores disappeared. Conclusions: We provide evidence for a load-dependent effect of STN DBS on paced RNG in PD. Patients could switch to more controlled RNG strategies during conditions of low cognitive load at slower rates only when the STN stimulators were off, but when STN stimulation was on, they engaged in more automatic habitual counting under increased cognitive load. These findings are consistent with the proposal that the STN implements a switch signal from the medial frontal cortex which enables a shift from automatic to controlled processing. PMID:25720447

  1. Pancreas and cyst segmentation

    NASA Astrophysics Data System (ADS)

    Dmitriev, Konstantin; Gutenko, Ievgeniia; Nadeem, Saad; Kaufman, Arie

    2016-03-01

    Accurate segmentation of abdominal organs from medical images is an essential part of surgical planning and computer-aided disease diagnosis. Many existing algorithms are specialized for the segmentation of healthy organs. Cystic pancreas segmentation is especially challenging due to its low contrast boundaries, variability in shape, location and the stage of the pancreatic cancer. We present a semi-automatic segmentation algorithm for pancreata with cysts. In contrast to existing automatic segmentation approaches for healthy pancreas segmentation which are amenable to atlas/statistical shape approaches, a pancreas with cysts can have even higher variability with respect to the shape of the pancreas due to the size and shape of the cyst(s). Hence, fine results are better attained with semi-automatic steerable approaches. We use a novel combination of random walker and region growing approaches to delineate the boundaries of the pancreas and cysts with respective best Dice coefficients of 85.1% and 86.7%, and respective best volumetric overlap errors of 26.0% and 23.5%. Results show that the proposed algorithm for pancreas and pancreatic cyst segmentation is accurate and stable.

  2. Simulation in Metallurgical Processing: Recent Developments and Future Perspectives

    NASA Astrophysics Data System (ADS)

    Ludwig, Andreas; Wu, Menghuai; Kharicha, Abdellah

    2016-08-01

    This article briefly addresses the most important topics concerning numerical simulation of metallurgical processes, namely, multiphase issues (particle and bubble motion and flotation/sedimentation of equiaxed crystals during solidification), multiphysics issues (electromagnetic stirring, electro-slag remelting, Cu-electro-refining, fluid-structure interaction, and mushy zone deformation), process simulations on graphical processing units, integrated computational materials engineering, and automatic optimization via simulation. The present state-of-the-art as well as requirements for future developments are presented and briefly discussed.

  3. Cell type classifiers for breast cancer microscopic images based on fractal dimension texture analysis of image color layers.

    PubMed

    Jitaree, Sirinapa; Phinyomark, Angkoon; Boonyaphiphat, Pleumjit; Phukpattaranont, Pornchai

    2015-01-01

    Having a classifier of cell types in a breast cancer microscopic image (BCMI), obtained with immunohistochemical staining, is required as part of a computer-aided system that counts the cancer cells in such BCMI. Such quantitation by cell counting is very useful in supporting decisions and planning of the medical treatment of breast cancer. This study proposes and evaluates features based on texture analysis by fractal dimension (FD), for the classification of histological structures in a BCMI into either cancer cells or non-cancer cells. The cancer cells include positive cells (PC) and negative cells (NC), while the normal cells comprise stromal cells (SC) and lymphocyte cells (LC). The FD feature values were calculated with the box-counting method from binarized images, obtained by automatic thresholding with Otsu's method of the grayscale images for various color channels. A total of 12 color channels from four color spaces (RGB, CIE-L*a*b*, HSV, and YCbCr) were investigated, and the FD feature values from them were used with decision tree classifiers. The BCMI data consisted of 1,400, 1,200, and 800 images with pixel resolutions 128 × 128, 192 × 192, and 256 × 256, respectively. The best cross-validated classification accuracy was 93.87%, for distinguishing between cancer and non-cancer cells, obtained using the Cr color channel with window size 256. The results indicate that the proposed algorithm, based on fractal dimension features extracted from a color channel, performs well in the automatic classification of the histology in a BCMI. This might support accurate automatic cell counting in a computer-assisted system for breast cancer diagnosis. © Wiley Periodicals, Inc.

  4. "Compacted" procedures for adults' simple addition: A review and critique of the evidence.

    PubMed

    Chen, Yalin; Campbell, Jamie I D

    2018-04-01

    We review recent empirical findings and arguments proffered as evidence that educated adults solve elementary addition problems (3 + 2, 4 + 1) using so-called compacted procedures (e.g., unconscious, automatic counting); a conclusion that could have significant pedagogical implications. We begin with the large-sample experiment reported by Uittenhove, Thevenot and Barrouillet (2016, Cognition, 146, 289-303), which tested 90 adults on the 81 single-digit addition problems from 1 + 1 to 9 + 9. They identified the 12 very-small addition problems with different operands both ≤ 4 (e.g., 4 + 3) as a distinct subgroup of problems solved by unconscious, automatic counting: These items yielded a near-perfectly linear increase in answer response time (RT) yoked to the sum of the operands. Using the data reported in the article, however, we show that there are clear violations of the sum-counting model's predictions among the very-small addition problems, and that there is no real RT boundary associated with addends ≤4. Furthermore, we show that a well-known associative retrieval model of addition facts-the network interference theory (Campbell, 1995)-predicts the results observed for these problems with high precision. We also review the other types of evidence adduced for the compacted procedure theory of simple addition and conclude that these findings are unconvincing in their own right and only distantly consistent with automatic counting. We conclude that the cumulative evidence for fast compacted procedures for adults' simple addition does not justify revision of the long-standing assumption that direct memory retrieval is ultimately the most efficient process of simple addition for nonzero problems, let alone sufficient to recommend significant changes to basic addition pedagogy.

  5. Modified Right Heart Contrast Echocardiography Versus Traditional Method in Diagnosis of Right-to-Left Shunt: A Comparative Study.

    PubMed

    Wang, Yi; Zeng, Jie; Yin, Lixue; Zhang, Mei; Hou, Dailun

    2016-01-01

    The purpose of this study was to evaluate the reliability, effectiveness, and safety of modified right heart contrast transthoracic echocardiography (cTTE) in comparison with the traditional method. We performed a modified right heart cTTE using saline mixed with a small sample of patient's own blood. Samples were agitated with varying intensity. This study protocol involved microscopic analysis and patient evaluation. 1. Microscopic analysis: After two contrast samples had been agitated 10 or 20 times, they underwent a comparison of bubble size, bubble number, and red blood cell morphology. 2. Patient analysis: 40 patients with suspected RLS (right- to-left shunt) were enrolled. All patients underwent right heart contrast echocardiography. Oxygen saturation, transit time and duration, presence of RLS, change in indirect bilirubin and urobilinogen concentrations were compared afterward. Modified method generated more bubbles (P<0.05), but the differences in bubble size were not significant (P>0.05). Twenty-four patients were diagnosed with RLS (60%) using the modified method compared to 16 patients (40%) with the traditional method. The transit time of ASb20 group was the shortest (P<0.05). However, the duration time in this group was much longer (P<0.05). Also, in semi-quantitative analysis mean rank of RLS was higher after injecting the modified contrast agent agitated 20 times (P<0.05). Modified right heart contrast echocardiography is a reliable, effective and safe method of detecting cardiovascular RLS.

  6. In vitro fertilization (IVF) using semi-defined culture conditions from low or high antral follicle count pubertal beef heifers

    USDA-ARS?s Scientific Manuscript database

    To compare the in vitro fertilization (IVF) and production (IVP) of embryos from low and high antral follicle count (AFC) heifers, AFC were determined on 106 heifers using transrectal ultrasonography. Ten heifers with the lowest AFC (avg. 13.2) and 10 heifers with the highest AFC (avg. 27.4) with ev...

  7. Multiple sclerosis lesion segmentation using an automatic multimodal graph cuts.

    PubMed

    García-Lorenzo, Daniel; Lecoeur, Jeremy; Arnold, Douglas L; Collins, D Louis; Barillot, Christian

    2009-01-01

    Graph Cuts have been shown as a powerful interactive segmentation technique in several medical domains. We propose to automate the Graph Cuts in order to automatically segment Multiple Sclerosis (MS) lesions in MRI. We replace the manual interaction with a robust EM-based approach in order to discriminate between MS lesions and the Normal Appearing Brain Tissues (NABT). Evaluation is performed in synthetic and real images showing good agreement between the automatic segmentation and the target segmentation. We compare our algorithm with the state of the art techniques and with several manual segmentations. An advantage of our algorithm over previously published ones is the possibility to semi-automatically improve the segmentation due to the Graph Cuts interactive feature.

  8. New non-invasive automatic cough counting program based on 6 types of classified cough sounds.

    PubMed

    Murata, Akira; Ohota, Nao; Shibuya, Atsuo; Ono, Hiroshi; Kudoh, Shoji

    2006-01-01

    Cough consisting of an initial deep inspiration, glottal closure, and an explosive expiration accompanied by a sound is one of the most common symptoms of respiratory disease. Despite its clinical importance, standard methods for objective cough analysis have yet to be established. We investigated the characteristics of cough sounds acoustically, designed a program to discriminate cough sounds from other sounds, and finally developed a new objective method of non-invasive cough counting. In addition, we evaluated the clinical efficacy of that program. We recorded cough sounds using a memory stick IC recorder in free-field from 2 patients and analyzed the intensity of 534 recorded coughs acoustically according to time domain. First we squared the sound waveform of recorded cough sounds, which was then smoothed out over a 20 ms window. The 5 parameters and some definitions to discriminate the cough sounds from other noise were identified and the cough sounds were classified into 6 groups. Next, we applied this method to develop a new automatic cough count program. Finally, to evaluate the accuracy and clinical usefulness of this program, we counted cough sounds collected from another 10 patients using our program and conventional manual counting. And the sensitivity, specificity and discriminative rate of the program were analyzed. This program successfully discriminated recorded cough sounds out of 1902 sound events collected from 10 patients at a rate of 93.1%. The sensitivity was 90.2% and the specificity was 96.5%. Our new cough counting program can be sufficiently useful for clinical studies.

  9. Total reduction of distorted echelle spectrograms - An automatic procedure. [for computer controlled microdensitometer

    NASA Technical Reports Server (NTRS)

    Peterson, R. C.; Title, A. M.

    1975-01-01

    A total reduction procedure, notable for its use of a computer-controlled microdensitometer for semi-automatically tracing curved spectra, is applied to distorted high-dispersion echelle spectra recorded by an image tube. Microdensitometer specifications are presented and the FORTRAN, TRACEN and SPOTS programs are outlined. The intensity spectrum of the photographic or electrographic plate is plotted on a graphic display. The time requirements are discussed in detail.

  10. 10 CFR Appendix J2 to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Automatic and Semi-Automatic Clothes...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... minutes with a minimum fill of 20 gallons of soft water (17 ppm hardness or less) using 27.0 grams + 4.0 grams per pound of cloth load of AHAM Standard detergent Formula 3. The wash temperature is to be... stain resistant finishes shall not be applied to the test cloth. The absence of such finishes shall be...

  11. System for sensing droplet formation time delay in a flow cytometer

    DOEpatents

    Van den Engh, Ger; Esposito, Richard J.

    1997-01-01

    A droplet flow cytometer system which includes a system to optimize the droplet formation time delay based on conditions actually experienced includes an automatic droplet sampler which rapidly moves a plurality of containers stepwise through the droplet stream while simultaneously adjusting the droplet time delay. Through the system sampling of an actual substance to be processed can be used to minimize the effect of the substances variations or the determination of which time delay is optimal. Analysis such as cell counting and the like may be conducted manually or automatically and input to a time delay adjustment which may then act with analysis equipment to revise the time delay estimate actually applied during processing. The automatic sampler can be controlled through a microprocessor and appropriate programming to bracket an initial droplet formation time delay estimate. When maximization counts through volume, weight, or other types of analysis exists in the containers, the increment may then be reduced for a more accurate ultimate setting. This may be accomplished while actually processing the sample without interruption.

  12. Determinants of wood dust exposure in the Danish furniture industry.

    PubMed

    Mikkelsen, Anders B; Schlunssen, Vivi; Sigsgaard, Torben; Schaumburg, Inger

    2002-11-01

    This paper investigates the relation between wood dust exposure in the furniture industry and occupational hygiene variables. During the winter 1997-98 54 factories were visited and 2362 personal, passive inhalable dust samples were obtained; the geometric mean was 0.95 mg/m(3) and the geometric standard deviation was 2.08. In a first measuring round 1685 dust concentrations were obtained. For some of the workers repeated measurements were carried out 1 (351) and 2 weeks (326) after the first measurement. Hygiene variables like job, exhaust ventilation, cleaning procedures, etc., were documented. A multivariate analysis based on mixed effects models was used with hygiene variables being fixed effects and worker, machine, department and factory being random effects. A modified stepwise strategy of model making was adopted taking into account the hierarchically structured variables and making possible the exclusion of non-influential random as well as fixed effects. For woodworking, the following determinants of exposure increase the dust concentration: manual and automatic sanding and use of compressed air with fully automatic and semi-automatic machines and for cleaning of work pieces. Decreased dust exposure resulted from the use of compressed air with manual machines, working at fully automatic or semi-automatic machines, functioning exhaust ventilation, work on the night shift, daily cleaning of rooms, cleaning of work pieces with a brush, vacuum cleaning of machines, supplementary fresh air intake and safety representative elected within the last 2 yr. For handling and assembling, increased exposure results from work at automatic machines and presence of wood dust on the workpieces. Work on the evening shift, supplementary fresh air intake, work in a chair factory and special cleaning staff produced decreased exposure to wood dust. The implications of the results for the prevention of wood dust exposure are discussed.

  13. A stable and convenient protein electrophoresis titration device with bubble removing system.

    PubMed

    Zhang, Qiang; Fan, Liu-Yin; Li, Wen-Lin; Cong, Feng-Song; Zhong, Ran; Chen, Jing-Jing; He, Yu-Chen; Xiao, Hua; Cao, Cheng-Xi

    2017-07-01

    Moving reaction boundary titration (MRBT) has a potential application to immunoassay and protein content analysis with high selectivity. However, air bubbles often impair the accuracy of MRBT, and the leakage of electrolyte greatly decreases the safety and convenience of electrophoretic titration. Addressing these two issues a reliable MRBT device with modified electrolyte chamber of protein titration was designed. Multiphysics computer simulation was conducted for optimization according to two-phase flow. The single chamber was made of two perpendicular cylinders with different diameters. After placing electrophoretic tube, the resident air in the junction next to the gel could be eliminated by a simple fast electrolyte flow. Removing the electrophoretic tube automatically prevented electrolyte leakage at the junction due to the gravity-induced negative pressure within the chamber. Moreover, the numerical simulation and experiments showed that the improved MRBT device has following advantages: (i) easy and rapid setup of electrophoretic tube within 20 s; (ii) simple and quick bubble dissipates from the chamber of titration within 2 s; (iii) no electrolyte leakage from the two chambers: and (iv) accurate protein titration and safe instrumental operation. The developed technique and apparatus greatly improves the performance of the previous MRBT device, and providing a new route toward practical application. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. A mass and momentum conserving unsplit semi-Lagrangian framework for simulating multiphase flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owkes, Mark, E-mail: mark.owkes@montana.edu; Desjardins, Olivier

    In this work, we present a computational methodology for convection and advection that handles discontinuities with second order accuracy and maintains conservation to machine precision. This method can transport a variety of discontinuous quantities and is used in the context of an incompressible gas–liquid flow to transport the phase interface, momentum, and scalars. The proposed method provides a modification to the three-dimensional, unsplit, second-order semi-Lagrangian flux method of Owkes & Desjardins (JCP, 2014). The modification adds a refined grid that provides consistent fluxes of mass and momentum defined on a staggered grid and discrete conservation of mass and momentum, evenmore » for flows with large density ratios. Additionally, the refined grid doubles the resolution of the interface without significantly increasing the computational cost over previous non-conservative schemes. This is possible due to a novel partitioning of the semi-Lagrangian fluxes into a small number of simplices. The proposed scheme is tested using canonical verification tests, rising bubbles, and an atomizing liquid jet.« less

  15. CONCH: A Visual Basic program for interactive processing of ion-microprobe analytical data

    NASA Astrophysics Data System (ADS)

    Nelson, David R.

    2006-11-01

    A Visual Basic program for flexible, interactive processing of ion-microprobe data acquired for quantitative trace element, 26Al- 26Mg, 53Mn- 53Cr, 60Fe- 60Ni and U-Th-Pb geochronology applications is described. Default but editable run-tables enable software identification of secondary ion species analyzed and for characterization of the standard used. Counts obtained for each species may be displayed in plots against analysis time and edited interactively. Count outliers can be automatically identified via a set of editable count-rejection criteria and displayed for assessment. Standard analyses are distinguished from Unknowns by matching of the analysis label with a string specified in the Set-up dialog, and processed separately. A generalized routine writes background-corrected count rates, ratios and uncertainties, plus weighted means and uncertainties for Standards and Unknowns, to a spreadsheet that may be saved as a text-delimited file. Specialized routines process trace-element concentration, 26Al- 26Mg, 53Mn- 53Cr, 60Fe- 60Ni, and Th-U disequilibrium analysis types, and U-Th-Pb isotopic data obtained for zircon, titanite, perovskite, monazite, xenotime and baddeleyite. Correction to measured Pb-isotopic, Pb/U and Pb/Th ratios for the presence of common Pb may be made using measured 204Pb counts, or the 207Pb or 208Pb counts following subtraction from these of the radiogenic component. Common-Pb corrections may be made automatically, using a (user-specified) common-Pb isotopic composition appropriate for that on the sample surface, or for that incorporated within the mineral at the time of its crystallization, depending on whether the 204Pb count rate determined for the Unknown is substantially higher than the average 204Pb count rate for all session standards. Pb/U inter-element fractionation corrections are determined using an interactive log e-log e plot of common-Pb corrected 206Pb/ 238U ratios against any nominated fractionation-sensitive species pair (commonly 238U 16O +/ 238U +) for session standards. Also displayed with this plot are calculated Pb/U and Pb/Th calibration line regression slopes, y-intercepts, calibration uncertainties, standard 204Pb- and 208Pb-corrected 207Pb/ 206Pb dates and other parameters useful for assessment of the calibration-line data. Calibrated data for Unknowns may be automatically grouped according to calculated date and displayed in color on interactive Wetherill Concordia, Tera-Wasserburg Concordia, Linearized Gaussian ("Probability Paper") and Gaussian-summation probability density diagrams.

  16. ELSA: An integrated, semi-automated nebular abundance package

    NASA Astrophysics Data System (ADS)

    Johnson, Matthew D.; Levitt, Jesse S.; Henry, Richard B. C.; Kwitter, Karen B.

    We present ELSA, a new modular software package, written in C, to analyze and manage spectroscopic data from emission-line objects. In addition to calculating plasma diagnostics and abundances from nebular emission lines, the software provides a number of convenient features including the ability to ingest logs produced by IRAF's splot task, to semi-automatically merge spectra in different wavelength ranges, and to automatically generate various data tables in machine-readable or LaTeX format. ELSA features a highly sophisticated interstellar reddening correction scheme that takes into account temperature and density effects as well as He II contamination of the hydrogen Balmer lines. Abundance calculations are performed using a 5-level atom approximation with recent atomic data, based on R. Henry's ABUN program. Downloading and detailed documentation for all aspects of ELSA are available at the following URL:

  17. Detection of microbial concentration in ice-cream using the impedance technique.

    PubMed

    Grossi, M; Lanzoni, M; Pompei, A; Lazzarini, R; Matteuzzi, D; Riccò, B

    2008-06-15

    The detection of microbial concentration, essential for safe and high quality food products, is traditionally made with the plate count technique, that is reliable, but also slow and not easily realized in the automatic form, as required for direct use in industrial machines. To this purpose, the method based on impedance measurements represents an attractive alternative since it can produce results in about 10h, instead of the 24-48h needed by standard plate counts and can be easily realized in automatic form. In this paper such a method has been experimentally studied in the case of ice-cream products. In particular, all main ice-cream compositions of real interest have been considered and no nutrient media has been used to dilute the samples. A measurement set-up has been realized using benchtop instruments for impedance measurements on samples whose bacteria concentration was independently measured by means of standard plate counts. The obtained results clearly indicate that impedance measurement represents a feasible and reliable technique to detect total microbial concentration in ice-cream, suitable to be implemented as an embedded system for industrial machines.

  18. Metadata Wizard: an easy-to-use tool for creating FGDC-CSDGM metadata for geospatial datasets in ESRI ArcGIS Desktop

    USGS Publications Warehouse

    Ignizio, Drew A.; O'Donnell, Michael S.; Talbert, Colin B.

    2014-01-01

    Creating compliant metadata for scientific data products is mandated for all federal Geographic Information Systems professionals and is a best practice for members of the geospatial data community. However, the complexity of the The Federal Geographic Data Committee’s Content Standards for Digital Geospatial Metadata, the limited availability of easy-to-use tools, and recent changes in the ESRI software environment continue to make metadata creation a challenge. Staff at the U.S. Geological Survey Fort Collins Science Center have developed a Python toolbox for ESRI ArcDesktop to facilitate a semi-automated workflow to create and update metadata records in ESRI’s 10.x software. The U.S. Geological Survey Metadata Wizard tool automatically populates several metadata elements: the spatial reference, spatial extent, geospatial presentation format, vector feature count or raster column/row count, native system/processing environment, and the metadata creation date. Once the software auto-populates these elements, users can easily add attribute definitions and other relevant information in a simple Graphical User Interface. The tool, which offers a simple design free of esoteric metadata language, has the potential to save many government and non-government organizations a significant amount of time and costs by facilitating the development of The Federal Geographic Data Committee’s Content Standards for Digital Geospatial Metadata compliant metadata for ESRI software users. A working version of the tool is now available for ESRI ArcDesktop, version 10.0, 10.1, and 10.2 (downloadable at http:/www.sciencebase.gov/metadatawizard).

  19. Automatic Counting of Pedestrians and Cyclists.

    DOT National Transportation Integrated Search

    2016-01-01

    On September 2014, U.S. Secretary of Transportation Anthony Foxx : unveiled a new Transportation Action Plan [1] to increase active, no-motorized : transportation. On it, the Department of Transportation reinforced its commitment : to making safe ...

  20. Automated pedestrian counter : final report, February 2010.

    DOT National Transportation Integrated Search

    2010-02-01

    Emerging sensor technologies accelerated the shift toward automatic pedestrian counting methods to : acquire reliable long-term data for transportation design, planning, and safety studies. Although a : number of commercial pedestrian sensors are ava...

  1. AISLE: an automatic volumetric segmentation method for the study of lung allometry.

    PubMed

    Ren, Hongliang; Kazanzides, Peter

    2011-01-01

    We developed a fully automatic segmentation method for volumetric CT (computer tomography) datasets to support construction of a statistical atlas for the study of allometric laws of the lung. The proposed segmentation method, AISLE (Automated ITK-Snap based on Level-set), is based on the level-set implementation from an existing semi-automatic segmentation program, ITK-Snap. AISLE can segment the lung field without human interaction and provide intermediate graphical results as desired. The preliminary experimental results show that the proposed method can achieve accurate segmentation, in terms of volumetric overlap metric, by comparing with the ground-truth segmentation performed by a radiologist.

  2. Method for stitching microbial images using a neural network

    NASA Astrophysics Data System (ADS)

    Semenishchev, E. A.; Voronin, V. V.; Marchuk, V. I.; Tolstova, I. V.

    2017-05-01

    Currently an analog microscope has a wide distribution in the following fields: medicine, animal husbandry, monitoring technological objects, oceanography, agriculture and others. Automatic method is preferred because it will greatly reduce the work involved. Stepper motors are used to move the microscope slide and allow to adjust the focus in semi-automatic or automatic mode view with transfer images of microbiological objects from the eyepiece of the microscope to the computer screen. Scene analysis allows to locate regions with pronounced abnormalities for focusing specialist attention. This paper considers the method for stitching microbial images, obtained of semi-automatic microscope. The method allows to keep the boundaries of objects located in the area of capturing optical systems. Objects searching are based on the analysis of the data located in the area of the camera view. We propose to use a neural network for the boundaries searching. The stitching image boundary is held of the analysis borders of the objects. To auto focus, we use the criterion of the minimum thickness of the line boundaries of object. Analysis produced the object located in the focal axis of the camera. We use method of recovery of objects borders and projective transform for the boundary of objects which are based on shifted relative to the focal axis. Several examples considered in this paper show the effectiveness of the proposed approach on several test images.

  3. SU-C-201-04: Quantification of Perfusion Heterogeneity Based On Texture Analysis for Fully Automatic Detection of Ischemic Deficits From Myocardial Perfusion Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fang, Y; Huang, H; Su, T

    Purpose: Texture-based quantification of image heterogeneity has been a popular topic for imaging studies in recent years. As previous studies mainly focus on oncological applications, we report our recent efforts of applying such techniques on cardiac perfusion imaging. A fully automated procedure has been developed to perform texture analysis for measuring the image heterogeneity. Clinical data were used to evaluate the preliminary performance of such methods. Methods: Myocardial perfusion images of Thallium-201 scans were collected from 293 patients with suspected coronary artery disease. Each subject underwent a Tl-201 scan and a percutaneous coronary intervention (PCI) within three months. The PCImore » Result was used as the gold standard of coronary ischemia of more than 70% stenosis. Each Tl-201 scan was spatially normalized to an image template for fully automatic segmentation of the LV. The segmented voxel intensities were then carried into the texture analysis with our open-source software Chang Gung Image Texture Analysis toolbox (CGITA). To evaluate the clinical performance of the image heterogeneity for detecting the coronary stenosis, receiver operating characteristic (ROC) analysis was used to compute the overall accuracy, sensitivity and specificity as well as the area under curve (AUC). Those indices were compared to those obtained from the commercially available semi-automatic software QPS. Results: With the fully automatic procedure to quantify heterogeneity from Tl-201 scans, we were able to achieve a good discrimination with good accuracy (74%), sensitivity (73%), specificity (77%) and AUC of 0.82. Such performance is similar to those obtained from the semi-automatic QPS software that gives a sensitivity of 71% and specificity of 77%. Conclusion: Based on fully automatic procedures of data processing, our preliminary data indicate that the image heterogeneity of myocardial perfusion imaging can provide useful information for automatic determination of the myocardial ischemia.« less

  4. Numerical investigations on unstable direct contact condensation of cryogenic fluids

    NASA Astrophysics Data System (ADS)

    Jayachandran, K. N.; Arnab, Roy; Parthasarathi, Ghosh

    2017-02-01

    A typical problem of Direct Contact Condensation (DCC) occurs at the liquid oxygen (LOX) booster turbopump exit of oxidiser rich staged combustion cycle based semi-cryogenic rocket engines, where the hot gas mixture (predominantly oxygen and small amounts of combustion products) that runs the turbine mixes with LOX from the pump exit. This complex multiphase phenomena leads to the formation of solid CO2 & H2O, which is undesirable for the functioning of the main LOX turbopump. As a starting point for solving this complex problem, in this study, the hot gas mixture is taken as pure oxygen and hence, DCC of pure oxygen vapour jets in subcooled liquid oxygen is simulated using the commercial CFD package ANSYS CFX®. A two fluid model along with the thermal phase change model is employed for capturing the heat and mass transfer effects. The study mainly focuses on the subsonic DCC bubbling regime, which is reported as unstable with bubble formation, elongation, necking and collapsing effects. The heat transfer coefficients over a period of time have been computed and the various stages of bubbling have been analysed with the help of vapour volume fraction and pressure profiles. The results obtained for DCC of oxygen vapour-liquid mixtures is in qualitative agreement with the experimental results on DCC of steam-water mixtures.

  5. Using SAR Interferograms and Coherence Images for Object-Based Delineation of Unstable Slopes

    NASA Astrophysics Data System (ADS)

    Friedl, Barbara; Holbling, Daniel

    2015-05-01

    This study uses synthetic aperture radar (SAR) interferometric products for the semi-automated identification and delineation of unstable slopes and active landslides. Single-pair interferograms and coherence images are therefore segmented and classified in an object-based image analysis (OBIA) framework. The rule-based classification approach has been applied to landslide-prone areas located in Taiwan and Southern Germany. The semi-automatically obtained results were validated against landslide polygons derived from manual interpretation.

  6. Analysis of the Parameters Required for Performance Monitoring and Assessment of Military Communications Systems by Military Technical Controller

    DTIC Science & Technology

    1975-12-01

    139 APPENDIX A* BASIC CONCEPT OF MILITARY TECHNICAL CONTROL.142 6 APIENDIX Es TEST EQUIPMENI REQUIRED FOR lEASURF.4ENr OF 1AF’AMETE RS...Control ( SATEC ) Automatic Facilities heport Army Automated Quality Monitoring Reporting System (AQMPS) Army Autcmated Technical Control-Semi (ATC-Semi...technical control then beco.. es equipment status monitoring. All the major equipment in a system wculd have internal sensors with properly selected parameters

  7. Breadboard activities for advanced protein crystal growth

    NASA Technical Reports Server (NTRS)

    Rosenberger, Franz; Banish, Michael

    1993-01-01

    The proposed work entails the design, assembly, testing, and delivery of a turn-key system for the semi-automated determination of protein solubilities as a function of temperature. The system will utilize optical scintillation as a means of detecting and monitoring nucleation and crystallite growth during temperature lowering (or raising, with retrograde solubility systems). The deliverables of this contract are: (1) turn-key scintillation system for the semi-automatic determination of protein solubilities as a function of temperature, (2) instructions and software package for the operation of the scintillation system, and (3) one semi-annual and one final report including the test results obtained for ovostatin with the above scintillation system.

  8. Computer Vision Assisted Virtual Reality Calibration

    NASA Technical Reports Server (NTRS)

    Kim, W.

    1999-01-01

    A computer vision assisted semi-automatic virtual reality (VR) calibration technology has been developed that can accurately match a virtual environment of graphically simulated three-dimensional (3-D) models to the video images of the real task environment.

  9. Optimizing automatic traffic recorders network in Minnesota.

    DOT National Transportation Integrated Search

    2016-01-01

    Accurate traffic counts are important for budgeting, traffic planning, and roadway design. With thousands of : centerline miles of roadways, it is not possible to install continuous counters at all locations of interest (e.g., : intersections). There...

  10. Preliminary Investigation on the Effects of Shockwaves on Water Samples Using a Portable Semi-Automatic Shocktube

    NASA Astrophysics Data System (ADS)

    Wessley, G. Jims John

    2017-10-01

    The propagation of shock waves through any media results in an instantaneous increase in pressure and temperature behind the shockwave. The scope of utilizing this sudden rise in pressure and temperature in new industrial, biological and commercial areas has been explored and the opportunities are tremendous. This paper presents the design and testing of a portable semi-automatic shock tube on water samples mixed with salt. The preliminary analysis shows encouraging results as the salinity of water samples were reduced up to 5% when bombarded with 250 shocks generated using a pressure ratio of 2. 5. Paper used for normal printing is used as the diaphragm to generate the shocks. The impact of shocks of much higher intensity obtained using different diaphragms will lead to more reduction in the salinity of the sea water, thus leading to production of potable water from saline water, which is the need of the hour.

  11. A novel semi-automatic snake robot for natural orifice transluminal endoscopic surgery: preclinical tests in animal and human cadaver models (with video).

    PubMed

    Son, Jaebum; Cho, Chang Nho; Kim, Kwang Gi; Chang, Tae Young; Jung, Hyunchul; Kim, Sung Chun; Kim, Min-Tae; Yang, Nari; Kim, Tae-Yun; Sohn, Dae Kyung

    2015-06-01

    Natural orifice transluminal endoscopic surgery (NOTES) is an emerging surgical technique. We aimed to design, create, and evaluate a new semi-automatic snake robot for NOTES. The snake robot employs the characteristics of both a manual endoscope and a multi-segment snake robot. This robot is inserted and retracted manually, like a classical endoscope, while its shape is controlled using embedded robot technology. The feasibility of a prototype robot for NOTES was evaluated in animals and human cadavers. The transverse stiffness and maneuverability of the snake robot appeared satisfactory. It could be advanced through the anus as far as the peritoneal cavity without any injury to adjacent organs. Preclinical tests showed that the device could navigate the peritoneal cavity. The snake robot has advantages of high transverse force and intuitive control. This new robot may be clinically superior to conventional tools for transanal NOTES.

  12. Sub-micron particles in northwest Atlantic shelf water

    NASA Astrophysics Data System (ADS)

    Longhurst, A. R.; Koike, I.; Li, W. K. W.; Rodriguez, J.; Dickie, P.; Kepay, P.; Partensky, F.; Bautista, B.; Ruiz, J.; Wells, M.; Bird, D. F.

    1992-01-01

    The existence of numerous (1.0 × 10 7 ml -1) sub-micron particles has been confirmed in northwest Atlantic shelf water. These particles were counted independently by two different resistive-pulse instruments, and their existence confirmed by our ability to reduce their numbers by ultracentrifugation, serial dilution and surface coagulation in a bubbling column. There are important implications for the dynamics of DOM in seawater if, as seems probable, these particles represent a fraction of "dissolved" organic material in seawater.

  13. A New Bibliographical Feature for SIMBAD: Highlighting the Most Relevant Papers for One Astronomical Object

    NASA Astrophysics Data System (ADS)

    Oberto, A.; Lesteven, S.; Derriere, S.; Bonnin, C.; Buga, M.; Brouty, M.; Bruneau, C.; Brunet, C.; Eisele, A.; Genova, F.; Guéhenneux, S.; Neuville, M.; Ochsenbein, F.; Perret, E.; Son, E.; Vannier, P.; Vonflie, P.; Wenger, M.; Woelfel, F.

    2015-04-01

    The number of bibliographical references attached to an astronomical object in SIMBAD is has been growing continuously over the years. It is important for astronomers to retrieve the most relevant papers, those that give important information about the object of study. This is not easy since there can be many references attached to one object. For instance, in 2014, more than 15,000 objects had been attached to more than 50 references. The location of the object's citations inside the paper and its number of occurrences are important criteria to extract the most relevant papers. Since 2008, because of the DJIN application (a semi-automatic tool to search for object names in full text) this information has been collected. For each article associated with an astronomical object, we know where it is cited and how many times and with which name it appears. Since September 2013, the users of SIMBAD web site can choose to retrieve the most relevant references for an astronomical object depending on its location in the publication. A new formula to sort references by combining all locations, number of occurrences, total number of objects studied, citation count, and year is presented in this paper.

  14. A new user-assisted segmentation and tracking technique for an object-based video editing system

    NASA Astrophysics Data System (ADS)

    Yu, Hong Y.; Hong, Sung-Hoon; Lee, Mike M.; Choi, Jae-Gark

    2004-03-01

    This paper presents a semi-automatic segmentation method which can be used to generate video object plane (VOP) for object based coding scheme and multimedia authoring environment. Semi-automatic segmentation can be considered as a user-assisted segmentation technique. A user can initially mark objects of interest around the object boundaries and then the user-guided and selected objects are continuously separated from the unselected areas through time evolution in the image sequences. The proposed segmentation method consists of two processing steps: partially manual intra-frame segmentation and fully automatic inter-frame segmentation. The intra-frame segmentation incorporates user-assistance to define the meaningful complete visual object of interest to be segmentation and decides precise object boundary. The inter-frame segmentation involves boundary and region tracking to obtain temporal coherence of moving object based on the object boundary information of previous frame. The proposed method shows stable efficient results that could be suitable for many digital video applications such as multimedia contents authoring, content based coding and indexing. Based on these results, we have developed objects based video editing system with several convenient editing functions.

  15. Semi-automatic mapping of linear-trending bedforms using 'Self-Organizing Maps' algorithm

    NASA Astrophysics Data System (ADS)

    Foroutan, M.; Zimbelman, J. R.

    2017-09-01

    Increased application of high resolution spatial data such as high resolution satellite or Unmanned Aerial Vehicle (UAV) images from Earth, as well as High Resolution Imaging Science Experiment (HiRISE) images from Mars, makes it necessary to increase automation techniques capable of extracting detailed geomorphologic elements from such large data sets. Model validation by repeated images in environmental management studies such as climate-related changes as well as increasing access to high-resolution satellite images underline the demand for detailed automatic image-processing techniques in remote sensing. This study presents a methodology based on an unsupervised Artificial Neural Network (ANN) algorithm, known as Self Organizing Maps (SOM), to achieve the semi-automatic extraction of linear features with small footprints on satellite images. SOM is based on competitive learning and is efficient for handling huge data sets. We applied the SOM algorithm to high resolution satellite images of Earth and Mars (Quickbird, Worldview and HiRISE) in order to facilitate and speed up image analysis along with the improvement of the accuracy of results. About 98% overall accuracy and 0.001 quantization error in the recognition of small linear-trending bedforms demonstrate a promising framework.

  16. Trust, control strategies and allocation of function in human-machine systems.

    PubMed

    Lee, J; Moray, N

    1992-10-01

    As automated controllers supplant human intervention in controlling complex systems, the operators' role often changes from that of an active controller to that of a supervisory controller. Acting as supervisors, operators can choose between automatic and manual control. Improperly allocating function between automatic and manual control can have negative consequences for the performance of a system. Previous research suggests that the decision to perform the job manually or automatically depends, in part, upon the trust the operators invest in the automatic controllers. This paper reports an experiment to characterize the changes in operators' trust during an interaction with a semi-automatic pasteurization plant, and investigates the relationship between changes in operators' control strategies and trust. A regression model identifies the causes of changes in trust, and a 'trust transfer function' is developed using time series analysis to describe the dynamics of trust. Based on a detailed analysis of operators' strategies in response to system faults we suggest a model for the choice between manual and automatic control, based on trust in automatic controllers and self-confidence in the ability to control the system manually.

  17. Far-Ultraviolet Number Counts of Field Galaxies

    NASA Technical Reports Server (NTRS)

    Voyer, Elysse N.; Gardner, Jonathan P.; Teplitz, Harry I.; Siana, Brian D.; deMello, Duilia F.

    2010-01-01

    The Number counts of far-ultraviolet (FUV) galaxies as a function of magnitude provide a direct statistical measure of the density and evolution of star-forming galaxies. We report on the results of measurements of the rest-frame FUV number counts computed from data of several fields including the Hubble Ultra Deep Field, the Hubble Deep Field North, and the GOODS-North and -South fields. These data were obtained from the Hubble Space Telescope Solar Blind Channel of the Advance Camera for Surveys. The number counts cover an AB magnitude range from 20-29 magnitudes, covering a total area of 15.9 arcmin'. We show that the number counts are lower than those in previous studies using smaller areas. The differences in the counts are likely the result of cosmic variance; our new data cover more area and more lines of sight than the previous studies. The slope of our number counts connects well with local FUV counts and they show good agreement with recent semi-analytical models based on dark matter "merger trees".

  18. Supporting the annotation of chronic obstructive pulmonary disease (COPD) phenotypes with text mining workflows.

    PubMed

    Fu, Xiao; Batista-Navarro, Riza; Rak, Rafal; Ananiadou, Sophia

    2015-01-01

    Chronic obstructive pulmonary disease (COPD) is a life-threatening lung disorder whose recent prevalence has led to an increasing burden on public healthcare. Phenotypic information in electronic clinical records is essential in providing suitable personalised treatment to patients with COPD. However, as phenotypes are often "hidden" within free text in clinical records, clinicians could benefit from text mining systems that facilitate their prompt recognition. This paper reports on a semi-automatic methodology for producing a corpus that can ultimately support the development of text mining tools that, in turn, will expedite the process of identifying groups of COPD patients. A corpus of 30 full-text papers was formed based on selection criteria informed by the expertise of COPD specialists. We developed an annotation scheme that is aimed at producing fine-grained, expressive and computable COPD annotations without burdening our curators with a highly complicated task. This was implemented in the Argo platform by means of a semi-automatic annotation workflow that integrates several text mining tools, including a graphical user interface for marking up documents. When evaluated using gold standard (i.e., manually validated) annotations, the semi-automatic workflow was shown to obtain a micro-averaged F-score of 45.70% (with relaxed matching). Utilising the gold standard data to train new concept recognisers, we demonstrated that our corpus, although still a work in progress, can foster the development of significantly better performing COPD phenotype extractors. We describe in this work the means by which we aim to eventually support the process of COPD phenotype curation, i.e., by the application of various text mining tools integrated into an annotation workflow. Although the corpus being described is still under development, our results thus far are encouraging and show great potential in stimulating the development of further automatic COPD phenotype extractors.

  19. Impact of 10% SF6 Gas Compared to 100% Air Tamponade in Descemet's Membrane Endothelial Keratoplasty.

    PubMed

    Rickmann, Annekatrin; Szurman, Peter; Jung, Sacha; Boden, Karl Thomas; Wahl, Silke; Haus, Arno; Boden, Katrin; Januschowski, Kai

    2018-04-01

    To compare the clinical outcomes following Descemet's membrane endothelial keratoplasty (DMEK) with 100% air tamponade versus 10% sulfur hexafluoride (SF 6 ) tamponade. Retrospective analysis of 108 consecutive DMEK cases subdivided by anterior chamber tamponade with 54 eyes receiving 10% SF 6 and 54 eyes receiving 100% air injection. A post-hoc matched analysis revealed no statistically significant differences between the groups. The main outcome measurements were the complication rate, including intra- and postoperative complications and graft detachment rate requiring re-bubbling. Clinical outcome included best-corrected visual acuity (BCVA), endothelial cell count (ECC), and central corneal thickness (CCT) measured 1, 3, and 6 months after DMEK surgery. The graft detachment rate with consecutive re-bubbling was 18.5% in the air group and 22.2% in the SF 6 group (p = 0.2). Remaining small peripheral graft detachments with a clear cornea occurred more often in the 100% air group (air: 22.2%; 12/54, 6/12 inferior compared to SF 6 : 7.4%; 4/54, 2/4 inferior; p = 0.06). The primary graft failure rate was comparable between the two groups. No complete graft detachment occurred. Outcome results for BCVA, ECC, and CCT at all follow-up time points were comparable between the two groups. The clinical outcomes (including re-bubbling rate, primary graft failure rate, and endothelial cell loss) were comparable with 100% air versus 10% SF 6 tamponade, whereas other studies suggest that a higher SF 6 concentration (20%) may result in a lower re-bubbling rate.

  20. An automatic granular structure generation and finite element analysis of heterogeneous semi-solid materials

    NASA Astrophysics Data System (ADS)

    Sharifi, Hamid; Larouche, Daniel

    2015-09-01

    The quality of cast metal products depends on the capacity of the semi-solid metal to sustain the stresses generated during the casting. Predicting the evolution of these stresses with accuracy in the solidification interval should be highly helpful to avoid the formation of defects like hot tearing. This task is however very difficult because of the heterogeneous nature of the material. In this paper, we propose to evaluate the mechanical behaviour of a metal during solidification using a mesh generation technique of the heterogeneous semi-solid material for a finite element analysis at the microscopic level. This task is done on a two-dimensional (2D) domain in which the granular structure of the solid phase is generated surrounded by an intergranular and interdendritc liquid phase. Some basic solid grains are first constructed and projected in the 2D domain with random orientations and scale factors. Depending on their orientation, the basic grains are combined to produce larger grains or separated by a liquid film. Different basic grain shapes can produce different granular structures of the mushy zone. As a result, using this automatic grain generation procedure, we can investigate the effect of grain shapes and sizes on the thermo-mechanical behaviour of the semi-solid material. The granular models are automatically converted to the finite element meshes. The solid grains and the liquid phase are meshed properly using quadrilateral elements. This method has been used to simulate the microstructure of a binary aluminium-copper alloy (Al-5.8 wt% Cu) when the fraction solid is 0.92. Using the finite element method and the Mie-Grüneisen equation of state for the liquid phase, the transient mechanical behaviour of the mushy zone under tensile loading has been investigated. The stress distribution and the bridges, which are formed during the tensile loading, have been detected.

  1. Comparison of a semi-automatic annotation tool and a natural language processing application for the generation of clinical statement entries.

    PubMed

    Lin, Ching-Heng; Wu, Nai-Yuan; Lai, Wei-Shao; Liou, Der-Ming

    2015-01-01

    Electronic medical records with encoded entries should enhance the semantic interoperability of document exchange. However, it remains a challenge to encode the narrative concept and to transform the coded concepts into a standard entry-level document. This study aimed to use a novel approach for the generation of entry-level interoperable clinical documents. Using HL7 clinical document architecture (CDA) as the example, we developed three pipelines to generate entry-level CDA documents. The first approach was a semi-automatic annotation pipeline (SAAP), the second was a natural language processing (NLP) pipeline, and the third merged the above two pipelines. We randomly selected 50 test documents from the i2b2 corpora to evaluate the performance of the three pipelines. The 50 randomly selected test documents contained 9365 words, including 588 Observation terms and 123 Procedure terms. For the Observation terms, the merged pipeline had a significantly higher F-measure than the NLP pipeline (0.89 vs 0.80, p<0.0001), but a similar F-measure to that of the SAAP (0.89 vs 0.87). For the Procedure terms, the F-measure was not significantly different among the three pipelines. The combination of a semi-automatic annotation approach and the NLP application seems to be a solution for generating entry-level interoperable clinical documents. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.comFor numbered affiliation see end of article.

  2. StandFood: Standardization of Foods Using a Semi-Automatic System for Classifying and Describing Foods According to FoodEx2

    PubMed Central

    Eftimov, Tome; Korošec, Peter; Koroušić Seljak, Barbara

    2017-01-01

    The European Food Safety Authority has developed a standardized food classification and description system called FoodEx2. It uses facets to describe food properties and aspects from various perspectives, making it easier to compare food consumption data from different sources and perform more detailed data analyses. However, both food composition data and food consumption data, which need to be linked, are lacking in FoodEx2 because the process of classification and description has to be manually performed—a process that is laborious and requires good knowledge of the system and also good knowledge of food (composition, processing, marketing, etc.). In this paper, we introduce a semi-automatic system for classifying and describing foods according to FoodEx2, which consists of three parts. The first involves a machine learning approach and classifies foods into four FoodEx2 categories, with two for single foods: raw (r) and derivatives (d), and two for composite foods: simple (s) and aggregated (c). The second uses a natural language processing approach and probability theory to describe foods. The third combines the result from the first and the second part by defining post-processing rules in order to improve the result for the classification part. We tested the system using a set of food items (from Slovenia) manually-coded according to FoodEx2. The new semi-automatic system obtained an accuracy of 89% for the classification part and 79% for the description part, or an overall result of 79% for the whole system. PMID:28587103

  3. StandFood: Standardization of Foods Using a Semi-Automatic System for Classifying and Describing Foods According to FoodEx2.

    PubMed

    Eftimov, Tome; Korošec, Peter; Koroušić Seljak, Barbara

    2017-05-26

    The European Food Safety Authority has developed a standardized food classification and description system called FoodEx2. It uses facets to describe food properties and aspects from various perspectives, making it easier to compare food consumption data from different sources and perform more detailed data analyses. However, both food composition data and food consumption data, which need to be linked, are lacking in FoodEx2 because the process of classification and description has to be manually performed-a process that is laborious and requires good knowledge of the system and also good knowledge of food (composition, processing, marketing, etc.). In this paper, we introduce a semi-automatic system for classifying and describing foods according to FoodEx2, which consists of three parts. The first involves a machine learning approach and classifies foods into four FoodEx2 categories, with two for single foods: raw (r) and derivatives (d), and two for composite foods: simple (s) and aggregated (c). The second uses a natural language processing approach and probability theory to describe foods. The third combines the result from the first and the second part by defining post-processing rules in order to improve the result for the classification part. We tested the system using a set of food items (from Slovenia) manually-coded according to FoodEx2. The new semi-automatic system obtained an accuracy of 89% for the classification part and 79% for the description part, or an overall result of 79% for the whole system.

  4. Semi-Automatic Electronic Stent Register: a novel approach to preventing ureteric stents lost to follow up.

    PubMed

    Macneil, James W H; Michail, Peter; Patel, Manish I; Ashbourne, Julie; Bariol, Simon V; Ende, David A; Hossack, Tania A; Lau, Howard; Wang, Audrey C; Brooks, Andrew J

    2017-10-01

    Ureteric stents are indispensable tools in modern urology; however, the risk of them not being followed-up once inserted poses medical and medico-legal risks. Stent registers are a common solution to mitigate this risk; however, manual registers are logistically challenging, especially for busy units. Western Sydney Local Health District developed a novel Semi-Automatic Electronic Stent Register (SAESR) utilizing billing information to track stent insertions. To determine the utility of this system, an audit was conducted comparing the 6 months before the introduction of the register to the first 6 months of the register. In the first 6 months of the register, 457 stents were inserted. At the time of writing, two of these are severely delayed for removal, representing a rate of 0.4%. In the 6 months immediately preceding the introduction of the register, 497 stents were inserted, and six were either missed completely or severely delayed in their removal, representing a rate of 1.2%. A non-inferiority analysis found this to be no worse than the results achieved before the introduction of the register. The SAESR allowed us to improve upon our better than expected rate of stents lost to follow up or severely delayed. We demonstrated non-inferiority in the rate of lost or severely delayed stents, and a number of other advantages including savings in personnel costs. The semi-automatic register represents an effective way of reducing the risk associated with a common urological procedure. We believe that this methodology could be implemented elsewhere. © 2017 Royal Australasian College of Surgeons.

  5. Real-time people counting system using a single video camera

    NASA Astrophysics Data System (ADS)

    Lefloch, Damien; Cheikh, Faouzi A.; Hardeberg, Jon Y.; Gouton, Pierre; Picot-Clemente, Romain

    2008-02-01

    There is growing interest in video-based solutions for people monitoring and counting in business and security applications. Compared to classic sensor-based solutions the video-based ones allow for more versatile functionalities, improved performance with lower costs. In this paper, we propose a real-time system for people counting based on single low-end non-calibrated video camera. The two main challenges addressed in this paper are: robust estimation of the scene background and the number of real persons in merge-split scenarios. The latter is likely to occur whenever multiple persons move closely, e.g. in shopping centers. Several persons may be considered to be a single person by automatic segmentation algorithms, due to occlusions or shadows, leading to under-counting. Therefore, to account for noises, illumination and static objects changes, a background substraction is performed using an adaptive background model (updated over time based on motion information) and automatic thresholding. Furthermore, post-processing of the segmentation results is performed, in the HSV color space, to remove shadows. Moving objects are tracked using an adaptive Kalman filter, allowing a robust estimation of the objects future positions even under heavy occlusion. The system is implemented in Matlab, and gives encouraging results even at high frame rates. Experimental results obtained based on the PETS2006 datasets are presented at the end of the paper.

  6. Assessment of local pulse wave velocity distribution in mice using k-t BLAST PC-CMR with semi-automatic area segmentation.

    PubMed

    Herold, Volker; Herz, Stefan; Winter, Patrick; Gutjahr, Fabian Tobias; Andelovic, Kristina; Bauer, Wolfgang Rudolf; Jakob, Peter Michael

    2017-10-16

    Local aortic pulse wave velocity (PWV) is a measure for vascular stiffness and has a predictive value for cardiovascular events. Ultra high field CMR scanners allow the quantification of local PWV in mice, however these systems are yet unable to monitor the distribution of local elasticities. In the present study we provide a new accelerated method to quantify local aortic PWV in mice with phase-contrast cardiovascular magnetic resonance imaging (PC-CMR) at 17.6 T. Based on a k-t BLAST (Broad-use Linear Acquisition Speed-up Technique) undersampling scheme, total measurement time could be reduced by a factor of 6. The fast data acquisition enables to quantify the local PWV at several locations along the aortic blood vessel based on the evaluation of local temporal changes in blood flow and vessel cross sectional area. To speed up post processing and to eliminate operator bias, we introduce a new semi-automatic segmentation algorithm to quantify cross-sectional areas of the aortic vessel. The new methods were applied in 10 eight-month-old mice (4 C57BL/6J-mice and 6 ApoE (-/-) -mice) at 12 adjacent locations along the abdominal aorta. Accelerated data acquisition and semi-automatic post-processing delivered reliable measures for the local PWV, similiar to those obtained with full data sampling and manual segmentation. No statistically significant differences of the mean values could be detected for the different measurement approaches. Mean PWV values were elevated for the ApoE (-/-) -group compared to the C57BL/6J-group (3.5 ± 0.7 m/s vs. 2.2 ± 0.4 m/s, p < 0.01). A more heterogeneous PWV-distribution in the ApoE (-/-) -animals could be observed compared to the C57BL/6J-mice, representing the local character of lesion development in atherosclerosis. In the present work, we showed that k-t BLAST PC-MRI enables the measurement of the local PWV distribution in the mouse aorta. The semi-automatic segmentation method based on PC-CMR data allowed rapid determination of local PWV. The findings of this study demonstrate the ability of the proposed methods to non-invasively quantify the spatial variations in local PWV along the aorta of ApoE (-/-) -mice as a relevant model of atherosclerosis.

  7. Utilization of Automatic Tagging Using Web Information to Datamining

    NASA Astrophysics Data System (ADS)

    Sugimura, Hiroshi; Matsumoto, Kazunori

    This paper proposes a data annotation system using the automatic tagging approach. Although annotations of data are useful for deep analysis and mining of it, the cost of providing them becomes huge in most of the cases. In order to solve this problem, we develop a semi-automatic method that consists of two stages. In the first stage, it searches the Web space for relating information, and discovers candidates of effective annotations. The second stage uses knowledge of a human user. The candidates are investigated and refined by the user, and then they become annotations. We in this paper focus on time-series data, and show effectiveness of a GUI tool that supports the above process.

  8. Application of Semantic Tagging to Generate Superimposed Information on a Digital Encyclopedia

    NASA Astrophysics Data System (ADS)

    Garrido, Piedad; Tramullas, Jesus; Martinez, Francisco J.

    We can find in the literature several works regarding the automatic or semi-automatic processing of textual documents with historic information using free software technologies. However, more research work is needed to integrate the analysis of the context and provide coverage to the peculiarities of the Spanish language from a semantic point of view. This research work proposes a novel knowledge-based strategy based on combining subject-centric computing, a topic-oriented approach, and superimposed information. It subsequent combination with artificial intelligence techniques led to an automatic analysis after implementing a made-to-measure interpreted algorithm which, in turn, produced a good number of associations and events with 90% reliability.

  9. Automatic Threshold Design for a Bound Document Scanner.

    DTIC Science & Technology

    1982-12-01

    IS k- A AL O. N J MJt A ,4. TITLE foodSublitOio ). TYP R F EPOAT A PERIOD COVEREO Automatic Threshold De~i~n - ’W::d 1)o, i ,-r THESIS /DASSET’T...due to data uncertainty and other shortcomings in the scanner L * rather than in the ATC scheme. (Page count: 224) * Thesis Supervisor: Dr. J. F...permission to reproduce and distribute copies of this thesis document in whole or in part. Signature of Author Certified b y_ ___ -F . Reites, Thesis

  10. Citrus Inventory

    NASA Technical Reports Server (NTRS)

    1994-01-01

    An aerial color infrared (CIR) mapping system developed by Kennedy Space Center enables Florida's Charlotte County to accurately appraise its citrus groves while reducing appraisal costs. The technology was further advanced by development of a dual video system making it possible to simultaneously view images of the same area and detect changes. An image analysis system automatically surveys and photo interprets grove images as well as automatically counts trees and reports totals. The system, which saves both time and money, has potential beyond citrus grove valuation.

  11. Improved Visualization of Hydroacoustic Plumes Using the Split-Beam Aperture Coherence.

    PubMed

    Blomberg, Ann E A; Weber, Thomas C; Austeng, Andreas

    2018-06-25

    Natural seepage of methane into the oceans is considerable, and plays a role in the global carbon cycle. Estimating the amount of this greenhouse gas entering the water column is important in order to understand their environmental impact. In addition, leakage from man-made structures such as gas pipelines may have environmental and economical consequences and should be promptly detected. Split beam echo sounders (SBES) detect hydroacoustic plumes due to the significant contrast in acoustic impedance between water and free gas. SBES are also powerful tools for plume characterization, with the ability to provide absolute acoustic measurements, estimate bubble trajectories, and capture the frequency dependent response of bubbles. However, under challenging conditions such as deep water and considerable background noise, it can be difficult to detect the presence of gas seepage from the acoustic imagery alone. The spatial coherence of the wavefield measured across the split beam sectors, quantified by the coherence factor (CF), is a computationally simple, easily available quantity which complements the acoustic imagery and may ease the ability to automatically or visually detect bubbles in the water column. We demonstrate the benefits of CF processing using SBES data from the Hudson Canyon, acquired using the Simrad EK80 SBES. We observe that hydroacoustic plumes appear more clearly defined and are easier to detect in the CF imagery than in the acoustic backscatter images.

  12. Real-time yield estimation based on deep learning

    NASA Astrophysics Data System (ADS)

    Rahnemoonfar, Maryam; Sheppard, Clay

    2017-05-01

    Crop yield estimation is an important task in product management and marketing. Accurate yield prediction helps farmers to make better decision on cultivation practices, plant disease prevention, and the size of harvest labor force. The current practice of yield estimation based on the manual counting of fruits is very time consuming and expensive process and it is not practical for big fields. Robotic systems including Unmanned Aerial Vehicles (UAV) and Unmanned Ground Vehicles (UGV), provide an efficient, cost-effective, flexible, and scalable solution for product management and yield prediction. Recently huge data has been gathered from agricultural field, however efficient analysis of those data is still a challenging task. Computer vision approaches currently face diffident challenges in automatic counting of fruits or flowers including occlusion caused by leaves, branches or other fruits, variance in natural illumination, and scale. In this paper a novel deep convolutional network algorithm was developed to facilitate the accurate yield prediction and automatic counting of fruits and vegetables on the images. Our method is robust to occlusion, shadow, uneven illumination and scale. Experimental results in comparison to the state-of-the art show the effectiveness of our algorithm.

  13. Symbolic and non symbolic numerical representation in adults with and without developmental dyscalculia

    PubMed Central

    2012-01-01

    Background The question whether Developmental Dyscalculia (DD; a deficit in the ability to process numerical information) is the result of deficiencies in the non symbolic numerical representation system (e.g., a group of dots) or in the symbolic numerical representation system (e.g., Arabic numerals) has been debated in scientific literature. It is accepted that the non symbolic system is divided into two different ranges, the subitizing range (i.e., quantities from 1-4) which is processed automatically and quickly, and the counting range (i.e., quantities larger than 4) which is an attention demanding procedure and is therefore processed serially and slowly. However, so far no study has tested the automaticity of symbolic and non symbolic representation in DD participants separately for the subitizing and the counting ranges. Methods DD and control participants undergo a novel version of the Stroop task, i.e., the Enumeration Stroop. They were presented with a random series of between one and nine written digits, and were asked to name either the relevant written digit (in the symbolic task) or the relevant quantity of digits (in the non symbolic task) while ignoring the irrelevant aspect. Result DD participants, unlike the control group, didn't show any congruency effect in the subitizing range of the non symbolic task. Conclusion These findings suggest that DD may be impaired in the ability to process symbolic numerical information or in the ability to automatically associate the two systems (i.e., the symbolic vs. the non symbolic). Additionally DD have deficiencies in the non symbolic counting range. PMID:23190433

  14. Symbolic and non symbolic numerical representation in adults with and without developmental dyscalculia.

    PubMed

    Furman, Tamar; Rubinsten, Orly

    2012-11-28

    The question whether Developmental Dyscalculia (DD; a deficit in the ability to process numerical information) is the result of deficiencies in the non symbolic numerical representation system (e.g., a group of dots) or in the symbolic numerical representation system (e.g., Arabic numerals) has been debated in scientific literature. It is accepted that the non symbolic system is divided into two different ranges, the subitizing range (i.e., quantities from 1-4) which is processed automatically and quickly, and the counting range (i.e., quantities larger than 4) which is an attention demanding procedure and is therefore processed serially and slowly. However, so far no study has tested the automaticity of symbolic and non symbolic representation in DD participants separately for the subitizing and the counting ranges. DD and control participants undergo a novel version of the Stroop task, i.e., the Enumeration Stroop. They were presented with a random series of between one and nine written digits, and were asked to name either the relevant written digit (in the symbolic task) or the relevant quantity of digits (in the non symbolic task) while ignoring the irrelevant aspect. DD participants, unlike the control group, didn't show any congruency effect in the subitizing range of the non symbolic task. These findings suggest that DD may be impaired in the ability to process symbolic numerical information or in the ability to automatically associate the two systems (i.e., the symbolic vs. the non symbolic). Additionally DD have deficiencies in the non symbolic counting range.

  15. A repeated-measures analysis of the effects of soft tissues on wrist range of motion in the extant phylogenetic bracket of dinosaurs: Implications for the functional origins of an automatic wrist folding mechanism in Crocodilia.

    PubMed

    Hutson, Joel David; Hutson, Kelda Nadine

    2014-07-01

    A recent study hypothesized that avian-like wrist folding in quadrupedal dinosaurs could have aided their distinctive style of locomotion with semi-pronated and therefore medially facing palms. However, soft tissues that automatically guide avian wrist folding rarely fossilize, and automatic wrist folding of unknown function in extant crocodilians has not been used to test this hypothesis. Therefore, an investigation of the relative contributions of soft tissues to wrist range of motion (ROM) in the extant phylogenetic bracket of dinosaurs, and the quadrupedal function of crocodilian wrist folding, could inform these questions. Here, we repeatedly measured wrist ROM in degrees through fully fleshed, skinned, minus muscles/tendons, minus ligaments, and skeletonized stages in the American alligator Alligator mississippiensis and the ostrich Struthio camelus. The effects of dissection treatment and observer were statistically significant for alligator wrist folding and ostrich wrist flexion, but not ostrich wrist folding. Final skeletonized wrist folding ROM was higher than (ostrich) or equivalent to (alligator) initial fully fleshed ROM, while final ROM was lower than initial ROM for ostrich wrist flexion. These findings suggest that, unlike the hinge/ball and socket-type elbow and shoulder joints in these archosaurs, ROM within gliding/planar diarthrotic joints is more restricted to the extent of articular surfaces. The alligator data indicate that the crocodilian wrist mechanism functions to automatically lock their semi-pronated palms into a rigid column, which supports the hypothesis that this palmar orientation necessitated soft tissue stiffening mechanisms in certain dinosaurs, although ROM-restricted articulations argue against the presence of an extensive automatic mechanism. Anat Rec, 297:1228-1249, 2014. © 2014 Wiley Periodicals, Inc. © 2014 Wiley Periodicals, Inc.

  16. Implementation and evaluation of a new workflow for registration and segmentation of pulmonary MRI data for regional lung perfusion assessment.

    PubMed

    Böttger, T; Grunewald, K; Schöbinger, M; Fink, C; Risse, F; Kauczor, H U; Meinzer, H P; Wolf, Ivo

    2007-03-07

    Recently it has been shown that regional lung perfusion can be assessed using time-resolved contrast-enhanced magnetic resonance (MR) imaging. Quantification of the perfusion images has been attempted, based on definition of small regions of interest (ROIs). Use of complete lung segmentations instead of ROIs could possibly increase quantification accuracy. Due to the low signal-to-noise ratio, automatic segmentation algorithms cannot be applied. On the other hand, manual segmentation of the lung tissue is very time consuming and can become inaccurate, as the borders of the lung to adjacent tissues are not always clearly visible. We propose a new workflow for semi-automatic segmentation of the lung from additionally acquired morphological HASTE MR images. First the lung is delineated semi-automatically in the HASTE image. Next the HASTE image is automatically registered with the perfusion images. Finally, the transformation resulting from the registration is used to align the lung segmentation from the morphological dataset with the perfusion images. We evaluated rigid, affine and locally elastic transformations, suitable optimizers and different implementations of mutual information (MI) metrics to determine the best possible registration algorithm. We located the shortcomings of the registration procedure and under which conditions automatic registration will succeed or fail. Segmentation results were evaluated using overlap and distance measures. Integration of the new workflow reduces the time needed for post-processing of the data, simplifies the perfusion quantification and reduces interobserver variability in the segmentation process. In addition, the matched morphological data set can be used to identify morphologic changes as the source for the perfusion abnormalities.

  17. Exaggerated, mispredicted, and misplaced: when "it's the thought that counts" in gift exchanges.

    PubMed

    Zhang, Yan; Epley, Nicholas

    2012-11-01

    Gift-giving involves both the objective value of a gift and the symbolic meaning of the exchange. The objective value is sometimes considered of secondary importance as when people claim, "It's the thought that counts." We evaluated when and how mental state inferences count in gift exchanges. Because considering another's thoughts requires motivation and deliberation, we predicted gift givers' thoughts would increase receivers' appreciation only when triggered to consider a giver's thoughts, such as when a friend gives a bad gift. Because gift givers do not experience this trigger, we expected they would mispredict when their thoughts count and when they do not. Three experiments support these predictions. A final experiment demonstrated that thoughts "count" for givers by increasing social connection to the receiver. These results suggest that mental state inferences are not automatic in social interactions and that inferences about how much thoughts count are systematically miscalibrated. (PsycINFO Database Record (c) 2012 APA, all rights reserved).

  18. Localization accuracy from automatic and semi-automatic rigid registration of locally-advanced lung cancer targets during image-guided radiation therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Scott P.; Weiss, Elisabeth; Hugo, Geoffrey D.

    2012-01-15

    Purpose: To evaluate localization accuracy resulting from rigid registration of locally-advanced lung cancer targets using fully automatic and semi-automatic protocols for image-guided radiation therapy. Methods: Seventeen lung cancer patients, fourteen also presenting with involved lymph nodes, received computed tomography (CT) scans once per week throughout treatment under active breathing control. A physician contoured both lung and lymph node targets for all weekly scans. Various automatic and semi-automatic rigid registration techniques were then performed for both individual and simultaneous alignments of the primary gross tumor volume (GTV{sub P}) and involved lymph nodes (GTV{sub LN}) to simulate the localization process in image-guidedmore » radiation therapy. Techniques included ''standard'' (direct registration of weekly images to a planning CT), ''seeded'' (manual prealignment of targets to guide standard registration), ''transitive-based'' (alignment of pretreatment and planning CTs through one or more intermediate images), and ''rereferenced'' (designation of a new reference image for registration). Localization error (LE) was assessed as the residual centroid and border distances between targets from planning and weekly CTs after registration. Results: Initial bony alignment resulted in centroid LE of 7.3 {+-} 5.4 mm and 5.4 {+-} 3.4 mm for the GTV{sub P} and GTV{sub LN}, respectively. Compared to bony alignment, transitive-based and seeded registrations significantly reduced GTV{sub P} centroid LE to 4.7 {+-} 3.7 mm (p = 0.011) and 4.3 {+-} 2.5 mm (p < 1 x 10{sup -3}), respectively, but the smallest GTV{sub P} LE of 2.4 {+-} 2.1 mm was provided by rereferenced registration (p < 1 x 10{sup -6}). Standard registration significantly reduced GTV{sub LN} centroid LE to 3.2 {+-} 2.5 mm (p < 1 x 10{sup -3}) compared to bony alignment, with little additional gain offered by the other registration techniques. For simultaneous target alignment, centroid LE as low as 3.9 {+-} 2.7 mm and 3.8 {+-} 2.3 mm were achieved for the GTV{sub P} and GTV{sub LN}, respectively, using rereferenced registration. Conclusions: Target shape, volume, and configuration changes during radiation therapy limited the accuracy of standard rigid registration for image-guided localization in locally-advanced lung cancer. Significant error reductions were possible using other rigid registration techniques, with LE approaching the lower limit imposed by interfraction target variability throughout treatment.« less

  19. Automatic spatiotemporal matching of detected pleural thickenings

    NASA Astrophysics Data System (ADS)

    Chaisaowong, Kraisorn; Keller, Simon Kai; Kraus, Thomas

    2014-01-01

    Pleural thickenings can be found in asbestos exposed patient's lung. Non-invasive diagnosis including CT imaging can detect aggressive malignant pleural mesothelioma in its early stage. In order to create a quantitative documentation of automatic detected pleural thickenings over time, the differences in volume and thickness of the detected thickenings have to be calculated. Physicians usually estimate the change of each thickening via visual comparison which provides neither quantitative nor qualitative measures. In this work, automatic spatiotemporal matching techniques of the detected pleural thickenings at two points of time based on the semi-automatic registration have been developed, implemented, and tested so that the same thickening can be compared fully automatically. As result, the application of the mapping technique using the principal components analysis turns out to be advantageous than the feature-based mapping using centroid and mean Hounsfield Units of each thickening, since the resulting sensitivity was improved to 98.46% from 42.19%, while the accuracy of feature-based mapping is only slightly higher (84.38% to 76.19%).

  20. Patient-specific semi-supervised learning for postoperative brain tumor segmentation.

    PubMed

    Meier, Raphael; Bauer, Stefan; Slotboom, Johannes; Wiest, Roland; Reyes, Mauricio

    2014-01-01

    In contrast to preoperative brain tumor segmentation, the problem of postoperative brain tumor segmentation has been rarely approached so far. We present a fully-automatic segmentation method using multimodal magnetic resonance image data and patient-specific semi-supervised learning. The idea behind our semi-supervised approach is to effectively fuse information from both pre- and postoperative image data of the same patient to improve segmentation of the postoperative image. We pose image segmentation as a classification problem and solve it by adopting a semi-supervised decision forest. The method is evaluated on a cohort of 10 high-grade glioma patients, with segmentation performance and computation time comparable or superior to a state-of-the-art brain tumor segmentation method. Moreover, our results confirm that the inclusion of preoperative MR images lead to a better performance regarding postoperative brain tumor segmentation.

  1. Shared Decisions That Count.

    ERIC Educational Resources Information Center

    Schlechty, Phillip C.

    1993-01-01

    Advocates of participatory leadership, site-based management, and decentralization often assume that changing decision-making group composition will automatically improve the quality of decisions being made. Stakeholder satisfaction does not guarantee quality results. This article offers a framework for moving the decision-making discussion from…

  2. Sources of error in estimating truck traffic from automatic vehicle classification data

    DOT National Transportation Integrated Search

    1998-10-01

    Truck annual average daily traffic estimation errors resulting from sample classification counts are computed in this paper under two scenarios. One scenario investigates an improper factoring procedure that may be used by highway agencies. The study...

  3. A semi-automated technique for labeling and counting of apoptosing retinal cells

    PubMed Central

    2014-01-01

    Background Retinal ganglion cell (RGC) loss is one of the earliest and most important cellular changes in glaucoma. The DARC (Detection of Apoptosing Retinal Cells) technology enables in vivo real-time non-invasive imaging of single apoptosing retinal cells in animal models of glaucoma and Alzheimer’s disease. To date, apoptosing RGCs imaged using DARC have been counted manually. This is time-consuming, labour-intensive, vulnerable to bias, and has considerable inter- and intra-operator variability. Results A semi-automated algorithm was developed which enabled automated identification of apoptosing RGCs labeled with fluorescent Annexin-5 on DARC images. Automated analysis included a pre-processing stage involving local-luminance and local-contrast “gain control”, a “blob analysis” step to differentiate between cells, vessels and noise, and a method to exclude non-cell structures using specific combined ‘size’ and ‘aspect’ ratio criteria. Apoptosing retinal cells were counted by 3 masked operators, generating ‘Gold-standard’ mean manual cell counts, and were also counted using the newly developed automated algorithm. Comparison between automated cell counts and the mean manual cell counts on 66 DARC images showed significant correlation between the two methods (Pearson’s correlation coefficient 0.978 (p < 0.001), R Squared = 0.956. The Intraclass correlation coefficient was 0.986 (95% CI 0.977-0.991, p < 0.001), and Cronbach’s alpha measure of consistency = 0.986, confirming excellent correlation and consistency. No significant difference (p = 0.922, 95% CI: −5.53 to 6.10) was detected between the cell counts of the two methods. Conclusions The novel automated algorithm enabled accurate quantification of apoptosing RGCs that is highly comparable to manual counting, and appears to minimise operator-bias, whilst being both fast and reproducible. This may prove to be a valuable method of quantifying apoptosing retinal cells, with particular relevance to translation in the clinic, where a Phase I clinical trial of DARC in glaucoma patients is due to start shortly. PMID:24902592

  4. Cavitation induced Becquerel effect.

    PubMed

    Prevenslik, T V

    2003-06-01

    The observation of an electrical current upon the ultraviolet (UV) illumination of one of a pair of identical electrodes in liquid water, called the Becquerel effect, was made over 150 years ago. More recently, an electrical current was found if the water surrounding one electrode was made to cavitate by focused acoustic radiation, the phenomenon called the cavitation induced Becquerel effect. Since cavitation is known to produce UV light, the electrode may simply absorb the UV light and produce the current by the photo-emission theory of photoelectrochemistry. But the current was found to be semi-logarithmic with the standard electrode potential which is characteristic of the oxidation of the electrode surface in the photo-decomposition theory, and not the photo-emission theory. High bubble collapse temperatures may oxidize the electrode, but this is unlikely because melting was not observed on the electrode surfaces. At ambient temperature, oxidation may proceed by chemical reaction provided a source of vacuum ultraviolet (VUV) radiation is available to produce the excited OH* states of water to react with the electrode. The source of VUV radiation is shown to be the spontaneous emission of coherent infrared (IR) radiation from water molecules in particles that form in bubbles because of surface tension, the spontaneous IR emission induced by cavity quantum electrodynamics. The excited OH* states are produced as the IR radiation accumulates to VUV levels in the bubble wall molecules.

  5. Parametric Study of Flow Patterns behind the Standing Accretion Shock Wave for Core-Collapse Supernovae

    NASA Astrophysics Data System (ADS)

    Iwakami, Wakana; Nagakura, Hiroki; Yamada, Shoichi

    2014-05-01

    In this study, we conduct three-dimensional hydrodynamic simulations systematically to investigate the flow patterns behind the accretion shock waves that are commonly formed in the post-bounce phase of core-collapse supernovae. Adding small perturbations to spherically symmetric, steady, shocked accretion flows, we compute the subsequent evolutions to find what flow pattern emerges as a consequence of hydrodynamical instabilities such as convection and standing accretion shock instability for different neutrino luminosities and mass accretion rates. Depending on these two controlling parameters, various flow patterns are indeed realized. We classify them into three basic patterns and two intermediate ones; the former includes sloshing motion (SL), spiral motion (SP), and multiple buoyant bubble formation (BB); the latter consists of spiral motion with buoyant-bubble formation (SPB) and spiral motion with pulsationally changing rotational velocities (SPP). Although the post-shock flow is highly chaotic, there is a clear trend in the pattern realization. The sloshing and spiral motions tend to be dominant for high accretion rates and low neutrino luminosities, and multiple buoyant bubbles prevail for low accretion rates and high neutrino luminosities. It is interesting that the dominant pattern is not always identical between the semi-nonlinear and nonlinear phases near the critical luminosity; the intermediate cases are realized in the latter case. Running several simulations with different random perturbations, we confirm that the realization of flow pattern is robust in most cases.

  6. Tuned grid generation with ICEM CFD

    NASA Technical Reports Server (NTRS)

    Wulf, Armin; Akdag, Vedat

    1995-01-01

    ICEM CFD is a CAD based grid generation package that supports multiblock structured, unstructured tetrahedral and unstructured hexahedral grids. Major development efforts have been spent to extend ICEM CFD's multiblock structured and hexahedral unstructured grid generation capabilities. The modules added are: a parametric grid generation module and a semi-automatic hexahedral grid generation module. A fully automatic version of the hexahedral grid generation module for around a set of predefined objects in rectilinear enclosures has been developed. These modules will be presented and the procedures used will be described, and examples will be discussed.

  7. MedSynDiKATe--design considerations for an ontology-based medical text understanding system.

    PubMed Central

    Hahn, U.; Romacker, M.; Schulz, S.

    2000-01-01

    MedSynDiKATe is a natural language processor for automatically acquiring knowledge from medical finding reports. The content of these documents is transferred to formal representation structures which constitute a corresponding text knowledge base. The general system architecture we present integrates requirements from the analysis of single sentences, as well as those of referentially linked sentences forming cohesive texts. The strong demands MedSynDiKATe poses to the availability of expressive knowledge sources are accounted for by two alternative approaches to (semi)automatic ontology engineering. PMID:11079899

  8. Vegetation survey in Amazonia using LANDSAT data. [Brazil

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Shimabukuro, Y. E.; Dossantos, J. R.; Deaquino, L. C. S.

    1982-01-01

    Automatic Image-100 analysis of LANDSAT data was performed using the MAXVER classification algorithm. In the pilot area, four vegetation units were mapped automatically in addition to the areas occupied for agricultural activities. The Image-100 classified results together with a soil map and information from RADAR images, permitted the establishment of the final legend with six classes: semi-deciduous tropical forest; low land evergreen tropical forest; secondary vegetation; tropical forest of humid areas, predominant pastureland and flood plains. Two water types were identified based on their sediments indicating different geological and geomorphological aspects.

  9. Evaluation of an automatic MR-based gold fiducial marker localisation method for MR-only prostate radiotherapy

    NASA Astrophysics Data System (ADS)

    Maspero, Matteo; van den Berg, Cornelis A. T.; Zijlstra, Frank; Sikkes, Gonda G.; de Boer, Hans C. J.; Meijer, Gert J.; Kerkmeijer, Linda G. W.; Viergever, Max A.; Lagendijk, Jan J. W.; Seevinck, Peter R.

    2017-10-01

    An MR-only radiotherapy planning (RTP) workflow would reduce the cost, radiation exposure and uncertainties introduced by CT-MRI registrations. In the case of prostate treatment, one of the remaining challenges currently holding back the implementation of an RTP workflow is the MR-based localisation of intraprostatic gold fiducial markers (FMs), which is crucial for accurate patient positioning. Currently, MR-based FM localisation is clinically performed manually. This is sub-optimal, as manual interaction increases the workload. Attempts to perform automatic FM detection often rely on being able to detect signal voids induced by the FMs in magnitude images. However, signal voids may not always be sufficiently specific, hampering accurate and robust automatic FM localisation. Here, we present an approach that aims at automatic MR-based FM localisation. This method is based on template matching using a library of simulated complex-valued templates, and exploiting the behaviour of the complex MR signal in the vicinity of the FM. Clinical evaluation was performed on seventeen prostate cancer patients undergoing external beam radiotherapy treatment. Automatic MR-based FM localisation was compared to manual MR-based and semi-automatic CT-based localisation (the current gold standard) in terms of detection rate and the spatial accuracy and precision of localisation. The proposed method correctly detected all three FMs in 15/17 patients. The spatial accuracy (mean) and precision (STD) were 0.9 mm and 0.5 mm respectively, which is below the voxel size of 1.1 × 1.1 × 1.2 mm3 and comparable to MR-based manual localisation. FM localisation failed (3/51 FMs) in the presence of bleeding or calcifications in the direct vicinity of the FM. The method was found to be spatially accurate and precise, which is essential for clinical use. To overcome any missed detection, we envision the use of the proposed method along with verification by an observer. This will result in a semi-automatic workflow facilitating the introduction of an MR-only workflow.

  10. Semi-Automatic Segmentation Software for Quantitative Clinical Brain Glioblastoma Evaluation

    PubMed Central

    Zhu, Y; Young, G; Xue, Z; Huang, R; You, H; Setayesh, K; Hatabu, H; Cao, F; Wong, S.T.

    2012-01-01

    Rationale and Objectives Quantitative measurement provides essential information about disease progression and treatment response in patients with Glioblastoma multiforme (GBM). The goal of this paper is to present and validate a software pipeline for semi-automatic GBM segmentation, called AFINITI (Assisted Follow-up in NeuroImaging of Therapeutic Intervention), using clinical data from GBM patients. Materials and Methods Our software adopts the current state-of-the-art tumor segmentation algorithms and combines them into one clinically usable pipeline. Both the advantages of the traditional voxel-based and the deformable shape-based segmentation are embedded into the software pipeline. The former provides an automatic tumor segmentation scheme based on T1- and T2-weighted MR brain data, and the latter refines the segmentation results with minimal manual input. Results Twenty six clinical MR brain images of GBM patients were processed and compared with manual results. The results can be visualized using the embedded graphic user interface (GUI). Conclusion Validation results using clinical GBM data showed high correlation between the AFINITI results and manual annotation. Compared to the voxel-wise segmentation, AFINITI yielded more accurate results in segmenting the enhanced GBM from multimodality MRI data. The proposed pipeline could be used as additional information to interpret MR brain images in neuroradiology. PMID:22591720

  11. Advanced and standardized evaluation of neurovascular compression syndromes

    NASA Astrophysics Data System (ADS)

    Hastreiter, Peter; Vega Higuera, Fernando; Tomandl, Bernd; Fahlbusch, Rudolf; Naraghi, Ramin

    2004-05-01

    Caused by a contact between vascular structures and the root entry or exit zone of cranial nerves neurovascular compression syndromes are combined with different neurological diseases (trigeminal neurolagia, hemifacial spasm, vertigo, glossopharyngeal neuralgia) and show a relation with essential arterial hypertension. As presented previously, the semi-automatic segmentation and 3D visualization of strongly T2 weighted MR volumes has proven to be an effective strategy for a better spatial understanding prior to operative microvascular decompression. After explicit segmentation of coarse structures, the tiny target nerves and vessels contained in the area of cerebrospinal fluid are segmented implicitly using direct volume rendering. However, based on this strategy the delineation of vessels in the vicinity of the brainstem and those at the border of the segmented CSF subvolume are critical. Therefore, we suggest registration with MR angiography and introduce consecutive fusion after semi-automatic labeling of the vascular information. Additionally, we present an approach of automatic 3D visualization and video generation based on predefined flight paths. Thereby, a standardized evaluation of the fused image data is supported and the visualization results are optimally prepared for intraoperative application. Overall, our new strategy contributes to a significantly improved 3D representation and evaluation of vascular compression syndromes. Its value for diagnosis and surgery is demonstrated with various clinical examples.

  12. Antibiogramj: A tool for analysing images from disk diffusion tests.

    PubMed

    Alonso, C A; Domínguez, C; Heras, J; Mata, E; Pascual, V; Torres, C; Zarazaga, M

    2017-05-01

    Disk diffusion testing, known as antibiogram, is widely applied in microbiology to determine the antimicrobial susceptibility of microorganisms. The measurement of the diameter of the zone of growth inhibition of microorganisms around the antimicrobial disks in the antibiogram is frequently performed manually by specialists using a ruler. This is a time-consuming and error-prone task that might be simplified using automated or semi-automated inhibition zone readers. However, most readers are usually expensive instruments with embedded software that require significant changes in laboratory design and workflow. Based on the workflow employed by specialists to determine the antimicrobial susceptibility of microorganisms, we have designed a software tool that, from images of disk diffusion tests, semi-automatises the process. Standard computer vision techniques are employed to achieve such an automatisation. We present AntibiogramJ, a user-friendly and open-source software tool to semi-automatically determine, measure and categorise inhibition zones of images from disk diffusion tests. AntibiogramJ is implemented in Java and deals with images captured with any device that incorporates a camera, including digital cameras and mobile phones. The fully automatic procedure of AntibiogramJ for measuring inhibition zones achieves an overall agreement of 87% with an expert microbiologist; moreover, AntibiogramJ includes features to easily detect when the automatic reading is not correct and fix it manually to obtain the correct result. AntibiogramJ is a user-friendly, platform-independent, open-source, and free tool that, up to the best of our knowledge, is the most complete software tool for antibiogram analysis without requiring any investment in new equipment or changes in the laboratory. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Volumetric glioma quantification: comparison of manual and semi-automatic tumor segmentation for the quantification of tumor growth.

    PubMed

    Odland, Audun; Server, Andres; Saxhaug, Cathrine; Breivik, Birger; Groote, Rasmus; Vardal, Jonas; Larsson, Christopher; Bjørnerud, Atle

    2015-11-01

    Volumetric magnetic resonance imaging (MRI) is now widely available and routinely used in the evaluation of high-grade gliomas (HGGs). Ideally, volumetric measurements should be included in this evaluation. However, manual tumor segmentation is time-consuming and suffers from inter-observer variability. Thus, tools for semi-automatic tumor segmentation are needed. To present a semi-automatic method (SAM) for segmentation of HGGs and to compare this method with manual segmentation performed by experts. The inter-observer variability among experts manually segmenting HGGs using volumetric MRIs was also examined. Twenty patients with HGGs were included. All patients underwent surgical resection prior to inclusion. Each patient underwent several MRI examinations during and after adjuvant chemoradiation therapy. Three experts performed manual segmentation. The results of tumor segmentation by the experts and by the SAM were compared using Dice coefficients and kappa statistics. A relatively close agreement was seen among two of the experts and the SAM, while the third expert disagreed considerably with the other experts and the SAM. An important reason for this disagreement was a different interpretation of contrast enhancement as either surgically-induced or glioma-induced. The time required for manual tumor segmentation was an average of 16 min per scan. Editing of the tumor masks produced by the SAM required an average of less than 2 min per sample. Manual segmentation of HGG is very time-consuming and using the SAM could increase the efficiency of this process. However, the accuracy of the SAM ultimately depends on the expert doing the editing. Our study confirmed a considerable inter-observer variability among experts defining tumor volume from volumetric MRIs. © The Foundation Acta Radiologica 2014.

  14. Semi-Automatic Normalization of Multitemporal Remote Images Based on Vegetative Pseudo-Invariant Features

    PubMed Central

    Garcia-Torres, Luis; Caballero-Novella, Juan J.; Gómez-Candón, David; De-Castro, Ana Isabel

    2014-01-01

    A procedure to achieve the semi-automatic relative image normalization of multitemporal remote images of an agricultural scene called ARIN was developed using the following procedures: 1) defining the same parcel of selected vegetative pseudo-invariant features (VPIFs) in each multitemporal image; 2) extracting data concerning the VPIF spectral bands from each image; 3) calculating the correction factors (CFs) for each image band to fit each image band to the average value of the image series; and 4) obtaining the normalized images by linear transformation of each original image band through the corresponding CF. ARIN software was developed to semi-automatically perform the ARIN procedure. We have validated ARIN using seven GeoEye-1 satellite images taken over the same location in Southern Spain from early April to October 2010 at an interval of approximately 3 to 4 weeks. The following three VPIFs were chosen: citrus orchards (CIT), olive orchards (OLI) and poplar groves (POP). In the ARIN-normalized images, the range, standard deviation (s. d.) and root mean square error (RMSE) of the spectral bands and vegetation indices were considerably reduced compared to the original images, regardless of the VPIF or the combination of VPIFs selected for normalization, which demonstrates the method’s efficacy. The correlation coefficients between the CFs among VPIFs for any spectral band (and all bands overall) were calculated to be at least 0.85 and were significant at P = 0.95, indicating that the normalization procedure was comparably performed regardless of the VPIF chosen. ARIN method was designed only for agricultural and forestry landscapes where VPIFs can be identified. PMID:24604031

  15. Quantitative analysis of hyperpolarized 129Xe ventilation imaging in healthy volunteers and subjects with chronic obstructive pulmonary disease

    PubMed Central

    Virgincar, Rohan S.; Cleveland, Zackary I.; Kaushik, S. Sivaram; Freeman, Matthew S.; Nouls, John; Cofer, Gary P.; Martinez-Jimenez, Santiago; He, Mu; Kraft, Monica; Wolber, Jan; McAdams, H. Page; Driehuys, Bastiaan

    2013-01-01

    In this study, hyperpolarized (HP) 129Xe MR ventilation and 1H anatomical images were obtained from 3 subject groups: young healthy volunteers (HV), subjects with chronic obstructive pulmonary disease (COPD), and age-matched control subjects (AMC). Ventilation images were quantified by 2 methods: an expert reader-based ventilation defect score percentage (VDS%) and a semi-automatic segmentation-based ventilation defect percentage (VDP). Reader-based values were assigned by two experienced radiologists and resolved by consensus. In the semi-automatic analysis, 1H anatomical images and 129Xe ventilation images were both segmented following registration, to obtain the thoracic cavity volume (TCV) and ventilated volume (VV), respectively, which were then expressed as a ratio to obtain the VDP. Ventilation images were also characterized by generating signal intensity histograms from voxels within the TCV, and heterogeneity was analyzed using the coefficient of variation (CV). The reader-based VDS% correlated strongly with the semi-automatically generated VDP (r = 0.97, p < 0.0001), and with CV (r = 0.82, p < 0.0001). Both 129Xe ventilation defect scoring metrics readily separated the 3 groups from one another and correlated significantly with FEV1 (VDS%: r = -0.78, p = 0.0002; VDP: r = -0.79, p = 0.0003; CV: r = -0.66, p = 0.0059) and other pulmonary function tests. In the healthy subject groups (HV and AMC), the prevalence of ventilation defects also increased with age (VDS%: r = 0.61, p = 0.0002; VDP: r = 0.63, p = 0.0002). Moreover, ventilation histograms and their associated CVs distinguished between COPD subjects with similar ventilation defect scores but visibly different ventilation patterns. PMID:23065808

  16. Surface smoothness: cartilage biomarkers for knee OA beyond the radiologist

    NASA Astrophysics Data System (ADS)

    Tummala, Sudhakar; Dam, Erik B.

    2010-03-01

    Fully automatic imaging biomarkers may allow quantification of patho-physiological processes that a radiologist would not be able to assess reliably. This can introduce new insight but is problematic to validate due to lack of meaningful ground truth expert measurements. Rather than quantification accuracy, such novel markers must therefore be validated against clinically meaningful end-goals such as the ability to allow correct diagnosis. We present a method for automatic cartilage surface smoothness quantification in the knee joint. The quantification is based on a curvature flow method used on tibial and femoral cartilage compartments resulting from an automatic segmentation scheme. These smoothness estimates are validated for their ability to diagnose osteoarthritis and compared to smoothness estimates based on manual expert segmentations and to conventional cartilage volume quantification. We demonstrate that the fully automatic markers eliminate the time required for radiologist annotations, and in addition provide a diagnostic marker superior to the evaluated semi-manual markers.

  17. Automatic analysis of microscopic images of red blood cell aggregates

    NASA Astrophysics Data System (ADS)

    Menichini, Pablo A.; Larese, Mónica G.; Riquelme, Bibiana D.

    2015-06-01

    Red blood cell aggregation is one of the most important factors in blood viscosity at stasis or at very low rates of flow. The basic structure of aggregates is a linear array of cell commonly termed as rouleaux. Enhanced or abnormal aggregation is seen in clinical conditions, such as diabetes and hypertension, producing alterations in the microcirculation, some of which can be analyzed through the characterization of aggregated cells. Frequently, image processing and analysis for the characterization of RBC aggregation were done manually or semi-automatically using interactive tools. We propose a system that processes images of RBC aggregation and automatically obtains the characterization and quantification of the different types of RBC aggregates. Present technique could be interesting to perform the adaptation as a routine used in hemorheological and Clinical Biochemistry Laboratories because this automatic method is rapid, efficient and economical, and at the same time independent of the user performing the analysis (repeatability of the analysis).

  18. Quality and shelf life evaluation of fermented sausages of buffalo meat with different levels of heart and fat.

    PubMed

    Ahmad, S; Srivastava, P K

    2007-04-01

    Investigations were carried to study the effect of heart incorporation (0%, 15% and 20%) and increasing levels of fat (20% and 25%) on physicochemical (pH, moisture content and thiobarbituric acid, TBA number) and microbiological (total plate count and yeast and mold count) quality and shelf life of semi dry sausages of buffalo meat during refrigerated storage (4°C). Different levels of fat significantly (p<0.05) increased the pH of the sausage samples. However different levels of heart incorporation did not significantly (p<0.05) affect pH, moisture content and TBA number of sausage samples. Fresh samples had pH, moisture content and TBA number in the range of 5.15-5.28, 42.4-47.4% and 0.073-0.134 respectively. Refrigerated storage significantly (p<0.05) increased TBA number of control samples while storage did not significantly (p<0.05) increase the TBA number of sodium ascorbate (SA) treated samples. Total plate counts of twelve sausage samples were f under the TFTC (too few to count) limit at the initial stage. Incorporation of different levels of heart and also increasing levels of fat did not significantly (p<0.05) increase the log TPC/g values. Yeast and molds were not detected in twelve samples of semi dry fermented sausages in their fresh condition. Storage revealed that there was a consistent decrease in pH, and moisture content. Refrigerated storage significantly (p<0.05) reduced both pH and moisture contents. TBA number and total plate counts and yeast and mold counts of controls were found to increase significantly (p<0.05) during refrigerated storage. However, in SA treated sausage, only TPC and yeast and mold count significantly (p<0.05) increased during refrigerated storage. Shelf life of the sausages was found to be 60 days under refrigerated storage (4°C).

  19. A technique for automatically extracting useful field of view and central field of view images.

    PubMed

    Pandey, Anil Kumar; Sharma, Param Dev; Aheer, Deepak; Kumar, Jay Prakash; Sharma, Sanjay Kumar; Patel, Chetan; Kumar, Rakesh; Bal, Chandra Sekhar

    2016-01-01

    It is essential to ensure the uniform response of the single photon emission computed tomography gamma camera system before using it for the clinical studies by exposing it to uniform flood source. Vendor specific acquisition and processing protocol provide for studying flood source images along with the quantitative uniformity parameters such as integral and differential uniformity. However, a significant difficulty is that the time required to acquire a flood source image varies from 10 to 35 min depending both on the activity of Cobalt-57 flood source and the pre specified counts in the vendors protocol (usually 4000K-10,000K counts). In case the acquired total counts are less than the total prespecified counts, and then the vendor's uniformity processing protocol does not precede with the computation of the quantitative uniformity parameters. In this study, we have developed and verified a technique for reading the flood source image, remove unwanted information, and automatically extract and save the useful field of view and central field of view images for the calculation of the uniformity parameters. This was implemented using MATLAB R2013b running on Ubuntu Operating system and was verified by subjecting it to the simulated and real flood sources images. The accuracy of the technique was found to be encouraging, especially in view of practical difficulties with vendor-specific protocols. It may be used as a preprocessing step while calculating uniformity parameters of the gamma camera in lesser time with fewer constraints.

  20. MEASURING PROJECTOR

    DOEpatents

    Franck, J.V.; Broadhead, P.S.; Skiff, E.W.

    1959-07-14

    A semiautomatic measuring projector particularly adapted for measurement of the coordinates of photographic images of particle tracks as prcduced in a bubble or cloud chamber is presented. A viewing screen aids the operator in selecting a particle track for measurement. After approximate manual alignment, an image scanning system coupled to a servo control provides automatic exact alignment of a track image with a reference point. The apparatus can follow along a track with a continuous motion while recording coordinate data at various selected points along the track. The coordinate data is recorded on punched cards for subsequent computer calculation of particle trajectory, momentum, etc.

  1. Optical instrumentation engineering in science, technology and society; Proceedings of the Sixteenth Annual Technical Meeting, San Mateo, Calif., October 16-18, 1972

    NASA Technical Reports Server (NTRS)

    Katz, Y. H.

    1973-01-01

    Visual tracking performance in instrumentation is discussed together with photographic pyrometry in an aeroballistic range, optical characteristics of spherical vapor bubbles in liquids, and the automatic detection and control of surface roughness by coherent diffraction patterns. Other subjects explored are related to instruments, sensors, systems, holography, and pattern recognition. Questions of data handling are also investigated, taking into account minicomputer image storage for holographic interferometry analysis, the design of a video amplifier for a 90 MHz bandwidth, and autostereoscopic screens. Individual items are announced in this issue.

  2. Cardiopulmonary Changes with Moderate Decompression in Rats

    NASA Technical Reports Server (NTRS)

    Robinson, R.; Little, T.; Doursout, M.-F.; Butler, B. D.; Chelly, J. E.

    1996-01-01

    Sprague-Dawley rats were compressed to 616 kPa for 120 min then decompressed at 38 kPa/min to assess the cardiovascular and pulmonary responses to moderate decompression stress. In one series of experiments the rats were chronically instrumented with Doppler ultrasonic probes for simultaneous measurement of blood pressure, cardiac output, heart rate, left and right ventricular wall thickening fraction, and venous bubble detection. Data were collected at base-line, throughout the compression/decompression protocol, and for 120 min post decompression. In a second series of experiments the pulmonary responses to the decompression protocol were evaluated in non-instrumented rats. Analyses included blood gases, pleural and bronchoalveolar lavage (BAL) protein and hemoglobin concentration, pulmonary edema, BAL and lung tissue phospholipids, lung compliance, and cell counts. Venous bubbles were directly observed in 90% of the rats where immediate post-decompression autopsy was performed and in 37% using implanted Doppler monitors. Cardiac output, stroke volume, and right ventricular wall thickening fractions were significantly decreased post decompression, whereas systemic vascular resistance was increased suggesting a decrease in venous return. BAL Hb and total protein levels were increased 0 and 60 min post decompression, pleural and plasma levels were unchanged. BAL white blood cells and neutrophil percentages were increased 0 and 60 min post decompression and pulmonary edema was detected. Venous bubbles produced with moderate decompression profiles give detectable cardiovascular and pulmonary responses in the rat.

  3. Use of Semi-Autonomous Tools for ISS Commanding and Monitoring

    NASA Technical Reports Server (NTRS)

    Brzezinski, Amy S.

    2014-01-01

    As the International Space Station (ISS) has moved into a utilization phase, operations have shifted to become more ground-based with fewer mission control personnel monitoring and commanding multiple ISS systems. This shift to fewer people monitoring more systems has prompted use of semi-autonomous console tools in the ISS Mission Control Center (MCC) to help flight controllers command and monitor the ISS. These console tools perform routine operational procedures while keeping the human operator "in the loop" to monitor and intervene when off-nominal events arise. Two such tools, the Pre-positioned Load (PPL) Loader and Automatic Operators Recorder Manager (AutoORM), are used by the ISS Communications RF Onboard Networks Utilization Specialist (CRONUS) flight control position. CRONUS is responsible for simultaneously commanding and monitoring the ISS Command & Data Handling (C&DH) and Communications and Tracking (C&T) systems. PPL Loader is used to uplink small pieces of frequently changed software data tables, called PPLs, to ISS computers to support different ISS operations. In order to uplink a PPL, a data load command must be built that contains multiple user-input fields. Next, a multiple step commanding and verification procedure must be performed to enable an onboard computer for software uplink, uplink the PPL, verify the PPL has incorporated correctly, and disable the computer for software uplink. PPL Loader provides different levels of automation in both building and uplinking these commands. In its manual mode, PPL Loader automatically builds the PPL data load commands but allows the flight controller to verify and save the commands for future uplink. In its auto mode, PPL Loader automatically builds the PPL data load commands for flight controller verification, but automatically performs the PPL uplink procedure by sending commands and performing verification checks while notifying CRONUS of procedure step completion. If an off-nominal condition occurs during procedure execution, PPL Loader notifies CRONUS through popup messages, allowing CRONUS to examine the situation and choose an option of how PPL loader should proceed with the procedure. The use of PPL Loader to perform frequent, routine PPL uplinks offloads CRONUS to better monitor two ISS systems. It also reduces procedure performance time and decreases risk of command errors. AutoORM identifies ISS communication outage periods and builds commands to lock, playback, and unlock ISS Operations Recorder files. Operation Recorder files are circular buffer files of continually recorded ISS telemetry data. Sections of these files can be locked from further writing, be played back to capture telemetry data that occurred during an ISS loss of signal (LOS) period, and then be unlocked for future recording use. Downlinked Operation Recorder files are used by mission support teams for data analysis, especially if failures occur during LOS. The commands to lock, playback, and unlock Operations Recorder files are encompassed in three different operational procedures and contain multiple user-input fields. AutoORM provides different levels of automation for building and uplinking the commands to lock, playback, and unlock Operations Recorder files. In its automatic mode, AutoORM automatically detects ISS LOS periods, then generates and uplinks the commands to lock, playback, and unlock Operations Recorder files when MCC regains signal with ISS. AutoORM also features semi-autonomous and manual modes which integrate CRONUS more into the command verification and uplink process. AutoORMs ability to automatically detect ISS LOS periods and build the necessary commands to preserve, playback, and release recorded telemetry data greatly offloads CRONUS to perform more high-level cognitive tasks, such as mission planning and anomaly troubleshooting. Additionally, since Operations Recorder commands contain numerical time input fields which are tedious for a human to manually build, AutoORM's ability to automatically build commands reduces operational command errors. PPL Loader and AutoORM demonstrate principles of semi-autonomous operational tools that will benefit future space mission operations. Both tools employ different levels of automation to perform simple and routine procedures, thereby offloading human operators to perform higher-level cognitive tasks. Because both tools provide procedure execution status and highlight off-nominal indications, the flight controller is able to intervene during procedure execution if needed. Semi-autonomous tools and systems that can perform routine procedures, yet keep human operators informed of execution, will be essential in future long-duration missions where the onboard crew will be solely responsible for spacecraft monitoring and control.

  4. An Algorithm to Automatically Generate the Combinatorial Orbit Counting Equations

    PubMed Central

    Melckenbeeck, Ine; Audenaert, Pieter; Michoel, Tom; Colle, Didier; Pickavet, Mario

    2016-01-01

    Graphlets are small subgraphs, usually containing up to five vertices, that can be found in a larger graph. Identification of the graphlets that a vertex in an explored graph touches can provide useful information about the local structure of the graph around that vertex. Actually finding all graphlets in a large graph can be time-consuming, however. As the graphlets grow in size, more different graphlets emerge and the time needed to find each graphlet also scales up. If it is not needed to find each instance of each graphlet, but knowing the number of graphlets touching each node of the graph suffices, the problem is less hard. Previous research shows a way to simplify counting the graphlets: instead of looking for the graphlets needed, smaller graphlets are searched, as well as the number of common neighbors of vertices. Solving a system of equations then gives the number of times a vertex is part of each graphlet of the desired size. However, until now, equations only exist to count graphlets with 4 or 5 nodes. In this paper, two new techniques are presented. The first allows to generate the equations needed in an automatic way. This eliminates the tedious work needed to do so manually each time an extra node is added to the graphlets. The technique is independent on the number of nodes in the graphlets and can thus be used to count larger graphlets than previously possible. The second technique gives all graphlets a unique ordering which is easily extended to name graphlets of any size. Both techniques were used to generate equations to count graphlets with 4, 5 and 6 vertices, which extends all previous results. Code can be found at https://github.com/IneMelckenbeeck/equation-generator and https://github.com/IneMelckenbeeck/graphlet-naming. PMID:26797021

  5. Semi-automated potentiometric titration method for uranium characterization.

    PubMed

    Cristiano, B F G; Delgado, J U; da Silva, J W S; de Barros, P D; de Araújo, R M S; Lopes, R T

    2012-07-01

    The manual version of the potentiometric titration method has been used for certification and characterization of uranium compounds. In order to reduce the analysis time and the influence of the analyst, a semi-automatic version of the method was developed in the Brazilian Nuclear Energy Commission. The method was applied with traceability assured by using a potassium dichromate primary standard. The combined standard uncertainty in determining the total concentration of uranium was around 0.01%, which is suitable for uranium characterization. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Soil Moisture Estimate Under Forest Using a Semi-Empirical Model at P-Band

    NASA Technical Reports Server (NTRS)

    Truong-Loi, My-Linh; Saatchi, Sassan; Jaruwatanadilok, Sermsak

    2013-01-01

    Here we present the result of a semi-empirical inversion model for soil moisture retrieval using the three backscattering coefficients: sigma(sub HH), sigma(sub VV) and sigma(sub HV). In this paper we focus on the soil moisture estimate and use the biomass as an ancillary parameter estimated automatically from the algorithm and used as a validation parameter, We will first remind the model analytical formulation. Then we will sow some results obtained with real SAR data and compare them to ground estimates.

  7. ASSIST: User's manual

    NASA Technical Reports Server (NTRS)

    Johnson, S. C.

    1986-01-01

    Semi-Markov models can be used to compute the reliability of virtually any fault-tolerant system. However, the process of delineating all of the states and transitions in a model of a complex system can be devastingly tedious and error-prone. The ASSIST program allows the user to describe the semi-Markov model in a high-level language. Instead of specifying the individual states of the model, the user specifies the rules governing the behavior of the system and these are used by ASSIST to automatically generate the model. The ASSIST program is described and illustrated by examples.

  8. A semi-automatic method for extracting thin line structures in images as rooted tree network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brazzini, Jacopo; Dillard, Scott; Soille, Pierre

    2010-01-01

    This paper addresses the problem of semi-automatic extraction of line networks in digital images - e.g., road or hydrographic networks in satellite images, blood vessels in medical images, robust. For that purpose, we improve a generic method derived from morphological and hydrological concepts and consisting in minimum cost path estimation and flow simulation. While this approach fully exploits the local contrast and shape of the network, as well as its arborescent nature, we further incorporate local directional information about the structures in the image. Namely, an appropriate anisotropic metric is designed by using both the characteristic features of the targetmore » network and the eigen-decomposition of the gradient structure tensor of the image. Following, the geodesic propagation from a given seed with this metric is combined with hydrological operators for overland flow simulation to extract the line network. The algorithm is demonstrated for the extraction of blood vessels in a retina image and of a river network in a satellite image.« less

  9. Semi-automatic recognition of marine debris on beaches

    NASA Astrophysics Data System (ADS)

    Ge, Zhenpeng; Shi, Huahong; Mei, Xuefei; Dai, Zhijun; Li, Daoji

    2016-05-01

    An increasing amount of anthropogenic marine debris is pervading the earth’s environmental systems, resulting in an enormous threat to living organisms. Additionally, the large amount of marine debris around the world has been investigated mostly through tedious manual methods. Therefore, we propose the use of a new technique, light detection and ranging (LIDAR), for the semi-automatic recognition of marine debris on a beach because of its substantially more efficient role in comparison with other more laborious methods. Our results revealed that LIDAR should be used for the classification of marine debris into plastic, paper, cloth and metal. Additionally, we reconstructed a 3-dimensional model of different types of debris on a beach with a high validity of debris revivification using LIDAR-based individual separation. These findings demonstrate that the availability of this new technique enables detailed observations to be made of debris on a large beach that was previously not possible. It is strongly suggested that LIDAR could be implemented as an appropriate monitoring tool for marine debris by global researchers and governments.

  10. Conceptual design of semi-automatic wheelbarrow to overcome ergonomics problems among palm oil plantation workers

    NASA Astrophysics Data System (ADS)

    Nawik, N. S. M.; Deros, B. M.; Rahman, M. N. A.; Sukadarin, E. H.; Nordin, N.; Tamrin, S. B. M.; Bakar, S. A.; Norzan, M. L.

    2015-12-01

    An ergonomics problem is one of the main issues faced by palm oil plantation workers especially during harvesting and collecting of fresh fruit bunches (FFB). Intensive manual handling and labor activities involved have been associated with high prevalence of musculoskeletal disorders (MSDs) among palm oil plantation workers. New and safe technology on machines and equipment in palm oil plantation are very important in order to help workers reduce risks and injuries while working. The aim of this research is to improve the design of a wheelbarrow, which is suitable for workers and a small size oil palm plantation. The wheelbarrow design was drawn using CATIA ergonomic features. The characteristic of ergonomics assessment is performed by comparing the existing design of wheelbarrow. Conceptual design was developed based on the problems that have been reported by workers. From the analysis of the problem, finally have resulting concept design the ergonomic quality of semi-automatic wheelbarrow with safe and suitable used for palm oil plantation workers.

  11. Model reduction by trimming for a class of semi-Markov reliability models and the corresponding error bound

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Palumbo, Daniel L.

    1991-01-01

    Semi-Markov processes have proved to be an effective and convenient tool to construct models of systems that achieve reliability by redundancy and reconfiguration. These models are able to depict complex system architectures and to capture the dynamics of fault arrival and system recovery. A disadvantage of this approach is that the models can be extremely large, which poses both a model and a computational problem. Techniques are needed to reduce the model size. Because these systems are used in critical applications where failure can be expensive, there must be an analytically derived bound for the error produced by the model reduction technique. A model reduction technique called trimming is presented that can be applied to a popular class of systems. Automatic model generation programs were written to help the reliability analyst produce models of complex systems. This method, trimming, is easy to implement and the error bound easy to compute. Hence, the method lends itself to inclusion in an automatic model generator.

  12. High-frequency ultrasound M-mode monitoring of HIFU ablation in cardiac tissue

    NASA Astrophysics Data System (ADS)

    Kumon, R. E.; Gudur, M. S. R.; Zhou, Y.; Deng, C. X.

    2012-10-01

    Effective real-time HIFU lesion detection is important for expanded use of HIFU in interventional electrophysiology (e.g., epicardial ablation of cardiac arrhythmia). The goal of this study was to investigate rapid, high-frequency M-mode ultrasound imaging for monitoring spatiotemporal changes in tissue during HIFU application. The HIFU application (4.33 MHz, 1000 Hz PRF, 50% duty cycle, 1 s exposure, 6100 W/cm2) was perpendicularly applied to porcine cardiac tissue with a high-frequency imaging system (Visualsonics Vevo 770, 55 MHz, 4.5 mm focal distance) confocally aligned. Radiofrequency (RF) M-mode data (1 kHz PRF, 4 s × 7 mm) was acquired before, during, and after HIFU treatment. Gross lesions were compared with M-mode data to correlate lesion and cavity formation. Integrated backscatter, echo-decorrelation parameters, and their cumulative extrema over time were analyzed for automatically identifying lesion width and bubble formation. Cumulative maximum integrated backscatter showed the best results for identifying the final lesion width, and a criterion based on line-to-line decorrelation was proposed for identification of transient bubble activity.

  13. The Electronic McPhail Trap

    PubMed Central

    Potamitis, Ilyas; Rigakis, Iraklis; Fysarakis, Konstantinos

    2014-01-01

    Certain insects affect cultivations in a detrimental way. A notable case is the olive fruit fly (Bactrocera oleae (Rossi)), that in Europe alone causes billions of euros in crop-loss/per year. Pests can be controlled with aerial and ground bait pesticide sprays, the efficiency of which depends on knowing the time and location of insect infestations as early as possible. The inspection of traps is currently carried out manually. Automatic monitoring traps can enhance efficient monitoring of flying pests by identifying and counting targeted pests as they enter the trap. This work deals with the hardware setup of an insect trap with an embedded optoelectronic sensor that automatically records insects as they fly in the trap. The sensor responsible for detecting the insect is an array of phototransistors receiving light from an infrared LED. The wing-beat recording is based on the interruption of the emitted light due to the partial occlusion from insect's wings as they fly in the trap. We show that the recordings are of high quality paving the way for automatic recognition and transmission of insect detections from the field to a smartphone. This work emphasizes the hardware implementation of the sensor and the detection/counting module giving all necessary implementation details needed to construct it. PMID:25429412

  14. [Advances in automatic detection technology for images of thin blood film of malaria parasite].

    PubMed

    Juan-Sheng, Zhang; Di-Qiang, Zhang; Wei, Wang; Xiao-Guang, Wei; Zeng-Guo, Wang

    2017-05-05

    This paper reviews the computer vision and image analysis studies aiming at automated diagnosis or screening of malaria in microscope images of thin blood film smears. On the basis of introducing the background and significance of automatic detection technology, the existing detection technologies are summarized and divided into several steps, including image acquisition, pre-processing, morphological analysis, segmentation, count, and pattern classification components. Then, the principles and implementation methods of each step are given in detail. In addition, the promotion and application in automatic detection technology of thick blood film smears are put forwarded as questions worthy of study, and a perspective of the future work for realization of automated microscopy diagnosis of malaria is provided.

  15. Mapping the acquisition of the number word sequence in the first year of school

    NASA Astrophysics Data System (ADS)

    Gould, Peter

    2017-03-01

    Learning to count and to produce the correct sequence of number words in English is not a simple process. In NSW government schools taking part in Early Action for Success, over 800 students in each of the first 3 years of school were assessed every 5 weeks over the school year to determine the highest correct oral count they could produce. Rather than displaying a steady increase in the accurate sequence of the number words produced, the kindergarten data reported here identified clear, substantial hurdles in the acquisition of the counting sequence. The large-scale, longitudinal data also provided evidence of learning to count through the teens being facilitated by the semi-regular structure of the number words in English. Instead of occurring as hurdles to starting the next counting sequence, number words corresponding to some multiples of ten (10, 20 and 100) acted as if they were rest points. These rest points appear to be artefacts of how the counting sequence is acquired.

  16. Analysis of Archaeological, Geological and Historical Artifacts and Documents Pertaining to the Midden Excavation at ’Woodville’ - Unit 4, James G. Fulton, Flood Protection Project, Chartiers Creek, Pennsylvania,

    DTIC Science & Technology

    1985-03-01

    104 INTRODUCTION TO ANALYZED CERAMICS by John Eddins ....................... 128 DECORATED PORCELAIN by Noel Strattan Introduction...vessel terminology 135 DECORATED PORCELAIN Figure 1. Photographs of selected porcelain patterns 200-14 between 201 - 448 i. WHITEWARE AND SEMI-VITREOUS...Potential date curves by count for Features 20, 21, 885 and 25 Figure 3. Weighted potential date curves for ceramics 8896 ( porcelain , whiteware, and semi

  17. Single Polygon Counting on Cayley Tree of Order 3

    NASA Astrophysics Data System (ADS)

    Pah, Chin Hee

    2010-07-01

    We showed that one form of generalized Catalan numbers is the solution to the problem of finding different connected component with finite vertices containing a fixed root for the semi-infinite Cayley tree of order 3. We give the formula for the full graph, Cayley tree of order 3 which is derived from the generalized Catalan numbers. Using ratios of Gamma functions, two upper bounds are given for problem defined on semi-infinite Cayley tree of order 3 as well as the full graph.

  18. The casting and mechanism of formation of semi-permeable polymer membranes in a microgravity environment

    NASA Astrophysics Data System (ADS)

    Vera, I.

    The National Electric Company of Venezuela, C.A.D.A.F.E., is sponsoring the development of this experiment which represents Venezuela's first scientific experiment in space. The apparatus for the automatic casting of polymer thin films will be contained in NASA's payload No. G-559 of the Get Away Special program for a future orbital space flight in the U.S. Space Shuttle. Semi-permeable polymer membranes have important applications in a variety of fields, such as medecine, energy, and pharmaceuticals, and in general fluid separation processes such as reverse osmosis, ultra-filtration, and electro-dialysis. The casting of semi-permeable membranes in space will help to identify the roles of convection in determining the strucutre of these membranes.

  19. Bureau of Mines method of calibrating a primary radon-measuring apparatus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holub, R.R.; Stroud, W.P.

    1990-01-01

    One important requirement for accurate monitoring of radon in working environments, dwellings, and outdoors is to ensure that the measurement instrumentation is properly calibrated against a recognized standard. To achieve this goal, the U.S. Department of Interior Bureau of Mines (BoM) Radiation Laboratory has participated since 1983 in a program to establish international radon measurement standards. While the National Institute of Standards and Technology (NIST) radium solution ampules are acceptable to all participating laboratories as a primary standard, a method of transferring radon from the NIST source into each laboratory's primary counting apparatus is a critical problem. The Bureau's methodmore » transfers radon from the primary solution by bubbling 3 L of air through it into a steel cylinder. After homogenizing the radon concentrations in the cylinder, eight alpha-scintillation cells are filled consecutively and measured in a standard counting system. The resulting efficiency is 81.7 + or - 1.2%.« less

  20. Prior automatic posture and activity identification improves physical activity energy expenditure prediction from hip-worn triaxial accelerometry.

    PubMed

    Garnotel, M; Bastian, T; Romero-Ugalde, H M; Maire, A; Dugas, J; Zahariev, A; Doron, M; Jallon, P; Charpentier, G; Franc, S; Blanc, S; Bonnet, S; Simon, C

    2018-03-01

    Accelerometry is increasingly used to quantify physical activity (PA) and related energy expenditure (EE). Linear regression models designed to derive PAEE from accelerometry-counts have shown their limits, mostly due to the lack of consideration of the nature of activities performed. Here we tested whether a model coupling an automatic activity/posture recognition (AAR) algorithm with an activity-specific count-based model, developed in 61 subjects in laboratory conditions, improved PAEE and total EE (TEE) predictions from a hip-worn triaxial-accelerometer (ActigraphGT3X+) in free-living conditions. Data from two independent subject groups of varying body mass index and age were considered: 20 subjects engaged in a 3-h urban-circuit, with activity-by-activity reference PAEE from combined heart-rate and accelerometry monitoring (Actiheart); and 56 subjects involved in a 14-day trial, with PAEE and TEE measured using the doubly-labeled water method. PAEE was estimated from accelerometry using the activity-specific model coupled to the AAR algorithm (AAR model), a simple linear model (SLM), and equations provided by the companion-software of used activity-devices (Freedson and Actiheart models). AAR-model predictions were in closer agreement with selected references than those from other count-based models, both for PAEE during the urban-circuit (RMSE = 6.19 vs 7.90 for SLM and 9.62 kJ/min for Freedson) and for EE over the 14-day trial, reaching Actiheart performances in the latter (PAEE: RMSE = 0.93 vs. 1.53 for SLM, 1.43 for Freedson, 0.91 MJ/day for Actiheart; TEE: RMSE = 1.05 vs. 1.57 for SLM, 1.70 for Freedson, 0.95 MJ/day for Actiheart). Overall, the AAR model resulted in a 43% increase of daily PAEE variance explained by accelerometry predictions. NEW & NOTEWORTHY Although triaxial accelerometry is widely used in free-living conditions to assess the impact of physical activity energy expenditure (PAEE) on health, its precision and accuracy are often debated. Here we developed and validated an activity-specific model which, coupled with an automatic activity-recognition algorithm, improved the variance explained by the predictions from accelerometry counts by 43% of daily PAEE compared with models relying on a simple relationship between accelerometry counts and EE.

  1. Determination of land use in Minnesota by automatic interpretation of ERTS MSS data

    NASA Technical Reports Server (NTRS)

    Zirkle, R. E.; Pile, D. R.

    1973-01-01

    This program aims to determine the feasibility of identifying land use in Minnesota by automatic interpretation of ERTS-MSS data. Ultimate objectives include establishment of land use delineation and quantification by computer processing with a minimum of human operator interaction. This implies not only that reflectivity as a function of calendar time can be catalogued effectively, but also that the effects of uncontrolled variables can be identified and compensated. Clouds are the major uncontrollable data pollutant, so part of the initial effort is devoted to determining their effect and the construction of a model to help correct or justifiably ignore affected data. Other short range objectives are to identify and verify measurements giving results of importance to land managers. Lake-counting is a prominent example. Open water is easily detected in band 7 data with some support from either band 4 or band 5 to remove ambiguities. Land managers and conservationists commission studies periodically to measure water bodies and total water count within specified areas.

  2. Refining comparative proteomics by spectral counting to account for shared peptides and multiple search engines

    PubMed Central

    Chen, Yao-Yi; Dasari, Surendra; Ma, Ze-Qiang; Vega-Montoto, Lorenzo J.; Li, Ming

    2013-01-01

    Spectral counting has become a widely used approach for measuring and comparing protein abundance in label-free shotgun proteomics. However, when analyzing complex samples, the ambiguity of matching between peptides and proteins greatly affects the assessment of peptide and protein inventories, differentiation, and quantification. Meanwhile, the configuration of database searching algorithms that assign peptides to MS/MS spectra may produce different results in comparative proteomic analysis. Here, we present three strategies to improve comparative proteomics through spectral counting. We show that comparing spectral counts for peptide groups rather than for protein groups forestalls problems introduced by shared peptides. We demonstrate the advantage and flexibility of this new method in two datasets. We present four models to combine four popular search engines that lead to significant gains in spectral counting differentiation. Among these models, we demonstrate a powerful vote counting model that scales well for multiple search engines. We also show that semi-tryptic searching outperforms tryptic searching for comparative proteomics. Overall, these techniques considerably improve protein differentiation on the basis of spectral count tables. PMID:22552787

  3. Refining comparative proteomics by spectral counting to account for shared peptides and multiple search engines.

    PubMed

    Chen, Yao-Yi; Dasari, Surendra; Ma, Ze-Qiang; Vega-Montoto, Lorenzo J; Li, Ming; Tabb, David L

    2012-09-01

    Spectral counting has become a widely used approach for measuring and comparing protein abundance in label-free shotgun proteomics. However, when analyzing complex samples, the ambiguity of matching between peptides and proteins greatly affects the assessment of peptide and protein inventories, differentiation, and quantification. Meanwhile, the configuration of database searching algorithms that assign peptides to MS/MS spectra may produce different results in comparative proteomic analysis. Here, we present three strategies to improve comparative proteomics through spectral counting. We show that comparing spectral counts for peptide groups rather than for protein groups forestalls problems introduced by shared peptides. We demonstrate the advantage and flexibility of this new method in two datasets. We present four models to combine four popular search engines that lead to significant gains in spectral counting differentiation. Among these models, we demonstrate a powerful vote counting model that scales well for multiple search engines. We also show that semi-tryptic searching outperforms tryptic searching for comparative proteomics. Overall, these techniques considerably improve protein differentiation on the basis of spectral count tables.

  4. Novel semi-automated kidney volume measurements in autosomal dominant polycystic kidney disease.

    PubMed

    Muto, Satoru; Kawano, Haruna; Isotani, Shuji; Ide, Hisamitsu; Horie, Shigeo

    2018-06-01

    We assessed the effectiveness and convenience of a novel semi-automatic kidney volume (KV) measuring high-speed 3D-image analysis system SYNAPSE VINCENT ® (Fuji Medical Systems, Tokyo, Japan) for autosomal dominant polycystic kidney disease (ADPKD) patients. We developed a novel semi-automated KV measurement software for patients with ADPKD to be included in the imaging analysis software SYNAPSE VINCENT ® . The software extracts renal regions using image recognition software and measures KV (VINCENT KV). The algorithm was designed to work with the manual designation of a long axis of a kidney including cysts. After using the software to assess the predictive accuracy of the VINCENT method, we performed an external validation study and compared accurate KV and ellipsoid KV based on geometric modeling by linear regression analysis and Bland-Altman analysis. Median eGFR was 46.9 ml/min/1.73 m 2 . Median accurate KV, Vincent KV and ellipsoid KV were 627.7, 619.4 ml (IQR 431.5-947.0) and 694.0 ml (IQR 488.1-1107.4), respectively. Compared with ellipsoid KV (r = 0.9504), Vincent KV correlated strongly with accurate KV (r = 0.9968), without systematic underestimation or overestimation (ellipsoid KV; 14.2 ± 22.0%, Vincent KV; - 0.6 ± 6.0%). There were no significant slice thickness-specific differences (p = 0.2980). The VINCENT method is an accurate and convenient semi-automatic method to measure KV in patients with ADPKD compared with the conventional ellipsoid method.

  5. Asteroid (21) Lutetia: Semi-Automatic Impact Craters Detection and Classification

    NASA Astrophysics Data System (ADS)

    Jenerowicz, M.; Banaszkiewicz, M.

    2018-05-01

    The need to develop an automated method, independent of lighting and surface conditions, for the identification and measurement of impact craters, as well as the creation of a reliable and efficient tool, has become a justification of our studies. This paper presents a methodology for the detection of impact craters based on their spectral and spatial features. The analysis aims at evaluation of the algorithm capabilities to determinate the spatial parameters of impact craters presented in a time series. In this way, time-consuming visual interpretation of images would be reduced to the special cases. The developed algorithm is tested on a set of OSIRIS high resolution images of asteroid Lutetia surface which is characterized by varied landforms and the abundance of craters created by collisions with smaller bodies of the solar system.The proposed methodology consists of three main steps: characterisation of objects of interest on limited set of data, semi-automatic extraction of impact craters performed for total set of data by applying the Mathematical Morphology image processing (Serra, 1988, Soille, 2003), and finally, creating libraries of spatial and spectral parameters for extracted impact craters, i.e. the coordinates of the crater center, semi-major and semi-minor axis, shadow length and cross-section. The overall accuracy of the proposed method is 98 %, the Kappa coefficient is 0.84, the correlation coefficient is ∼ 0.80, the omission error 24.11 %, the commission error 3.45 %. The obtained results show that methods based on Mathematical Morphology operators are effective also with a limited number of data and low-contrast images.

  6. Drone transportation of blood products.

    PubMed

    Amukele, Timothy; Ness, Paul M; Tobian, Aaron A R; Boyd, Joan; Street, Jeff

    2017-03-01

    Small civilian unmanned aerial vehicles (drones) are a novel way to transport small goods. To the best of our knowledge there are no studies examining the impact of drone transport on blood products, describing approaches to maintaining temperature control, or component physical characteristics during drone transport. Six leukoreduced red blood cell (RBC) and six apheresis platelet (PLT) units were split using sterile techniques. The larger parent RBC and PLT units, as well as six unthawed plasma units frozen within 24 hours of collection (FP24), were placed in a cooler, attached to the drone, and flown for up to 26.5 minutes with temperature logging. Ambient temperatures during the experimental window ranged between -1 and 18°C across 2 days. The difference between the ambient and unit temperatures was approximately 20°C for PLT and FP24 units. After flight, the RBC parent units were centrifuged and visually checked for hemolysis; the PLTs were checked for changes in mean PLT volumes (MPVs), pH, and PLT count; and the frozen air bubbles on the back of the FP24 units were examined for any changes in size or shape, as evidence of thawing. There was no evidence of RBC hemolysis; no significant changes in PLT count, pH, or MPVs; and no changes in the FP24 bubbles. The temperature of all units was maintained during transport and flight. There was no adverse impact of drone transport on RBC, PLT, or FP24 units. These findings suggest that drone transportation systems are a viable option for the transportation of blood products. © 2016 AABB.

  7. Clustering method for counting passengers getting in a bus with single camera

    NASA Astrophysics Data System (ADS)

    Yang, Tao; Zhang, Yanning; Shao, Dapei; Li, Ying

    2010-03-01

    Automatic counting of passengers is very important for both business and security applications. We present a single-camera-based vision system that is able to count passengers in a highly crowded situation at the entrance of a traffic bus. The unique characteristics of the proposed system include, First, a novel feature-point-tracking- and online clustering-based passenger counting framework, which performs much better than those of background-modeling-and foreground-blob-tracking-based methods. Second, a simple and highly accurate clustering algorithm is developed that projects the high-dimensional feature point trajectories into a 2-D feature space by their appearance and disappearance times and counts the number of people through online clustering. Finally, all test video sequences in the experiment are captured from a real traffic bus in Shanghai, China. The results show that the system can process two 320×240 video sequences at a frame rate of 25 fps simultaneously, and can count passengers reliably in various difficult scenarios with complex interaction and occlusion among people. The method achieves high accuracy rates up to 96.5%.

  8. Automated detection of retinal landmarks for the identification of clinically relevant regions in fundus photography

    NASA Astrophysics Data System (ADS)

    Ometto, Giovanni; Calivá, Francesco; Al-Diri, Bashir; Bek, Toke; Hunter, Andrew

    2016-03-01

    Automatic, quick and reliable identification of retinal landmarks from fundus photography is key for measurements used in research, diagnosis, screening and treating of common diseases affecting the eyes. This study presents a fast method for the detection of the centre of mass of the vascular arcades, optic nerve head (ONH) and fovea, used in the definition of five clinically relevant areas in use for screening programmes for diabetic retinopathy (DR). Thirty-eight fundus photographs showing 7203 DR lesions were analysed to find the landmarks manually by two retina-experts and automatically by the proposed method. The automatic identification of the ONH and fovea were performed using template matching based on normalised cross correlation. The centre of mass of the arcades was obtained by fitting an ellipse on sample coordinates of the main vessels. The coordinates were obtained by processing the image with hessian filtering followed by shape analyses and finally sampling the results. The regions obtained manually and automatically were used to count the retinal lesions falling within, and to evaluate the method. 92.7% of the lesions were falling within the same regions based on the landmarks selected by the two experts. 91.7% and 89.0% were counted in the same areas identified by the method and the first and second expert respectively. The inter-repeatability of the proposed method and the experts is comparable, while the 100% intra-repeatability makes the algorithm a valuable tool in tasks like analyses in real-time, of large datasets and of intra-patient variability.

  9. Automated high-performance cIMT measurement techniques using patented AtheroEdge™: a screening and home monitoring system.

    PubMed

    Molinari, Filippo; Meiburger, Kristen M; Suri, Jasjit

    2011-01-01

    The evaluation of the carotid artery wall is fundamental for the assessment of cardiovascular risk. This paper presents the general architecture of an automatic strategy, which segments the lumen-intima and media-adventitia borders, classified under a class of Patented AtheroEdge™ systems (Global Biomedical Technologies, Inc, CA, USA). Guidelines to produce accurate and repeatable measurements of the intima-media thickness are provided and the problem of the different distance metrics one can adopt is confronted. We compared the results of a completely automatic algorithm that we developed with those of a semi-automatic algorithm, and showed final segmentation results for both techniques. The overall rationale is to provide user-independent high-performance techniques suitable for screening and remote monitoring.

  10. Rapid and automated enumeration of viable bacteria in compost using a micro-colony auto counting system.

    PubMed

    Wang, Xiaodan; Yamaguchi, Nobuyasu; Someya, Takashi; Nasu, Masao

    2007-10-01

    The micro-colony method was used to enumerate viable bacteria in composts. Cells were vacuum-filtered onto polycarbonate filters and incubated for 18 h on LB medium at 37 degrees C. Bacteria on the filters were stained with SYBR Green II, and enumerated using a newly developed micro-colony auto counting system which can automatically count micro-colonies on half the area of the filter within 90 s. A large number of bacteria in samples retained physiological activity and formed micro-colonies within 18 h, whereas most could not form large colonies on conventional media within 1 week. The results showed that this convenient technique can enumerate viable bacteria in compost rapidly for its efficient quality control.

  11. A Semi-Automatic Approach to Construct Vietnamese Ontology from Online Text

    ERIC Educational Resources Information Center

    Nguyen, Bao-An; Yang, Don-Lin

    2012-01-01

    An ontology is an effective formal representation of knowledge used commonly in artificial intelligence, semantic web, software engineering, and information retrieval. In open and distance learning, ontologies are used as knowledge bases for e-learning supplements, educational recommenders, and question answering systems that support students with…

  12. Semi-Automatic Assembly of Learning Resources

    ERIC Educational Resources Information Center

    Verbert, K.; Ochoa, X.; Derntl, M.; Wolpers, M.; Pardo, A.; Duval, E.

    2012-01-01

    Technology Enhanced Learning is a research field that has matured considerably over the last decade. Many technical solutions to support design, authoring and use of learning activities and resources have been developed. The first datasets that reflect the tracking of actual use of these tools in real-life settings are beginning to become…

  13. 78 FR 37520 - Order Denying Export Privileges

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-21

    ... DEPARTMENT OF COMMERCE Bureau of Industry and Security Order Denying Export Privileges In the... Molina, Jr. (``Molina'') was convicted of violating Section 38 of the Arms Export Control Act (22 U.S.C... attempting to export and causing to be exported from the United States to Mexico two AK47 semi-automatic...

  14. Semi-Automatic Determination of Citation Relevancy: User Evaluation.

    ERIC Educational Resources Information Center

    Huffman, G. David

    1990-01-01

    Discussion of online bibliographic database searches focuses on a software system, SORT-AID/SABRE, that ranks retrieved citations in terms of relevance. Results of a comprehensive user evaluation of the relevance ranking procedure to determine its effectiveness are presented, and implications for future work are suggested. (10 references) (LRW)

  15. A video-based real-time adaptive vehicle-counting system for urban roads.

    PubMed

    Liu, Fei; Zeng, Zhiyuan; Jiang, Rong

    2017-01-01

    In developing nations, many expanding cities are facing challenges that result from the overwhelming numbers of people and vehicles. Collecting real-time, reliable and precise traffic flow information is crucial for urban traffic management. The main purpose of this paper is to develop an adaptive model that can assess the real-time vehicle counts on urban roads using computer vision technologies. This paper proposes an automatic real-time background update algorithm for vehicle detection and an adaptive pattern for vehicle counting based on the virtual loop and detection line methods. In addition, a new robust detection method is introduced to monitor the real-time traffic congestion state of road section. A prototype system has been developed and installed on an urban road for testing. The results show that the system is robust, with a real-time counting accuracy exceeding 99% in most field scenarios.

  16. A video-based real-time adaptive vehicle-counting system for urban roads

    PubMed Central

    2017-01-01

    In developing nations, many expanding cities are facing challenges that result from the overwhelming numbers of people and vehicles. Collecting real-time, reliable and precise traffic flow information is crucial for urban traffic management. The main purpose of this paper is to develop an adaptive model that can assess the real-time vehicle counts on urban roads using computer vision technologies. This paper proposes an automatic real-time background update algorithm for vehicle detection and an adaptive pattern for vehicle counting based on the virtual loop and detection line methods. In addition, a new robust detection method is introduced to monitor the real-time traffic congestion state of road section. A prototype system has been developed and installed on an urban road for testing. The results show that the system is robust, with a real-time counting accuracy exceeding 99% in most field scenarios. PMID:29135984

  17. Active transportation monitoring plan : initial evaluation of bicycling and walking baseline & planned traffic counts through 2015.

    DOT National Transportation Integrated Search

    2011-09-06

    Active Transportation in the form of bicycle and pedestrian traffic, is monitored for the Austin-Round Rock-San Marcos, TX region by the Capital Area Metropolitan Planning Organization using a variety of methods: automatic, manual and surveyed. This ...

  18. An ERTS-1 investigation for Lake Ontario and its basin

    NASA Technical Reports Server (NTRS)

    Polcyn, F. C.; Falconer, A. (Principal Investigator); Wagner, T. W.; Rebel, D. L.

    1975-01-01

    The author has identified the following significant results. Methods of manual, semi-automatic, and automatic (computer) data processing were evaluated, as were the requirements for spatial physiographic and limnological information. The coupling of specially processed ERTS data with simulation models of the watershed precipitation/runoff process provides potential for water resources management. Optimal and full use of the data requires a mix of data processing and analysis techniques, including single band editing, two band ratios, and multiband combinations. A combination of maximum likelihood ratio and near-IR/red band ratio processing was found to be particularly useful.

  19. Public knowledge of how to use an automatic external defibrillator in out-of-hospital cardiac arrest in Hong Kong.

    PubMed

    Fan, K L; Leung, L P; Poon, H T; Chiu, H Y; Liu, H L; Tang, W Y

    2016-12-01

    The survival rate of out-of-hospital cardiac arrest in Hong Kong is low. A long delay between collapse and defibrillation is a contributing factor. Public access to defibrillation may shorten this delay. It is unknown, however, whether Hong Kong's public is willing or able to use an automatic external defibrillator. This study aimed to evaluate public knowledge of how to use an automatic external defibrillator in out-of-hospital cardiac arrest. A face-to-face semi-structured questionnaire survey of the public was conducted in six locations with a high pedestrian flow in Hong Kong. In this study, 401 members of the public were interviewed. Most had no training in first aid (65.8%) or in use of an automatic external defibrillator (85.3%). Nearly all (96.5%) would call for help for a victim of out-of-hospital cardiac arrest but only 18.0% would use an automatic external defibrillator. Public knowledge of automatic external defibrillator use was low: 77.6% did not know the location of an automatic external defibrillator in the vicinity of their home or workplace. People who had ever been trained in both first aid and use of an automatic external defibrillator were more likely to respond to and help a victim of cardiac arrest, and to use an automatic external defibrillator. Public knowledge of automatic external defibrillator use is low in Hong Kong. A combination of training in first aid and in the use of an automatic external defibrillator is better than either one alone.

  20. Rapid enumeration of viable bacteria by image analysis

    NASA Technical Reports Server (NTRS)

    Singh, A.; Pyle, B. H.; McFeters, G. A.

    1989-01-01

    A direct viable counting method for enumerating viable bacteria was modified and made compatible with image analysis. A comparison was made between viable cell counts determined by the spread plate method and direct viable counts obtained using epifluorescence microscopy either manually or by automatic image analysis. Cultures of Escherichia coli, Salmonella typhimurium, Vibrio cholerae, Yersinia enterocolitica and Pseudomonas aeruginosa were incubated at 35 degrees C in a dilute nutrient medium containing nalidixic acid. Filtered samples were stained for epifluorescence microscopy and analysed manually as well as by image analysis. Cells enlarged after incubation were considered viable. The viable cell counts determined using image analysis were higher than those obtained by either the direct manual count of viable cells or spread plate methods. The volume of sample filtered or the number of cells in the original sample did not influence the efficiency of the method. However, the optimal concentration of nalidixic acid (2.5-20 micrograms ml-1) and length of incubation (4-8 h) varied with the culture tested. The results of this study showed that under optimal conditions, the modification of the direct viable count method in combination with image analysis microscopy provided an efficient and quantitative technique for counting viable bacteria in a short time.

  1. Improved constraints on the estimated size and volatile content of the Mount St. Helens magma system from the 2004-2008 history of dome growth and deformation

    USGS Publications Warehouse

    Mastin, Larry G.; Lisowski, Mike; Roeloffs, Evelyn; Beeler, Nick

    2009-01-01

    The history of dome growth and geodetic deflation during the 2004-2008 Mount St. Helens eruption can be fit to theoretical curves with parameters such as reservoir volume, bubble content, initial overpressure, and magma rheology, here assumed to be Newtonian viscous, with or without a solid plug in the conduit center. Data from 2004-2008 are consistent with eruption from a 10-25 km3 reservoir containing 0.5-2% bubbles, an initial overpressure of 10-20 MPa, and no significant, sustained recharge. During the eruption we used curve fits to project the eruption's final duration and volume. Early projections predicted a final volume only about half of the actual value; but projections increased with each measurement, implying a temporal increase in reservoir volume or compressibility. A simple interpretation is that early effusion was driven by a 5-10 km3, integrated core of fluid magma. This core expanded with time through creep of semi-solid magma and host rock.

  2. Improved constraints on the estimated size and volatile content of the Mount St. Helens magma system from the 2004-2008 history of dome growth and deformation

    USGS Publications Warehouse

    Mastin, L.G.; Lisowski, M.; Roeloffs, E.; Beeler, N.

    2009-01-01

    The history of dome growth and geodetic deflation during the 2004-2008 Mount St. Helens eruption can be fit to theoretical curves with parameters such as reservoir volume, bubble content, initial overpressure, and magma rheology, here assumed to be Newtonian viscous, with or without a solid plug in the conduit center. Data from 2004-2008 are consistent with eruption from a 10-25 km3 reservoir containing 0.5-2% bubbles, an initial overpressure of 10-20 MPa, and no significant, sustained recharge. During the eruption we used curve fits to project the eruption's final duration and volume. Early projections predicted a final volume only about half of the actual value; but projections increased with each measurement, implying a temporal increase in reservoir volume or compressibility. A simple interpretation is that early effusion was driven by a 5-10 km3, integrated core of fluid magma. This core expanded with time through creep of semi-solid magma and host rock. Copyright 2009 by the American Geophysical Union.

  3. Heat transfer and hydrodynamic investigations of a baffled slurry bubble column

    NASA Astrophysics Data System (ADS)

    Saxena, S. C.; Chen, Z. D.

    1992-09-01

    Heat transfer and hydrodynamic investigations have been conducted in a 0.108 m internal diameter bubble column at ambient conditions. The column is equipped with seven 19mm diameter tubes arranged in an equilateral triangular pitch of 36.5 mm. A Monsanto synthetic heat transfer fluid, Therminol-66 having a viscosity of 39.8 cP at 303 K, is used as a liquid medium. Magnetite powders, average diameters 27.7 and 36.6 µm, in five concentrations up to 50 weight percent in the slurry, are used. As a gas phase, industrial grade nitrogen of purity 99.6 percent is employed. Gas holdup in different operating modes and regimes have been measured for the two- and three-phase systems over a superficial gas velocity range up to 0.20 m/s in the semi-batch mode. Heat transfer coefficients are measured at different tube locations in the bundle at different radial and vertical locations over a range of operating conditions. All these data are compared with the existing literature correlations and models. New correlations are proposed.

  4. Assessing Rotation-Invariant Feature Classification for Automated Wildebeest Population Counts.

    PubMed

    Torney, Colin J; Dobson, Andrew P; Borner, Felix; Lloyd-Jones, David J; Moyer, David; Maliti, Honori T; Mwita, Machoke; Fredrick, Howard; Borner, Markus; Hopcraft, J Grant C

    2016-01-01

    Accurate and on-demand animal population counts are the holy grail for wildlife conservation organizations throughout the world because they enable fast and responsive adaptive management policies. While the collection of image data from camera traps, satellites, and manned or unmanned aircraft has advanced significantly, the detection and identification of animals within images remains a major bottleneck since counting is primarily conducted by dedicated enumerators or citizen scientists. Recent developments in the field of computer vision suggest a potential resolution to this issue through the use of rotation-invariant object descriptors combined with machine learning algorithms. Here we implement an algorithm to detect and count wildebeest from aerial images collected in the Serengeti National Park in 2009 as part of the biennial wildebeest count. We find that the per image error rates are greater than, but comparable to, two separate human counts. For the total count, the algorithm is more accurate than both manual counts, suggesting that human counters have a tendency to systematically over or under count images. While the accuracy of the algorithm is not yet at an acceptable level for fully automatic counts, our results show this method is a promising avenue for further research and we highlight specific areas where future research should focus in order to develop fast and accurate enumeration of aerial count data. If combined with a bespoke image collection protocol, this approach may yield a fully automated wildebeest count in the near future.

  5. Investigation and classification of spume droplets production mechanisms at hurricane winds

    NASA Astrophysics Data System (ADS)

    Troitskaya, Yuliya; Kandaurov, Alexander; Ermakova, Olga; Kozlov, Dmitry; Sergeev, Daniil; Zilitinkevich, Sergey

    2016-04-01

    Sea sprays are typical element of the marine atmospheric boundary layer of important environmental effect. There are still significant uncertainties in estimations of these effects due to insufficient knowledge on the sea spray generation function. The reason for that are difficulties of direct measurements and insufficient knowledge about the mechanisms of the spume droplet's formation. This study is concerned with the laboratory experiments for identification of mechanisms due to which a strong wind tears off water from the crest of the waves made at the high-speed wind-wave flume of IAP RAS. In order to obtain statistical data for the events on the surface, leading to the spray generation a high-speed video-filming was made using a horizontal and vertical shadow methods at rates of up to 10,000 fps in a wide range of wind speeds (20 - 35 m/s). Classification of phenomena responsible for generation of spume droplets was made. It was observed for the friction velocities from 0.8 to 1.5 m/s that the generation of the spume droplets is caused by 3 types of local phenomena: breaking of "projections" see e.g.[1], bursting of submerged bubbles [2,3] and bag breakup - it begins with increase of small-scale elevation of the surface, transforming to small "sails" then inflated to a water film bordered by a thicker rim and at last blows up, so the droplets are produced from rupture of the water film and fragmentation of the rim (the first report on the observation of a new mechanism of spume droplets', similar to bag-breakup regime was made in [4]). Statistical analysis of number of these phenomena at different winds showed that the "bag-breakup" is the major mechanism of spume droplets generation at strong and hurricane winds. Statistical distributions of observed "bags" geometrical parameters at different airflow velocities were retrieved from video-filming using specially developed software which allowed semi-automatic registering of image features. Acknowledgements: The work was supported by RFBR (Project No. 16-05-00839, 15-35-20953, 14-05-91767), Yu. Troitskaya, D. Sergeev, A. Kandaurov were partially supported by FP7 collaborative project No. 612610, experimental studies of spray generation mechanisms were supported by Russian Science Foundation (Grant No. 15-17-20009), post-processing was supported by Russian Science Foundation (Grant No. 14-17-00667). References: 1. Koga M. Direct production of droplets from breaking wind-waves - its observation by a multi-colored overlapping exposure photographing technique // Tellus. 1981. V.33. Issue 6. P. 552-563 2. Blanchard, D.C., The electrification of the atmosphere by particles from bubbles in the sea, Progr. Oceanogr., 1963. V. 1. P. 71-202. 3. Spiel D.E. On the birth of jet drops from bubbles bursting on water surfaces // J. Geophys. Res. 1995. V.100. P. 4995-5006 4. Villermaux, E. Fragmentation // Annu. Rev. Fluid Mech., 2007. V.39. P.419-446

  6. Standardized evaluation framework for evaluating coronary artery stenosis detection, stenosis quantification and lumen segmentation algorithms in computed tomography angiography.

    PubMed

    Kirişli, H A; Schaap, M; Metz, C T; Dharampal, A S; Meijboom, W B; Papadopoulou, S L; Dedic, A; Nieman, K; de Graaf, M A; Meijs, M F L; Cramer, M J; Broersen, A; Cetin, S; Eslami, A; Flórez-Valencia, L; Lor, K L; Matuszewski, B; Melki, I; Mohr, B; Oksüz, I; Shahzad, R; Wang, C; Kitslaar, P H; Unal, G; Katouzian, A; Örkisz, M; Chen, C M; Precioso, F; Najman, L; Masood, S; Ünay, D; van Vliet, L; Moreno, R; Goldenberg, R; Vuçini, E; Krestin, G P; Niessen, W J; van Walsum, T

    2013-12-01

    Though conventional coronary angiography (CCA) has been the standard of reference for diagnosing coronary artery disease in the past decades, computed tomography angiography (CTA) has rapidly emerged, and is nowadays widely used in clinical practice. Here, we introduce a standardized evaluation framework to reliably evaluate and compare the performance of the algorithms devised to detect and quantify the coronary artery stenoses, and to segment the coronary artery lumen in CTA data. The objective of this evaluation framework is to demonstrate the feasibility of dedicated algorithms to: (1) (semi-)automatically detect and quantify stenosis on CTA, in comparison with quantitative coronary angiography (QCA) and CTA consensus reading, and (2) (semi-)automatically segment the coronary lumen on CTA, in comparison with expert's manual annotation. A database consisting of 48 multicenter multivendor cardiac CTA datasets with corresponding reference standards are described and made available. The algorithms from 11 research groups were quantitatively evaluated and compared. The results show that (1) some of the current stenosis detection/quantification algorithms may be used for triage or as a second-reader in clinical practice, and that (2) automatic lumen segmentation is possible with a precision similar to that obtained by experts. The framework is open for new submissions through the website, at http://coronary.bigr.nl/stenoses/. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Semi-automatic forensic approach using mandibular midline lingual structures as fingerprint: a pilot study.

    PubMed

    Shaheen, E; Mowafy, B; Politis, C; Jacobs, R

    2017-12-01

    Previous research proposed the use of the mandibular midline neurovascular canal structures as a forensic finger print. In their observer study, an average correct identification of 95% was reached which triggered this study. To present a semi-automatic computer recognition approach to replace the observers and to validate the accuracy of this newly proposed method. Imaging data from Computer Tomography (CT) and Cone Beam Computer Tomography (CBCT) of mandibles scanned at two different moments were collected to simulate an AM and PM situation where the first scan presented AM and the second scan was used to simulate PM. Ten cases with 20 scans were used to build a classifier which relies on voxel based matching and results with classification into one of two groups: "Unmatched" and "Matched". This protocol was then tested using five other scans out of the database. Unpaired t-testing was applied and accuracy of the computerized approach was determined. A significant difference was found between the "Unmatched" and "Matched" classes with means of 0.41 and 0.86 respectively. Furthermore, the testing phase showed an accuracy of 100%. The validation of this method pushes this protocol further to a fully automatic identification procedure for victim identification based on the mandibular midline canals structures only in cases with available AM and PM CBCT/CT data.

  8. Parametric study of flow patterns behind the standing accretion shock wave for core-collapse supernovae

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iwakami, Wakana; Nagakura, Hiroki; Yamada, Shoichi, E-mail: wakana@heap.phys.waseda.ac.jp

    2014-05-10

    In this study, we conduct three-dimensional hydrodynamic simulations systematically to investigate the flow patterns behind the accretion shock waves that are commonly formed in the post-bounce phase of core-collapse supernovae. Adding small perturbations to spherically symmetric, steady, shocked accretion flows, we compute the subsequent evolutions to find what flow pattern emerges as a consequence of hydrodynamical instabilities such as convection and standing accretion shock instability for different neutrino luminosities and mass accretion rates. Depending on these two controlling parameters, various flow patterns are indeed realized. We classify them into three basic patterns and two intermediate ones; the former includes sloshingmore » motion (SL), spiral motion (SP), and multiple buoyant bubble formation (BB); the latter consists of spiral motion with buoyant-bubble formation (SPB) and spiral motion with pulsationally changing rotational velocities (SPP). Although the post-shock flow is highly chaotic, there is a clear trend in the pattern realization. The sloshing and spiral motions tend to be dominant for high accretion rates and low neutrino luminosities, and multiple buoyant bubbles prevail for low accretion rates and high neutrino luminosities. It is interesting that the dominant pattern is not always identical between the semi-nonlinear and nonlinear phases near the critical luminosity; the intermediate cases are realized in the latter case. Running several simulations with different random perturbations, we confirm that the realization of flow pattern is robust in most cases.« less

  9. Manufacturing polymer thin films in a micro-gravity environment

    NASA Technical Reports Server (NTRS)

    Vera, Ivan

    1987-01-01

    This project represents Venezuela's first scientific experiment in space. The apparatus for the automatic casting of two polymer thin films will be contained in NASA's Payload No. G-559 of the Get Away Special program for a future orbital space flight in the U.S. Space Shuttle. Semi-permeable polymer membranes have important applications in a variety of fields, such as medicine, energy, and pharmaceuticals and in general fluid separation processes, such as reverse osmosis, ultrafiltration, and electrodialysis. The casting of semi-permeable membranes in space will help to identify the roles of convection in determining the structure of these membranes.

  10. Integration of the shallow water equations on the sphere using a vector semi-Lagrangian scheme with a multigrid solver

    NASA Technical Reports Server (NTRS)

    Bates, J. R.; Semazzi, F. H. M.; Higgins, R. W.; Barros, Saulo R. M.

    1990-01-01

    A vector semi-Lagrangian semi-implicit two-time-level finite-difference integration scheme for the shallow water equations on the sphere is presented. A C-grid is used for the spatial differencing. The trajectory-centered discretization of the momentum equation in vector form eliminates pole problems and, at comparable cost, gives greater accuracy than a previous semi-Lagrangian finite-difference scheme which used a rotated spherical coordinate system. In terms of the insensitivity of the results to increasing timestep, the new scheme is as successful as recent spectral semi-Lagrangian schemes. In addition, the use of a multigrid method for solving the elliptic equation for the geopotential allows efficient integration with an operation count which, at high resolution, is of lower order than in the case of the spectral models. The properties of the new scheme should allow finite-difference models to compete with spectral models more effectively than has previously been possible.

  11. Calibration of automatic performance measures - speed and volume data : volume 1, evaluation of the accuracy of traffic volume counts collected by microwave sensors.

    DOT National Transportation Integrated Search

    2015-09-01

    Over the past few years, the Utah Department of Transportation (UDOT) has developed a system called the : Signal Performance Metrics System (SPMS) to evaluate the performance of signalized intersections. This system : currently provides data summarie...

  12. An automatic multi-atlas prostate segmentation in MRI using a multiscale representation and a label fusion strategy

    NASA Astrophysics Data System (ADS)

    Álvarez, Charlens; Martínez, Fabio; Romero, Eduardo

    2015-01-01

    The pelvic magnetic Resonance images (MRI) are used in Prostate cancer radiotherapy (RT), a process which is part of the radiation planning. Modern protocols require a manual delineation, a tedious and variable activity that may take about 20 minutes per patient, even for trained experts. That considerable time is an important work ow burden in most radiological services. Automatic or semi-automatic methods might improve the efficiency by decreasing the measure times while conserving the required accuracy. This work presents a fully automatic atlas- based segmentation strategy that selects the more similar templates for a new MRI using a robust multi-scale SURF analysis. Then a new segmentation is achieved by a linear combination of the selected templates, which are previously non-rigidly registered towards the new image. The proposed method shows reliable segmentations, obtaining an average DICE Coefficient of 79%, when comparing with the expert manual segmentation, under a leave-one-out scheme with the training database.

  13. Association between right-to-left shunts and brain lesions in sport divers.

    PubMed

    Gerriets, Tibo; Tetzlaff, Kay; Hutzelmann, Alfred; Liceni, Thomas; Kopiske, Gerrit; Struck, Niklas; Reuter, Michael; Kaps, Manfred

    2003-10-01

    Recent studies suggest that healthy sport divers may develop clinically silent brain damage, based on the association between a finding of multiple brain lesions on MRI and the presence of right-to-left shunt, a pathway for venous gas bubbles to enter the arterial system. We performed echocontrast transcranial Doppler sonography in 42 sport divers to determine the presence of a right-to-left shunt. Cranial MRI was carried out using a 1.5 T magnet. A lesion was counted if it was hyperintense on both T2-weighted and T2-weighted fluid attenuated inversion recovery sequences. To test the hypothesis that the occurrence of postdive arterial gas emboli is related to brain lesions on MRI, we measured postdive intravascular bubbles in a subset of 15 divers 30 min after open water scuba dives. Echocontrast transcranial Doppler sonography revealed a right-to-left shunt in 16 of the divers (38%). Only one hyperintensive lesion of the central white matter was found and that was in a diver with no evidence of a right-to-left shunt. Postdive arterial gas emboli were detected in 3 out of 15 divers; they had a right-to-left shunt, but no pathologic findings on cranial magnetic resonance imaging. Our data support the theory that right-to-left shunts can serve as a pathway for venous gas bubbles into the arterial circulation. However, we could not confirm an association between brain lesions and the presence of a right-to-left shunt in sport divers.

  14. Further evidence of gaseous embolic material in patients with artificial heart valves.

    PubMed

    Georgiadis, D; Baumgartner, R W; Karatschai, R; Lindner, A; Zerkowski, H R

    1998-04-01

    We undertook this study to evaluate the hypothesis that most microemboli signals in patients with artificial heart valves are gaseous, assuming that microemboli counts in cerebral arteries would progressively decline with increasing distance from the generating heart valve. A total of 10 outpatients with CarboMedics (Sulzer Carbomedics Inc., n = 5) and ATS prosthetic heart valves (n = 5) in the aortic (n = 8), mitral (n = 1), or both aortic and mitral positions (n = 1) were recruited. Monitoring was performed simultaneously over the middle and anterior cerebral arteries and the common carotid artery for 30 minutes with the 2 MHZ transducers of a color duplex scanner (common carotid artery) and pulsed-wave Doppler ultrasonography (intracranial arteries). All data were harvested in an eight-channel digital audio tape recorder, and microembolic signal counts were evaluated online by two separate observers. Significantly higher microembolic signal counts were recorded in the common carotid artery (112 [75 to 175]) compared with the middle and anterior cerebral arteries (30 [18 to 36], p < 0.0001). Interobserver variability was satisfactory (k = 0.81). Our results strongly argue for gaseous underlying embolic material in patients with artificial heart valves because bubbles are bound to implode with time.

  15. Brain extraction in partial volumes T2*@7T by using a quasi-anatomic segmentation with bias field correction.

    PubMed

    Valente, João; Vieira, Pedro M; Couto, Carlos; Lima, Carlos S

    2018-02-01

    Poor brain extraction in Magnetic Resonance Imaging (MRI) has negative consequences in several types of brain post-extraction such as tissue segmentation and related statistical measures or pattern recognition algorithms. Current state of the art algorithms for brain extraction work on weighted T1 and T2, being not adequate for non-whole brain images such as the case of T2*FLASH@7T partial volumes. This paper proposes two new methods that work directly in T2*FLASH@7T partial volumes. The first is an improvement of the semi-automatic threshold-with-morphology approach adapted to incomplete volumes. The second method uses an improved version of a current implementation of the fuzzy c-means algorithm with bias correction for brain segmentation. Under high inhomogeneity conditions the performance of the first method degrades, requiring user intervention which is unacceptable. The second method performed well for all volumes, being entirely automatic. State of the art algorithms for brain extraction are mainly semi-automatic, requiring a correct initialization by the user and knowledge of the software. These methods can't deal with partial volumes and/or need information from atlas which is not available in T2*FLASH@7T. Also, combined volumes suffer from manipulations such as re-sampling which deteriorates significantly voxel intensity structures making segmentation tasks difficult. The proposed method can overcome all these difficulties, reaching good results for brain extraction using only T2*FLASH@7T volumes. The development of this work will lead to an improvement of automatic brain lesions segmentation in T2*FLASH@7T volumes, becoming more important when lesions such as cortical Multiple-Sclerosis need to be detected. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Towards natural language question generation for the validation of ontologies and mappings.

    PubMed

    Ben Abacha, Asma; Dos Reis, Julio Cesar; Mrabet, Yassine; Pruski, Cédric; Da Silveira, Marcos

    2016-08-08

    The increasing number of open-access ontologies and their key role in several applications such as decision-support systems highlight the importance of their validation. Human expertise is crucial for the validation of ontologies from a domain point-of-view. However, the growing number of ontologies and their fast evolution over time make manual validation challenging. We propose a novel semi-automatic approach based on the generation of natural language (NL) questions to support the validation of ontologies and their evolution. The proposed approach includes the automatic generation, factorization and ordering of NL questions from medical ontologies. The final validation and correction is performed by submitting these questions to domain experts and automatically analyzing their feedback. We also propose a second approach for the validation of mappings impacted by ontology changes. The method exploits the context of the changes to propose correction alternatives presented as Multiple Choice Questions. This research provides a question optimization strategy to maximize the validation of ontology entities with a reduced number of questions. We evaluate our approach for the validation of three medical ontologies. We also evaluate the feasibility and efficiency of our mappings validation approach in the context of ontology evolution. These experiments are performed with different versions of SNOMED-CT and ICD9. The obtained experimental results suggest the feasibility and adequacy of our approach to support the validation of interconnected and evolving ontologies. Results also suggest that taking into account RDFS and OWL entailment helps reducing the number of questions and validation time. The application of our approach to validate mapping evolution also shows the difficulty of adapting mapping evolution over time and highlights the importance of semi-automatic validation.

  17. Temporal evolution of micro-eruptions within the crater lake of White Island (Whakaari) during January/February 2013

    NASA Astrophysics Data System (ADS)

    Edwards, Matt; Kennedy, Ben; Jolly, Art; Scheu, Bettina; Taddeucci, Jacopo; Jousset, Philippe; Schmid, Di

    2015-04-01

    Micro-eruptions are potentially modulated by hydrothermal systems and crater lakes but to date have not been well studied. In January/February 2013 White Island (Whakaari), New Zealand, experienced an about three week long period of atypical, frequent micro-eruptions within its crater lake. Many of these micro-eruptions were recorded by tour operators and GNS personnel monitoring the lake activity. Analysis of this video footage reveals an increasingly energetic eruption style. Deformation of the muddy lake surface by ascending bubbles begins as irregularly shaped bursts, producing liquid strings of mud ejected to heights of less than 10m at 10-15m/s. As the episode progresses, eruption frequency is maintained at semi-regular <10s intervals. Each eruption however starts with an increasingly hemispheric surface deformation ~6m in diameter, and bursts occur as "star-bursts" with ejection of less fluidal ash/mud clots. In addition, these bursts are commonly followed within 2s by a more vertical and energetic secondary ejection of material, which occasionally ejects through the deformed hemispheric surface up to >100m high, and reaches ejection velocities up to 45m/s. The period of frequent "star-bursts" is then followed by a two day phase of constant ~30-75m high ash ejection resulting in the formation of a tuff cone with a central open conduit of 6m within the former crater lake. We theorise that this behaviour is influenced by evolving bubble overpressure/volume, including the presence or absence of a trailing wake of smaller bubbles and is modulated over the eruption episode by the viscosity of the crater lake. In the early stages of the episode a lower viscosity lake provides little resistance to rising gas/ash mixtures. Bubble coalescence and/or overpressure development is therefore minimised, resulting in low energy bursts. Over the course of this episode the viscosity of the lake increases due to addition of ash from ash-carrying gas flux and fluid loss by boiling. Thus higher pressurized gas bubbles can form within the conduit which burst with increasing explosivity. Two experiments are planned simulating this evolving eruption style. In the first, controlled cold volumes of pressurized gas bubbles within a vertical pipe will be released into an overlying chamber filled with varying viscosity fluids, to investigate energy and acoustics of bubble bursts. The second will involve sudden depressurisation of a mud-filled autoclave at elevated temperature (>100°C) to provide eruption metrics. Comparing the eruption styles generated in the lab with those identified at White Island in video analysis will allow us to investigate the dominant controls on the eruption style.

  18. Modified mosquito landing boxes dispensing transfluthrin provide effective protection against Anopheles arabiensis mosquitoes under simulated outdoor conditions in a semi-field system.

    PubMed

    Andrés, Marta; Lorenz, Lena M; Mbeleya, Edgar; Moore, Sarah J

    2015-06-24

    Efforts to control malaria vectors have primarily focused on scaling-up of long-lasting insecticidal nets (LLINs) and indoor residual spraying. Although highly efficient against indoor-biting and indoor-resting vectors, these interventions have lower impact on outdoor-biting mosquitoes. Innovative vector control tools are required to prevent outdoor human-mosquito contacts. In this work, the potential of spatial repellents, delivered in an active system that requires minimal user compliance, to provide personal protection against exophagic mosquitoes active in the early evening was explored. A device previously used as an odour-baited lure and kill apparatus, the mosquito landing box (MLB), was modified to dispense the volatile synthetic pyrethroid, transfluthrin, as a spatial repellent. The MLB has an active odour-dispensing mechanism that uses a solar-powered fan and switches on at dusk to provide long duration dispensing of volatile compounds without the need for the user to remember to employ it. Two MLBs were located 5 m from a human volunteer to investigate the repellent effects of a transfluthrin 'bubble' created between the MLBs. Transfluthrin was emanated from polyester strips, hanging inside the MLB odour-dispensing unit. A fully randomized cross-over design was performed in a large, semi-field, screened cage to assess the effect of the repellent against laboratory-reared Anopheles arabiensis mosquitoes under ambient outdoor conditions. The knock-down capacity of the transfluthrin-treated strips was also evaluated at different time points up to 3 weeks after being impregnated to measure duration of efficacy. The protective transfluthrin bubble provided 68.9% protection against An. arabiensis bites under these simulated outdoor conditions. Volatile transfluthrin caused low mortality among mosquitoes in the semi-field system. Transfluthrin-treated strips continued to knock down mosquitoes in laboratory tests, 3 weeks after impregnation, although this effect diminished with time. Modified MLBs can be used as efficient and long-lasting dispensers of volatile spatial repellents such as transfluthrin, thereby providing high levels of protection against outdoor-biting mosquitoes in the peri-domestic space. They have a potential role in combatting outdoor malaria transmission without interfering with effective indoor interventions such as LLINs.

  19. Macintosh/LabVIEW based control and data acquisition system for a single photon counting fluorometer

    NASA Astrophysics Data System (ADS)

    Stryjewski, Wieslaw J.

    1991-08-01

    A flexible software system has been developed for controlling fluorescence decay measurements using the virtual instrument approach offered by LabVIEW. The time-correlated single photon counting instrument operates under computer control in both manual and automatic mode. Implementation time was short and the equipment is now easier to use, reducing the training time required for new investigators. It is not difficult to customize the front panel or adapt the program to a different instrument. We found LabVIEW much more convenient to use for this application than traditional, textual computer languages.

  20. Functional-to-form mapping for assembly design automation

    NASA Astrophysics Data System (ADS)

    Xu, Z. G.; Liu, W. M.; Shen, W. D.; Yang, D. Y.; Liu, T. T.

    2017-11-01

    Assembly-level function-to-form mapping is the most effective procedure towards design automation. The research work mainly includes: the assembly-level function definitions, product network model and the two-step mapping mechanisms. The function-to-form mapping is divided into two steps, i.e. mapping of function-to-behavior, called the first-step mapping, and the second-step mapping, i.e. mapping of behavior-to-structure. After the first step mapping, the three dimensional transmission chain (or 3D sketch) is studied, and the feasible design computing tools are developed. The mapping procedure is relatively easy to be implemented interactively, but, it is quite difficult to finish it automatically. So manual, semi-automatic, automatic and interactive modification of the mapping model are studied. A mechanical hand F-F mapping process is illustrated to verify the design methodologies.

  1. Semi-Automatic Grading of Students' Answers Written in Free Text

    ERIC Educational Resources Information Center

    Escudeiro, Nuno; Escudeiro, Paula; Cruz, Augusto

    2011-01-01

    The correct grading of free text answers to exam questions during an assessment process is time consuming and subject to fluctuations in the application of evaluation criteria, particularly when the number of answers is high (in the hundreds). In consequence of these fluctuations, inherent to human nature, and largely determined by emotional…

  2. Semi-Automatic Methods of Knowledge Enhancement

    DTIC Science & Technology

    1988-12-05

    pL . Response was patchy. Apparently awed by the complexity of the problem only 3 GM’s responded and all asked for no public use to be made of their...by the SERC . Thanks are due to the Turing Institute and Edinburgh University Ai department for resource and facilities. We would also like to thank

  3. Resolving carbonate platform geometries on the Island of Bonaire, Caribbean Netherlands through semi-automatic GPR facies classification

    NASA Astrophysics Data System (ADS)

    Bowling, R. D.; Laya, J. C.; Everett, M. E.

    2018-07-01

    The study of exposed carbonate platforms provides observational constraints on regional tectonics and sea-level history. In this work Miocene-aged carbonate platform units of the Seroe Domi Formation are investigated on the island of Bonaire, located in the Southern Caribbean. Ground penetrating radar (GPR) was used to probe near-surface structural geometries associated with these lithologies. The single cross-island transect described herein allowed for continuous mapping of geologic structures on kilometre length scales. Numerical analysis was applied to the data in the form of k-means clustering of structure-parallel vectors derived from image structure tensors. This methodology enables radar facies along the survey transect to be semi-automatically mapped. The results provide subsurface evidence to support previous surficial and outcrop observations, and reveal complex stratigraphy within the platform. From the GPR data analysis, progradational clinoform geometries were observed on the northeast side of the island which support the tectonics and depositional trends of the region. Furthermore, several leeward-side radar facies are identified which correlate to environments of deposition conducive to dolomitization via reflux mechanisms.

  4. Semi-automatic recognition of marine debris on beaches

    PubMed Central

    Ge, Zhenpeng; Shi, Huahong; Mei, Xuefei; Dai, Zhijun; Li, Daoji

    2016-01-01

    An increasing amount of anthropogenic marine debris is pervading the earth’s environmental systems, resulting in an enormous threat to living organisms. Additionally, the large amount of marine debris around the world has been investigated mostly through tedious manual methods. Therefore, we propose the use of a new technique, light detection and ranging (LIDAR), for the semi-automatic recognition of marine debris on a beach because of its substantially more efficient role in comparison with other more laborious methods. Our results revealed that LIDAR should be used for the classification of marine debris into plastic, paper, cloth and metal. Additionally, we reconstructed a 3-dimensional model of different types of debris on a beach with a high validity of debris revivification using LIDAR-based individual separation. These findings demonstrate that the availability of this new technique enables detailed observations to be made of debris on a large beach that was previously not possible. It is strongly suggested that LIDAR could be implemented as an appropriate monitoring tool for marine debris by global researchers and governments. PMID:27156433

  5. Semi-automatic segmentation of brain tumors using population and individual information.

    PubMed

    Wu, Yao; Yang, Wei; Jiang, Jun; Li, Shuanqian; Feng, Qianjin; Chen, Wufan

    2013-08-01

    Efficient segmentation of tumors in medical images is of great practical importance in early diagnosis and radiation plan. This paper proposes a novel semi-automatic segmentation method based on population and individual statistical information to segment brain tumors in magnetic resonance (MR) images. First, high-dimensional image features are extracted. Neighborhood components analysis is proposed to learn two optimal distance metrics, which contain population and patient-specific information, respectively. The probability of each pixel belonging to the foreground (tumor) and the background is estimated by the k-nearest neighborhood classifier under the learned optimal distance metrics. A cost function for segmentation is constructed through these probabilities and is optimized using graph cuts. Finally, some morphological operations are performed to improve the achieved segmentation results. Our dataset consists of 137 brain MR images, including 68 for training and 69 for testing. The proposed method overcomes segmentation difficulties caused by the uneven gray level distribution of the tumors and even can get satisfactory results if the tumors have fuzzy edges. Experimental results demonstrate that the proposed method is robust to brain tumor segmentation.

  6. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure.

    PubMed

    Castellazzi, Giovanni; D'Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-07-28

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation.

  7. Semi-automatic delineation of the spino-laminar junction curve on lateral x-ray radiographs of the cervical spine

    NASA Astrophysics Data System (ADS)

    Narang, Benjamin; Phillips, Michael; Knapp, Karen; Appelboam, Andy; Reuben, Adam; Slabaugh, Greg

    2015-03-01

    Assessment of the cervical spine using x-ray radiography is an important task when providing emergency room care to trauma patients suspected of a cervical spine injury. In routine clinical practice, a physician will inspect the alignment of the cervical spine vertebrae by mentally tracing three alignment curves along the anterior and posterior sides of the cervical vertebral bodies, as well as one along the spinolaminar junction. In this paper, we propose an algorithm to semi-automatically delineate the spinolaminar junction curve, given a single reference point and the corners of each vertebral body. From the reference point, our method extracts a region of interest, and performs template matching using normalized cross-correlation to find matching regions along the spinolaminar junction. Matching points are then fit to a third order spline, producing an interpolating curve. Experimental results demonstrate promising results, on average producing a modified Hausdorff distance of 1.8 mm, validated on a dataset consisting of 29 patients including those with degenerative change, retrolisthesis, and fracture.

  8. Resolving Carbonate Platform Geometries on the Island of Bonaire, Caribbean Netherlands through Semi-Automatic GPR Facies Classification

    NASA Astrophysics Data System (ADS)

    Bowling, R. D.; Laya, J. C.; Everett, M. E.

    2018-05-01

    The study of exposed carbonate platforms provides observational constraints on regional tectonics and sea-level history. In this work Miocene-aged carbonate platform units of the Seroe Domi Formation are investigated, on the island of Bonaire, located in the Southern Caribbean. Ground penetrating radar (GPR) was used to probe near-surface structural geometries associated with these lithologies. The single cross-island transect described herein allowed for continuous mapping of geologic structures on kilometer length scales. Numerical analysis was applied to the data in the form of k-means clustering of structure-parallel vectors derived from image structure tensors. This methodology enables radar facies along the survey transect to be semi-automatically mapped. The results provide subsurface evidence to support previous surficial and outcrop observations, and reveal complex stratigraphy within the platform. From the GPR data analysis, progradational clinoform geometries were observed on the northeast side of the island which supports the tectonics and depositional trends of the region. Furthermore, several leeward-side radar facies are identified which correlate to environments of deposition conducive to dolomitization via reflux mechanisms.

  9. Detecting and Analyzing Multiple Moving Objects in Crowded Environments with Coherent Motion Regions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheriyadat, Anil M.

    Understanding the world around us from large-scale video data requires vision systems that can perform automatic interpretation. While human eyes can unconsciously perceive independent objects in crowded scenes and other challenging operating environments, automated systems have difficulty detecting, counting, and understanding their behavior in similar scenes. Computer scientists at ORNL have a developed a technology termed as "Coherent Motion Region Detection" that invloves identifying multiple indepedent moving objects in crowded scenes by aggregating low-level motion cues extracted from moving objects. Humans and other species exploit such low-level motion cues seamlessely to perform perceptual grouping for visual understanding. The algorithm detectsmore » and tracks feature points on moving objects resulting in partial trajectories that span coherent 3D region in the space-time volume defined by the video. In the case of multi-object motion, many possible coherent motion regions can be constructed around the set of trajectories. The unique approach in the algorithm is to identify all possible coherent motion regions, then extract a subset of motion regions based on an innovative measure to automatically locate moving objects in crowded environments.The software reports snapshot of the object, count, and derived statistics ( count over time) from input video streams. The software can directly process videos streamed over the internet or directly from a hardware device (camera).« less

  10. SU-E-J-252: Reproducibility of Radiogenomic Image Features: Comparison of Two Semi-Automated Segmentation Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, M; Woo, B; Kim, J

    Purpose: Objective and reliable quantification of imaging phenotype is an essential part of radiogenomic studies. We compared the reproducibility of two semi-automatic segmentation methods for quantitative image phenotyping in magnetic resonance imaging (MRI) of glioblastoma multiforme (GBM). Methods: MRI examinations with T1 post-gadolinium and FLAIR sequences of 10 GBM patients were downloaded from the Cancer Image Archive site. Two semi-automatic segmentation tools with different algorithms (deformable model and grow cut method) were used to segment contrast enhancement, necrosis and edema regions by two independent observers. A total of 21 imaging features consisting of area and edge groups were extracted automaticallymore » from the segmented tumor. The inter-observer variability and coefficient of variation (COV) were calculated to evaluate the reproducibility. Results: Inter-observer correlations and coefficient of variation of imaging features with the deformable model ranged from 0.953 to 0.999 and 2.1% to 9.2%, respectively, and the grow cut method ranged from 0.799 to 0.976 and 3.5% to 26.6%, respectively. Coefficient of variation for especially important features which were previously reported as predictive of patient survival were: 3.4% with deformable model and 7.4% with grow cut method for the proportion of contrast enhanced tumor region; 5.5% with deformable model and 25.7% with grow cut method for the proportion of necrosis; and 2.1% with deformable model and 4.4% with grow cut method for edge sharpness of tumor on CE-T1W1. Conclusion: Comparison of two semi-automated tumor segmentation techniques shows reliable image feature extraction for radiogenomic analysis of GBM patients with multiparametric Brain MRI.« less

  11. A dorsolateral prefrontal cortex semi-automatic segmenter

    NASA Astrophysics Data System (ADS)

    Al-Hakim, Ramsey; Fallon, James; Nain, Delphine; Melonakos, John; Tannenbaum, Allen

    2006-03-01

    Structural, functional, and clinical studies in schizophrenia have, for several decades, consistently implicated dysfunction of the prefrontal cortex in the etiology of the disease. Functional and structural imaging studies, combined with clinical, psychometric, and genetic analyses in schizophrenia have confirmed the key roles played by the prefrontal cortex and closely linked "prefrontal system" structures such as the striatum, amygdala, mediodorsal thalamus, substantia nigra-ventral tegmental area, and anterior cingulate cortices. The nodal structure of the prefrontal system circuit is the dorsal lateral prefrontal cortex (DLPFC), or Brodmann area 46, which also appears to be the most commonly studied and cited brain area with respect to schizophrenia. 1, 2, 3, 4 In 1986, Weinberger et. al. tied cerebral blood flow in the DLPFC to schizophrenia.1 In 2001, Perlstein et. al. demonstrated that DLPFC activation is essential for working memory tasks commonly deficient in schizophrenia. 2 More recently, groups have linked morphological changes due to gene deletion and increased DLPFC glutamate concentration to schizophrenia. 3, 4 Despite the experimental and clinical focus on the DLPFC in structural and functional imaging, the variability of the location of this area, differences in opinion on exactly what constitutes DLPFC, and inherent difficulties in segmenting this highly convoluted cortical region have contributed to a lack of widely used standards for manual or semi-automated segmentation programs. Given these implications, we developed a semi-automatic tool to segment the DLPFC from brain MRI scans in a reproducible way to conduct further morphological and statistical studies. The segmenter is based on expert neuroanatomist rules (Fallon-Kindermann rules), inspired by cytoarchitectonic data and reconstructions presented by Rajkowska and Goldman-Rakic. 5 It is semi-automated to provide essential user interactivity. We present our results and provide details on our DLPFC open-source tool.

  12. Automatically identifying, counting, and describing wild animals in camera-trap images with deep learning.

    PubMed

    Norouzzadeh, Mohammad Sadegh; Nguyen, Anh; Kosmala, Margaret; Swanson, Alexandra; Palmer, Meredith S; Packer, Craig; Clune, Jeff

    2018-06-19

    Having accurate, detailed, and up-to-date information about the location and behavior of animals in the wild would improve our ability to study and conserve ecosystems. We investigate the ability to automatically, accurately, and inexpensively collect such data, which could help catalyze the transformation of many fields of ecology, wildlife biology, zoology, conservation biology, and animal behavior into "big data" sciences. Motion-sensor "camera traps" enable collecting wildlife pictures inexpensively, unobtrusively, and frequently. However, extracting information from these pictures remains an expensive, time-consuming, manual task. We demonstrate that such information can be automatically extracted by deep learning, a cutting-edge type of artificial intelligence. We train deep convolutional neural networks to identify, count, and describe the behaviors of 48 species in the 3.2 million-image Snapshot Serengeti dataset. Our deep neural networks automatically identify animals with >93.8% accuracy, and we expect that number to improve rapidly in years to come. More importantly, if our system classifies only images it is confident about, our system can automate animal identification for 99.3% of the data while still performing at the same 96.6% accuracy as that of crowdsourced teams of human volunteers, saving >8.4 y (i.e., >17,000 h at 40 h/wk) of human labeling effort on this 3.2 million-image dataset. Those efficiency gains highlight the importance of using deep neural networks to automate data extraction from camera-trap images, reducing a roadblock for this widely used technology. Our results suggest that deep learning could enable the inexpensive, unobtrusive, high-volume, and even real-time collection of a wealth of information about vast numbers of animals in the wild. Copyright © 2018 the Author(s). Published by PNAS.

  13. An Automated Microfluidic Multiplexer for Fast Delivery of C. elegans Populations from Multiwells

    PubMed Central

    Ghorashian, Navid; Gökçe, Sertan Kutal; Guo, Sam Xun; Everett, William Neil; Ben-Yakar, Adela

    2013-01-01

    Automated biosorter platforms, including recently developed microfluidic devices, enable and accelerate high-throughput and/or high-resolution bioassays on small animal models. However, time-consuming delivery of different organism populations to these systems introduces a major bottleneck to executing large-scale screens. Current population delivery strategies rely on suction from conventional well plates through tubing periodically exposed to air, leading to certain disadvantages: 1) bubble introduction to the sample, interfering with analysis in the downstream system, 2) substantial time drain from added bubble-cleaning steps, and 3) the need for complex mechanical systems to manipulate well plate position. To address these concerns, we developed a multiwell-format microfluidic platform that can deliver multiple distinct animal populations from on-chip wells using multiplexed valve control. This Population Delivery Chip could operate autonomously as part of a relatively simple setup that did not require any of the major mechanical moving parts typical of plate-handling systems to address a given well. We demonstrated automatic serial delivery of 16 distinct C. elegans worm populations to a single outlet without introducing any bubbles to the samples, causing cross-contamination, or damaging the animals. The device achieved delivery of more than 90% of the population preloaded into a given well in 4.7 seconds; an order of magnitude faster than delivery modalities in current use. This platform could potentially handle other similarly sized model organisms, such as zebrafish and drosophila larvae or cellular micro-colonies. The device’s architecture and microchannel dimensions allow simple expansion for processing larger numbers of populations. PMID:24069313

  14. A semi-implicit augmented IIM for Navier–Stokes equations with open, traction, or free boundary conditions

    PubMed Central

    Li, Zhilin; Xiao, Li; Cai, Qin; Zhao, Hongkai; Luo, Ray

    2016-01-01

    In this paper, a new Navier–Stokes solver based on a finite difference approximation is proposed to solve incompressible flows on irregular domains with open, traction, and free boundary conditions, which can be applied to simulations of fluid structure interaction, implicit solvent model for biomolecular applications and other free boundary or interface problems. For some problems of this type, the projection method and the augmented immersed interface method (IIM) do not work well or does not work at all. The proposed new Navier–Stokes solver is based on the local pressure boundary method, and a semi-implicit augmented IIM. A fast Poisson solver can be used in our algorithm which gives us the potential for developing fast overall solvers in the future. The time discretization is based on a second order multi-step method. Numerical tests with exact solutions are presented to validate the accuracy of the method. Application to fluid structure interaction between an incompressible fluid and a compressible gas bubble is also presented. PMID:27087702

  15. A semi-implicit augmented IIM for Navier-Stokes equations with open, traction, or free boundary conditions.

    PubMed

    Li, Zhilin; Xiao, Li; Cai, Qin; Zhao, Hongkai; Luo, Ray

    2015-08-15

    In this paper, a new Navier-Stokes solver based on a finite difference approximation is proposed to solve incompressible flows on irregular domains with open, traction, and free boundary conditions, which can be applied to simulations of fluid structure interaction, implicit solvent model for biomolecular applications and other free boundary or interface problems. For some problems of this type, the projection method and the augmented immersed interface method (IIM) do not work well or does not work at all. The proposed new Navier-Stokes solver is based on the local pressure boundary method, and a semi-implicit augmented IIM. A fast Poisson solver can be used in our algorithm which gives us the potential for developing fast overall solvers in the future. The time discretization is based on a second order multi-step method. Numerical tests with exact solutions are presented to validate the accuracy of the method. Application to fluid structure interaction between an incompressible fluid and a compressible gas bubble is also presented.

  16. Radiative transfer model for contaminated rough slabs.

    PubMed

    Andrieu, François; Douté, Sylvain; Schmidt, Frédéric; Schmitt, Bernard

    2015-11-01

    We present a semi-analytical model to simulate the bidirectional reflectance distribution function (BRDF) of a rough slab layer containing impurities. This model has been optimized for fast computation in order to analyze massive hyperspectral data by a Bayesian approach. We designed it for planetary surface ice studies but it could be used for other purposes. It estimates the bidirectional reflectance of a rough slab of material containing inclusions, overlaying an optically thick media (semi-infinite media or stratified media, for instance granular material). The inclusions are assumed to be close to spherical and constituted of any type of material other than the ice matrix. It can be any other type of ice, mineral, or even bubbles defined by their optical constants. We assume a low roughness and we consider the geometrical optics conditions. This model is thus applicable for inclusions larger than the considered wavelength. The scattering on the inclusions is assumed to be isotropic. This model has a fast computation implementation and thus is suitable for high-resolution hyperspectral data analysis.

  17. Investigating helmet promotion for cyclists: results from a randomised study with observation of behaviour, using a semi-automatic video system.

    PubMed

    Constant, Aymery; Messiah, Antoine; Felonneau, Marie-Line; Lagarde, Emmanuel

    2012-01-01

    Half of fatal injuries among bicyclists are head injuries. While helmet use is likely to provide protection, their use often remains rare. We assessed the influence of strategies for promotion of helmet use with direct observation of behaviour by a semi-automatic video system. We performed a single-centre randomised controlled study, with 4 balanced randomisation groups. Participants were non-helmet users, aged 18-75 years, recruited at a loan facility in the city of Bordeaux, France. After completing a questionnaire investigating their attitudes towards road safety and helmet use, participants were randomly assigned to three groups with the provision of "helmet only", "helmet and information" or "information only", and to a fourth control group. Bikes were labelled with a colour code designed to enable observation of helmet use by participants while cycling, using a 7-spot semi-automatic video system located in the city. A total of 1557 participants were included in the study. Between October 15th 2009 and September 28th 2010, 2621 cyclists' movements, made by 587 participants, were captured by the video system. Participants seen at least once with a helmet amounted to 6.6% of all observed participants, with higher rates in the two groups that received a helmet at baseline. The likelihood of observed helmet use was significantly increased among participants of the "helmet only" group (OR = 7.73 [2.09-28.5]) and this impact faded within six months following the intervention. No effect of information delivery was found. Providing a helmet may be of value, but will not be sufficient to achieve high rates of helmet wearing among adult cyclists. Integrated and repeated prevention programmes will be needed, including free provision of helmets, but also information on the protective effect of helmets and strategies to increase peer and parental pressure.

  18. Semi-automatic spray pyrolysis deposition of thin, transparent, titania films as blocking layers for dye-sensitized and perovskite solar cells.

    PubMed

    Krýsová, Hana; Krýsa, Josef; Kavan, Ladislav

    2018-01-01

    For proper function of the negative electrode of dye-sensitized and perovskite solar cells, the deposition of a nonporous blocking film is required on the surface of F-doped SnO 2 (FTO) glass substrates. Such a blocking film can minimise undesirable parasitic processes, for example, the back reaction of photoinjected electrons with the oxidized form of the redox mediator or with the hole-transporting medium can be avoided. In the present work, thin, transparent, blocking TiO 2 films are prepared by semi-automatic spray pyrolysis of precursors consisting of titanium diisopropoxide bis(acetylacetonate) as the main component. The variation in the layer thickness of the sprayed films is achieved by varying the number of spray cycles. The parameters investigated in this work were deposition temperature (150, 300 and 450 °C), number of spray cycles (20-200), precursor composition (with/without deliberately added acetylacetone), concentration (0.05 and 0.2 M) and subsequent post-calcination at 500 °C. The photo-electrochemical properties were evaluated in aqueous electrolyte solution under UV irradiation. The blocking properties were tested by cyclic voltammetry with a model redox probe with a simple one-electron-transfer reaction. Semi-automatic spraying resulted in the formation of transparent, homogeneous, TiO 2 films, and the technique allows for easy upscaling to large electrode areas. The deposition temperature of 450 °C was necessary for the fabrication of highly photoactive TiO 2 films. The blocking properties of the as-deposited TiO 2 films (at 450 °C) were impaired by post-calcination at 500 °C, but this problem could be addressed by increasing the number of spray cycles. The modification of the precursor by adding acetylacetone resulted in the fabrication of TiO 2 films exhibiting perfect blocking properties that were not influenced by post-calcination. These results will surely find use in the fabrication of large-scale dye-sensitized and perovskite solar cells.

  19. Investigating Helmet Promotion for Cyclists: Results from a Randomised Study with Observation of Behaviour, Using a Semi-Automatic Video System

    PubMed Central

    Constant, Aymery; Messiah, Antoine; Felonneau, Marie-Line; Lagarde, Emmanuel

    2012-01-01

    Introduction Half of fatal injuries among bicyclists are head injuries. While helmet use is likely to provide protection, their use often remains rare. We assessed the influence of strategies for promotion of helmet use with direct observation of behaviour by a semi-automatic video system. Methods We performed a single-centre randomised controlled study, with 4 balanced randomisation groups. Participants were non-helmet users, aged 18–75 years, recruited at a loan facility in the city of Bordeaux, France. After completing a questionnaire investigating their attitudes towards road safety and helmet use, participants were randomly assigned to three groups with the provision of “helmet only”, “helmet and information” or “information only”, and to a fourth control group. Bikes were labelled with a colour code designed to enable observation of helmet use by participants while cycling, using a 7-spot semi-automatic video system located in the city. A total of 1557 participants were included in the study. Results Between October 15th 2009 and September 28th 2010, 2621 cyclists' movements, made by 587 participants, were captured by the video system. Participants seen at least once with a helmet amounted to 6.6% of all observed participants, with higher rates in the two groups that received a helmet at baseline. The likelihood of observed helmet use was significantly increased among participants of the “helmet only” group (OR = 7.73 [2.09–28.5]) and this impact faded within six months following the intervention. No effect of information delivery was found. Conclusion Providing a helmet may be of value, but will not be sufficient to achieve high rates of helmet wearing among adult cyclists. Integrated and repeated prevention programmes will be needed, including free provision of helmets, but also information on the protective effect of helmets and strategies to increase peer and parental pressure. PMID:22355384

  20. Comparison of SAM and OBIA as Tools for Lava Morphology Classification - A Case Study in Krafla, NE Iceland

    NASA Astrophysics Data System (ADS)

    Aufaristama, Muhammad; Hölbling, Daniel; Höskuldsson, Ármann; Jónsdóttir, Ingibjörg

    2017-04-01

    The Krafla volcanic system is part of the Icelandic North Volcanic Zone (NVZ). During Holocene, two eruptive events occurred in Krafla, 1724-1729 and 1975-1984. The last eruptive episode (1975-1984), known as the "Krafla Fires", resulted in nine volcanic eruption episodes. The total area covered by the lavas from this eruptive episode is 36 km2 and the volume is about 0.25-0.3 km3. Lava morphology is related to the characteristics of the surface morphology of a lava flow after solidification. The typical morphology of lava can be used as primary basis for the classification of lava flows when rheological properties cannot be directly observed during emplacement, and also for better understanding the behavior of lava flow models. Although mapping of lava flows in the field is relatively accurate such traditional methods are time consuming, especially when the lava covers large areas such as it is the case in Krafla. Semi-automatic mapping methods that make use of satellite remote sensing data allow for an efficient and fast mapping of lava morphology. In this study, two semi-automatic methods for lava morphology classification are presented and compared using Landsat 8 (30 m spatial resolution) and SPOT-5 (10 m spatial resolution) satellite images. For assessing the classification accuracy, the results from semi-automatic mapping were compared to the respective results from visual interpretation. On the one hand, the Spectral Angle Mapper (SAM) classification method was used. With this method an image is classified according to the spectral similarity between the image reflectance spectrums and the reference reflectance spectra. SAM successfully produced detailed lava surface morphology maps. However, the pixel-based approach partly leads to a salt-and-pepper effect. On the other hand, we applied the Random Forest (RF) classification method within an object-based image analysis (OBIA) framework. This statistical classifier uses a randomly selected subset of training samples to produce multiple decision trees. For final classification of pixels or - in the present case - image objects, the average of the class assignments probability predicted by the different decision trees is used. While the resulting OBIA classification of lava morphology types shows a high coincidence with the reference data, the approach is sensitive to the segmentation-derived image objects that constitute the base units for classification. Both semi-automatic methods produce reasonable results in the Krafla lava field, even if the identification of different pahoehoe and aa types of lava appeared to be difficult. The use of satellite remote sensing data shows a high potential for fast and efficient classification of lava morphology, particularly over large and inaccessible areas.

  1. White blood cell subsets are associated with carotid intima-media thickness and pulse wave velocity in an older Chinese population: the Guangzhou Biobank Cohort Study.

    PubMed

    Phillips, A C; Jiang, C Q; Thomas, G N; Lin, J M; Yue, X J; Cheng, K K; Jin, Y L; Zhang, W S; Lam, T H

    2012-08-01

    Cross-sectional associations between white blood cell (WBC) count, lymphocyte and granulocyte numbers, and carotid intima-media thickness (IMT) and brachial-ankle pulse wave velocity (PWV) were examined in a novel older Chinese community sample. A total of 817 men and 760 women from a sub-study of the Guangzhou Biobank Cohort Study had a full blood count measured by an automated hematology analyzer, carotid IMT by B-mode ultrasonography and brachial-ankle PWV by a non-invasive automatic waveform analyzer. Following adjustment for confounders, WBC count (β=0.07, P<0.001) and granulocyte (β=0.07, P<0.001) number were significantly positively related to PWV, but not lymphocyte number. Similarly, WBC count (β=0.08, P=0.03), lymphocyte (β=0.08, P=0.002) and granulocyte (β=0.03, P=0.04) number were significantly positively associated with carotid IMT, but only the association with lymphocyte count survived correction for other cardiovascular risk factors. In conclusion, higher WBC, particularly lymphocyte and granulocyte, count could be used, respectively, as markers of cardiovascular disease risk, measured through indicators of atherosclerosis and arterial stiffness. The associations for WBC count previously observed by others were likely driven by higher granulocytes; an index of systemic inflammation.

  2. 47 CFR 90.633 - Conventional systems loading requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... automatically. For purposes of this section, a base station is not considered to be in operation unless at least... loading purposes only for the base station facility in the geographic area in which it primarily operates. If this cannot be determined, it will be counted fractionally over the number of base station...

  3. Automatic image analysis and spot classification for detection of fruit fly infestation in hyperspectral images of mangoes

    USDA-ARS?s Scientific Manuscript database

    An algorithm has been developed to identify spots generated in hyperspectral images of mangoes infested with fruit fly larvae. The algorithm incorporates background removal, application of a Gaussian blur, thresholding, and particle count analysis to identify locations of infestations. Each of the f...

  4. Calibration of automatic performance measures - speed and volume data: volume 2, evaluation of the accuracy of approach volume counts and speeds collected by microwave sensors.

    DOT National Transportation Integrated Search

    2016-05-01

    This study evaluated the accuracy of approach volumes and free flow approach speeds collected by the Wavetronix : SmartSensor Advance sensor for the Signal Performance Metrics system of the Utah Department of Transportation (UDOT), : using the field ...

  5. HVAC System Automatic Controls and Indoor Air Quality in Schools. Technical Bulletin.

    ERIC Educational Resources Information Center

    Wheeler, Arthur E.

    Fans, motors, coils, and other control components enable a heating, ventilating, and air-conditioning (HVAC) system to function smoothly. An explanation of these control components and how they make school HVAC systems work is provided. Different systems may be compared by counting the number of controlled devices that are required. Control…

  6. Operator Priming and Generalization of Practice in Adults' Simple Arithmetic

    ERIC Educational Resources Information Center

    Chen, Yalin; Campbell, Jamie I. D.

    2016-01-01

    There is a renewed debate about whether educated adults solve simple addition problems (e.g., 2 + 3) by direct fact retrieval or by fast, automatic counting-based procedures. Recent research testing adults' simple addition and multiplication showed that a 150-ms preview of the operator (+ or ×) facilitated addition, but not multiplication,…

  7. ASSIST user manual

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.; Boerschlein, David P.

    1995-01-01

    Semi-Markov models can be used to analyze the reliability of virtually any fault-tolerant system. However, the process of delineating all the states and transitions in a complex system model can be devastatingly tedious and error prone. The Abstract Semi-Markov Specification Interface to the SURE Tool (ASSIST) computer program allows the user to describe the semi-Markov model in a high-level language. Instead of listing the individual model states, the user specifies the rules governing the behavior of the system, and these are used to generate the model automatically. A few statements in the abstract language can describe a very large, complex model. Because no assumptions are made about the system being modeled, ASSIST can be used to generate models describing the behavior of any system. The ASSIST program and its input language are described and illustrated by examples.

  8. A semi-analytic theory for the motion of a close-earth artificial satellite with drag

    NASA Technical Reports Server (NTRS)

    Liu, J. J. F.; Alford, R. L.

    1979-01-01

    A semi-analytic method is used to estimate the decay history/lifetime and to generate orbital ephemeris for close-earth satellites perturbed by the atmospheric drag and earth oblateness due to the spherical harmonics J2, J3, and J4. The theory maintains efficiency through the application of the theory of a method of averaging and employs sufficient numerical emphasis to include a rather sophisticated atmospheric density model. The averaged drag effects with respect to mean anomaly are evaluated by a Gauss-Legendre quadrature while the averaged variational equations of motion are integrated numerically with automatic step size and error control.

  9. Air-guided manual deep lamellar keratoplasty.

    PubMed

    Caporossi, A; Simi, C; Licignano, R; Traversi, C; Balestrazzi, A

    2004-01-01

    To evaluate the efficacy of a new modified technique of deep lamellar keratoplasty (DLK). Nine eyes of eight patients with keratoconus of moderate degree were included. All patients underwent DLK with manual dissection from a limbal side port after an air bubble injection in the anterior chamber. The patients underwent a complete ophthalmologic examination 6 months after the suture removal, evaluating best-corrected visual acuity, corneal thickness, endothelial cell count, and topographic astigmatism. One case (11.1%) was converted to penetrating keratoplasty because of microperforation. In the eight successful cases, 7 eyes (77.8%) achieved 20/30 or better visual acuity 6 months after suture removal. Mean postoperative pachymetry was 604.76 microm (SD 46.76). Specular microscopy 6 months after suture removal revealed average endothelial cell count of 2273/mm2 (SD 229). This modified DLK technique is a safe and effective procedure and could facilitate, after a short learning curve, this kind of surgery with a low risk of conversion to penetrating keratoplasty.

  10. Processor architectures utilizing magnetic bubble and semi-conductor memories. [for the Omega navigation system

    NASA Technical Reports Server (NTRS)

    Parrish, E. A., Jr.; Aylor, J. H.

    1975-01-01

    To aid work being conducted on the feasibility of a low cost Omega navigational receiver, a control panel was designed and constructed according to supplied specifications. Since the proposed Omega receiver is designed around a microprocessor, software engineering necessary for control panel operation is included in the design. The control panel is to be used as an operational model for use in the design of a prototype receiver. A detailed description of the hardware design is presented along with a description of the software needed to operate the panel. A complete description of the operating procedures for the panel are also included.

  11. Optimizing the 3D-reconstruction technique for serial block-face scanning electron microscopy.

    PubMed

    Wernitznig, Stefan; Sele, Mariella; Urschler, Martin; Zankel, Armin; Pölt, Peter; Rind, F Claire; Leitinger, Gerd

    2016-05-01

    Elucidating the anatomy of neuronal circuits and localizing the synaptic connections between neurons, can give us important insights in how the neuronal circuits work. We are using serial block-face scanning electron microscopy (SBEM) to investigate the anatomy of a collision detection circuit including the Lobula Giant Movement Detector (LGMD) neuron in the locust, Locusta migratoria. For this, thousands of serial electron micrographs are produced that allow us to trace the neuronal branching pattern. The reconstruction of neurons was previously done manually by drawing cell outlines of each cell in each image separately. This approach was very time consuming and troublesome. To make the process more efficient a new interactive software was developed. It uses the contrast between the neuron under investigation and its surrounding for semi-automatic segmentation. For segmentation the user sets starting regions manually and the algorithm automatically selects a volume within the neuron until the edges corresponding to the neuronal outline are reached. Internally the algorithm optimizes a 3D active contour segmentation model formulated as a cost function taking the SEM image edges into account. This reduced the reconstruction time, while staying close to the manual reference segmentation result. Our algorithm is easy to use for a fast segmentation process, unlike previous methods it does not require image training nor an extended computing capacity. Our semi-automatic segmentation algorithm led to a dramatic reduction in processing time for the 3D-reconstruction of identified neurons. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the patient's peripheral blood (blood circulating in one of the body's extremities, such as the arm). These...

  13. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the patient's peripheral blood (blood circulating in one of the body's extremities, such as the arm). These...

  14. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the patient's peripheral blood (blood circulating in one of the body's extremities, such as the arm). These...

  15. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the patient's peripheral blood (blood circulating in one of the body's extremities, such as the arm). These...

  16. RootScan: Software for high-throughput analysis of root anatomical traits

    USDA-ARS?s Scientific Manuscript database

    RootScan is a program for semi-automated image analysis of anatomical phenes in root cross-sections. RootScan uses pixel value thresholds to separate the cross-section from its background and to visually dissect it into tissue regions. Area measurements and object counts are performed within various...

  17. Counting abilities in autism: possible implications for central coherence theory.

    PubMed

    Jarrold, C; Russell, J

    1997-02-01

    We examined the claim that children with autism have a "weak drive for central coherence" which biases them towards processing information at an analytic rather than global level. This was done by investigating whether children with autism would rapidly and automatically enumerate a number of dots presented in a canonical form, or count each dot individually to obtain the total. The time taken to count stimuli was compared across three participant groups: children with autism, children with moderate learning difficulties, and normally developing children. There were 22 children in each group, and individuals were matched across groups on the basis of verbal mental age. Results implied that children with autism did show a tendency towards an analytic level of processing. However, though the groups differed on measures of counting speeds, the number or children showing patterns of global or analytic processing did not differ significantly across the groups. Whether these results implicate a weak drive for central coherence in autism, which is both specific to, and pervasive in the disorder, is discussed.

  18. Magsat investigation. [Canadian shield

    NASA Technical Reports Server (NTRS)

    Hall, D. H. (Principal Investigator)

    1980-01-01

    A computer program was prepared for modeling segments of the Earth's crust allowing for heterogeneity in magnetization in calculating the Earth's field at Magsat heights. This permits investigation of a large number of possible models in assessing the magnetic signatures of subprovinces of the Canadian shield. The fit between the model field and observed fields is optimized in a semi-automatic procedure.

  19. POPCORN: a Supervisory Control Simulation for Workload and Performance Research

    NASA Technical Reports Server (NTRS)

    Hart, S. G.; Battiste, V.; Lester, P. T.

    1984-01-01

    A multi-task simulation of a semi-automatic supervisory control system was developed to provide an environment in which training, operator strategy development, failure detection and resolution, levels of automation, and operator workload can be investigated. The goal was to develop a well-defined, but realistically complex, task that would lend itself to model-based analysis. The name of the task (POPCORN) reflects the visual display that depicts different task elements milling around waiting to be released and pop out to be performed. The operator's task was to complete each of 100 task elements that ere represented by different symbols, by selecting a target task and entering the desired a command. The simulated automatic system then completed the selected function automatically. Highly significant differences in performance, strategy, and rated workload were found as a function of all experimental manipulations (except reward/penalty).

  20. Taguchi Experimental Design for Cleaning PWAs with Ball Grid Arrays

    NASA Technical Reports Server (NTRS)

    Bonner, J. K.; Mehta, A.; Walton, S.

    1997-01-01

    Ball grid arrays (BGAs), and other area array packages, are becoming more prominent as a way to increase component pin count while avoiding the manufacturing difficulties inherent in processing quad flat packs (QFPs)...Cleaning printed wiring assemblies (PWAs) with BGA components mounted on the surface is problematic...Currently, a low flash point semi-aqueous material, in conjunction with a batch cleaning unit, is being used to clean PWAs. The approach taken at JPL was to investigate the use of (1) semi-aqueous materials having a high flash point and (2) aqueous cleaning involving a saponifier.

  1. Transition scenario and transition control of the flow over a semi-infinite square leading-edge plate

    NASA Astrophysics Data System (ADS)

    Huang, Yadong; Zhou, Benmou; Tang, Zhaolie; Zhang, Fei

    2017-07-01

    In recent investigations of the flow over a square leading-edge flat plate, elliptic instability and transient growth of perturbations are proposed to explain the turbulent transition mechanism of the separating and reattaching flow reported in early experimental visualizations. An original transition scenario as well as a transition control method is presented by a detailed numerical study in this paper. The transient growth of perturbations in the separation bubble induces the primary instability that causes the 2D unsteady flow consisting of Kelvin-Helmholtz (KH) vortices. The pairing instability of the KH vortices induces the subharmonic secondary instability, and then resonance transition occurs. The streamwise Lorentz force as the control input is applied in the recirculation region where the separation bubble generates. The maximum energy amplification magnitude of perturbations takes a linear attenuation with the interaction number; thus, the primary instability is reduced under control. The interaction number represents the strength of the streamwise Lorentz force relative to the inertial force of the fluid. The reduced primary instability is not strong enough to induce the secondary instability, so the flow is globally stable under control. Three-dimensional direct numerical simulation confirms the results of the linear stability analysis. Although the growth rate of the convectively unstable secondary instability is limited by the flow field scale, the feedback loop of the energy transfer promotes the resonance transition. However, as the separation bubble scale is reduced and the feedback loop is broken by the streamwise Lorentz force, the three-dimensional transition is suppressed and a skin-friction drag reduction is achieved.

  2. Semi Automated Land Cover Layer Updating Process Utilizing Spectral Analysis and GIS Data Fusion

    NASA Astrophysics Data System (ADS)

    Cohen, L.; Keinan, E.; Yaniv, M.; Tal, Y.; Felus, A.; Regev, R.

    2018-04-01

    Technological improvements made in recent years of mass data gathering and analyzing, influenced the traditional methods of updating and forming of the national topographic database. It has brought a significant increase in the number of use cases and detailed geo information demands. Processes which its purpose is to alternate traditional data collection methods developed in many National Mapping and Cadaster Agencies. There has been significant progress in semi-automated methodologies aiming to facilitate updating of a topographic national geodatabase. Implementation of those is expected to allow a considerable reduction of updating costs and operation times. Our previous activity has focused on building automatic extraction (Keinan, Zilberstein et al, 2015). Before semiautomatic updating method, it was common that interpreter identification has to be as detailed as possible to hold most reliable database eventually. When using semi-automatic updating methodologies, the ability to insert human insights based knowledge is limited. Therefore, our motivations were to reduce the created gap by allowing end-users to add their data inputs to the basic geometric database. In this article, we will present a simple Land cover database updating method which combines insights extracted from the analyzed image, and a given spatial data of vector layers. The main stages of the advanced practice are multispectral image segmentation and supervised classification together with given vector data geometric fusion while maintaining the principle of low shape editorial work to be done. All coding was done utilizing open source software components.

  3. Semi-automated intra-operative fluoroscopy guidance for osteotomy and external-fixator.

    PubMed

    Lin, Hong; Samchukov, Mikhail L; Birch, John G; Cherkashin, Alexander

    2006-01-01

    This paper outlines a semi-automated intra-operative fluoroscopy guidance and monitoring approach for osteotomy and external-fixator application in orthopedic surgery. Intra-operative Guidance module is one component of the "LegPerfect Suite" developed for assisting the surgical correction of lower extremity angular deformity. The Intra-operative Guidance module utilizes information from the preoperative surgical planning module as a guideline to overlay (register) its bone outline semi-automatically with the bone edge from the real-time fluoroscopic C-Arm X-Ray image in the operating room. In the registration process, scaling factor is obtained automatically through matching a fiducial template in the fluoroscopic image and a marker in the module. A triangle metal plate, placed on the operating table is used as fiducial template. The area of template image within the viewing area of the fluoroscopy machine is obtained by the image processing techniques such as edge detection and Hough transformation to extract the template from other objects in the fluoroscopy image. The area of fiducial template from fluoroscopic image is then compared with the area of the marker from the planning so as to obtain the scaling factor. After the scaling factor is obtained, the user can use simple operations by mouse to shift and rotate the preoperative planning to overlay the bone outline from planning with the bone edge from fluoroscopy image. In this way osteotomy levels and external fixator positioning on the limb can guided by the computerized preoperative plan.

  4. Simulation of HLNC and NCC measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Ming-Shih; Teichmann, T.; De Ridder, P.

    1994-03-01

    This report discusses an automatic method of simulating the results of High Level Neutron Coincidence Counting (HLNC) and Neutron Collar Coincidence Counting (NCC) measurements to facilitate the safeguards` inspectors understanding and use of these instruments under realistic conditions. This would otherwise be expensive, and time-consuming, except at sites designed to handle radioactive materials, and having the necessary variety of fuel elements and other samples. This simulation must thus include the behavior of the instruments for variably constituted and composed fuel elements (including poison rods and Gd loading), and must display the changes in the count rates as a function ofmore » these characteristics, as well as of various instrumental parameters. Such a simulation is an efficient way of accomplishing the required familiarization and training of the inspectors by providing a realistic reproduction of the results of such measurements.« less

  5. Hydrological Response of Semi-arid Degraded Catchments in Tigray, Northern Ethiopia

    NASA Astrophysics Data System (ADS)

    Teka, Daniel; Van Wesemael, Bas; Vanacker, Veerle; Hallet, Vincent

    2013-04-01

    To address water scarcity in the arid and semi-arid part of developing countries, accurate estimation of surface runoff is an essential task. In semi-arid catchments runoff data are scarce and therefore runoff estimation using hydrological models becomes an alternative. This research was initiated in order to characterize runoff response of semi-arid catchments in Tigray, North Ethiopia to evaluate SCS-CN for various catchments. Ten sub-catchments were selected in different river basins and rainfall and runoff were measured with automatic hydro-monitoring equipments for 2-3 years. The Curve Number was estimated for each Hydrological Response Unit (HRU) in the sub-catchments and runoff was modeled using the SCS-CN method at λ = 0.05 and λ = 0.20. The result showed a significant difference between the two abstraction ratios (P =0.05, df = 1, n= 132) and reasonable good result was obtained for predicted runoff at λ = 0.05 (NSE = -0.69; PBIAS = 18.1%). When using the CN values from literature runoff was overestimated compared to the measured value (e= -11.53). This research showed the importance of using measured runoff data to characterize semi-arid catchments and accurately estimate the scarce water resource. Key words: Hydrological response, rainfall-runoff, degraded environments, semi-arid, Ethiopia, Tigray

  6. Grading Multiple Choice Exams with Low-Cost and Portable Computer-Vision Techniques

    NASA Astrophysics Data System (ADS)

    Fisteus, Jesus Arias; Pardo, Abelardo; García, Norberto Fernández

    2013-08-01

    Although technology for automatic grading of multiple choice exams has existed for several decades, it is not yet as widely available or affordable as it should be. The main reasons preventing this adoption are the cost and the complexity of the setup procedures. In this paper, Eyegrade, a system for automatic grading of multiple choice exams is presented. While most current solutions are based on expensive scanners, Eyegrade offers a truly low-cost solution requiring only a regular off-the-shelf webcam. Additionally, Eyegrade performs both mark recognition as well as optical character recognition of handwritten student identification numbers, which avoids the use of bubbles in the answer sheet. When compared with similar webcam-based systems, the user interface in Eyegrade has been designed to provide a more efficient and error-free data collection procedure. The tool has been validated with a set of experiments that show the ease of use (both setup and operation), the reduction in grading time, and an increase in the reliability of the results when compared with conventional, more expensive systems.

  7. Seismo-Geochemical Variations in SW Taiwan: Multi-Parameter Automatic Gas Monitoring Results

    NASA Astrophysics Data System (ADS)

    Yang, T. F.; Fu, C.-C.; Walia, V.; Chen, C.-H.; Chyi, L. L.; Liu, T.-K.; Song, S.-R.; Lee, M.; Lin, C.-W.; Lin, C.-C.

    2006-04-01

    Gas variations of many mud volcanoes and hot springs distributed along the tectonic sutures in southwestern Taiwan are considered to be sensitive to the earthquake activity. Therefore, a multi-parameter automatic gas station was built on the bank of one of the largest mud-pools at an active fault zone of southwestern Taiwan, for continuous monitoring of CO2, CH4, N2 and H2O, the major constituents of its bubbling gases. During the year round monitoring from October 2001 to October 2002, the gas composition, especially, CH4 and CO2, of the mud pool showed significant variations. Taking the CO2/CH4 ratio as the main indicator, anomalous variations can be recognized from a few days to a few weeks before earthquakes and correlated well with those with a local magnitude >4.0 and local intensities >2. It is concluded that the gas composition in the area is sensitive to the local crustal stress/strain and is worthy to conduct real-time monitoring for the seismo-geochemical precursors.

  8. Rapid on-site monitoring of Legionella pneumophila in cooling tower water using a portable microfluidic system.

    PubMed

    Yamaguchi, Nobuyasu; Tokunaga, Yusuke; Goto, Satoko; Fujii, Yudai; Banno, Fumiya; Edagawa, Akiko

    2017-06-08

    Legionnaires' disease, predominantly caused by the bacterium Legionella pneumophila, has increased in prevalence worldwide. The most common mode of transmission of Legionella is inhalation of contaminated aerosols, such as those generated by cooling towers. Simple, rapid and accurate methods to enumerate L. pneumophila are required to prevent the spread of this organism. Here, we applied a microfluidic device for on-chip fluorescent staining and semi-automated counting of L. pneumophila in cooling tower water. We also constructed a portable system for rapid on-site monitoring and used it to enumerate target bacterial cells rapidly flowing in the microchannel. A fluorescently-labelled polyclonal antibody was used for the selective detection of L. pneumophila serogroup 1 in the samples. The counts of L. pneumophila in cooling tower water obtained using the system and fluorescence microscopy were similar. The detection limit of the system was 10 4  cells/ml, but lower numbers of L. pneumophila cells (10 1 to 10 3  cells/ml) could be detected following concentration of 0.5-3 L of the water sample by filtration. Our technique is rapid to perform (1.5 h), semi-automated (on-chip staining and counting), and portable for on-site measurement, and it may therefore be effective in the initial screening of Legionella contamination in freshwater.

  9. Semi-industrial experimental study on bauxite separation using a cell-column integration process

    NASA Astrophysics Data System (ADS)

    Zhang, Ning-ning; Zhou, Chang-chun; Cong, Long-fei; Cao, Wen-long; Zhou, You

    2016-01-01

    The cyclonic-static micro-bubble flotation column (FCSMC) is a highly efficient mineral processing equipment. In this study, a cell-column (FCSMC) integration process was investigated for the separation of bauxite and its feasibility was analyzed on a theoretical basis. The properties of low-grade bauxite ore from Henan Province, China were analyzed. Parameters such as reagent dosage, scraping bubble time, and pressure of the circulating pump during the sorting process were investigated and optimized to improve the flotation efficiency. On the basis of these parameters, continuous separation experiments were conducted. Bauxite concentrate with an aluminum-to-silicon (A/S) mass ratio of 6.37 and a 77.63wt% recovery rate were achieved via a flow sheet consisting of "fast flotation using a flotation cell, one roughing flotation and one cleaning flotation using flotation columns". Compared with the full-flotation-cells process, the cell-column integration process resulted in an increase of the A/S ratio by 0.41 and the recovery rate by 17.58wt%. Cell-column integration separation technology represents a new approach for the separation of middle-to-low-grade bauxite ore.

  10. Membrane cleaning with ultrasonically driven bubbles.

    PubMed

    Reuter, Fabian; Lauterborn, Sonja; Mettin, Robert; Lauterborn, Werner

    2017-07-01

    A laboratory filtration plant for drinking water treatment is constructed to study the conditions for purely mechanical in situ cleaning of fouled polymeric membranes by the application of ultrasound. The filtration is done by suction of water with defined constant contamination through a membrane module, a stack of five pairs of flat-sheet ultrafiltration membranes. The short cleaning cycle to remove the cake layer from the membranes includes backwashing, the application of ultrasound and air flushing. A special geometry for sound irradiation of the membranes parallel to their surfaces is chosen. Two frequencies, 35kHz and 130kHz, and different driving powers are tested for their cleaning effectiveness. No cleaning is found for 35kHz, whereas good cleaning results are obtained for 130kHz, with an optimum cleaning effectiveness at moderate driving powers. Acoustic and optic measurements in space and time as well as analytical considerations and numerical calculations reveal the reasons and confirm the experimental results. The sound field is measured in high resolution and bubble structures are high-speed imaged on their nucleation sites as well as during their cleaning work at the membrane surface. The microscopic inspection of the membrane surface after cleaning shows distinct cleaning types in the cake layer that are related to specific bubble behaviour on the membrane. The membrane integrity and permeate quality are checked on-line by particle counting and turbidity measurement of the permeate. No signs of membrane damage or irreversible membrane degradation in permeability are detected and an excellent water permeate quality is retained. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Correlation and registration of ERTS multispectral imagery. [by a digital processing technique

    NASA Technical Reports Server (NTRS)

    Bonrud, L. O.; Henrikson, P. J.

    1974-01-01

    Examples of automatic digital processing demonstrate the feasibility of registering one ERTS multispectral scanner (MSS) image with another obtained on a subsequent orbit, and automatic matching, correlation, and registration of MSS imagery with aerial photography (multisensor correlation) is demonstrated. Excellent correlation was obtained with patch sizes exceeding 16 pixels square. Qualities which lead to effective control point selection are distinctive features, good contrast, and constant feature characteristics. Results of the study indicate that more than 300 degrees of freedom are required to register two standard ERTS-1 MSS frames covering 100 by 100 nautical miles to an accuracy of 0.6 pixel mean radial displacement error. An automatic strip processing technique demonstrates 600 to 1200 degrees of freedom over a quater frame of ERTS imagery. Registration accuracies in the range of 0.3 pixel to 0.5 pixel mean radial error were confirmed by independent error analysis. Accuracies in the range of 0.5 pixel to 1.4 pixel mean radial error were demonstrated by semi-automatic registration over small geographic areas.

  12. Strategies for automatic processing of large aftershock sequences

    NASA Astrophysics Data System (ADS)

    Kvaerna, T.; Gibbons, S. J.

    2017-12-01

    Aftershock sequences following major earthquakes present great challenges to seismic bulletin generation. The analyst resources needed to locate events increase with increased event numbers as the quality of underlying, fully automatic, event lists deteriorates. While current pipelines, designed a generation ago, are usually limited to single passes over the raw data, modern systems also allow multiple passes. Processing the raw data from each station currently generates parametric data streams that are later subject to phase-association algorithms which form event hypotheses. We consider a major earthquake scenario and propose to define a region of likely aftershock activity in which we will detect and accurately locate events using a separate, specially targeted, semi-automatic process. This effort may use either pattern detectors or more general algorithms that cover wider source regions without requiring waveform similarity. An iterative procedure to generate automatic bulletins would incorporate all the aftershock event hypotheses generated by the auxiliary process, and filter all phases from these events from the original detection lists prior to a new iteration of the global phase-association algorithm.

  13. Analysis of Technique to Extract Data from the Web for Improved Performance

    NASA Astrophysics Data System (ADS)

    Gupta, Neena; Singh, Manish

    2010-11-01

    The World Wide Web rapidly guides the world into a newly amazing electronic world, where everyone can publish anything in electronic form and extract almost all the information. Extraction of information from semi structured or unstructured documents, such as web pages, is a useful yet complex task. Data extraction, which is important for many applications, extracts the records from the HTML files automatically. Ontologies can achieve a high degree of accuracy in data extraction. We analyze method for data extraction OBDE (Ontology-Based Data Extraction), which automatically extracts the query result records from the web with the help of agents. OBDE first constructs an ontology for a domain according to information matching between the query interfaces and query result pages from different web sites within the same domain. Then, the constructed domain ontology is used during data extraction to identify the query result section in a query result page and to align and label the data values in the extracted records. The ontology-assisted data extraction method is fully automatic and overcomes many of the deficiencies of current automatic data extraction methods.

  14. Three-dimensional distribution of tyrosine hydroxylase, vasopressin and oxytocin neurones in the transparent postnatal mouse brain.

    PubMed

    Godefroy, D; Dominici, C; Hardin-Pouzet, H; Anouar, Y; Melik-Parsadaniantz, S; Rostène, W; Reaux-Le Goazigo, A

    2017-12-01

    Over the years, advances in immunohistochemistry techniques have been a critical step in detecting and mapping neuromodulatory substances in the central nervous system. The better quality and specificity of primary antibodies, new staining procedures and the spectacular development of imaging technologies have allowed such progress. Very recently, new methods permitting tissue transparency have been successfully used on brain tissues. In the present study, we combined whole-mount immunostaining for tyrosine hydroxylase (TH), oxytocin (OXT) and arginine vasopressin (AVP), with the iDISCO+ clearing method, light-sheet microscopy and semi-automated counting of three-dimensionally-labelled neurones to obtain a (3D) distribution of these neuronal populations in a 5-day postnatal (P5) mouse brain. Segmentation procedure and 3D reconstruction allowed us, with high resolution, to map TH staining of the various catecholaminergic cell groups and their ascending and descending fibre pathways. We show that TH pathways are present in the whole P5 mouse brain, similar to that observed in the adult rat brain. We also provide new information on the postnatal distribution of OXT and AVP immunoreactive cells in the mouse hypothalamus, and show that, compared to AVP neurones, OXT neurones in the supraoptic (SON) and paraventricular (PVN) nuclei are not yet mature in the early postnatal period. 3D semi-automatic quantitative analysis of the PVN reveals that OXT cell bodies are more numerous than AVP neurones, although their immunoreactive soma have a volume half smaller. More AVP nerve fibres compared to OXT were observed in the PVN and the retrochiasmatic area. In conclusion, the results of the present study demonstrate the utility and the potency of imaging large brain tissues with clearing procedures coupled to novel 3D imaging technologies to study, localise and quantify neurotransmitter substances involved in brain and neuroendocrine functions. © 2017 British Society for Neuroendocrinology.

  15. Semi-supervised Learning for Phenotyping Tasks.

    PubMed

    Dligach, Dmitriy; Miller, Timothy; Savova, Guergana K

    2015-01-01

    Supervised learning is the dominant approach to automatic electronic health records-based phenotyping, but it is expensive due to the cost of manual chart review. Semi-supervised learning takes advantage of both scarce labeled and plentiful unlabeled data. In this work, we study a family of semi-supervised learning algorithms based on Expectation Maximization (EM) in the context of several phenotyping tasks. We first experiment with the basic EM algorithm. When the modeling assumptions are violated, basic EM leads to inaccurate parameter estimation. Augmented EM attenuates this shortcoming by introducing a weighting factor that downweights the unlabeled data. Cross-validation does not always lead to the best setting of the weighting factor and other heuristic methods may be preferred. We show that accurate phenotyping models can be trained with only a few hundred labeled (and a large number of unlabeled) examples, potentially providing substantial savings in the amount of the required manual chart review.

  16. Measurement results obtained from air quality monitoring system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turzanski, P.K.; Beres, R.

    1995-12-31

    An automatic system of air pollution monitoring operates in Cracow since 1991. The organization, assembling and start-up of the network is a result of joint efforts of the US Environmental Protection Agency and the Cracow environmental protection service. At present the automatic monitoring network is operated by the Provincial Inspection of Environmental Protection. There are in total seven stationary stations situated in Cracow to measure air pollution. These stations are supported continuously by one semi-mobile (transportable) station. It allows to modify periodically the area under investigation and therefore the 3-dimensional picture of creation and distribution of air pollutants within Cracowmore » area could be more intelligible.« less

  17. Automatic Generation of Building Models with Levels of Detail 1-3

    NASA Astrophysics Data System (ADS)

    Nguatem, W.; Drauschke, M.; Mayer, H.

    2016-06-01

    We present a workflow for the automatic generation of building models with levels of detail (LOD) 1 to 3 according to the CityGML standard (Gröger et al., 2012). We start with orienting unsorted image sets employing (Mayer et al., 2012), we compute depth maps using semi-global matching (SGM) (Hirschmüller, 2008), and fuse these depth maps to reconstruct dense 3D point clouds (Kuhn et al., 2014). Based on planes segmented from these point clouds, we have developed a stochastic method for roof model selection (Nguatem et al., 2013) and window model selection (Nguatem et al., 2014). We demonstrate our workflow up to the export into CityGML.

  18. The cleaning and disinfection by heat of bedpans in automatic and semi-automatic machines.

    PubMed Central

    Mostafa, A. B.; Chackett, K. F.

    1976-01-01

    This work is concerned with the cleaning and disinfection by heat of stainless-steel and polypropylene bedpans, which had been soiled with either a biological contaminant, human serum albumin (HSA) labelled with technetium-99m 99m(Tc), or a bacteriological contaminant, streptococcus faecalis mixed with Tc-labelled HSA. Results of cleaning and disinfection achieved with a Test Machine and those achieved by procedures adopted in eight different wards of a general hospital are reported. Bedpan washers installed in wards were found to be less efficient than the Test Machine, at least partly because of inadequate maintenance. Stainless-steel and polypropylene bedpans gave essentially the same results. PMID:6591

  19. Designed tools for analysis of lithography patterns and nanostructures

    NASA Astrophysics Data System (ADS)

    Dervillé, Alexandre; Baderot, Julien; Bernard, Guilhem; Foucher, Johann; Grönqvist, Hanna; Labrosse, Aurélien; Martinez, Sergio; Zimmermann, Yann

    2017-03-01

    We introduce a set of designed tools for the analysis of lithography patterns and nano structures. The classical metrological analysis of these objects has the drawbacks of being time consuming, requiring manual tuning and lacking robustness and user friendliness. With the goal of improving the current situation, we propose new image processing tools at different levels: semi automatic, automatic and machine-learning enhanced tools. The complete set of tools has been integrated into a software platform designed to transform the lab into a virtual fab. The underlying idea is to master nano processes at the research and development level by accelerating the access to knowledge and hence speed up the implementation in product lines.

  20. Generating Models of Surgical Procedures using UMLS Concepts and Multiple Sequence Alignment

    PubMed Central

    Meng, Frank; D’Avolio, Leonard W.; Chen, Andrew A.; Taira, Ricky K.; Kangarloo, Hooshang

    2005-01-01

    Surgical procedures can be viewed as a process composed of a sequence of steps performed on, by, or with the patient’s anatomy. This sequence is typically the pattern followed by surgeons when generating surgical report narratives for documenting surgical procedures. This paper describes a methodology for semi-automatically deriving a model of conducted surgeries, utilizing a sequence of derived Unified Medical Language System (UMLS) concepts for representing surgical procedures. A multiple sequence alignment was computed from a collection of such sequences and was used for generating the model. These models have the potential of being useful in a variety of informatics applications such as information retrieval and automatic document generation. PMID:16779094

  1. Serum bactericidal assay for the evaluation of typhoid vaccine using a semi-automated colony-counting method.

    PubMed

    Jang, Mi Seon; Sahastrabuddhe, Sushant; Yun, Cheol-Heui; Han, Seung Hyun; Yang, Jae Seung

    2016-08-01

    Typhoid fever, mainly caused by Salmonella enterica serovar Typhi (S. Typhi), is a life-threatening disease, mostly in developing countries. Enzyme-linked immunosorbent assay (ELISA) is widely used to quantify antibodies against S. Typhi in serum but does not provide information about functional antibody titers. Although the serum bactericidal assay (SBA) using an agar plate is often used to measure functional antibody titers against various bacterial pathogens in clinical specimens, it has rarely been used for typhoid vaccines because it is time-consuming and labor-intensive. In the present study, we established an improved SBA against S. Typhi using a semi-automated colony-counting system with a square agar plate harboring 24 samples. The semi-automated SBA efficiently measured bactericidal titers of sera from individuals immunized with S. Typhi Vi polysaccharide vaccines. The assay specifically responded to S. Typhi Ty2 but not to other irrelevant enteric bacteria including Vibrio cholerae and Shigella flexneri. Baby rabbit complement was more appropriate source for the SBA against S. Typhi than complements from adult rabbit, guinea pig, and human. We also examined the correlation between SBA and ELISA for measuring antibody responses against S. Typhi using pre- and post-vaccination sera from 18 human volunteers. The SBA titer showed a good correlation with anti-Vi IgG quantity in the serum as determined by Spearman correlation coefficient of 0.737 (P < 0.001). Taken together, the semi-automated SBA might be efficient, accurate, sensitive, and specific enough to measure functional antibody titers against S. Typhi in sera from human subjects immunized with typhoid vaccines. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Automatic vs. manual curation of a multi-source chemical dictionary: the impact on text mining.

    PubMed

    Hettne, Kristina M; Williams, Antony J; van Mulligen, Erik M; Kleinjans, Jos; Tkachenko, Valery; Kors, Jan A

    2010-03-23

    Previously, we developed a combined dictionary dubbed Chemlist for the identification of small molecules and drugs in text based on a number of publicly available databases and tested it on an annotated corpus. To achieve an acceptable recall and precision we used a number of automatic and semi-automatic processing steps together with disambiguation rules. However, it remained to be investigated which impact an extensive manual curation of a multi-source chemical dictionary would have on chemical term identification in text. ChemSpider is a chemical database that has undergone extensive manual curation aimed at establishing valid chemical name-to-structure relationships. We acquired the component of ChemSpider containing only manually curated names and synonyms. Rule-based term filtering, semi-automatic manual curation, and disambiguation rules were applied. We tested the dictionary from ChemSpider on an annotated corpus and compared the results with those for the Chemlist dictionary. The ChemSpider dictionary of ca. 80 k names was only a 1/3 to a 1/4 the size of Chemlist at around 300 k. The ChemSpider dictionary had a precision of 0.43 and a recall of 0.19 before the application of filtering and disambiguation and a precision of 0.87 and a recall of 0.19 after filtering and disambiguation. The Chemlist dictionary had a precision of 0.20 and a recall of 0.47 before the application of filtering and disambiguation and a precision of 0.67 and a recall of 0.40 after filtering and disambiguation. We conclude the following: (1) The ChemSpider dictionary achieved the best precision but the Chemlist dictionary had a higher recall and the best F-score; (2) Rule-based filtering and disambiguation is necessary to achieve a high precision for both the automatically generated and the manually curated dictionary. ChemSpider is available as a web service at http://www.chemspider.com/ and the Chemlist dictionary is freely available as an XML file in Simple Knowledge Organization System format on the web at http://www.biosemantics.org/chemlist.

  3. Automatic vs. manual curation of a multi-source chemical dictionary: the impact on text mining

    PubMed Central

    2010-01-01

    Background Previously, we developed a combined dictionary dubbed Chemlist for the identification of small molecules and drugs in text based on a number of publicly available databases and tested it on an annotated corpus. To achieve an acceptable recall and precision we used a number of automatic and semi-automatic processing steps together with disambiguation rules. However, it remained to be investigated which impact an extensive manual curation of a multi-source chemical dictionary would have on chemical term identification in text. ChemSpider is a chemical database that has undergone extensive manual curation aimed at establishing valid chemical name-to-structure relationships. Results We acquired the component of ChemSpider containing only manually curated names and synonyms. Rule-based term filtering, semi-automatic manual curation, and disambiguation rules were applied. We tested the dictionary from ChemSpider on an annotated corpus and compared the results with those for the Chemlist dictionary. The ChemSpider dictionary of ca. 80 k names was only a 1/3 to a 1/4 the size of Chemlist at around 300 k. The ChemSpider dictionary had a precision of 0.43 and a recall of 0.19 before the application of filtering and disambiguation and a precision of 0.87 and a recall of 0.19 after filtering and disambiguation. The Chemlist dictionary had a precision of 0.20 and a recall of 0.47 before the application of filtering and disambiguation and a precision of 0.67 and a recall of 0.40 after filtering and disambiguation. Conclusions We conclude the following: (1) The ChemSpider dictionary achieved the best precision but the Chemlist dictionary had a higher recall and the best F-score; (2) Rule-based filtering and disambiguation is necessary to achieve a high precision for both the automatically generated and the manually curated dictionary. ChemSpider is available as a web service at http://www.chemspider.com/ and the Chemlist dictionary is freely available as an XML file in Simple Knowledge Organization System format on the web at http://www.biosemantics.org/chemlist. PMID:20331846

  4. An automatic method for segmentation of fission tracks in epidote crystal photomicrographs

    NASA Astrophysics Data System (ADS)

    de Siqueira, Alexandre Fioravante; Nakasuga, Wagner Massayuki; Pagamisse, Aylton; Tello Saenz, Carlos Alberto; Job, Aldo Eloizo

    2014-08-01

    Manual identification of fission tracks has practical problems, such as variation due to observe-observation efficiency. An automatic processing method that could identify fission tracks in a photomicrograph could solve this problem and improve the speed of track counting. However, separation of nontrivial images is one of the most difficult tasks in image processing. Several commercial and free softwares are available, but these softwares are meant to be used in specific images. In this paper, an automatic method based on starlet wavelets is presented in order to separate fission tracks in mineral photomicrographs. Automatization is obtained by the Matthews correlation coefficient, and results are evaluated by precision, recall and accuracy. This technique is an improvement of a method aimed at segmentation of scanning electron microscopy images. This method is applied in photomicrographs of epidote phenocrystals, in which accuracy higher than 89% was obtained in fission track segmentation, even for difficult images. Algorithms corresponding to the proposed method are available for download. Using the method presented here, a user could easily determine fission tracks in photomicrographs of mineral samples.

  5. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the...

  6. Supporting Mediated Peer-Evaluation to Grade Answers to Open-Ended Questions

    ERIC Educational Resources Information Center

    De Marsico, Maria; Sciarrone, Filippo; Sterbini, Andrea; Temperini, Marco

    2017-01-01

    We show an approach to semi-automatic grading of answers given by students to open ended questions (open answers). We use both peer-evaluation and teacher evaluation. A learner is modeled by her Knowledge and her assessments quality (Judgment). The data generated by the peer- and teacher-evaluations, and by the learner models is represented by a…

  7. Relative Recency Judgments in Learning Disabled Children: A Semi-Automatic Process.

    ERIC Educational Resources Information Center

    Stein, Debra K.; And Others

    The ability of 20 learning disabled (LD) and 20 non-LD students (mean age of 9 years) to process temporal order information was assessed by employing a relative recency judgment task. Ss were administered lists composed of pictures of everyday objects and were then asked to indicate which item appeared latest on the list (that is, most recently).…

  8. Semi-automatic 10/20 Identification Method for MRI-Free Probe Placement in Transcranial Brain Mapping Techniques.

    PubMed

    Xiao, Xiang; Zhu, Hao; Liu, Wei-Jie; Yu, Xiao-Ting; Duan, Lian; Li, Zheng; Zhu, Chao-Zhe

    2017-01-01

    The International 10/20 system is an important head-surface-based positioning system for transcranial brain mapping techniques, e.g., fNIRS and TMS. As guidance for probe placement, the 10/20 system permits both proper ROI coverage and spatial consistency among multiple subjects and experiments in a MRI-free context. However, the traditional manual approach to the identification of 10/20 landmarks faces problems in reliability and time cost. In this study, we propose a semi-automatic method to address these problems. First, a novel head surface reconstruction algorithm reconstructs head geometry from a set of points uniformly and sparsely sampled on the subject's head. Second, virtual 10/20 landmarks are determined on the reconstructed head surface in computational space. Finally, a visually-guided real-time navigation system guides the experimenter to each of the identified 10/20 landmarks on the physical head of the subject. Compared with the traditional manual approach, our proposed method provides a significant improvement both in reliability and time cost and thus could contribute to improving both the effectiveness and efficiency of 10/20-guided MRI-free probe placement.

  9. Left ventricular endocardial surface detection based on real-time 3D echocardiographic data

    NASA Technical Reports Server (NTRS)

    Corsi, C.; Borsari, M.; Consegnati, F.; Sarti, A.; Lamberti, C.; Travaglini, A.; Shiota, T.; Thomas, J. D.

    2001-01-01

    OBJECTIVE: A new computerized semi-automatic method for left ventricular (LV) chamber segmentation is presented. METHODS: The LV is imaged by real-time three-dimensional echocardiography (RT3DE). The surface detection model, based on level set techniques, is applied to RT3DE data for image analysis. The modified level set partial differential equation we use is solved by applying numerical methods for conservation laws. The initial conditions are manually established on some slices of the entire volume. The solution obtained for each slice is a contour line corresponding with the boundary between LV cavity and LV endocardium. RESULTS: The mathematical model has been applied to sequences of frames of human hearts (volume range: 34-109 ml) imaged by 2D and reconstructed off-line and RT3DE data. Volume estimation obtained by this new semi-automatic method shows an excellent correlation with those obtained by manual tracing (r = 0.992). Dynamic change of LV volume during the cardiac cycle is also obtained. CONCLUSION: The volume estimation method is accurate; edge based segmentation, image completion and volume reconstruction can be accomplished. The visualization technique also allows to navigate into the reconstructed volume and to display any section of the volume.

  10. A system of regional agricultural land use mapping tested against small scale Apollo 9 color infrared photography of the Imperial Valley (California)

    USGS Publications Warehouse

    Johnson, Claude W.; Browden, Leonard W.; Pease, Robert W.

    1969-01-01

    Interpretation results of the small scale ClR photography of the Imperial Valley (California) taken on March 12, 1969 by the Apollo 9 earth orbiting satellite have shown that world wide agricultural land use mapping can be accomplished from satellite ClR imagery if sufficient a priori information is available for the region being mapped. Correlation of results with actual data is encouraging although the accuracy of identification of specific crops from the single image is poor. The poor results can be partly attributed to only one image taken during mid-season when the three major crops were reflecting approximately the same and their ClR image appears to indicate the same crop type. However, some incapacity can be attributed to lack of understanding of the subtle variations of visual and infrared color reflectance of vegetation and surrounding environment. Analysis of integrated color variations of the vegetation and background environment recorded on ClR imagery is discussed. Problems associated with the color variations may be overcome by development of a semi-automatic processing system which considers individual field units or cells. Design criteria for semi-automatic processing system are outlined.

  11. Methods for Ensuring High Quality of Coding of Cause of Death. The Mortality Register to Follow Southern Urals Populations Exposed to Radiation.

    PubMed

    Startsev, N; Dimov, P; Grosche, B; Tretyakov, F; Schüz, J; Akleyev, A

    2015-01-01

    To follow up populations exposed to several radiation accidents in the Southern Urals, a cause-of-death registry was established at the Urals Center capturing deaths in the Chelyabinsk, Kurgan and Sverdlovsk region since 1950. When registering deaths over such a long time period, quality measures need to be in place to maintain quality and reduce the impact of individual coders as well as quality changes in death certificates. To ensure the uniformity of coding, a method for semi-automatic coding was developed, which is described here. Briefly, the method is based on a dynamic thesaurus, database-supported coding and parallel coding by two different individuals. A comparison of the proposed method for organizing the coding process with the common procedure of coding showed good agreement, with, at the end of the coding process, 70  - 90% agreement for the three-digit ICD -9 rubrics. The semi-automatic method ensures a sufficiently high quality of coding by at the same time providing an opportunity to reduce the labor intensity inherent in the creation of large-volume cause-of-death registries.

  12. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure

    PubMed Central

    Castellazzi, Giovanni; D’Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-01-01

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation. PMID:26225978

  13. Semi-automatic image personalization tool for variable text insertion and replacement

    NASA Astrophysics Data System (ADS)

    Ding, Hengzhou; Bala, Raja; Fan, Zhigang; Eschbach, Reiner; Bouman, Charles A.; Allebach, Jan P.

    2010-02-01

    Image personalization is a widely used technique in personalized marketing,1 in which a vendor attempts to promote new products or retain customers by sending marketing collateral that is tailored to the customers' demographics, needs, and interests. With current solutions of which we are aware such as XMPie,2 DirectSmile,3 and AlphaPicture,4 in order to produce this tailored marketing collateral, image templates need to be created manually by graphic designers, involving complex grid manipulation and detailed geometric adjustments. As a matter of fact, the image template design is highly manual, skill-demanding and costly, and essentially the bottleneck for image personalization. We present a semi-automatic image personalization tool for designing image templates. Two scenarios are considered: text insertion and text replacement, with the text replacement option not offered in current solutions. The graphical user interface (GUI) of the tool is described in detail. Unlike current solutions, the tool renders the text in 3-D, which allows easy adjustment of the text. In particular, the tool has been implemented in Java, which introduces flexible deployment and eliminates the need for any special software or know-how on the part of the end user.

  14. Usefulness of model-based iterative reconstruction in semi-automatic volumetry for ground-glass nodules at ultra-low-dose CT: a phantom study.

    PubMed

    Maruyama, Shuki; Fukushima, Yasuhiro; Miyamae, Yuta; Koizumi, Koji

    2018-06-01

    This study aimed to investigate the effects of parameter presets of the forward projected model-based iterative reconstruction solution (FIRST) on the accuracy of pulmonary nodule volume measurement. A torso phantom with simulated nodules [diameter: 5, 8, 10, and 12 mm; computed tomography (CT) density: - 630 HU] was scanned with a multi-detector CT at tube currents of 10 mA (ultra-low-dose: UL-dose) and 270 mA (standard-dose: Std-dose). Images were reconstructed with filtered back projection [FBP; standard (Std-FBP), ultra-low-dose (UL-FBP)], FIRST Lung (UL-Lung), and FIRST Body (UL-Body), and analyzed with a semi-automatic software. The error in the volume measurement was determined. The errors with UL-Lung and UL-Body were smaller than that with UL-FBP. The smallest error was 5.8% ± 0.3 for the 12-mm nodule with UL-Body (middle lung). Our results indicated that FIRST Body would be superior to FIRST Lung in terms of accuracy of nodule measurement with UL-dose CT.

  15. Semi-Automatic Building Models and FAÇADE Texture Mapping from Mobile Phone Images

    NASA Astrophysics Data System (ADS)

    Jeong, J.; Kim, T.

    2016-06-01

    Research on 3D urban modelling has been actively carried out for a long time. Recently the need of 3D urban modelling research is increased rapidly due to improved geo-web services and popularized smart devices. Nowadays 3D urban models provided by, for example, Google Earth use aerial photos for 3D urban modelling but there are some limitations: immediate update for the change of building models is difficult, many buildings are without 3D model and texture, and large resources for maintaining and updating are inevitable. To resolve the limitations mentioned above, we propose a method for semi-automatic building modelling and façade texture mapping from mobile phone images and analyze the result of modelling with actual measurements. Our method consists of camera geometry estimation step, image matching step, and façade mapping step. Models generated from this method were compared with actual measurement value of real buildings. Ratios of edge length of models and measurements were compared. Result showed 5.8% average error of length ratio. Through this method, we could generate a simple building model with fine façade textures without expensive dedicated tools and dataset.

  16. MOLGENIS/connect: a system for semi-automatic integration of heterogeneous phenotype data with applications in biobanks.

    PubMed

    Pang, Chao; van Enckevort, David; de Haan, Mark; Kelpin, Fleur; Jetten, Jonathan; Hendriksen, Dennis; de Boer, Tommy; Charbon, Bart; Winder, Erwin; van der Velde, K Joeri; Doiron, Dany; Fortier, Isabel; Hillege, Hans; Swertz, Morris A

    2016-07-15

    While the size and number of biobanks, patient registries and other data collections are increasing, biomedical researchers still often need to pool data for statistical power, a task that requires time-intensive retrospective integration. To address this challenge, we developed MOLGENIS/connect, a semi-automatic system to find, match and pool data from different sources. The system shortlists relevant source attributes from thousands of candidates using ontology-based query expansion to overcome variations in terminology. Then it generates algorithms that transform source attributes to a common target DataSchema. These include unit conversion, categorical value matching and complex conversion patterns (e.g. calculation of BMI). In comparison to human-experts, MOLGENIS/connect was able to auto-generate 27% of the algorithms perfectly, with an additional 46% needing only minor editing, representing a reduction in the human effort and expertise needed to pool data. Source code, binaries and documentation are available as open-source under LGPLv3 from http://github.com/molgenis/molgenis and www.molgenis.org/connect : m.a.swertz@rug.nl Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  17. MOLGENIS/connect: a system for semi-automatic integration of heterogeneous phenotype data with applications in biobanks

    PubMed Central

    Pang, Chao; van Enckevort, David; de Haan, Mark; Kelpin, Fleur; Jetten, Jonathan; Hendriksen, Dennis; de Boer, Tommy; Charbon, Bart; Winder, Erwin; van der Velde, K. Joeri; Doiron, Dany; Fortier, Isabel; Hillege, Hans

    2016-01-01

    Motivation: While the size and number of biobanks, patient registries and other data collections are increasing, biomedical researchers still often need to pool data for statistical power, a task that requires time-intensive retrospective integration. Results: To address this challenge, we developed MOLGENIS/connect, a semi-automatic system to find, match and pool data from different sources. The system shortlists relevant source attributes from thousands of candidates using ontology-based query expansion to overcome variations in terminology. Then it generates algorithms that transform source attributes to a common target DataSchema. These include unit conversion, categorical value matching and complex conversion patterns (e.g. calculation of BMI). In comparison to human-experts, MOLGENIS/connect was able to auto-generate 27% of the algorithms perfectly, with an additional 46% needing only minor editing, representing a reduction in the human effort and expertise needed to pool data. Availability and Implementation: Source code, binaries and documentation are available as open-source under LGPLv3 from http://github.com/molgenis/molgenis and www.molgenis.org/connect. Contact: m.a.swertz@rug.nl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153686

  18. Semi-Automatic Terminology Generation for Information Extraction from German Chest X-Ray Reports.

    PubMed

    Krebs, Jonathan; Corovic, Hamo; Dietrich, Georg; Ertl, Max; Fette, Georg; Kaspar, Mathias; Krug, Markus; Stoerk, Stefan; Puppe, Frank

    2017-01-01

    Extraction of structured data from textual reports is an important subtask for building medical data warehouses for research and care. Many medical and most radiology reports are written in a telegraphic style with a concatenation of noun phrases describing the presence or absence of findings. Therefore a lexico-syntactical approach is promising, where key terms and their relations are recognized and mapped on a predefined standard terminology (ontology). We propose a two-phase algorithm for terminology matching: In the first pass, a local terminology for recognition is derived as close as possible to the terms used in the radiology reports. In the second pass, the local terminology is mapped to a standard terminology. In this paper, we report on an algorithm for the first step of semi-automatic generation of the local terminology and evaluate the algorithm with radiology reports of chest X-ray examinations from Würzburg university hospital. With an effort of about 20 hours work of a radiologist as domain expert and 10 hours for meetings, a local terminology with about 250 attributes and various value patterns was built. In an evaluation with 100 randomly chosen reports it achieved an F1-Score of about 95% for information extraction.

  19. Derivation of groundwater flow-paths based on semi-automatic extraction of lineaments from remote sensing data

    NASA Astrophysics Data System (ADS)

    Mallast, U.; Gloaguen, R.; Geyer, S.; Rödiger, T.; Siebert, C.

    2011-08-01

    In this paper we present a semi-automatic method to infer groundwater flow-paths based on the extraction of lineaments from digital elevation models. This method is especially adequate in remote and inaccessible areas where in-situ data are scarce. The combined method of linear filtering and object-based classification provides a lineament map with a high degree of accuracy. Subsequently, lineaments are differentiated into geological and morphological lineaments using auxiliary information and finally evaluated in terms of hydro-geological significance. Using the example of the western catchment of the Dead Sea (Israel/Palestine), the orientation and location of the differentiated lineaments are compared to characteristics of known structural features. We demonstrate that a strong correlation between lineaments and structural features exists. Using Euclidean distances between lineaments and wells provides an assessment criterion to evaluate the hydraulic significance of detected lineaments. Based on this analysis, we suggest that the statistical analysis of lineaments allows a delineation of flow-paths and thus significant information on groundwater movements. To validate the flow-paths we compare them to existing results of groundwater models that are based on well data.

  20. Platelet Counts in Insoluble Platelet-Rich Fibrin Clots: A Direct Method for Accurate Determination.

    PubMed

    Kitamura, Yutaka; Watanabe, Taisuke; Nakamura, Masayuki; Isobe, Kazushige; Kawabata, Hideo; Uematsu, Kohya; Okuda, Kazuhiro; Nakata, Koh; Tanaka, Takaaki; Kawase, Tomoyuki

    2018-01-01

    Platelet-rich fibrin (PRF) clots have been used in regenerative dentistry most often, with the assumption that growth factor levels are concentrated in proportion to the platelet concentration. Platelet counts in PRF are generally determined indirectly by platelet counting in other liquid fractions. This study shows a method for direct estimation of platelet counts in PRF. To validate this method by determination of the recovery rate, whole-blood samples were obtained with an anticoagulant from healthy donors, and platelet-rich plasma (PRP) fractions were clotted with CaCl 2 by centrifugation and digested with tissue-plasminogen activator. Platelet counts were estimated before clotting and after digestion using an automatic hemocytometer. The method was then tested on PRF clots. The quality of platelets was examined by scanning electron microscopy and flow cytometry. In PRP-derived fibrin matrices, the recovery rate of platelets and white blood cells was 91.6 and 74.6%, respectively, after 24 h of digestion. In PRF clots associated with small and large red thrombi, platelet counts were 92.6 and 67.2% of the respective total platelet counts. These findings suggest that our direct method is sufficient for estimating the number of platelets trapped in an insoluble fibrin matrix and for determining that platelets are distributed in PRF clots and red thrombi roughly in proportion to their individual volumes. Therefore, we propose this direct digestion method for more accurate estimation of platelet counts in most types of platelet-enriched fibrin matrix.

Top