Sample records for automatic log sorting

  1. Is it time to revisit the log-sort yard?

    Treesearch

    John Dramm; Gerry Jackson

    2000-01-01

    Log-sort yards provide better utilization and marketing with improved value recovery of currently available timber resources in North America. Log-sort yards provide many services in marketing wood and fiber by concentrating, merchandising, manufacturing, sorting, and adding value to logs. Such operations supply forest products firms with desired raw materials, which...

  2. Review of log sort yards

    Treesearch

    John Rusty Dramm; Gerry L. Jackson; Jenny Wong

    2002-01-01

    This report provides a general overview of current log sort yard operations in the United States, including an extensive literature review and information collected during on-site visits to several operations throughout the nation. Log sort yards provide many services in marketing wood and fiber by concentrating, merchandising, processing, sorting, and adding value to...

  3. Log sort yard economics, planning, and feasibility

    Treesearch

    John Rusty Dramm; Robert Govett; Ted Bilek; Gerry L. Jackson

    2004-01-01

    This publication discusses basic marketing and economic concepts, planning approach, and feasibility methodology for assessing log sort yard operations. Special attention is given to sorting small diameter and underutilized logs from forest restoration, fuels reduction, and thinning operations. A planned programming approach of objectively determining the feasibility...

  4. Financial feasibility of a log sort yard handling small-diameter logs: A preliminary study

    Treesearch

    Han-Sup Han; E. M. (Ted) Bilek; John (Rusty) Dramm; Dan Loeffler; Dave Calkin

    2011-01-01

    The value and use of the trees removed in fuel reduction thinning and restoration treatments could be enhanced if the wood were effectively evaluated and sorted for quality and highest value before delivery to the next manufacturing destination. This article summarizes a preliminary financial feasibility analysis of a log sort yard that would serve as a log market to...

  5. Spin-the-bottle Sort and Annealing Sort: Oblivious Sorting via Round-robin Random Comparisons

    PubMed Central

    Goodrich, Michael T.

    2013-01-01

    We study sorting algorithms based on randomized round-robin comparisons. Specifically, we study Spin-the-bottle sort, where comparisons are unrestricted, and Annealing sort, where comparisons are restricted to a distance bounded by a temperature parameter. Both algorithms are simple, randomized, data-oblivious sorting algorithms, which are useful in privacy-preserving computations, but, as we show, Annealing sort is much more efficient. We show that there is an input permutation that causes Spin-the-bottle sort to require Ω(n2 log n) expected time in order to succeed, and that in O(n2 log n) time this algorithm succeeds with high probability for any input. We also show there is a specification of Annealing sort that runs in O(n log n) time and succeeds with very high probability. PMID:24550575

  6. Automatic Color Sorting of Hardwood Edge-Glued Panel Parts

    Treesearch

    D. Earl Kline; Richard Conners; Qiang Lu; Philip A. Araman

    1997-01-01

    This paper describes an automatic color sorting system for red oak edge-glued panel parts. The color sorting system simultaneously examines both faces of a panel part and then determines which face has the "best" color, and sorts the part into one of a number of color classes at plant production speeds. Initial test results show that the system generated over...

  7. Structure Design and Realization of Rapid Medicine Dispensing System

    NASA Astrophysics Data System (ADS)

    Liu, Xiangquan

    In this paper, the main components and function of rapid medicine dispensing system is analyzed, structure design of automatic feeding device, sloping storeroom, automatic dispensing device and automatic sorting device is completed. The system adopts medicine conveyer working in with manipulator to realize automatic batch supply of the boxed medicine, adopts sloping storeroom as warehouse of medicine to realize dense depositing, adopts dispensing mechanism which includes elevator, turning panel and electric magnet to realize rapid medicine dispensing, adopts sorting conveyor belt and sorting device to send medicine to designated outlet.

  8. Design of mechanical arm for an automatic sorting system of recyclable cans

    NASA Astrophysics Data System (ADS)

    Resti, Y.; Mohruni, A. S.; Burlian, F.; Yani, I.; Amran, A.

    2018-04-01

    The use of a mechanical arm for an automatic sorting system of used cans should be designed carefully. The right design will result in a high precision sorting rate and a short sorting time. The design includes first; design manipulator,second; determine link and joint specifications, and third; build mechanical systems and control systems. This study aims to design the mechanical arm as a hardware system for automatic cans sorting system. The material used for the manipulator is the aluminum plate. The manipulator is designed using 6 links and 6 join where the 6th link is the end effectorand the 6th join is the gripper. As a driving motor used servo motor, while as a microcontroller used Arduino Uno which is connected with Matlab programming language. Based on testing, a mechanical arm designed for this recyclable canned recycling system has a precision sorting rate at 93%, where the average total time required for sorting is 10.82 seconds.

  9. Port-of-entry advanced sorting system (PASS) operational test

    DOT National Transportation Integrated Search

    1998-12-01

    In 1992 the Oregon Department of Transportation undertook an operational test of the Port-of-Entry Advanced Sorting System (PASS), which uses a two-way communication automatic vehicle identification system, integrated with weigh-in-motion, automatic ...

  10. Fast and straightforward analysis approach of charge transport data in single molecule junctions.

    PubMed

    Zhang, Qian; Liu, Chenguang; Tao, Shuhui; Yi, Ruowei; Su, Weitao; Zhao, Cezhou; Zhao, Chun; Dappe, Yannick J; Nichols, Richard J; Yang, Li

    2018-08-10

    In this study, we introduce an efficient data sorting algorithm, including filters for noisy signals, conductance mapping for analyzing the most dominant conductance group and sub-population groups. The capacity of our data analysis process has also been corroborated on real experimental data sets of Au-1,6-hexanedithiol-Au and Au-1,8-octanedithiol-Au molecular junctions. The fully automated and unsupervised program requires less than one minute on a standard PC to sort the data and generate histograms. The resulting one-dimensional and two-dimensional log histograms give conductance values in good agreement with previous studies. Our algorithm is a straightforward, fast and user-friendly tool for single molecule charge transport data analysis. We also analyze the data in a form of a conductance map which can offer evidence for diversity in molecular conductance. The code for automatic data analysis is openly available, well-documented and ready to use, thereby offering a useful new tool for single molecule electronics.

  11. Port-of-entry Advanced Sorting System (PASS) operational test : final report

    DOT National Transportation Integrated Search

    1998-12-01

    In 1992 the Oregon Department of Transportation undertook an operational test of the Port-of-Entry Advanced Sorting System (PASS), which uses a two-way communication automatic vehicle identification system, integrated with weigh-in-motion, automatic ...

  12. Corner detection and sorting method based on improved Harris algorithm in camera calibration

    NASA Astrophysics Data System (ADS)

    Xiao, Ying; Wang, Yonghong; Dan, Xizuo; Huang, Anqi; Hu, Yue; Yang, Lianxiang

    2016-11-01

    In traditional Harris corner detection algorithm, the appropriate threshold which is used to eliminate false corners is selected manually. In order to detect corners automatically, an improved algorithm which combines Harris and circular boundary theory of corners is proposed in this paper. After detecting accurate corner coordinates by using Harris algorithm and Forstner algorithm, false corners within chessboard pattern of the calibration plate can be eliminated automatically by using circular boundary theory. Moreover, a corner sorting method based on an improved calibration plate is proposed to eliminate false background corners and sort remaining corners in order. Experiment results show that the proposed algorithms can eliminate all false corners and sort remaining corners correctly and automatically.

  13. Automatically assisting human memory: a SenseCam browser.

    PubMed

    Doherty, Aiden R; Moulin, Chris J A; Smeaton, Alan F

    2011-10-01

    SenseCams have many potential applications as tools for lifelogging, including the possibility of use as a memory rehabilitation tool. Given that a SenseCam can log hundreds of thousands of images per year, it is critical that these be presented to the viewer in a manner that supports the aims of memory rehabilitation. In this article we report a software browser constructed with the aim of using the characteristics of memory to organise SenseCam images into a form that makes the wealth of information stored on SenseCam more accessible. To enable a large amount of visual information to be easily and quickly assimilated by a user, we apply a series of automatic content analysis techniques to structure the images into "events", suggest their relative importance, and select representative images for each. This minimises effort when browsing and searching. We provide anecdotes on use of such a system and emphasise the need for SenseCam images to be meaningfully sorted using such a browser.

  14. Size-based cell sorting with a resistive pulse sensor and an electromagnetic pump in a microfluidic chip.

    PubMed

    Song, Yongxin; Li, Mengqi; Pan, Xinxiang; Wang, Qi; Li, Dongqing

    2015-02-01

    An electrokinetic microfluidic chip is developed to detect and sort target cells by size from human blood samples. Target-cell detection is achieved by a differential resistive pulse sensor (RPS) based on the size difference between the target cell and other cells. Once a target cell is detected, the detected RPS signal will automatically actuate an electromagnetic pump built in a microchannel to push the target cell into a collecting channel. This method was applied to automatically detect and sort A549 cells and T-lymphocytes from a peripheral fingertip blood sample. The viability of A549 cells sorted in the collecting well was verified by Hoechst33342 and propidium iodide staining. The results show that as many as 100 target cells per minute can be sorted out from the sample solution and thus is particularly suitable for sorting very rare target cells, such as circulating tumor cells. The actuation of the electromagnetic valve has no influence on RPS cell detection and the consequent cell-sorting process. The viability of the collected A549 cell is not impacted by the applied electric field when the cell passes the RPS detection area. The device described in this article is simple, automatic, and label-free and has wide applications in size-based rare target cell sorting for medical diagnostics. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Acoustic sorting models for improved log segregation

    Treesearch

    Xiping Wang; Steve Verrill; Eini Lowell; Robert J. Ross; Vicki L. Herian

    2013-01-01

    In this study, we examined three individual log measures (acoustic velocity, log diameter, and log vertical position in a tree) for their ability to predict average modulus of elasticity (MOE) and grade yield of structural lumber obtained from Douglas-fir (Pseudotsuga menziesii [Mirb. Franco]) logs. We found that log acoustic velocity only had a...

  16. Depth optimal sorting networks resistant to k passive faults

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piotrow, M.

    In this paper, we study the problem of constructing a sorting network that is tolerant to faults and whose running time (i.e. depth) is as small as possible. We consider the scenario of worst-case comparator faults and follow the model of passive comparator failure proposed by Yao and Yao, in which a faulty comparator outputs directly its inputs without comparison. Our main result is the first construction of an N-input, k-fault-tolerant sorting network that is of an asymptotically optimal depth {theta}(log N+k). That improves over the recent result of Leighton and Ma, whose network is of depth O(log N +more » k log log N/log k). Actually, we present a fault-tolerant correction network that can be added after any N-input sorting network to correct its output in the presence of at most k faulty comparators. Since the depth of the network is O(log N + k) and the constants hidden behind the {open_quotes}O{close_quotes} notation are not big, the construction can be of practical use. Developing the techniques necessary to show the main result, we construct a fault-tolerant network for the insertion problem. As a by-product, we get an N-input, O(log N)-depth INSERT-network that is tolerant to random faults, thereby answering a question posed by Ma in his PhD thesis. The results are based on a new notion of constant delay comparator networks, that is, networks in which each register is used (compared) only in a period of time of a constant length. Copies of such networks can be put one after another with only a constant increase in depth per copy.« less

  17. Adaptable Computing Environment/Self-Assembling Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osbourn, Gordon C.; Bouchard, Ann M.; Bartholomew, John W.

    Complex software applications are difficult to learn to use and to remember how to use. Further, the user has no control over the functionality available in a given application. The software we use can be created and modified only by a relatively small group of elite, highly skilled artisans known as programmers. "Normal users" are powerless to create and modify software themselves, because the tools for software development, designed by and for programmers, are a barrier to entry. This software, when completed, will be a user-adaptable computing environment in which the user is really in control of his/her own software,more » able to adapt the system, make new parts of the system interactive, and even modify the behavior of the system itself. Som key features of the basic environment that have been implemented are (a) books in bookcases, where all data is stored, (b) context-sensitive compass menus (compass, because the buttons are located in compass directions relative to the mouose cursor position), (c) importing tabular data and displaying it in a book, (d) light-weight table querying/sorting, (e) a Reach&Get capability (sort of a "smart" copy/paste that prevents the user from copying invalid data), and (f) a LogBook that automatically logs all user actions that change data or the system itself. To bootstrap toward full end-user adaptability, we implemented a set of development tools. With the development tools, compass menus can be made and customized.« less

  18. The interactive electrode localization utility: software for automatic sorting and labeling of intracranial subdural electrodes

    PubMed Central

    Tang, Wei; Peled, Noam; Vallejo, Deborah I.; Borzello, Mia; Dougherty, Darin D.; Eskandar, Emad N.; Widge, Alik S.; Cash, Sydney S.; Stufflebeam, Steven M.

    2018-01-01

    Purpose Existing methods for sorting, labeling, registering, and across-subject localization of electrodes in intracranial encephalography (iEEG) may involve laborious work requiring manual inspection of radiological images. Methods We describe a new open-source software package, the interactive electrode localization utility which presents a full pipeline for the registration, localization, and labeling of iEEG electrodes from CT and MR images. In addition, we describe a method to automatically sort and label electrodes from subdural grids of known geometry. Results We validated our software against manual inspection methods in twelve subjects undergoing iEEG for medically intractable epilepsy. Our algorithm for sorting and labeling performed correct identification on 96% of the electrodes. Conclusions The sorting and labeling methods we describe offer nearly perfect performance and the software package we have distributed may simplify the process of registering, sorting, labeling, and localizing subdural iEEG grid electrodes by manual inspection. PMID:27915398

  19. Improved grading system for structural logs for log homes

    Treesearch

    D.W. Green; T.M. Gorman; J.W. Evans; J.F. Murphy

    2004-01-01

    Current grading standards for logs used in log home construction use visual criteria to sort logs into either “wall logs” or structural logs (round and sawn round timbers). The conservative nature of this grading system, and the grouping of stronger and weaker species for marketing purposes, probably results in the specification of logs with larger diameter than would...

  20. Sorting on STAR. [CDC computer algorithm timing comparison

    NASA Technical Reports Server (NTRS)

    Stone, H. S.

    1978-01-01

    Timing comparisons are given for three sorting algorithms written for the CDC STAR computer. One algorithm is Hoare's (1962) Quicksort, which is the fastest or nearly the fastest sorting algorithm for most computers. A second algorithm is a vector version of Quicksort that takes advantage of the STAR's vector operations. The third algorithm is an adaptation of Batcher's (1968) sorting algorithm, which makes especially good use of vector operations but has a complexity of N(log N)-squared as compared with a complexity of N log N for the Quicksort algorithms. In spite of its worse complexity, Batcher's sorting algorithm is competitive with the serial version of Quicksort for vectors up to the largest that can be treated by STAR. Vector Quicksort outperforms the other two algorithms and is generally preferred. These results indicate that unusual instruction sets can introduce biases in program execution time that counter results predicted by worst-case asymptotic complexity analysis.

  1. Stress wave sorting of red maple logs for structural quality

    Treesearch

    Xiping Wang; Robert J. Ross; David W. Green; Brian Brashaw; Karl Englund; Michael Wolcott

    2004-01-01

    Existing log grading procedures in the United States make only visual assessments of log quality. These procedures do not incorporate estimates of the modulus of elasticity (MOE) of logs. It is questionable whether the visual grading procedures currently used for logs adequately assess the potential quality of structural products manufactured from them, especially...

  2. Nondestructive evaluation for sorting red maple logs

    Treesearch

    Xiping Wang; Robert J. Ross; David W. Green; Karl Englund; Michael Wolcott

    2000-01-01

    Existing log grading procedures in the United States make only visual assessments of log quality. These procedures do not incorporate estimates of the modulus of elasticity (MOE) of logs. It is questionable whether the visual grading procedures currently used for logs adequately assess the potential quality of structural products manufactured from them, especially...

  3. 1997 Hardwood Research Award Winner: "Automatic Color Sorting of Hardwood Edge-Glued Panel Parts"

    Treesearch

    D. Earl Kline; Richard Conners; Qiang Lu; Philip A. Araman

    1997-01-01

    The National Hardwood Lumber Association's 1997 Hardwood Research Award was presented to D. Earl Kline, Richard Conners, Qiang Lu and Philip Araman at the 25th Annual Hardwood Symposium for developing an automatic system for color sorting hardwood edge-glued panel parts. The researchers comprise a team from Virginia Tech University and the USDA Forest Service in...

  4. Machine Vision System for Color Sorting Wood Edge-Glued Panel Parts

    Treesearch

    Qiang Lu; S. Srikanteswara; W. King; T. Drayer; Richard Conners; D. Earl Kline; Philip A. Araman

    1997-01-01

    This paper describes an automatic color sorting system for hardwood edge-glued panel parts. The color sorting system simultaneously examines both faces of a panel part and then determines which face has the "better" color given specified color uniformity and priority defined by management. The real-time color sorting system software and hardware are briefly...

  5. Fast and automatic thermographic material identification for the recycling process

    NASA Astrophysics Data System (ADS)

    Haferkamp, Heinz; Burmester, Ingo

    1998-03-01

    Within the framework of the future closed loop recycling process the automatic and economical sorting of plastics is a decisive element. The at the present time available identification and sorting systems are not yet suitable for the sorting of technical plastics since essential demands, as the realization of high recognition reliability and identification rates considering the variety of technical plastics, can not be guaranteed. Therefore the Laser Zentrum Hannover e.V. in cooperation with the Hoerotron GmbH and the Preussag Noell GmbH has carried out investigations on a rapid thermographic and laser-supported material- identification-system for automatic material-sorting- systems. The automatic identification of different engineering plastics coming from electronic or automotive waste is possible. Identification rates up to 10 parts per second are allowed by the effort from fast IR line scanners. The procedure is based on the following principle: within a few milliseconds a spot on the relevant sample is heated by a CO2 laser. The samples different and specific chemical and physical material properties cause different temperature distributions on their surfaces that are measured by a fast IR-linescan system. This 'thermal impulse response' has to be analyzed by means of a computer system. Investigations have shown that it is possible to analyze more than 18 different sorts of plastics at a frequency of 10 Hz. Crucial for the development of such a system is the rapid processing of imaging data, the minimization of interferences caused by oscillating samples geometries, and a wide range of possible additives in plastics in question. One possible application area is sorting of plastics coming from car- and electronic waste recycling.

  6. Technology to sort lumber by color and grain for furniture parts

    Treesearch

    D. Earl Kline; Richard Conners; Philip A. Araman

    2000-01-01

    This paper describes an automatic color and grain sorting system for wood edge-glued panel parts. The color sorting system simultaneously examines both faces of a panel part and then determines which face has the "best" color, and sorts the part into one of a number of color classes at plant production speeds. In-plant test results show that the system...

  7. Automatic Color Sorting System for Hardwood Edge-Glued Panel Parts

    Treesearch

    Richard W. Conners; D.Earl Kline; Philip A. Araman

    1996-01-01

    The color sorting of edge-glued panel parts is becoming more important in the manufacture of hardwood products. Consumers, while admiring the natural appearance of hardwoods, do not like excessive color variation across product surfaces. Color uniformity is particularly important today because of the popularity of lightly stained products. Unfortunately, color sorting...

  8. Automatic Spike Sorting Using Tuning Information

    PubMed Central

    Ventura, Valérie

    2011-01-01

    Current spike sorting methods focus on clustering neurons’ characteristic spike waveforms. The resulting spike-sorted data are typically used to estimate how covariates of interest modulate the firing rates of neurons. However, when these covariates do modulate the firing rates, they provide information about spikes’ identities, which thus far have been ignored for the purpose of spike sorting. This letter describes a novel approach to spike sorting, which incorporates both waveform information and tuning information obtained from the modulation of firing rates. Because it efficiently uses all the available information, this spike sorter yields lower spike misclassification rates than traditional automatic spike sorters. This theoretical result is verified empirically on several examples. The proposed method does not require additional assumptions; only its implementation is different. It essentially consists of performing spike sorting and tuning estimation simultaneously rather than sequentially, as is currently done. We used an expectation-maximization maximum likelihood algorithm to implement the new spike sorter. We present the general form of this algorithm and provide a detailed implementable version under the assumptions that neurons are independent and spike according to Poisson processes. Finally, we uncover a systematic flaw of spike sorting based on waveform information only. PMID:19548802

  9. Automatic spike sorting using tuning information.

    PubMed

    Ventura, Valérie

    2009-09-01

    Current spike sorting methods focus on clustering neurons' characteristic spike waveforms. The resulting spike-sorted data are typically used to estimate how covariates of interest modulate the firing rates of neurons. However, when these covariates do modulate the firing rates, they provide information about spikes' identities, which thus far have been ignored for the purpose of spike sorting. This letter describes a novel approach to spike sorting, which incorporates both waveform information and tuning information obtained from the modulation of firing rates. Because it efficiently uses all the available information, this spike sorter yields lower spike misclassification rates than traditional automatic spike sorters. This theoretical result is verified empirically on several examples. The proposed method does not require additional assumptions; only its implementation is different. It essentially consists of performing spike sorting and tuning estimation simultaneously rather than sequentially, as is currently done. We used an expectation-maximization maximum likelihood algorithm to implement the new spike sorter. We present the general form of this algorithm and provide a detailed implementable version under the assumptions that neurons are independent and spike according to Poisson processes. Finally, we uncover a systematic flaw of spike sorting based on waveform information only.

  10. Influence of forest and rangeland management on anadromous fish habitat in Western North America: water transportation and storage of logs.

    Treesearch

    J.R. Sedell; W.S. Duval

    1985-01-01

    Environmental effects of water transportation of logs in western North America include the historical driving of logs in rivers and streams, and the current dumping, sorting, transportation, and storage of logs in rivers and estuaries in British Columbia and southeastern Alaska. The historical discussion focuses on habitat losses and volumes of...

  11. An overhead specimen handling system for variable workloads.

    PubMed

    Eggert, A A; Bowers, K L; Smulka, G J; Emmerich, K A; Iwanski, A L; Quam, E F

    1999-02-01

    This unique overhead specimen handling system requires virtually no floor space and only a minimal amount of bench space. It uses state-of-the-art conveyors suspended near the ceiling to transport, log-in and sort blood specimens in standard specimen containers. Specimens placed into the system at bench-level bins are automatically singulated and loaded onto cleated conveyors and lifted to the main conveyor belt near the ceiling. The barcoded labels are then read as the containers are rotated under an optical scanner. The specimens are then diverted to the appropriate branch conveyor and lowered back to the bench level by cleated conveyors. The specimen handling system is rapid and accurate, requires no special containers, allows laboratorians to move unimpeded below it, and is inexpensive by automation standards. Studies show no adverse effect upon the specimens.

  12. A simulation-based approach for evaluating logging residue handling systems.

    Treesearch

    B. Bruce Bare; Benjamin A. Jayne; Brian F. Anholt

    1976-01-01

    Describes a computer simulation model for evaluating logging residue handling systems. The flow of resources is traced through a prespecified combination of operations including yarding, chipping, sorting, loading, transporting, and unloading. The model was used to evaluate the feasibility of converting logging residues to chips that could be used, for example, to...

  13. Parts Color Matching Scanner for Edge Gluing - Research That Works

    Treesearch

    Richard W. Conners; D.Earl Kline; Philip A. Araman

    1996-01-01

    This paper presents an automatic color sorting system for hardwood edge-glued panel parts. The color sorting system simultaneously examines both faces of a panel part and then determines which face has the "best" color given specified color uniformity and priority defined by management. The real-time color sorting system hardware and color matching hardware...

  14. A Quality Sorting of Fruit Using a New Automatic Image Processing Method

    NASA Astrophysics Data System (ADS)

    Amenomori, Michihiro; Yokomizu, Nobuyuki

    This paper presents an innovative approach for quality sorting of objects such as apples sorting in an agricultural factory, using an image processing algorithm. The objective of our approach are; firstly to sort the objects by their colors precisely; secondly to detect any irregularity of the colors surrounding the apples efficiently. An experiment has been conducted and the results have been obtained and compared with that has been preformed by human sorting process and by color sensor sorting devices. The results demonstrate that our approach is capable to sort the objects rapidly and the percentage of classification valid rate was 100 %.

  15. The algorithm for automatic detection of the calibration object

    NASA Astrophysics Data System (ADS)

    Artem, Kruglov; Irina, Ugfeld

    2017-06-01

    The problem of the automatic image calibration is considered in this paper. The most challenging task of the automatic calibration is a proper detection of the calibration object. The solving of this problem required the appliance of the methods and algorithms of the digital image processing, such as morphology, filtering, edge detection, shape approximation. The step-by-step process of the development of the algorithm and its adopting to the specific conditions of the log cuts in the image's background is presented. Testing of the automatic calibration module was carrying out under the conditions of the production process of the logging enterprise. Through the tests the average possibility of the automatic isolating of the calibration object is 86.1% in the absence of the type 1 errors. The algorithm was implemented in the automatic calibration module within the mobile software for the log deck volume measurement.

  16. Extracting the Textual and Temporal Structure of Supercomputing Logs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jain, S; Singh, I; Chandra, A

    2009-05-26

    Supercomputers are prone to frequent faults that adversely affect their performance, reliability and functionality. System logs collected on these systems are a valuable resource of information about their operational status and health. However, their massive size, complexity, and lack of standard format makes it difficult to automatically extract information that can be used to improve system management. In this work we propose a novel method to succinctly represent the contents of supercomputing logs, by using textual clustering to automatically find the syntactic structures of log messages. This information is used to automatically classify messages into semantic groups via an onlinemore » clustering algorithm. Further, we describe a methodology for using the temporal proximity between groups of log messages to identify correlated events in the system. We apply our proposed methods to two large, publicly available supercomputing logs and show that our technique features nearly perfect accuracy for online log-classification and extracts meaningful structural and temporal message patterns that can be used to improve the accuracy of other log analysis techniques.« less

  17. Matching nuts and bolts in O(n log n) time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Komlos, J.; Ma, Yuan; Szemeredi, E.

    Given a set of n nuts of distinct widths and a set of n bolts such that each nut corresponds to a unique bolt of the same width, how should we match every nut with its corresponding bolt by comparing nuts with bolts (no comparison is allowed between two nuts or between two bolts)? The problem can be naturally viewed as a variant of the classic sorting problem as follows. Given two lists of n numbers each such that one list is a permutation of the other, how should we sort the lists by comparisons only between numbers in differentmore » lists? We give an O(n log n)-time deterministic algorithm for the problem. This is optimal up to a constant factor and answers an open question posed by Alon, Blum, Fiat, Kannan, Naor, and Ostrovsky. Moreover, when copies of nuts and bolts are allowed, our algorithm runs in optimal O(log n) time on n processors in Valiant`s parallel comparison tree model. Our algorithm is based on the AKS sorting algorithm with substantial modifications.« less

  18. MAIL LOG, program theory, volume 1. [Scout project automatic data system

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    The program theory used to obtain the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, is described. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG data base consists of three main subfiles: (1) incoming and outgoing mail correspondence; (2) design information releases and reports; and (3) drawings and engineering orders. All subroutine descriptions, flowcharts, and MAIL LOG outputs are given and the data base design is described.

  19. GMP-conformant on-site manufacturing of a CD133+ stem cell product for cardiovascular regeneration.

    PubMed

    Skorska, Anna; Müller, Paula; Gaebel, Ralf; Große, Jana; Lemcke, Heiko; Lux, Cornelia A; Bastian, Manuela; Hausburg, Frauke; Zarniko, Nicole; Bubritzki, Sandra; Ruch, Ulrike; Tiedemann, Gudrun; David, Robert; Steinhoff, Gustav

    2017-02-10

    CD133 + stem cells represent a promising subpopulation for innovative cell-based therapies in cardiovascular regeneration. Several clinical trials have shown remarkable beneficial effects following their intramyocardial transplantation. Yet, the purification of CD133 + stem cells is typically performed in centralized clean room facilities using semi-automatic manufacturing processes based on magnetic cell sorting (MACS®). However, this requires time-consuming and cost-intensive logistics. CD133 + stem cells were purified from patient-derived sternal bone marrow using the recently developed automatic CliniMACS Prodigy® BM-133 System (Prodigy). The entire manufacturing process, as well as the subsequent quality control of the final cell product (CP), were realized on-site and in compliance with EU guidelines for Good Manufacturing Practice. The biological activity of automatically isolated CD133 + cells was evaluated and compared to manually isolated CD133 + cells via functional assays as well as immunofluorescence microscopy. In addition, the regenerative potential of purified stem cells was assessed 3 weeks after transplantation in immunodeficient mice which had been subjected to experimental myocardial infarction. We established for the first time an on-site manufacturing procedure for stem CPs intended for the treatment of ischemic heart diseases using an automatized system. On average, 0.88 × 10 6 viable CD133 + cells with a mean log 10 depletion of 3.23 ± 0.19 of non-target cells were isolated. Furthermore, we demonstrated that these automatically isolated cells bear proliferation and differentiation capacities comparable to manually isolated cells in vitro. Moreover, the automatically generated CP shows equal cardiac regeneration potential in vivo. Our results indicate that the Prodigy is a powerful system for automatic manufacturing of a CD133 + CP within few hours. Compared to conventional manufacturing processes, future clinical application of this system offers multiple benefits including stable CP quality and on-site purification under reduced clean room requirements. This will allow saving of time, reduced logistics and diminished costs.

  20. Moisture content and the properties of lodgepole pine logs in bending and compression parallel to the grain

    Treesearch

    David W. Green; Thomas M. Gorman; Joseph F. Murphy; Matthew B. Wheeler

    2007-01-01

    This study evaluates the effect of moisture content on the properties of 127- to 152.4-mm (5- to 6-in.-) diameter lodgepole pine (Pinus contorta Engelm.) logs that were tested either in bending or in compression parallel to the grain. Lodgepole pine logs were obtained from a dense stand near Seeley Lake, Montana, and sorted into four piles of 30 logs each. Two groups...

  1. Validation of neural spike sorting algorithms without ground-truth information.

    PubMed

    Barnett, Alex H; Magland, Jeremy F; Greengard, Leslie F

    2016-05-01

    The throughput of electrophysiological recording is growing rapidly, allowing thousands of simultaneous channels, and there is a growing variety of spike sorting algorithms designed to extract neural firing events from such data. This creates an urgent need for standardized, automatic evaluation of the quality of neural units output by such algorithms. We introduce a suite of validation metrics that assess the credibility of a given automatic spike sorting algorithm applied to a given dataset. By rerunning the spike sorter two or more times, the metrics measure stability under various perturbations consistent with variations in the data itself, making no assumptions about the internal workings of the algorithm, and minimal assumptions about the noise. We illustrate the new metrics on standard sorting algorithms applied to both in vivo and ex vivo recordings, including a time series with overlapping spikes. We compare the metrics to existing quality measures, and to ground-truth accuracy in simulated time series. We provide a software implementation. Metrics have until now relied on ground-truth, simulated data, internal algorithm variables (e.g. cluster separation), or refractory violations. By contrast, by standardizing the interface, our metrics assess the reliability of any automatic algorithm without reference to internal variables (e.g. feature space) or physiological criteria. Stability is a prerequisite for reproducibility of results. Such metrics could reduce the significant human labor currently spent on validation, and should form an essential part of large-scale automated spike sorting and systematic benchmarking of algorithms. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Lumber grades from Douglas-fir peeler logs.

    Treesearch

    E.E. Matson

    1952-01-01

    Sawmill companies often must decide whether it is more economical to sort and sell peeler logs than to cut them into lumber. If the mill owners have reliable data on the grade of lumber that can be expected from these logs, they will be better prepared to make the decision. The Pacific Northwest Forest and Range Experiment Station has made several lumber grade recovery...

  3. Design of monitoring system for mail-sorting based on the Profibus S7 series PLC

    NASA Astrophysics Data System (ADS)

    Zhang, W.; Jia, S. H.; Wang, Y. H.; Liu, H.; Tang, G. C.

    2017-01-01

    With the rapid development of the postal express, the workload of mail sorting is increasing, but the automatic technology of mail sorting is not mature enough. In view of this, the system uses Siemens S7-300 PLC as the main station controller, PLC of Siemens S7-200/400 is from the station controller, through the man-machine interface configuration software MCGS, PROFIBUS-DP communication, RFID technology and mechanical sorting hand achieve mail classification sorting monitoring. Among them, distinguish mail-sorting by scanning RFID posted in the mail electronic bar code (fixed code), the system uses the corresponding controller on the acquisition of information processing, the processed information transmit to the sorting manipulator by PROFIBUS-DP. The system can realize accurate and efficient mail sorting, which will promote the development of mail sorting technology.

  4. A computationally efficient method for incorporating spike waveform information into decoding algorithms.

    PubMed

    Ventura, Valérie; Todorova, Sonia

    2015-05-01

    Spike-based brain-computer interfaces (BCIs) have the potential to restore motor ability to people with paralysis and amputation, and have shown impressive performance in the lab. To transition BCI devices from the lab to the clinic, decoding must proceed automatically and in real time, which prohibits the use of algorithms that are computationally intensive or require manual tweaking. A common choice is to avoid spike sorting and treat the signal on each electrode as if it came from a single neuron, which is fast, easy, and therefore desirable for clinical use. But this approach ignores the kinematic information provided by individual neurons recorded on the same electrode. The contribution of this letter is a linear decoding model that extracts kinematic information from individual neurons without spike-sorting the electrode signals. The method relies on modeling sample averages of waveform features as functions of kinematics, which is automatic and requires minimal data storage and computation. In offline reconstruction of arm trajectories of a nonhuman primate performing reaching tasks, the proposed method performs as well as decoders based on expertly manually and automatically sorted spikes.

  5. Automatic online spike sorting with singular value decomposition and fuzzy C-mean clustering

    PubMed Central

    2012-01-01

    Background Understanding how neurons contribute to perception, motor functions and cognition requires the reliable detection of spiking activity of individual neurons during a number of different experimental conditions. An important problem in computational neuroscience is thus to develop algorithms to automatically detect and sort the spiking activity of individual neurons from extracellular recordings. While many algorithms for spike sorting exist, the problem of accurate and fast online sorting still remains a challenging issue. Results Here we present a novel software tool, called FSPS (Fuzzy SPike Sorting), which is designed to optimize: (i) fast and accurate detection, (ii) offline sorting and (iii) online classification of neuronal spikes with very limited or null human intervention. The method is based on a combination of Singular Value Decomposition for fast and highly accurate pre-processing of spike shapes, unsupervised Fuzzy C-mean, high-resolution alignment of extracted spike waveforms, optimal selection of the number of features to retain, automatic identification the number of clusters, and quantitative quality assessment of resulting clusters independent on their size. After being trained on a short testing data stream, the method can reliably perform supervised online classification and monitoring of single neuron activity. The generalized procedure has been implemented in our FSPS spike sorting software (available free for non-commercial academic applications at the address: http://www.spikesorting.com) using LabVIEW (National Instruments, USA). We evaluated the performance of our algorithm both on benchmark simulated datasets with different levels of background noise and on real extracellular recordings from premotor cortex of Macaque monkeys. The results of these tests showed an excellent accuracy in discriminating low-amplitude and overlapping spikes under strong background noise. The performance of our method is competitive with respect to other robust spike sorting algorithms. Conclusions This new software provides neuroscience laboratories with a new tool for fast and robust online classification of single neuron activity. This feature could become crucial in situations when online spike detection from multiple electrodes is paramount, such as in human clinical recordings or in brain-computer interfaces. PMID:22871125

  6. Automatic online spike sorting with singular value decomposition and fuzzy C-mean clustering.

    PubMed

    Oliynyk, Andriy; Bonifazzi, Claudio; Montani, Fernando; Fadiga, Luciano

    2012-08-08

    Understanding how neurons contribute to perception, motor functions and cognition requires the reliable detection of spiking activity of individual neurons during a number of different experimental conditions. An important problem in computational neuroscience is thus to develop algorithms to automatically detect and sort the spiking activity of individual neurons from extracellular recordings. While many algorithms for spike sorting exist, the problem of accurate and fast online sorting still remains a challenging issue. Here we present a novel software tool, called FSPS (Fuzzy SPike Sorting), which is designed to optimize: (i) fast and accurate detection, (ii) offline sorting and (iii) online classification of neuronal spikes with very limited or null human intervention. The method is based on a combination of Singular Value Decomposition for fast and highly accurate pre-processing of spike shapes, unsupervised Fuzzy C-mean, high-resolution alignment of extracted spike waveforms, optimal selection of the number of features to retain, automatic identification the number of clusters, and quantitative quality assessment of resulting clusters independent on their size. After being trained on a short testing data stream, the method can reliably perform supervised online classification and monitoring of single neuron activity. The generalized procedure has been implemented in our FSPS spike sorting software (available free for non-commercial academic applications at the address: http://www.spikesorting.com) using LabVIEW (National Instruments, USA). We evaluated the performance of our algorithm both on benchmark simulated datasets with different levels of background noise and on real extracellular recordings from premotor cortex of Macaque monkeys. The results of these tests showed an excellent accuracy in discriminating low-amplitude and overlapping spikes under strong background noise. The performance of our method is competitive with respect to other robust spike sorting algorithms. This new software provides neuroscience laboratories with a new tool for fast and robust online classification of single neuron activity. This feature could become crucial in situations when online spike detection from multiple electrodes is paramount, such as in human clinical recordings or in brain-computer interfaces.

  7. Advanced sorting technologies for optimal wood products and woody biomass utilization

    Treesearch

    Xiping Wang

    2012-01-01

    Forest materials represent great potential for advancing our goals in the 21st century for sustainable building, energy independence, and carbon sequestration. A critical component of an improved system for producing bioproducts and bioenergr from forest materials is the ability to sort trees, stems, and logs into end-product categories that represent their highest...

  8. A new technology for automatic identification and sorting of plastics for recycling.

    PubMed

    Ahmad, S R

    2004-10-01

    A new technology for automatic sorting of plastics, based upon optical identification of fluorescence signatures of dyes, incorporated in such materials in trace concentrations prior to product manufacturing, is described. Three commercial tracers were selected primarily on the basis of their good absorbency in the 310-370 nm spectral band and their identifiable narrow-band fluorescence signatures in the visible band of the spectrum when present in binary combinations. This absorption band was selected because of the availability of strong emission lines in this band from a commercial Hg-arc lamp and high fluorescence quantum yields of the tracers at this excitation wavelength band. The plastics chosen for tracing and identification are HDPE, LDPE, PP, EVA, PVC and PET and the tracers were compatible and chemically non-reactive with the host matrices and did not affect the transparency of the plastics. The design of a monochromatic and collimated excitation source, the sensor system are described and their performances in identifying and sorting plastics doped with tracers at a few parts per million concentration levels are evaluated. In an industrial sorting system, the sensor was able to sort 300 mm long plastic bottles at a conveyor belt speed of 3.5 m.sec(-1) with a sorting purity of -95%. The limitation was imposed due to mechanical singulation irregularities at high speed and the limited processing speed of the computer used.

  9. Challenges in automatic sorting of construction and demolition waste by hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Hollstein, Frank; Cacho, Íñigo; Arnaiz, Sixto; Wohllebe, Markus

    2016-05-01

    EU-28 countries currently generate 460 Mt/year of construction and demolition waste (C&DW) and the generation rate is expected to reach around 570 Mt/year between 2025 and 2030. There is great potential for recycling C&DW materials since they are massively produced and content valuable resources. But new C&DW is more complex than existing one and there is a need for shifting from traditional recycling approaches to novel recycling solutions. One basic step to achieve this objective is an improvement in (automatic) sorting technology. Hyperspectral Imaging is a promising candidate to support the process. However, the industrial distribution of Hyperspectral Imaging in the C&DW recycling branch is currently insufficiently pronounced due to high investment costs, still insufficient robustness of optical sensor hardware in harsh ambient conditions and, because of the need of sensor fusion, not well-engineered special software methods to perform the (on line) sorting tasks. Thereby frame rates of over 300 Hz are needed for a successful sorting result. Currently the biggest challenges with regard to C&DW detection cover the need of overlapping VIS, NIR and SWIR hyperspectral images in time and space, in particular for selective recognition of contaminated particles. In the study on hand a new approach for hyperspectral imagers is presented by exploiting SWIR hyperspectral information in real time (with 300 Hz). The contribution describes both laboratory results with regard to optical detection of the most important C&DW material composites as well as a development path for an industrial implementation in automatic sorting and separation lines. The main focus is placed on the closure of the two recycling circuits "grey to grey" and "red to red" because of their outstanding potential for sustainability in conservation of construction resources.

  10. Ultrasonic Inspection of Wooden Pallet Parts for Grading and Sorting

    Treesearch

    Daniel L. Schmoldt; Michael Morrone; John C. Duke

    1994-01-01

    Wooden pallets are the largest single use of sawn hardwood logs in the USA. Unfortunately, millions of pallets are discarded into landfills annually. High quality wooden pallets, on the other hand, promote longevity and re-use. To build durable pallets requires high quality parts. Manual grading and sorting of pallet parts is not feasible, however, so we are developing...

  11. Diameter sensors for tree-length harvesting systems

    Treesearch

    T.P. McDonald; Robert B. Rummer; T.E. Grift

    2003-01-01

    Most cut-to-length (CTL) harvesters provide sensors for measuring diameter of trees as they are cut and processed. Among other uses, this capability provides a data collection tool for marketing of logs in real time. Logs can be sorted and stacked based on up-to-date market information, then transportation systems optimized to route wood to proper destinations at...

  12. Translations on USSR Military Affairs, Number 1280

    DTIC Science & Technology

    1977-06-17

    engineer, the conclusion was automatic : he is an undisciplined person. However, this idea was totally inconsistent with the image I had developed of V...pro- jectors, trainers, all sorts of simulators, automatic devices, and so forth. As is known, the technical devices for the mass training and...in the equipment and assemblies. In possessing "feedback," within a few seconds they can record and automatically analyze the actions of the

  13. Automated spike sorting algorithm based on Laplacian eigenmaps and k-means clustering.

    PubMed

    Chah, E; Hok, V; Della-Chiesa, A; Miller, J J H; O'Mara, S M; Reilly, R B

    2011-02-01

    This study presents a new automatic spike sorting method based on feature extraction by Laplacian eigenmaps combined with k-means clustering. The performance of the proposed method was compared against previously reported algorithms such as principal component analysis (PCA) and amplitude-based feature extraction. Two types of classifier (namely k-means and classification expectation-maximization) were incorporated within the spike sorting algorithms, in order to find a suitable classifier for the feature sets. Simulated data sets and in-vivo tetrode multichannel recordings were employed to assess the performance of the spike sorting algorithms. The results show that the proposed algorithm yields significantly improved performance with mean sorting accuracy of 73% and sorting error of 10% compared to PCA which combined with k-means had a sorting accuracy of 58% and sorting error of 10%.A correction was made to this article on 22 February 2011. The spacing of the title was amended on the abstract page. No changes were made to the article PDF and the print version was unaffected.

  14. CLARET user's manual: Mainframe Logs. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frobose, R.H.

    1984-11-12

    CLARET (Computer Logging and RETrieval) is a stand-alone PDP 11/23 system that can support 16 terminals. It provides a forms-oriented front end by which operators enter online activity logs for the Lawrence Livermore National Laboratory's OCTOPUS computer network. The logs are stored on the PDP 11/23 disks for later retrieval, and hardcopy reports are generated both automatically and upon request. Online viewing of the current logs is provided to management. As each day's logs are completed, the information is automatically sent to a CRAY and included in an online database system. The terminal used for the CLARET system is amore » dual-port Hewlett Packard 2626 terminal that can be used as either the CLARET logging station or as an independent OCTOPUS terminal. Because this is a stand-alone system, it does not depend on the availability of the OCTOPUS network to run and, in the event of a power failure, can be brought up independently.« less

  15. Noise-robust unsupervised spike sorting based on discriminative subspace learning with outlier handling.

    PubMed

    Keshtkaran, Mohammad Reza; Yang, Zhi

    2017-06-01

    Spike sorting is a fundamental preprocessing step for many neuroscience studies which rely on the analysis of spike trains. Most of the feature extraction and dimensionality reduction techniques that have been used for spike sorting give a projection subspace which is not necessarily the most discriminative one. Therefore, the clusters which appear inherently separable in some discriminative subspace may overlap if projected using conventional feature extraction approaches leading to a poor sorting accuracy especially when the noise level is high. In this paper, we propose a noise-robust and unsupervised spike sorting algorithm based on learning discriminative spike features for clustering. The proposed algorithm uses discriminative subspace learning to extract low dimensional and most discriminative features from the spike waveforms and perform clustering with automatic detection of the number of the clusters. The core part of the algorithm involves iterative subspace selection using linear discriminant analysis and clustering using Gaussian mixture model with outlier detection. A statistical test in the discriminative subspace is proposed to automatically detect the number of the clusters. Comparative results on publicly available simulated and real in vivo datasets demonstrate that our algorithm achieves substantially improved cluster distinction leading to higher sorting accuracy and more reliable detection of clusters which are highly overlapping and not detectable using conventional feature extraction techniques such as principal component analysis or wavelets. By providing more accurate information about the activity of more number of individual neurons with high robustness to neural noise and outliers, the proposed unsupervised spike sorting algorithm facilitates more detailed and accurate analysis of single- and multi-unit activities in neuroscience and brain machine interface studies.

  16. Noise-robust unsupervised spike sorting based on discriminative subspace learning with outlier handling

    NASA Astrophysics Data System (ADS)

    Keshtkaran, Mohammad Reza; Yang, Zhi

    2017-06-01

    Objective. Spike sorting is a fundamental preprocessing step for many neuroscience studies which rely on the analysis of spike trains. Most of the feature extraction and dimensionality reduction techniques that have been used for spike sorting give a projection subspace which is not necessarily the most discriminative one. Therefore, the clusters which appear inherently separable in some discriminative subspace may overlap if projected using conventional feature extraction approaches leading to a poor sorting accuracy especially when the noise level is high. In this paper, we propose a noise-robust and unsupervised spike sorting algorithm based on learning discriminative spike features for clustering. Approach. The proposed algorithm uses discriminative subspace learning to extract low dimensional and most discriminative features from the spike waveforms and perform clustering with automatic detection of the number of the clusters. The core part of the algorithm involves iterative subspace selection using linear discriminant analysis and clustering using Gaussian mixture model with outlier detection. A statistical test in the discriminative subspace is proposed to automatically detect the number of the clusters. Main results. Comparative results on publicly available simulated and real in vivo datasets demonstrate that our algorithm achieves substantially improved cluster distinction leading to higher sorting accuracy and more reliable detection of clusters which are highly overlapping and not detectable using conventional feature extraction techniques such as principal component analysis or wavelets. Significance. By providing more accurate information about the activity of more number of individual neurons with high robustness to neural noise and outliers, the proposed unsupervised spike sorting algorithm facilitates more detailed and accurate analysis of single- and multi-unit activities in neuroscience and brain machine interface studies.

  17. CT Imaging of Hardwood Logs for Lumber Production

    Treesearch

    Daniel L. Schmoldt; Pei Li; A. Lynn Abbott

    1996-01-01

    Hardwood sawmill operators need to improve the conversion of raw material (logs) into lumber. Internal log scanning provides detailed information that can aid log processors in improving lumber recovery. However, scanner data (i.e. tomographic images) need to be analyzed prior to presentation to saw operators. Automatic labeling of computer tomography (CT) images is...

  18. PhySortR: a fast, flexible tool for sorting phylogenetic trees in R.

    PubMed

    Stephens, Timothy G; Bhattacharya, Debashish; Ragan, Mark A; Chan, Cheong Xin

    2016-01-01

    A frequent bottleneck in interpreting phylogenomic output is the need to screen often thousands of trees for features of interest, particularly robust clades of specific taxa, as evidence of monophyletic relationship and/or reticulated evolution. Here we present PhySortR, a fast, flexible R package for classifying phylogenetic trees. Unlike existing utilities, PhySortR allows for identification of both exclusive and non-exclusive clades uniting the target taxa based on tip labels (i.e., leaves) on a tree, with customisable options to assess clades within the context of the whole tree. Using simulated and empirical datasets, we demonstrate the potential and scalability of PhySortR in analysis of thousands of phylogenetic trees without a priori assumption of tree-rooting, and in yielding readily interpretable trees that unambiguously satisfy the query. PhySortR is a command-line tool that is freely available and easily automatable.

  19. Automatic Shifts of Attention in the Dimensional Change Card Sort Task: Subtle Changes in Task Materials Lead to Flexible Switching

    ERIC Educational Resources Information Center

    Fisher, Anna V.

    2011-01-01

    Two experiments tested a hypothesis that reducing demands on executive control in a Dimensional Change Card Sort task will lead to improved performance in 3-year-olds. In Experiment 1, the shape dimension was represented by two dissimilar values ("stars" and "flowers"), and the color dimension was represented by two similar values ("red" and…

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiely, J Blanco; Olszanski, A; Both, S

    Purpose: To develop a quantitative decision making metric for automatically detecting irregular breathing using a large patient population that received phase-sorted 4DCT. Methods: This study employed two patient cohorts. Cohort#1 contained 256 patients who received a phasesorted 4DCT. Cohort#2 contained 86 patients who received three weekly phase-sorted 4DCT scans. A previously published technique used a single abdominal surrogate to calculate the ratio of extreme inhalation tidal volume to normal inhalation tidal volume, referred to as the κ metric. Since a single surrogate is standard for phase-sorted 4DCT in radiation oncology clinical practice, tidal volume was not quantified. Without tidal volume,more » the absolute κ metric could not be determined, so a relative κ (κrel) metric was defined based on the measured surrogate amplitude instead of tidal volume. Receiver operator characteristic (ROC) curves were used to quantitatively determine the optimal cutoff value (jk) and efficiency cutoff value (τk) of κrel to automatically identify irregular breathing that would reduce the image quality of phase-sorted 4DCT. Discriminatory accuracy (area under the ROC curve) of κrel was calculated by a trapezoidal numeric integration technique. Results: The discriminatory accuracy of ?rel was found to be 0.746. The key values of jk and tk were calculated to be 1.45 and 1.72 respectively. For values of ?rel such that jk≤κrel≤τk, the decision to reacquire the 4DCT would be at the discretion of the physician. This accounted for only 11.9% of the patients in this study. The magnitude of κrel held consistent over 3 weeks for 73% of the patients in cohort#3. Conclusion: The decision making metric, ?rel, was shown to be an accurate classifier of irregular breathing patients in a large patient population. This work provided an automatic quantitative decision making metric to quickly and accurately assess the extent to which irregular breathing is occurring during phase-sorted 4DCT.« less

  1. Using recurrence plot analysis for software execution interpretation and fault detection

    NASA Astrophysics Data System (ADS)

    Mosdorf, M.

    2015-09-01

    This paper shows a method targeted at software execution interpretation and fault detection using recurrence plot analysis. In in the proposed approach recurrence plot analysis is applied to software execution trace that contains executed assembly instructions. Results of this analysis are subject to further processing with PCA (Principal Component Analysis) method that simplifies number coefficients used for software execution classification. This method was used for the analysis of five algorithms: Bubble Sort, Quick Sort, Median Filter, FIR, SHA-1. Results show that some of the collected traces could be easily assigned to particular algorithms (logs from Bubble Sort and FIR algorithms) while others are more difficult to distinguish.

  2. High-temperature geothermal cableheads

    NASA Astrophysics Data System (ADS)

    Coquat, J. A.; Eifert, R. W.

    1981-11-01

    Two high temperature, corrosion resistant logging cable heads which use metal seals and a stable fluid to achieve proper electrical terminations and cable sonde interfacings are described. A tensile bar provides a calibrated yield point, and a cone assembly anchors the cable armor to the head. Electrical problems of the sort generally ascribable to the cable sonde interface were absent during demonstration hostile environment loggings in which these cable heads were used.

  3. 36 CFR 223.160 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... an ocean-going vessel. An export yard or pond is an area where sorting and/or bundling of logs for..., or the minimum piece specification set forth in the timber sale contract, in material meeting the...

  4. Spike sorting based upon machine learning algorithms (SOMA).

    PubMed

    Horton, P M; Nicol, A U; Kendrick, K M; Feng, J F

    2007-02-15

    We have developed a spike sorting method, using a combination of various machine learning algorithms, to analyse electrophysiological data and automatically determine the number of sampled neurons from an individual electrode, and discriminate their activities. We discuss extensions to a standard unsupervised learning algorithm (Kohonen), as using a simple application of this technique would only identify a known number of clusters. Our extra techniques automatically identify the number of clusters within the dataset, and their sizes, thereby reducing the chance of misclassification. We also discuss a new pre-processing technique, which transforms the data into a higher dimensional feature space revealing separable clusters. Using principal component analysis (PCA) alone may not achieve this. Our new approach appends the features acquired using PCA with features describing the geometric shapes that constitute a spike waveform. To validate our new spike sorting approach, we have applied it to multi-electrode array datasets acquired from the rat olfactory bulb, and from the sheep infero-temporal cortex, and using simulated data. The SOMA sofware is available at http://www.sussex.ac.uk/Users/pmh20/spikes.

  5. Agricultural produce grading and sorting system using color CCD and new color identification algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Dongsheng; Zou, Jizuo; Yang, Yunping; Dong, Jianhua; Zhang, Yuanxiang

    1996-10-01

    A high-speed automatic agricultural produce grading and sorting system using color CCD and new color identification algorithm has been developed. In a typical application, the system can sort almonds into tow output grades according to their color. Almonds ar rich in 18 kinds of amino acids and 13 kinds of micro minerals and vitamins and can be made into almond drink. In order to ensure the drink quality, almonds must be sorted carefully before being made into a drink. Using this system, almonds can be sorted into two grades: up to grade and below grade almonds or foreign materials. A color CCD inspects the almonds passing on a conveyor of rotating rollers, a color identification algorithm grades almonds and distinguishes foreign materials from almonds. Employing an elaborately designed mechanism, the below grade almonds and foreign materials can be removed effectively from the raw almonds. This system can be easily adapted for inspecting and sorting other kinds of agricultural produce such as peanuts, beans tomatoes and so on.

  6. Laser-aided material identification for the waste sorting process

    NASA Astrophysics Data System (ADS)

    Haferkamp, Heinz; Burmester, Ingo; Engel, Kai

    1994-03-01

    The LZH has carried out investigations in the field of rapid laser-supported material- identification systems for automatic material-sorting systems. The aim of this research is the fast identification of different sorts of plastics coming from recycled rubbish or electronic waste. Within a few milliseconds a spot on the sample which has to be identified is heated with a CO2 laser. The different and specific chemical and physical material properties of the examined sample cause a different temperature distribution on the surface which is measured with an IR thermographic system. This `thermal impulse response' has to be analyzed by means of a computer system. The results of previous investigations have shown that material identification of different sorts of plastics can possibly be done at a frequency of 30 Hz. Due to economic efficiency, a high velocity identification process is necessary to sort huge waste currents.

  7. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection...

  8. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the...

  9. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the...

  10. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records the...

  11. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records the...

  12. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the...

  13. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection...

  14. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records the...

  15. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection...

  16. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the...

  17. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records the...

  18. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the...

  19. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection...

  20. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records the...

  1. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection...

  2. Computer vision for automatic inspection of agricultural produce

    NASA Astrophysics Data System (ADS)

    Molto, Enrique; Blasco, Jose; Benlloch, Jose V.

    1999-01-01

    Fruit and vegetables suffer different manipulations from the field to the final consumer. These are basically oriented towards the cleaning and selection of the product in homogeneous categories. For this reason, several research projects, aimed at fast, adequate produce sorting and quality control are currently under development around the world. Moreover, it is possible to find manual and semi- automatic commercial system capable of reasonably performing these tasks.However, in many cases, their accuracy is incompatible with current European market demands, which are constantly increasing. IVIA, the Valencian Research Institute of Agriculture, located in Spain, has been involved in several European projects related with machine vision for real-time inspection of various agricultural produces. This paper will focus on the work related with two products that have different requirements: fruit and olives. In the case of fruit, the Institute has developed a vision system capable of providing assessment of the external quality of single fruit to a robot that also receives information from other senors. The system use four different views of each fruit and has been tested on peaches, apples and citrus. Processing time of each image is under 500 ms using a conventional PC. The system provides information about primary and secondary color, blemishes and their extension, and stem presence and position, which allows further automatic orientation of the fruit in the final box using a robotic manipulator. Work carried out in olives was devoted to fast sorting of olives for consumption at table. A prototype has been developed to demonstrate the feasibility of a machine vision system capable of automatically sorting 2500 kg/h olives using low-cost conventional hardware.

  3. GSE, data management system programmers/User' manual

    NASA Technical Reports Server (NTRS)

    Schlagheck, R. A.; Dolerhie, B. D., Jr.; Ghiglieri, F. J.

    1974-01-01

    The GSE data management system is a computerized program which provides for a central storage source for key data associated with the mechanical ground support equipment (MGSE). Eight major sort modes can be requested by the user. Attributes that are printed automatically with each sort include the GSE end item number, description, class code, functional code, fluid media, use location, design responsibility, weight, cost, quantity, dimensions, and applicable documents. Multiple subsorts are available for the class code, functional code, fluid media, use location, design responsibility, and applicable document categories. These sorts and how to use them are described. The program and GSE data bank may be easily updated and expanded.

  4. Research and Development of Fully Automatic Alien Smoke Stack and Packaging System

    NASA Astrophysics Data System (ADS)

    Yang, Xudong; Ge, Qingkuan; Peng, Tao; Zuo, Ping; Dong, Weifu

    2017-12-01

    The problem of low efficiency of manual sorting packaging for the current tobacco distribution center, which developed a set of safe efficient and automatic type of alien smoke stack and packaging system. The functions of fully automatic alien smoke stack and packaging system adopt PLC control technology, servo control technology, robot technology, image recognition technology and human-computer interaction technology. The characteristics, principles, control process and key technology of the system are discussed in detail. Through the installation and commissioning fully automatic alien smoke stack and packaging system has a good performance and has completed the requirements for shaped cigarette.

  5. Permeability-porosity relationships in sedimentary rocks

    USGS Publications Warehouse

    Nelson, Philip H.

    1994-01-01

    In many consolidated sandstone and carbonate formations, plots of core data show that the logarithm of permeability (k) is often linearly proportional to porosity (??). The slope, intercept, and degree of scatter of these log(k)-?? trends vary from formation to formation, and these variations are attributed to differences in initial grain size and sorting, diagenetic history, and compaction history. In unconsolidated sands, better sorting systematically increases both permeability and porosity. In sands and sandstones, an increase in gravel and coarse grain size content causes k to increase even while decreasing ??. Diagenetic minerals in the pore space of sandstones, such as cement and some clay types, tend to decrease log(k) proportionately as ?? decreases. Models to predict permeability from porosity and other measurable rock parameters fall into three classes based on either grain, surface area, or pore dimension considerations. (Models that directly incorporate well log measurements but have no particular theoretical underpinnings from a fourth class.) Grain-based models show permeability proportional to the square of grain size times porosity raised to (roughly) the fifth power, with grain sorting as an additional parameter. Surface-area models show permeability proportional to the inverse square of pore surface area times porosity raised to (roughly) the fourth power; measures of surface area include irreducible water saturation and nuclear magnetic resonance. Pore-dimension models show permeability proportional to the square of a pore dimension times porosity raised to a power of (roughly) two and produce curves of constant pore size that transgress the linear data trends on a log(k)-?? plot. The pore dimension is obtained from mercury injection measurements and is interpreted as the pore opening size of some interconnected fraction of the pore system. The linear log(k)-?? data trends cut the curves of constant pore size from the pore-dimension models, which shows that porosity reduction is always accompanied by a reduction in characteristic pore size. The high powers of porosity of the grain-based and surface-area models are required to compensate for the inclusion of the small end of the pore size spectrum.

  6. SAHARA: A package of PC computer programs for estimating both log-hyperbolic grain-size parameters and standard moments

    NASA Astrophysics Data System (ADS)

    Christiansen, Christian; Hartmann, Daniel

    This paper documents a package of menu-driven POLYPASCAL87 computer programs for handling grouped observations data from both sieving (increment data) and settling tube procedures (cumulative data). The package is designed deliberately for use on IBM-compatible personal computers. Two of the programs solve the numerical problem of determining the estimates of the four (main) parameters of the log-hyperbolic distribution and their derivatives. The package also contains a program for determining the mean, sorting, skewness. and kurtosis according to the standard moments. Moreover, the package contains procedures for smoothing and grouping of settling tube data. A graphic part of the package plots the data in a log-log plot together with the estimated log-hyperbolic curve. Along with the plot follows all estimated parameters. Another graphic option is a plot of the log-hyperbolic shape triangle with the (χ,ζ) position of the sample.

  7. 78 FR 20915 - Information Collection Being Reviewed by the Federal Communications Commission Under Delegated...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-08

    ..., focused and limited in scope, and with a clear path to compliance. A waiver request must specify the.... OMB Control Number: 3060-0998. Title: Section 87.109, Station Logs. Form Number: N/A. Type of Review... aeronautical mobile service (IAMS) must maintain a log (written or automatic log) in accordance with the Annex...

  8. Automatic sorting of toxicological information into the IUCLID (International Uniform Chemical Information Database) endpoint-categories making use of the semantic search engine Go3R.

    PubMed

    Sauer, Ursula G; Wächter, Thomas; Hareng, Lars; Wareing, Britta; Langsch, Angelika; Zschunke, Matthias; Alvers, Michael R; Landsiedel, Robert

    2014-06-01

    The knowledge-based search engine Go3R, www.Go3R.org, has been developed to assist scientists from industry and regulatory authorities in collecting comprehensive toxicological information with a special focus on identifying available alternatives to animal testing. The semantic search paradigm of Go3R makes use of expert knowledge on 3Rs methods and regulatory toxicology, laid down in the ontology, a network of concepts, terms, and synonyms, to recognize the contents of documents. Search results are automatically sorted into a dynamic table of contents presented alongside the list of documents retrieved. This table of contents allows the user to quickly filter the set of documents by topics of interest. Documents containing hazard information are automatically assigned to a user interface following the endpoint-specific IUCLID5 categorization scheme required, e.g. for REACH registration dossiers. For this purpose, complex endpoint-specific search queries were compiled and integrated into the search engine (based upon a gold standard of 310 references that had been assigned manually to the different endpoint categories). Go3R sorts 87% of the references concordantly into the respective IUCLID5 categories. Currently, Go3R searches in the 22 million documents available in the PubMed and TOXNET databases. However, it can be customized to search in other databases including in-house databanks. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. COMBATXXI, JDAFS, and LBC Integration Requirements for EASE

    DTIC Science & Technology

    2015-10-06

    process as linear and as new data is made available, any previous analysis is obsolete and has to start the process over again. Figure 2 proposes a...final line of the manifest file names the scenario file associated with the run. Under the usual practice, the analyst now starts the COMBATXXI...describes which events are to be logged. Finally the scenario is started with the click of a button. The simulation generates logs of a couple of sorts

  10. Data for four geologic test holes in the Sacramento Valley, California

    USGS Publications Warehouse

    Berkstresser, C.F.; French, J.J.; Schaal, M.E.

    1985-01-01

    The report provides geological and geophysical data for four of seven test holes drilled as a part of the Central Valley Aquifer Project, which is part of the Regional Aquifer Systems Analysis. The holes were drilled with a rotary well drilling machine to depths of 900 feet in the southwestern part of the Sacramento Valley in Solano and Yolo Counties. Geologic data for each well include lithology, texture, color, character of the contact, sorting, rounding, and cementation, determined from cuttings, cores, and sidewall covers. Fifty cores, 3 feet long, were obtained from each hole, and from eight to fourteen sidewall cores were collected. Geophysical data include a dual-induction log, spherically focused log (SFL), compensated neutron-formation density log, gamma-ray log, and a caliper log. These data are presented in four tables and on four plates. (USGS)

  11. Robust spike sorting of retinal ganglion cells tuned to spot stimuli.

    PubMed

    Ghahari, Alireza; Badea, Tudor C

    2016-08-01

    We propose an automatic spike sorting approach for the data recorded from a microelectrode array during visual stimulation of wild type retinas with tiled spot stimuli. The approach first detects individual spikes per electrode by their signature local minima. With the mixture probability distribution of the local minima estimated afterwards, it applies a minimum-squared-error clustering algorithm to sort the spikes into different clusters. A template waveform for each cluster per electrode is defined, and a number of reliability tests are performed on it and its corresponding spikes. Finally, a divisive hierarchical clustering algorithm is used to deal with the correlated templates per cluster type across all the electrodes. According to the measures of performance of the spike sorting approach, it is robust even in the cases of recordings with low signal-to-noise ratio.

  12. VizieR Online Data Catalog: Astron low resolution UV spectra (Boyarchuk+, 1994)

    NASA Astrophysics Data System (ADS)

    Boyarchuk, A. A.

    2017-05-01

    Astron was a Soviet spacecraft launched on 23 March 1983, and it was operational for eight years as the largest ultraviolet space telescope during its lifetime. Astron's payload consisted of an 80 cm ultraviolet telescope Spica and an X-ray spectroscope. We present 159 low resolution spectra of stars obtained during the Astron space mission (Tables 4, 5; hereafter table numbers in Boyarchuk et al. 1994 are given). Table 4 (observational log, logs.dat) contains data on 142 sessions for 90 stars (sorted in ascending order of RA), where SED was obtained by scanning method, and then data on 17 sessions for 15 stars (also sorted in ascending order of RA), where multicolor photometry was done. Kilpio et al. (2016, Baltic Astronomy 25, 23) presented results of the comparison of Astron data to the modern UV stellar data, discussed Astron precision and accuracy, and made some conclusions on potential application areas of these data. Also 34 sessions of observations of 27 stellar systems (galaxies and globular clusters) are presented. Observational log was published in Table 10 and data were published in Table 11, respectively. Also 16 sessions of observations of 12 nebulae (Table 12 for observational log and Table 13 for data themselves) are presented. Background radiation intensity data (Table 14) are presented in Table 15. At last, data on comets are presented in different forms. We draw your attention that observational data for stars, stellar systems, nebulae and comets are expressed in log [erg/s/cm^2/A], while for comets data 10E-13 erg/s/cm^2/A units are used, hydroxyl band photometric data for comets are expressed in log [erg/s/cm^2], and for the background data it is radiation intensity expressed in log [erg/s/cm^2/A/sr]. Scanned (PDF version of) Boyarchuk et al. (1994) book is available at http://www.inasan.ru/~astron/astron.pdf (12 data files).

  13. ADMAP (automatic data manipulation program)

    NASA Technical Reports Server (NTRS)

    Mann, F. I.

    1971-01-01

    Instructions are presented on the use of ADMAP, (automatic data manipulation program) an aerospace data manipulation computer program. The program was developed to aid in processing, reducing, plotting, and publishing electric propulsion trajectory data generated by the low thrust optimization program, HILTOP. The program has the option of generating SC4020 electric plots, and therefore requires the SC4020 routines to be available at excution time (even if not used). Several general routines are present, including a cubic spline interpolation routine, electric plotter dash line drawing routine, and single parameter and double parameter sorting routines. Many routines are tailored for the manipulation and plotting of electric propulsion data, including an automatic scale selection routine, an automatic curve labelling routine, and an automatic graph titling routine. Data are accepted from either punched cards or magnetic tape.

  14. MO-F-CAMPUS-I-01: A System for Automatically Calculating Organ and Effective Dose for Fluoroscopically-Guided Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Z; Vijayan, S; Rana, V

    2015-06-15

    Purpose: A system was developed that automatically calculates the organ and effective dose for individual fluoroscopically-guided procedures using a log of the clinical exposure parameters. Methods: We have previously developed a dose tracking system (DTS) to provide a real-time color-coded 3D- mapping of skin dose. This software produces a log file of all geometry and exposure parameters for every x-ray pulse during a procedure. The data in the log files is input into PCXMC, a Monte Carlo program that calculates organ and effective dose for projections and exposure parameters set by the user. We developed a MATLAB program to readmore » data from the log files produced by the DTS and to automatically generate the definition files in the format used by PCXMC. The processing is done at the end of a procedure after all exposures are completed. Since there are thousands of exposure pulses with various parameters for fluoroscopy, DA and DSA and at various projections, the data for exposures with similar parameters is grouped prior to entry into PCXMC to reduce the number of Monte Carlo calculations that need to be performed. Results: The software developed automatically transfers data from the DTS log file to PCXMC and runs the program for each grouping of exposure pulses. When the dose from all exposure events are calculated, the doses for each organ and all effective doses are summed to obtain procedure totals. For a complicated interventional procedure, the calculations can be completed on a PC without manual intervention in less than 30 minutes depending on the level of data grouping. Conclusion: This system allows organ dose to be calculated for individual procedures for every patient without tedious calculations or data entry so that estimates of stochastic risk can be obtained in addition to the deterministic risk estimate provided by the DTS. Partial support from NIH grant R01EB002873 and Toshiba Medical Systems Corp.« less

  15. To sort or not to sort: the impact of spike-sorting on neural decoding performance.

    PubMed

    Todorova, Sonia; Sadtler, Patrick; Batista, Aaron; Chase, Steven; Ventura, Valérie

    2014-10-01

    Brain-computer interfaces (BCIs) are a promising technology for restoring motor ability to paralyzed patients. Spiking-based BCIs have successfully been used in clinical trials to control multi-degree-of-freedom robotic devices. Current implementations of these devices require a lengthy spike-sorting step, which is an obstacle to moving this technology from the lab to the clinic. A viable alternative is to avoid spike-sorting, treating all threshold crossings of the voltage waveform on an electrode as coming from one putative neuron. It is not known, however, how much decoding information might be lost by ignoring spike identity. We present a full analysis of the effects of spike-sorting schemes on decoding performance. Specifically, we compare how well two common decoders, the optimal linear estimator and the Kalman filter, reconstruct the arm movements of non-human primates performing reaching tasks, when receiving input from various sorting schemes. The schemes we tested included: using threshold crossings without spike-sorting; expert-sorting discarding the noise; expert-sorting, including the noise as if it were another neuron; and automatic spike-sorting using waveform features. We also decoded from a joint statistical model for the waveforms and tuning curves, which does not involve an explicit spike-sorting step. Discarding the threshold crossings that cannot be assigned to neurons degrades decoding: no spikes should be discarded. Decoding based on spike-sorted units outperforms decoding based on electrodes voltage crossings: spike-sorting is useful. The four waveform based spike-sorting methods tested here yield similar decoding efficiencies: a fast and simple method is competitive. Decoding using the joint waveform and tuning model shows promise but is not consistently superior. Our results indicate that simple automated spike-sorting performs as well as the more computationally or manually intensive methods used here. Even basic spike-sorting adds value to the low-threshold waveform-crossing methods often employed in BCI decoding.

  16. To sort or not to sort: the impact of spike-sorting on neural decoding performance

    NASA Astrophysics Data System (ADS)

    Todorova, Sonia; Sadtler, Patrick; Batista, Aaron; Chase, Steven; Ventura, Valérie

    2014-10-01

    Objective. Brain-computer interfaces (BCIs) are a promising technology for restoring motor ability to paralyzed patients. Spiking-based BCIs have successfully been used in clinical trials to control multi-degree-of-freedom robotic devices. Current implementations of these devices require a lengthy spike-sorting step, which is an obstacle to moving this technology from the lab to the clinic. A viable alternative is to avoid spike-sorting, treating all threshold crossings of the voltage waveform on an electrode as coming from one putative neuron. It is not known, however, how much decoding information might be lost by ignoring spike identity. Approach. We present a full analysis of the effects of spike-sorting schemes on decoding performance. Specifically, we compare how well two common decoders, the optimal linear estimator and the Kalman filter, reconstruct the arm movements of non-human primates performing reaching tasks, when receiving input from various sorting schemes. The schemes we tested included: using threshold crossings without spike-sorting; expert-sorting discarding the noise; expert-sorting, including the noise as if it were another neuron; and automatic spike-sorting using waveform features. We also decoded from a joint statistical model for the waveforms and tuning curves, which does not involve an explicit spike-sorting step. Main results. Discarding the threshold crossings that cannot be assigned to neurons degrades decoding: no spikes should be discarded. Decoding based on spike-sorted units outperforms decoding based on electrodes voltage crossings: spike-sorting is useful. The four waveform based spike-sorting methods tested here yield similar decoding efficiencies: a fast and simple method is competitive. Decoding using the joint waveform and tuning model shows promise but is not consistently superior. Significance. Our results indicate that simple automated spike-sorting performs as well as the more computationally or manually intensive methods used here. Even basic spike-sorting adds value to the low-threshold waveform-crossing methods often employed in BCI decoding.

  17. Comparison of spike-sorting algorithms for future hardware implementation.

    PubMed

    Gibson, Sarah; Judy, Jack W; Markovic, Dejan

    2008-01-01

    Applications such as brain-machine interfaces require hardware spike sorting in order to (1) obtain single-unit activity and (2) perform data reduction for wireless transmission of data. Such systems must be low-power, low-area, high-accuracy, automatic, and able to operate in real time. Several detection and feature extraction algorithms for spike sorting are described briefly and evaluated in terms of accuracy versus computational complexity. The nonlinear energy operator method is chosen as the optimal spike detection algorithm, being most robust over noise and relatively simple. The discrete derivatives method [1] is chosen as the optimal feature extraction method, maintaining high accuracy across SNRs with a complexity orders of magnitude less than that of traditional methods such as PCA.

  18. Automatically Log Off Upon Disappearance of Facial Image

    DTIC Science & Technology

    2005-03-01

    log off a PC when the user’s face disappears for an adjustable time interval. Among the fundamental technologies of biometrics, facial recognition is... facial recognition products. In this report, a brief overview of face detection technologies is provided. The particular neural network-based face...ensure that the user logging onto the system is the same person. Among the fundamental technologies of biometrics, facial recognition is the only

  19. Raman-activated cell sorting based on dielectrophoretic single-cell trap and release.

    PubMed

    Zhang, Peiran; Ren, Lihui; Zhang, Xu; Shan, Yufei; Wang, Yun; Ji, Yuetong; Yin, Huabing; Huang, Wei E; Xu, Jian; Ma, Bo

    2015-02-17

    Raman-activated cell sorting (RACS) is a promising single-cell technology that holds several significant advantages, as RACS is label-free, information-rich, and potentially in situ. To date, the ability of the technique to identify single cells in a high-speed flow has been limited by inherent weakness of the spontaneous Raman signal. Here we present an alternative pause-and-sort RACS microfluidic system that combines positive dielectrophoresis (pDEP) for single-cell trap and release with a solenoid-valve-suction-based switch for cell separation. This has allowed the integration of trapping, Raman identification, and automatic separation of individual cells in a high-speed flow. By exerting a periodical pDEP field, single cells were trapped, ordered, and positioned individually to the detection point for Raman measurement. As a proof-of-concept demonstration, a mixture of two cell strains containing carotenoid-producing yeast (9%) and non-carotenoid-producing Saccharomyces cerevisiae (91%) was sorted, which enriched the former to 73% on average and showed a fast Raman-activated cell sorting at the subsecond level.

  20. IoT in Radiology: Using Raspberry Pi to Automatically Log Telephone Calls in the Reading Room.

    PubMed

    Chen, Po-Hao; Cross, Nathan

    2018-05-03

    The work environment for medical imaging such as distractions, ergonomics, distance, temperature, humidity, and lighting conditions generates a paucity of data and is difficult to analyze. The emergence of Internet of Things (IoT) with decreasing cost of single-board computers like Raspberry Pi makes creating customized hardware to collect data from the clinical environment within the reach of a clinical imaging informaticist. This article will walk the reader through a series of basic project using a variety sensors and devices in conjunction with a Pi to gather data, culminating in a complex example designed to automatically detect and log telephone calls.

  1. Adipose Tissue-Derived Pericytes for Cartilage Tissue Engineering.

    PubMed

    Zhang, Jinxin; Du, Chunyan; Guo, Weimin; Li, Pan; Liu, Shuyun; Yuan, Zhiguo; Yang, Jianhua; Sun, Xun; Yin, Heyong; Guo, Quanyi; Zhou, Chenfu

    2017-01-01

    Mesenchymal stem cells (MSCs) represent a promising alternative source for cartilage tissue engineering. However, MSC culture is labor-intensive, so these cells cannot be applied immediately to regenerate cartilage for clinical purposes. Risks during the ex vivo expansion of MSCs, such as infection and immunogenicity, can be a bottleneck in their use in clinical tissue engineering. As a novel stem cell source, pericytes are generally considered to be the origin of MSCs. Pericytes do not have to undergo time-consuming ex vivo expansion because they are uncultured cells. Adipose tissue is another optimal stem cell reservoir. Because adipose tissue is well vascularized, a considerable number of pericytes are located around blood vessels in this accessible and dispensable tissue, and autologous pericytes can be applied immediately for cartilage regeneration. Thus, we suggest that adipose tissue-derived pericytes are promising seed cells for cartilage regeneration. Many studies have been performed to develop isolation methods for the adipose tissuederived stromal vascular fraction (AT-SVF) using lipoaspiration and sorting pericytes from AT-SVF. These methods are useful for sorting a large number of viable pericytes for clinical therapy after being combined with automatic isolation using an SVF device and automatic magnetic-activated cell sorting. These tools should help to develop one-step surgery for repairing cartilage damage. However, the use of adipose tissue-derived pericytes as a cell source for cartilage tissue engineering has not drawn sufficient attention and preclinical studies are needed to improve cell purity, to increase sorting efficiency, and to assess safety issues of clinical applications. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  2. Design and development of an automatic data acquisition system for a balance study using a smartcard system.

    PubMed

    Ambrozy, C; Kolar, N A; Rattay, F

    2010-01-01

    For measurement value logging of board angle values during balance training, it is necessary to develop a measurement system. This study will provide data for a balance study using the smartcard. The data acquisition comes automatically. An individually training plan for each proband is necessary. To store the proband identification a smartcard with an I2C data bus protocol and an E2PROM memory system is used. For reading the smartcard data a smartcard reader is connected via universal serial bus (USB) to a notebook. The data acquisition and smartcard read programme is designed with Microsoft® Visual C#. A training plan file contains the individual training plan for each proband. The data of the test persons are saved in a proband directory. Each event is automatically saved as a log-file for the exact documentation. This system makes study development easy and time-saving.

  3. Log Defect Recognition Using CT-images and Neural Net Classifiers

    Treesearch

    Daniel L. Schmoldt; Pei Li; A. Lynn Abbott

    1995-01-01

    Although several approaches have been introduced to automatically identify internal log defects using computed tomography (CT) imagery, most of these have been feasibility efforts and consequently have had several limitations: (1) reports of classification accuracy are largely subjective, not statistical, (2) there has been no attempt to achieve real-time operation,...

  4. Logs Analysis of Adapted Pedagogical Scenarios Generated by a Simulation Serious Game Architecture

    ERIC Educational Resources Information Center

    Callies, Sophie; Gravel, Mathieu; Beaudry, Eric; Basque, Josianne

    2017-01-01

    This paper presents an architecture designed for simulation serious games, which automatically generates game-based scenarios adapted to learner's learning progression. We present three central modules of the architecture: (1) the learner model, (2) the adaptation module and (3) the logs module. The learner model estimates the progression of the…

  5. A Language-Independent Approach to Automatic Text Difficulty Assessment for Second-Language Learners

    DTIC Science & Technology

    2013-08-01

    best-suited for regression. Our baseline uses z-normalized shallow length features and TF -LOG weighted vectors on bag-of-words for Arabic, Dari...length features and TF -LOG weighted vectors on bag-of-words for Arabic, Dari, English and Pashto. We compare Support Vector Machines and the Margin...football, whereas they are much less common in documents about opera). We used TF -LOG weighted word frequencies on bag-of-words for each document

  6. Interpreting Abstract Interpretations in Membership Equational Logic

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd; Rosu, Grigore

    2001-01-01

    We present a logical framework in which abstract interpretations can be naturally specified and then verified. Our approach is based on membership equational logic which extends equational logics by membership axioms, asserting that a term has a certain sort. We represent an abstract interpretation as a membership equational logic specification, usually as an overloaded order-sorted signature with membership axioms. It turns out that, for any term, its least sort over this specification corresponds to its most concrete abstract value. Maude implements membership equational logic and provides mechanisms to calculate the least sort of a term efficiently. We first show how Maude can be used to get prototyping of abstract interpretations "for free." Building on the meta-logic facilities of Maude, we further develop a tool that automatically checks and abstract interpretation against a set of user-defined properties. This can be used to select an appropriate abstract interpretation, to characterize the specified loss of information during abstraction, and to compare different abstractions with each other.

  7. Automated lithology prediction from PGNAA and other geophysical logs.

    PubMed

    Borsaru, M; Zhou, B; Aizawa, T; Karashima, H; Hashimoto, T

    2006-02-01

    Different methods of lithology predictions from geophysical data have been developed in the last 15 years. The geophysical logs used for predicting lithology are the conventional logs: sonic, neutron-neutron, gamma (total natural-gamma) and density (backscattered gamma-gamma). The prompt gamma neutron activation analysis (PGNAA) is another established geophysical logging technique for in situ element analysis of rocks in boreholes. The work described in this paper was carried out to investigate the application of PGNAA to the lithology interpretation. The data interpretation was conducted using the automatic interpretation program LogTrans based on statistical analysis. Limited test suggests that PGNAA logging data can be used to predict the lithology. A success rate of 73% for lithology prediction was achieved from PGNAA logging data only. It can also be used in conjunction with the conventional geophysical logs to enhance the lithology prediction.

  8. SiC/Si diode trigger circuit provides automatic range switching for log amplifier

    NASA Technical Reports Server (NTRS)

    1967-01-01

    SiC/Si diode pair provides automatic range change to extend the operating range of a logarithmic amplifier-conversion circuit and assures stability at or near the range switch-over point. the diode provides hysteresis for a trigger circuit that actuates a relay at the desired range extension point.

  9. Contextual Computing: A Bluetooth based approach for tracking healthcare providers in the emergency room.

    PubMed

    Frisby, Joshua; Smith, Vernon; Traub, Stephen; Patel, Vimla L

    2017-01-01

    Hospital Emergency Departments (EDs) frequently experience crowding. One of the factors that contributes to this crowding is the "door to doctor time", which is the time from a patient's registration to when the patient is first seen by a physician. This is also one of the Meaningful Use (MU) performance measures that emergency departments report to the Center for Medicare and Medicaid Services (CMS). Current documentation methods for this measure are inaccurate due to the imprecision in manual data collection. We describe a method for automatically (in real time) and more accurately documenting the door to physician time. Using sensor-based technology, the distance between the physician and the computer is calculated by using the single board computers installed in patient rooms that log each time a Bluetooth signal is seen from a device that the physicians carry. This distance is compared automatically with the accepted room radius to determine if the physicians are present in the room at the time logged to provide greater precision. The logged times, accurate to the second, were compared with physicians' handwritten times, showing automatic recordings to be more precise. This real time automatic method will free the physician from extra cognitive load of manually recording data. This method for evaluation of performance is generic and can be used in any other setting outside the ED, and for purposes other than measuring physician time. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.

    PubMed

    Zhang, Bo; Dai, Ji; Zhang, Tao

    2017-11-13

    In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided by NeoAnalysis, users can easily obtain publication-quality figures without writing complex codes. NeoAnalysis is a powerful and valuable toolbox for users doing electrophysiological experiments.

  11. A Comparison of Several Artificial Neural Network Classifiers for CT Images of Hardwood Logs

    Treesearch

    Daniel L. Schmoldt; Jing He; A. Lynn Abbott

    1998-01-01

    Knowledge of internal log defects, obtained by scanning, is critical to efficiency improvements for future hardwood sawmills. Nevertheless, before computed tomography (CT) scanning can be applied in industrial operations, we need to automatically interpret scan information so that it can provide the saw operator with the information necessary to make proper sawing...

  12. Nondestructive Evaluation of Hardwood Logs Using Automated Interpretation of CT Images

    Treesearch

    Daniel L. Schmoldt; Dongping Zhu; Richard W. Conners

    1993-01-01

    Computed tomography (CT) imaging is being used to examine the internal structure of hardwood logs. The following steps are used to automatically interpret CT images: (1) preprocessing to remove unwanted portions of the image, e.g., annual ring structure, (2) image-by-image segmentation to produce relatively homogeneous image areas, (3) volume growing to create volumes...

  13. Reliable Analysis of Single-Unit Recordings from the Human Brain under Noisy Conditions: Tracking Neurons over Hours

    PubMed Central

    Boström, Jan; Elger, Christian E.; Mormann, Florian

    2016-01-01

    Recording extracellulary from neurons in the brains of animals in vivo is among the most established experimental techniques in neuroscience, and has recently become feasible in humans. Many interesting scientific questions can be addressed only when extracellular recordings last several hours, and when individual neurons are tracked throughout the entire recording. Such questions regard, for example, neuronal mechanisms of learning and memory consolidation, and the generation of epileptic seizures. Several difficulties have so far limited the use of extracellular multi-hour recordings in neuroscience: Datasets become huge, and data are necessarily noisy in clinical recording environments. No methods for spike sorting of such recordings have been available. Spike sorting refers to the process of identifying the contributions of several neurons to the signal recorded in one electrode. To overcome these difficulties, we developed Combinato: a complete data-analysis framework for spike sorting in noisy recordings lasting twelve hours or more. Our framework includes software for artifact rejection, automatic spike sorting, manual optimization, and efficient visualization of results. Our completely automatic framework excels at two tasks: It outperforms existing methods when tested on simulated and real data, and it enables researchers to analyze multi-hour recordings. We evaluated our methods on both short and multi-hour simulated datasets. To evaluate the performance of our methods in an actual neuroscientific experiment, we used data from from neurosurgical patients, recorded in order to identify visually responsive neurons in the medial temporal lobe. These neurons responded to the semantic content, rather than to visual features, of a given stimulus. To test our methods with multi-hour recordings, we made use of neurons in the human medial temporal lobe that respond selectively to the same stimulus in the evening and next morning. PMID:27930664

  14. A Dual-Range Strain Gage Weighing Transducer Employing Automatic Switching

    Treesearch

    Rodger A. Arola

    1968-01-01

    Describes a dual-range strain gage transducer which has proven to be an excellent weight-sensing device for weighing trees and tree-length logs; discusses basic principals of the design and operation; and shows that a single transducer having two sensitivity ranges with automatic internal switching can sense weight with good repeatability and that one calibration curve...

  15. SU-E-T-142: Automatic Linac Log File: Analysis and Reporting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gainey, M; Rothe, T

    Purpose: End to end QA for IMRT/VMAT is time consuming. Automated linac log file analysis and recalculation of daily recorded fluence, and hence dose, distribution bring this closer. Methods: Matlab (R2014b, Mathworks) software was written to read in and analyse IMRT/VMAT trajectory log files (TrueBeam 1.5, Varian Medical Systems) overnight, and are archived on a backed-up network drive (figure). A summary report (PDF) is sent by email to the duty linac physicist. A structured summary report (PDF) for each patient is automatically updated for embedding into the R&V system (Mosaiq 2.5, Elekta AG). The report contains cross-referenced hyperlinks to easemore » navigation between treatment fractions. Gamma analysis can be performed on planned (DICOM RTPlan) and treated (trajectory log) fluence distributions. Trajectory log files can be converted into RTPlan files for dose distribution calculation (Eclipse, AAA10.0.28, VMS). Results: All leaf positions are within +/−0.10mm: 57% within +/−0.01mm; 89% within 0.05mm. Mean leaf position deviation is 0.02mm. Gantry angle variations lie in the range −0.1 to 0.3 degrees, mean 0.04 degrees. Fluence verification shows excellent agreement between planned and treated fluence. Agreement between planned and treated dose distribution, the derived from log files, is very good. Conclusion: Automated log file analysis is a valuable tool for the busy physicist, enabling potential treated fluence distribution errors to be quickly identified. In the near future we will correlate trajectory log analysis with routine IMRT/VMAT QA analysis. This has the potential to reduce, but not eliminate, the QA workload.« less

  16. Semi-Automatic Determination of Citation Relevancy: User Evaluation.

    ERIC Educational Resources Information Center

    Huffman, G. David

    1990-01-01

    Discussion of online bibliographic database searches focuses on a software system, SORT-AID/SABRE, that ranks retrieved citations in terms of relevance. Results of a comprehensive user evaluation of the relevance ranking procedure to determine its effectiveness are presented, and implications for future work are suggested. (10 references) (LRW)

  17. Predicting Correctness of Problem Solving from Low-Level Log Data in Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Cetintas, Suleyman; Si, Luo; Xin, Yan Ping; Hord, Casey

    2009-01-01

    This paper proposes a learning based method that can automatically determine how likely a student is to give a correct answer to a problem in an intelligent tutoring system. Only log files that record students' actions with the system are used to train the model, therefore the modeling process doesn't require expert knowledge for identifying…

  18. Progress in analysis of computed tomography (CT) images of hardwood logs for defect detection

    Treesearch

    Erol Sarigul; A. Lynn Abbott; Daniel L. Schmoldt

    2003-01-01

    This paper addresses the problem of automatically detecting internal defects in logs using computed tomography (CT) images. The overall purpose is to assist in breakdown optimization. Several studies have shown that the commercial value of resulting boards can be increased substantially if defect locations are known in advance, and if this information is used to make...

  19. Labeling Defects in CT Images of Hardwood Logs with Species-Dependent and Species-Independent Classifiers

    Treesearch

    Pei Li; Jing He; A. Lynn Abbott; Daniel L. Schmoldt

    1996-01-01

    This paper analyses computed tomography (CT) images of hardwood logs, with the goal of locating internal defects. The ability to detect and identify defects automatically is a critical component of efficiency improvements for future sawmills and veneer mills. This paper describes an approach in which 1) histogram equalization is used during preprocessing to normalize...

  20. A New Approach to Automated Labeling of Internal Features of Hardwood Logs Using CT Images

    Treesearch

    Daniel L. Schmoldt; Pei Li; A. Lynn Abbott

    1996-01-01

    The feasibility of automatically identifying internal features of hardwood logs using CT imagery has been established previously. Features of primary interest are bark, knots, voids, decay, and clear wood. Our previous approach: filtered original CT images, applied histogram segmentation, grew volumes to extract 3-d regions, and applied a rule base, with Dempster-...

  1. [The actual possibilities of robotic microscopy in analysis automation and laboratory telemedicine].

    PubMed

    Medovyĭ, V S; Piatnitskiĭ, A M; Sokolinskiĭ, B Z; Balugian, R Sh

    2012-10-01

    The article discusses the possibilities of automation microscopy complexes manufactured by Cellavision and MEKOS to perform the medical analyses of blood films and other biomaterials. The joint work of the complex and physician in the regimen of automatic load stages, screening, sampling and sorting on types with simple morphology, visual sorting of sub-sample with complex morphology provides significant increase of method sensitivity, load decrease and enhancement of physician work conditions. The information technologies, the virtual slides and laboratory telemedicine included permit to develop the representative samples of rare types and pathologies to promote automation methods and medical research targets.

  2. Volume recovery, grade yield, and properties of lumber from young-growth sitka spruce and western hemlock in southeast Alaska.

    Treesearch

    Glenn A. Christensen; Kent R. Julin; Robert J. Ross; Susan. Willits

    2002-01-01

    Wood volume recovery, lumber grade yield, and mechanical properties of young-growth Sitka spruce (Picea sitchensis (Bong.) Carr.) and western hemlock (Tsuga heterophyla (Raf.) Sarg.)were examined. The sample included trees from commercially thinned and unthinned stands and fluted western hemlock logs obtained from a sort yard....

  3. Lumber stress grades and design properties

    Treesearch

    David E. Kretschmann; David W. Green

    1999-01-01

    Lumber sawn from a log, regardless of species and size, is quite variable in mechanical properties. Pieces may differ in strength by several hundred percent. For simplicity and economy in use, pieces of lumber of similar mechanical properties are placed in categories called stress grades, which are characterized by (a) one or more sorting criteria, (b) a set of...

  4. Automated labeling of log features in CT imagery of multiple hardwood species

    Treesearch

    Daniel L. Schmoldt; Jing He; A. Lynn Abbott

    2000-01-01

    Before noninvasive scanning, e.g., computed tomography (CT), becomes feasible in industrial saw-mill operations, we need a procedure that can automatically interpret scan information in order to provide the saw operator with information necessary to make proper sawing decisions. To this end, we have worked to develop an approach for automatic analysis of CT images of...

  5. High-Throughput Fluorescence-Based Isolation of Live C. elegans Larvae

    PubMed Central

    Fernandez, Anita G.; Bargmann, Bastiaan O. R.; Mis, Emily K.; Edgley, Mark. L.; Birnbaum, Kenneth D.; Piano, Fabio

    2017-01-01

    For the nematode Caenorhabditis elegans, automated selection of animals of specific genotypes from a mixed pool has become essential for genetic interaction or chemical screens. To date, such selection has been accomplished using specialized instruments. However, access to such dedicated equipment is not common. Here we describe live animal fluorescence-activated cell sorting (laFACS), a protocol for automatic selection of live L1 animals using a standard FACS. We show that a FACS can be used for the precise identification of GFP-expressing and non-GFP-expressing sub-populations and can accomplish high-speed sorting of live animals. We have routinely collected 100,000 or more homozygotes from a mixed starting population within two hours and with greater than ninety-nine percent purity. The sorted animals continue to develop normally, making this protocol ideally suited for the isolation of terminal mutants for use in genetic interaction or chemical genetic screens. PMID:22814389

  6. Design and Application of Automatic Falling Device for Different Brands of Goods

    NASA Astrophysics Data System (ADS)

    Yang, Xudong; Ge, Qingkuan; Zuo, Ping; Peng, Tao; Dong, Weifu

    2017-12-01

    The Goods-Falling device is an important device in the intelligent sorting goods sorting system, which is responsible for the temporary storage and counting of the goods, and the function of putting the goods on the conveyor belt according to certain precision requirements. According to the present situation analysis and actual demand of the domestic goods sorting equipment, a vertical type Goods - Falling Device is designed and the simulation model of the device is established. The dynamic characteristics such as the angular error of the opening and closing mechanism are carried out by ADAMS software. The simulation results show that the maximum angular error is 0.016rad. Through the test of the device, the goods falling speed is 7031/hour, the good of the falling position error within 2mm, meet the crawl accuracy requirements of the palletizing robot.

  7. MAIL LOG, program summary and specifications

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    The summary and specifications to obtain the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS are provided. The MAIL LOG program has four modes of operation: (1) input - putting new records into the data base; (2) revise - changing or modifying existing records in the data base; (3) search - finding special records existing in the data base; and (4) archive - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the input and search modes.

  8. Computer Aided Phenomenography: The Role of Leximancer Computer Software in Phenomenographic Investigation

    ERIC Educational Resources Information Center

    Penn-Edwards, Sorrel

    2010-01-01

    The qualitative research methodology of phenomenography has traditionally required a manual sorting and analysis of interview data. In this paper I explore a potential means of streamlining this procedure by considering a computer aided process not previously reported upon. Two methods of lexicological analysis, manual and automatic, were examined…

  9. An image segmentation method for apple sorting and grading using support vector machine and Otsu's method

    USDA-ARS?s Scientific Manuscript database

    Segmentation is the first step in image analysis to subdivide an image into meaningful regions. The segmentation result directly affects the subsequent image analysis. The objective of the research was to develop an automatic adjustable algorithm for segmentation of color images, using linear suppor...

  10. Extending the Online Public Access Catalog into the Microcomputer Environment.

    ERIC Educational Resources Information Center

    Sutton, Brett

    1990-01-01

    Describes PCBIS, a database program for MS-DOS microcomputers that features a utility for automatically converting online public access catalog search results stored as text files into structured database files that can be searched, sorted, edited, and printed. Topics covered include the general features of the program, record structure, record…

  11. A method for development of a system of identification for Appalachian coal-bearing rocks

    USGS Publications Warehouse

    Ferm, J.C.; Weisenfluh, G.A.; Smith, G.C.

    2002-01-01

    The number of observable properties of sedimentary rocks is large and numerous classifications have been proposed for describing them. Some rock classifications, however, may be disadvantageous in situations such as logging rock core during coal exploration programs, where speed and simplicity are the essence. After experimenting with a number of formats for logging rock core in the Appalachian coal fields, a method of using color photographs accompanied by a rock name and numeric code was selected. In order to generate a representative collection of rocks to be photographed, sample methods were devised to produce a representative collection, and empirically based techniques were devised to identify repeatedly recognizable rock types. A number of cores representing the stratigraphic and geographic range of the region were sampled so that every megascopically recognizable variety was included in the collection; the frequency of samples of any variety reflects the frequency with which it would be encountered during logging. In order to generate repeatedly recognizable rock classes, the samples were sorted to display variation in grain size, mineral composition, color, and sedimentary structures. Class boundaries for each property were selected on the basis of existing, widely accepted limits and the precision with which these limits could be recognized. The process of sorting the core samples demonstrated relationships between rock properties and indicated that similar methods, applied to other groups of rocks, could yield more widely applicable field classifications. ?? 2002 Elsevier Science B.V. All rights reserved.

  12. Classifying features in CT imagery: accuracy for some single- and multiple-species classifiers

    Treesearch

    Daniel L. Schmoldt; Jing He; A. Lynn Abbott

    1998-01-01

    Our current approach to automatically label features in CT images of hardwood logs classifies each pixel of an image individually. These feature classifiers use a back-propagation artificial neural network (ANN) and feature vectors that include a small, local neighborhood of pixels and the distance of the target pixel to the center of the log. Initially, this type of...

  13. A Computer Vision System forLocating and Identifying Internal Log Defects Using CT Imagery

    Treesearch

    Dongping Zhu; Richard W. Conners; Frederick Lamb; Philip A. Araman

    1991-01-01

    A number of researchers have shown the ability of magnetic resonance imaging (MRI) and computer tomography (CT) imaging to detect internal defects in logs. However, if these devices are ever to play a role in the forest products industry, automatic methods for analyzing data from these devices must be developed. This paper reports research aimed at developing a...

  14. Automatic processing influences free recall: converging evidence from the process dissociation procedure and remember-know judgments.

    PubMed

    McCabe, David P; Roediger, Henry L; Karpicke, Jeffrey D

    2011-04-01

    Dual-process theories of retrieval suggest that controlled and automatic processing contribute to memory performance. Free recall tests are often considered pure measures of recollection, assessing only the controlled process. We report two experiments demonstrating that automatic processes also influence free recall. Experiment 1 used inclusion and exclusion tasks to estimate recollection and automaticity in free recall, adopting a new variant of the process dissociation procedure. Dividing attention during study selectively reduced the recollection estimate but did not affect the automatic component. In Experiment 2, we replicated the results of Experiment 1, and subjects additionally reported remember-know-guess judgments during recall in the inclusion condition. In the latter task, dividing attention during study reduced remember judgments for studied items, but know responses were unaffected. Results from both methods indicated that free recall is partly driven by automatic processes. Thus, we conclude that retrieval in free recall tests is not driven solely by conscious recollection (or remembering) but also by automatic influences of the same sort believed to drive priming on implicit memory tests. Sometimes items come to mind without volition in free recall.

  15. Chapter 3:Sorting red maple logs for structural quality

    Treesearch

    Xiping Wang

    2005-01-01

    Nondestructive evaluation (NDE) of wood materials has a long history of application in the wood products industry. Visual grading of lumber is perhaps one of the earliest NDE forms. Visual assessment of a piece of lumber requires the grader to estimate a strength ratio on the basis of observed external defects (USDA 1999). The ratio is used to estimate the strength of...

  16. Stress grades and design properties for lumber, round timber, and ties

    Treesearch

    David E. Kretschmann

    2010-01-01

    Round timbers, ties, and lumber sawn from a log, regardless of species and size, are quite variable in mechanical properties. Pieces may differ in strength by several hundred percent. For simplicity and economy in use, pieces of wood of similar mechanical properties are placed in categories called stress grades, which are characterized by (a) one or more sorting...

  17. Premature germination of forest tree seed during natural storage in duff

    Treesearch

    I. T. Haig

    1932-01-01

    For some years forest investigators in the Pacific Northwest have been aware of the considerable quantity of tree seed which accumulates in the duff of heavy virgin timber stands and apparently retains its vitality for a few years in a sort of natural cold-storage condition. The major portion of the luxuriant regeneration which frequently follows logging and forest...

  18. Spike sorting using locality preserving projection with gap statistics and landmark-based spectral clustering.

    PubMed

    Nguyen, Thanh; Khosravi, Abbas; Creighton, Douglas; Nahavandi, Saeid

    2014-12-30

    Understanding neural functions requires knowledge from analysing electrophysiological data. The process of assigning spikes of a multichannel signal into clusters, called spike sorting, is one of the important problems in such analysis. There have been various automated spike sorting techniques with both advantages and disadvantages regarding accuracy and computational costs. Therefore, developing spike sorting methods that are highly accurate and computationally inexpensive is always a challenge in the biomedical engineering practice. An automatic unsupervised spike sorting method is proposed in this paper. The method uses features extracted by the locality preserving projection (LPP) algorithm. These features afterwards serve as inputs for the landmark-based spectral clustering (LSC) method. Gap statistics (GS) is employed to evaluate the number of clusters before the LSC can be performed. The proposed LPP-LSC is highly accurate and computationally inexpensive spike sorting approach. LPP spike features are very discriminative; thereby boost the performance of clustering methods. Furthermore, the LSC method exhibits its efficiency when integrated with the cluster evaluator GS. The proposed method's accuracy is approximately 13% superior to that of the benchmark combination between wavelet transformation and superparamagnetic clustering (WT-SPC). Additionally, LPP-LSC computing time is six times less than that of the WT-SPC. LPP-LSC obviously demonstrates a win-win spike sorting solution meeting both accuracy and computational cost criteria. LPP and LSC are linear algorithms that help reduce computational burden and thus their combination can be applied into real-time spike analysis. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Satellite freeze forecast system. Operating/troubleshooting manual

    NASA Technical Reports Server (NTRS)

    Martsolf, J. D. (Principal Investigator)

    1983-01-01

    Examples of operational procedures are given to assist users of the satellites freeze forecasting system (SFFS) in logging in on to the computer, executing the programs in the menu, logging off the computer, and setting up the automatic system. Directions are also given for displaying, acquiring, and listing satellite maps; for communicating via terminal and monitor displays; and for what to do when the SFFS doesn't work. Administrative procedures are included.

  20. Automatic Picking of Foraminifera: Design of the Foraminifera Image Recognition and Sorting Tool (FIRST) Prototype and Results of the Image Classification Scheme

    NASA Astrophysics Data System (ADS)

    de Garidel-Thoron, T.; Marchant, R.; Soto, E.; Gally, Y.; Beaufort, L.; Bolton, C. T.; Bouslama, M.; Licari, L.; Mazur, J. C.; Brutti, J. M.; Norsa, F.

    2017-12-01

    Foraminifera tests are the main proxy carriers for paleoceanographic reconstructions. Both geochemical and taxonomical studies require large numbers of tests to achieve statistical relevance. To date, the extraction of foraminifera from the sediment coarse fraction is still done by hand and thus time-consuming. Moreover, the recognition of morphotypes, ecologically relevant, requires some taxonomical skills not easily taught. The automatic recognition and extraction of foraminifera would largely help paleoceanographers to overcome these issues. Recent advances in automatic image classification using machine learning opens the way to automatic extraction of foraminifera. Here we detail progress on the design of an automatic picking machine as part of the FIRST project. The machine handles 30 pre-sieved samples (100-1000µm), separating them into individual particles (including foraminifera) and imaging each in pseudo-3D. The particles are classified and specimens of interest are sorted either for Individual Foraminifera Analyses (44 per slide) and/or for classical multiple analyses (8 morphological classes per slide, up to 1000 individuals per hole). The classification is based on machine learning using Convolutional Neural Networks (CNNs), similar to the approach used in the coccolithophorid imaging system SYRACO. To prove its feasibility, we built two training image datasets of modern planktonic foraminifera containing approximately 2000 and 5000 images each, corresponding to 15 & 25 morphological classes. Using a CNN with a residual topology (ResNet) we achieve over 95% correct classification for each dataset. We tested the network on 160,000 images from 45 depths of a sediment core from the Pacific ocean, for which we have human counts. The current algorithm is able to reproduce the downcore variability in both Globigerinoides ruber and the fragmentation index (r2 = 0.58 and 0.88 respectively). The FIRST prototype yields some promising results for high-resolution paleoceanographic studies and evolutionary studies.

  1. Spiral Transformation for High-Resolution and Efficient Sorting of Optical Vortex Modes.

    PubMed

    Wen, Yuanhui; Chremmos, Ioannis; Chen, Yujie; Zhu, Jiangbo; Zhang, Yanfeng; Yu, Siyuan

    2018-05-11

    Mode sorting is an essential function for optical multiplexing systems that exploit the orthogonality of the orbital angular momentum mode space. The familiar log-polar optical transformation provides a simple yet efficient approach whose resolution is, however, restricted by a considerable overlap between adjacent modes resulting from the limited excursion of the phase along a complete circle around the optical vortex axis. We propose and experimentally verify a new optical transformation that maps spirals (instead of concentric circles) to parallel lines. As the phase excursion along a spiral in the wave front of an optical vortex is theoretically unlimited, this new optical transformation can separate orbital angular momentum modes with superior resolution while maintaining unity efficiency.

  2. Spiral Transformation for High-Resolution and Efficient Sorting of Optical Vortex Modes

    NASA Astrophysics Data System (ADS)

    Wen, Yuanhui; Chremmos, Ioannis; Chen, Yujie; Zhu, Jiangbo; Zhang, Yanfeng; Yu, Siyuan

    2018-05-01

    Mode sorting is an essential function for optical multiplexing systems that exploit the orthogonality of the orbital angular momentum mode space. The familiar log-polar optical transformation provides a simple yet efficient approach whose resolution is, however, restricted by a considerable overlap between adjacent modes resulting from the limited excursion of the phase along a complete circle around the optical vortex axis. We propose and experimentally verify a new optical transformation that maps spirals (instead of concentric circles) to parallel lines. As the phase excursion along a spiral in the wave front of an optical vortex is theoretically unlimited, this new optical transformation can separate orbital angular momentum modes with superior resolution while maintaining unity efficiency.

  3. Neural computation of arithmetic functions

    NASA Technical Reports Server (NTRS)

    Siu, Kai-Yeung; Bruck, Jehoshua

    1990-01-01

    An area of application of neural networks is considered. A neuron is modeled as a linear threshold gate, and the network architecture considered is the layered feedforward network. It is shown how common arithmetic functions such as multiplication and sorting can be efficiently computed in a shallow neural network. Some known results are improved by showing that the product of two n-bit numbers and sorting of n n-bit numbers can be computed by a polynomial-size neural network using only four and five unit delays, respectively. Moreover, the weights of each threshold element in the neural networks require O(log n)-bit (instead of n-bit) accuracy. These results can be extended to more complicated functions such as multiple products, division, rational functions, and approximation of analytic functions.

  4. A new approach to spike sorting for multi-neuronal activities recorded with a tetrode--how ICA can be practical.

    PubMed

    Takahashi, Susumu; Anzai, Yuichiro; Sakurai, Yoshio

    2003-07-01

    Multi-neuronal recording with a tetrode is a powerful technique to reveal neuronal interactions in local circuits. However, it is difficult to detect precise spike timings among closely neighboring neurons because the spike waveforms of individual neurons overlap on the electrode when more than two neurons fire simultaneously. In addition, the spike waveforms of single neurons, especially in the presence of complex spikes, are often non-stationary. These problems limit the ability of ordinary spike sorting to sort multi-neuronal activities recorded using tetrodes into their single-neuron components. Though sorting with independent component analysis (ICA) can solve these problems, it has one serious limitation that the number of separated neurons must be less than the number of electrodes. Using a combination of ICA and the efficiency of ordinary spike sorting technique (k-means clustering), we developed an automatic procedure to solve the spike-overlapping and the non-stationarity problems with no limitation on the number of separated neurons. The results for the procedure applied to real multi-neuronal data demonstrated that some outliers which may be assigned to distinct clusters if ordinary spike-sorting methods were used can be identified as overlapping spikes, and that there are functional connections between a putative pyramidal neuron and its putative dendrite. These findings suggest that the combination of ICA and k-means clustering can provide insights into the precise nature of functional circuits among neurons, i.e. cell assemblies.

  5. Assessing the potential for log sort yards to improve financial viability of forest restoration treatments

    Treesearch

    Woodam Chung; Tyron J. Venn; Dan Loeffler; Greg Jones; Han-Sup Han; Dave E. Calkin

    2012-01-01

    Forest restoration and fuel reduction treatments have been widely applied in the western United States with the purpose of reducing the size and intensity of wildfires. However, the low value of small-diameter trees produced from such treatments has partly constrained the ability to treat all the areas identified as being in need of treatments. The objective of this...

  6. DataForge: Modular platform for data storage and analysis

    NASA Astrophysics Data System (ADS)

    Nozik, Alexander

    2018-04-01

    DataForge is a framework for automated data acquisition, storage and analysis based on modern achievements of applied programming. The aim of the DataForge is to automate some standard tasks like parallel data processing, logging, output sorting and distributed computing. Also the framework extensively uses declarative programming principles via meta-data concept which allows a certain degree of meta-programming and improves results reproducibility.

  7. An algorithm for 4D CT image sorting using spatial continuity.

    PubMed

    Li, Chen; Liu, Jie

    2013-01-01

    4D CT, which could locate the position of the movement of the tumor in the entire respiratory cycle and reduce image artifacts effectively, has been widely used in making radiation therapy of tumors. The current 4D CT methods required external surrogates of respiratory motion obtained from extra instruments. However, respiratory signals recorded by these external makers may not always accurately represent the internal tumor and organ movements, especially when irregular breathing patterns happened. In this paper we have proposed a novel automatic 4D CT sorting algorithm that performs without these external surrogates. The sorting algorithm requires collecting the image data with a cine scan protocol. Beginning with the first couch position, images from the adjacent couch position are selected out according to spatial continuity. The process is continued until images from all couch positions are sorted and the entire 3D volume is produced. The algorithm is verified by respiratory phantom image data and clinical image data. The primary test results show that the 4D CT images created by our algorithm have eliminated the motion artifacts effectively and clearly demonstrated the movement of tumor and organ in the breath period.

  8. Automatic integration of data from dissimilar sensors

    NASA Astrophysics Data System (ADS)

    Citrin, W. I.; Proue, R. W.; Thomas, J. W.

    The present investigation is concerned with the automatic integration of radar and electronic support measures (ESM) sensor data, and with the development of a method for the automatical integration of identification friend or foe (IFF) and radar sensor data. On the basis of the two considered proojects, significant advances have been made in the areas of sensor data integration. It is pointed out that the log likelihood approach in sensor data correlation is appropriate for both similar and dissimilar sensor data. Attention is given to the real time integration of radar and ESM sensor data, and a radar ESM correlation simulation program.

  9. Automated personalized feedback for physical activity and dietary behavior change with mobile phones: a randomized controlled trial on adults.

    PubMed

    Rabbi, Mashfiqui; Pfammatter, Angela; Zhang, Mi; Spring, Bonnie; Choudhury, Tanzeem

    2015-05-14

    A dramatic rise in health-tracking apps for mobile phones has occurred recently. Rich user interfaces make manual logging of users' behaviors easier and more pleasant, and sensors make tracking effortless. To date, however, feedback technologies have been limited to providing overall statistics, attractive visualization of tracked data, or simple tailoring based on age, gender, and overall calorie or activity information. There are a lack of systems that can perform automated translation of behavioral data into specific actionable suggestions that promote healthier lifestyle without any human involvement. MyBehavior, a mobile phone app, was designed to process tracked physical activity and eating behavior data in order to provide personalized, actionable, low-effort suggestions that are contextualized to the user's environment and previous behavior. This study investigated the technical feasibility of implementing an automated feedback system, the impact of the suggestions on user physical activity and eating behavior, and user perceptions of the automatically generated suggestions. MyBehavior was designed to (1) use a combination of automatic and manual logging to track physical activity (eg, walking, running, gym), user location, and food, (2) automatically analyze activity and food logs to identify frequent and nonfrequent behaviors, and (3) use a standard machine-learning, decision-making algorithm, called multi-armed bandit (MAB), to generate personalized suggestions that ask users to either continue, avoid, or make small changes to existing behaviors to help users reach behavioral goals. We enrolled 17 participants, all motivated to self-monitor and improve their fitness, in a pilot study of MyBehavior. In a randomized two-group trial, investigators randomly assigned participants to receive either MyBehavior's personalized suggestions (n=9) or nonpersonalized suggestions (n=8), created by professionals, from a mobile phone app over 3 weeks. Daily activity level and dietary intake was monitored from logged data. At the end of the study, an in-person survey was conducted that asked users to subjectively rate their intention to follow MyBehavior suggestions. In qualitative daily diary, interview, and survey data, users reported MyBehavior suggestions to be highly actionable and stated that they intended to follow the suggestions. MyBehavior users walked significantly more than the control group over the 3 weeks of the study (P=.05). Although some MyBehavior users chose lower-calorie foods, the between-group difference was not significant (P=.15). In a poststudy survey, users rated MyBehavior's personalized suggestions more positively than the nonpersonalized, generic suggestions created by professionals (P<.001). MyBehavior is a simple-to-use mobile phone app with preliminary evidence of efficacy. To the best of our knowledge, MyBehavior represents the first attempt to create personalized, contextualized, actionable suggestions automatically from self-tracked information (ie, manual food logging and automatic tracking of activity). Lessons learned about the difficulty of manual logging and usability concerns, as well as future directions, are discussed. ClinicalTrials.gov NCT02359981; https://clinicaltrials.gov/ct2/show/NCT02359981 (Archived by WebCite at http://www.webcitation.org/6YCeoN8nv).

  10. Binary-space-partitioned images for resolving image-based visibility.

    PubMed

    Fu, Chi-Wing; Wong, Tien-Tsin; Tong, Wai-Shun; Tang, Chi-Keung; Hanson, Andrew J

    2004-01-01

    We propose a novel 2D representation for 3D visibility sorting, the Binary-Space-Partitioned Image (BSPI), to accelerate real-time image-based rendering. BSPI is an efficient 2D realization of a 3D BSP tree, which is commonly used in computer graphics for time-critical visibility sorting. Since the overall structure of a BSP tree is encoded in a BSPI, traversing a BSPI is comparable to traversing the corresponding BSP tree. BSPI performs visibility sorting efficiently and accurately in the 2D image space by warping the reference image triangle-by-triangle instead of pixel-by-pixel. Multiple BSPIs can be combined to solve "disocclusion," when an occluded portion of the scene becomes visible at a novel viewpoint. Our method is highly automatic, including a tensor voting preprocessing step that generates candidate image partition lines for BSPIs, filters the noisy input data by rejecting outliers, and interpolates missing information. Our system has been applied to a variety of real data, including stereo, motion, and range images.

  11. The design and implementation of an automated system for logging clinical experiences using an anesthesia information management system.

    PubMed

    Simpao, Allan; Heitz, James W; McNulty, Stephen E; Chekemian, Beth; Brenn, B Randall; Epstein, Richard H

    2011-02-01

    Residents in anesthesia training programs throughout the world are required to document their clinical cases to help ensure that they receive adequate training. Current systems involve self-reporting, are subject to delayed updates and misreported data, and do not provide a practicable method of validation. Anesthesia information management systems (AIMS) are being used increasingly in training programs and are a logical source for verifiable documentation. We hypothesized that case logs generated automatically from an AIMS would be sufficiently accurate to replace the current manual process. We based our analysis on the data reporting requirements of the American College of Graduate Medical Education (ACGME). We conducted a systematic review of ACGME requirements and our AIMS record, and made modifications after identifying data element and attribution issues. We studied 2 methods (parsing of free text procedure descriptions and CPT4 procedure code mapping) to automatically determine ACGME case categories and generated AIMS-based case logs and compared these to assignments made by manual inspection of the anesthesia records. We also assessed under- and overreporting of cases entered manually by our residents into the ACGME website. The parsing and mapping methods assigned cases to a majority of the ACGME categories with accuracies of 95% and 97%, respectively, as compared with determinations made by 2 residents and 1 attending who manually reviewed all procedure descriptions. Comparison of AIMS-based case logs with reports from the ACGME Resident Case Log System website showed that >50% of residents either underreported or overreported their total case counts by at least 5%. The AIMS database is a source of contemporaneous documentation of resident experience that can be queried to generate valid, verifiable case logs. The extent of AIMS adoption by academic anesthesia departments should encourage accreditation organizations to support uploading of AIMS-based case log files to improve accuracy and to decrease the clerical burden on anesthesia residents.

  12. Effects of Early U.S. Compulsory Schooling Laws on Educational Assortative Mating: The Importance of Context.

    PubMed

    Rauscher, Emily

    2015-08-01

    Modernization theory predicts that rising education should increase assortative mating by education and decrease sorting by race. Recent research suggests that effects of educational expansion depend on contextual factors, such as economic development. Using log-linear and log-multiplicative models of male household heads ages 36 to 75 in the 1940 U.S. census data--the first U.S. census with educational attainment information--I investigate how educational assortative mating changed with one instance of educational expansion: early U.S. compulsory school attendance laws. To improve on existing research and distinguish effects of expansion from changes due to particular years or cohorts, I capitalize on state variation in the timing of these compulsory laws (ranging from 1852 to 1918). Aggregate results suggest that compulsory laws had minimal impact on assortative mating. However, separate analyses by region (and supplemental analyses by race) reveal that assortative mating by education decreased with the laws in the South but increased in the North. Whether due to economic, legal, political, or other differences, results suggest that the implications of educational expansion for marital sorting depend on context. Contemporary implications are discussed in light of President Obama's 2012 suggested extension of compulsory schooling to age 18.

  13. TU-D-209-05: Automatic Calculation of Organ and Effective Dose for CBCT and Interventional Fluoroscopic Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Z; Vijayan, S; Oines, A

    Purpose: To compare PCXMC and EGSnrc calculated organ and effective radiation doses from cone-beam computed tomography (CBCT) and interventional fluoroscopically-guided procedures using automatic exposure-event grouping. Methods: For CBCT, we used PCXMC20Rotation.exe to automatically calculate the doses and compared the results to those calculated using EGSnrc with the Zubal patient phantom. For interventional procedures, we use the dose tracking system (DTS) which we previously developed to produce a log file of all geometry and exposure parameters for every x-ray pulse during a procedure, and the data in the log file is input into PCXMC and EGSnrc for dose calculation. A MATLABmore » program reads data from the log files and groups similar exposures to reduce calculation time. The definition files are then automatically generated in the format used by PCXMC and EGSnrc. Processing is done at the end of the procedure after all exposures are completed. Results: For the Toshiba Infinix CBCT LCI-Middle-Abdominal protocol, most organ doses calculated with PCXMC20Rotation closely matched those calculated with EGSnrc. The effective doses were 33.77 mSv with PCXMC20Rotation and 32.46 mSv with EGSnrc. For a simulated interventional cardiac procedure, similar close agreement in organ dose was obtained between the two codes; the effective doses were 12.02 mSv with PCXMC and 11.35 mSv with EGSnrc. The calculations can be completed on a PC without manual intervention in less than 15 minutes with PCXMC and in about 10 hours with EGSnrc, depending on the level of data grouping and accuracy desired. Conclusion: Effective dose and most organ doses in CBCT and interventional radiology calculated by PCXMC closely match those calculated by EGSnrc. Data grouping, which can be done automatically, makes the calculation time with PCXMC on a standard PC acceptable. This capability expands the dose information that can be provided by the DTS. Partial support from NIH Grant R01-EB002873 and Toshiba Medical Systems Corp.« less

  14. Mail LOG: Program operating instructions

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    The operating instructions for the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, are provided. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG program has the following four modes of operation: (1) INPUT - putting new records into the data base (2) REVISE - changing or modifying existing records in the data base (3) SEARCH - finding special records existing in the data base (4) ARCHIVE - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the INPUT and SEARCH modes. The MAIL LOG data base consists of three main subfiles: Incoming and outgoing mail correspondence; Design Information Releases and Releases and Reports; and Drawings and Engineering orders.

  15. Technical data on new engineering products

    NASA Astrophysics Data System (ADS)

    1985-02-01

    New grades of permanently magnetic materials; automatic digital radiolocator; bench winder; analog induction gauge; programmable pulse generator; portable defibrillators; pipe welders; two-component electromagnetic log; sulphur content analyzer; peristaltic pumps; function generators; welding manipulator; and tonsiometer are described.

  16. Production of dioxins and furans for various solid fuels burnt in 25 kW automatic boiler

    NASA Astrophysics Data System (ADS)

    Hopan, František; Horák, Jiří; Krpec, Kamil; Kubesa, Petr; Dej, Milan; Laciok, Vendula

    2016-06-01

    There has been brown coal, black coal and maize straw in a pellet form burnt in an automatic boiler. Production of dibenzodioxins and dibenzofuranes, recomputated through toxicity equivalents, expressed as the emission factor relative to the fuel unit, has differentiated in a range of ca. three orders (0.05 up to 78.9 ng/kg) in dependence on a sort of the used fuel. The measured values have been compared with emission factors used for the emission inventory in the Czech Republic and Poland and with the emission limit applicable for waste incineration plants. The study has proven the influence of chlorine content in fuel on production of dioxins and furanes.

  17. 4D CT sorting based on patient internal anatomy

    NASA Astrophysics Data System (ADS)

    Li, Ruijiang; Lewis, John H.; Cerviño, Laura I.; Jiang, Steve B.

    2009-08-01

    Respiratory motion during free-breathing computed tomography (CT) scan may cause significant errors in target definition for tumors in the thorax and upper abdomen. A four-dimensional (4D) CT technique has been widely used for treatment simulation of thoracic and abdominal cancer radiotherapy. The current 4D CT techniques require retrospective sorting of the reconstructed CT slices oversampled at the same couch position. Most sorting methods depend on external surrogates of respiratory motion recorded by extra instruments. However, respiratory signals obtained from these external surrogates may not always accurately represent the internal target motion, especially when irregular breathing patterns occur. We have proposed a new sorting method based on multiple internal anatomical features for multi-slice CT scan acquired in the cine mode. Four features are analyzed in this study, including the air content, lung area, lung density and body area. We use a measure called spatial coherence to select the optimal internal feature at each couch position and to generate the respiratory signals for 4D CT sorting. The proposed method has been evaluated for ten cancer patients (eight with thoracic cancer and two with abdominal cancer). For nine patients, the respiratory signals generated from the combined internal features are well correlated to those from external surrogates recorded by the real-time position management (RPM) system (average correlation: 0.95 ± 0.02), which is better than any individual internal measures at 95% confidence level. For these nine patients, the 4D CT images sorted by the combined internal features are almost identical to those sorted by the RPM signal. For one patient with an irregular breathing pattern, the respiratory signals given by the combined internal features do not correlate well with those from RPM (correlation: 0.68 ± 0.42). In this case, the 4D CT image sorted by our method presents fewer artifacts than that from the RPM signal. Our 4D CT internal sorting method eliminates the need of externally recorded surrogates of respiratory motion. It is an automatic, accurate, robust, cost efficient and yet simple method and therefore can be readily implemented in clinical settings.

  18. SU-E-J-26: A Novel Technique for Markerless Self-Sorted 4D-CBCT Using Patient Motion Modeling: A Feasibility Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, L; Zhang, Y; Harris, W

    2015-06-15

    Purpose: To develop an automatic markerless 4D-CBCT projection sorting technique by using a patient respiratory motion model extracted from the planning 4D-CT images. Methods: Each phase of onboard 4D-CBCT is considered as a deformation of one phase of the prior planning 4D-CT. The deformation field map (DFM) is represented as a linear combination of three major deformation patterns extracted from the planning 4D-CT using principle component analysis (PCA). The coefficients of the PCA deformation patterns are solved by matching the digitally reconstructed radiograph (DRR) of the deformed volume to the onboard projection acquired. The PCA coefficients are solved for eachmore » single projection, and are used for phase sorting. Projections at the peaks of the Z direction coefficient are sorted as phase 1 and other projections are assigned into 10 phase bins by dividing phases equally between peaks. The 4D digital extended-cardiac-torso (XCAT) phantom was used to evaluate the proposed technique. Three scenarios were simulated, with different tumor motion amplitude (3cm to 2cm), tumor spatial shift (8mm SI), and tumor body motion phase shift (2 phases) from prior to on-board images. Projections were simulated over 180 degree scan-angle for the 4D-XCAT. The percentage of accurately binned projections across entire dataset was calculated to represent the phase sorting accuracy. Results: With a changed tumor motion amplitude from 3cm to 2cm, markerless phase sorting accuracy was 100%. With a tumor phase shift of 2 phases w.r.t. body motion, the phase sorting accuracy was 100%. With a tumor spatial shift of 8mm in SI direction, phase sorting accuracy was 86.1%. Conclusion: The XCAT phantom simulation results demonstrated that it is feasible to use prior knowledge and motion modeling technique to achieve markerless 4D-CBCT phase sorting. National Institutes of Health Grant No. R01-CA184173 Varian Medical System.« less

  19. Statistical physics in foreign exchange currency and stock markets

    NASA Astrophysics Data System (ADS)

    Ausloos, M.

    2000-09-01

    Problems in economy and finance have attracted the interest of statistical physicists all over the world. Fundamental problems pertain to the existence or not of long-, medium- or/and short-range power-law correlations in various economic systems, to the presence of financial cycles and on economic considerations, including economic policy. A method like the detrended fluctuation analysis is recalled emphasizing its value in sorting out correlation ranges, thereby leading to predictability at short horizon. The ( m, k)-Zipf method is presented for sorting out short-range correlations in the sign and amplitude of the fluctuations. A well-known financial analysis technique, the so-called moving average, is shown to raise questions to physicists about fractional Brownian motion properties. Among spectacular results, the possibility of crash predictions has been demonstrated through the log-periodicity of financial index oscillations.

  20. Accurately tracking single-cell movement trajectories in microfluidic cell sorting devices.

    PubMed

    Jeong, Jenny; Frohberg, Nicholas J; Zhou, Enlu; Sulchek, Todd; Qiu, Peng

    2018-01-01

    Microfluidics are routinely used to study cellular properties, including the efficient quantification of single-cell biomechanics and label-free cell sorting based on the biomechanical properties, such as elasticity, viscosity, stiffness, and adhesion. Both quantification and sorting applications require optimal design of the microfluidic devices and mathematical modeling of the interactions between cells, fluid, and the channel of the device. As a first step toward building such a mathematical model, we collected video recordings of cells moving through a ridged microfluidic channel designed to compress and redirect cells according to cell biomechanics. We developed an efficient algorithm that automatically and accurately tracked the cell trajectories in the recordings. We tested the algorithm on recordings of cells with different stiffness, and showed the correlation between cell stiffness and the tracked trajectories. Moreover, the tracking algorithm successfully picked up subtle differences of cell motion when passing through consecutive ridges. The algorithm for accurately tracking cell trajectories paves the way for future efforts of modeling the flow, forces, and dynamics of cell properties in microfluidics applications.

  1. Automated processing of shoeprint images based on the Fourier transform for use in forensic science.

    PubMed

    de Chazal, Philip; Flynn, John; Reilly, Richard B

    2005-03-01

    The development of a system for automatically sorting a database of shoeprint images based on the outsole pattern in response to a reference shoeprint image is presented. The database images are sorted so that those from the same pattern group as the reference shoeprint are likely to be at the start of the list. A database of 476 complete shoeprint images belonging to 140 pattern groups was established with each group containing two or more examples. A panel of human observers performed the grouping of the images into pattern categories. Tests of the system using the database showed that the first-ranked database image belongs to the same pattern category as the reference image 65 percent of the time and that a correct match appears within the first 5 percent of the sorted images 87 percent of the time. The system has translational and rotational invariance so that the spatial positioning of the reference shoeprint images does not have to correspond with the spatial positioning of the shoeprint images of the database. The performance of the system for matching partial-prints was also determined.

  2. Fast algorithms for transforming back and forth between a signed permutation and its equivalent simple permutation.

    PubMed

    Gog, Simon; Bader, Martin

    2008-10-01

    The problem of sorting signed permutations by reversals is a well-studied problem in computational biology. The first polynomial time algorithm was presented by Hannenhalli and Pevzner in 1995. The algorithm was improved several times, and nowadays the most efficient algorithm has a subquadratic running time. Simple permutations played an important role in the development of these algorithms. Although the latest result of Tannier et al. does not require simple permutations, the preliminary version of their algorithm as well as the first polynomial time algorithm of Hannenhalli and Pevzner use the structure of simple permutations. More precisely, the latter algorithms require a precomputation that transforms a permutation into an equivalent simple permutation. To the best of our knowledge, all published algorithms for this transformation have at least a quadratic running time. For further investigations on genome rearrangement problems, the existence of a fast algorithm for the transformation could be crucial. Another important task is the back transformation, i.e. if we have a sorting on the simple permutation, transform it into a sorting on the original permutation. Again, the naive approach results in an algorithm with quadratic running time. In this paper, we present a linear time algorithm for transforming a permutation into an equivalent simple permutation, and an O(n log n) algorithm for the back transformation of the sorting sequence.

  3. The USGS ``Did You Feel It?'' Internet-based Macroseismic Intensity Maps: Lessons Learned from a Decade of Online Data Collection (Invited)

    NASA Astrophysics Data System (ADS)

    Wald, D. J.; Quitoriano, V. R.; Hopper, M.; Mathias, S.; Dewey, J. W.

    2010-12-01

    Over the past decade, the U.S. Geological Survey’s “Did You Feel It?” (DYFI) system has automatically collected shaking and damage reports from Internet users immediately following earthquakes. This 10-yr stint of citizen-based science preceded the recently in vogue notion of "crowdsourcing" by nearly a decade. DYFI is a rapid and vast source of macroseismic data, providing quantitative and qualitative information about shaking intensities for earthquakes in the US and around the globe. Statistics attest to the abundance and rapid availability of these Internet-based macroseismic data: Over 1.8 million entries have been logged over the decade, and there are 30 events each with over 10,000 responses (230 events have over 1,000 entries). The greatest number of responses to date for an earthquake is over 78,000 for the April 2010, M7.2 Baja California, Mexico, event. Questionnaire response rates have reached 62,000 per hour (1,000 per min!) obviously requiring substantial web resource allocation and capacity. Outside the US, DYFI has gathered over 189,000 entries in 9,500 cities covering 140 countries since its global inception in late 2004. The rapid intensity data are automatically used in the Global ShakeMap (GSM) system, providing intensity constraints near population centers and in places without instrumental coverage (most of the world), and allowing for bias correction to the empirical prediction equations employed. ShakeMap has also been recently refined to automatically use macroseismic input data in their native form, and treat their uncertainties rigorously in concert with ground-motion data. Recent DYFI system improvements include a graphical user interface that allows seismic analysts to perform common functions, including map triggering and resizing , as well as sorting, searching, geocoding, and flagging entries. New web-based geolocation and geocoding services are being incorporated into DYFI for improving the accuracy of the users’ locations. A database containing the entire DYFI archive facilitates research by streamlining the selection, organization and export of data. For example, recent quantitative analyses of uncertainties of DYFI data provide confidence in their use: Averaging ten or more responses at a given location results in uncertainties of less than 0.2 intensity units. Systems comparable or complimentary to DYFI now operate in several countries, and collaborative efforts to uniformly collect and exchange data in near real time are being further explored. From our experience with DYFI, essential components of an Internet-based citizen science portal include i) easy-to-use forms, ii) instant feedback so that a user may see his contribution (validating their experience), iii) open space for first-person accounts (catharsis) and discussion of effects not covered in the questionnaire, and iv) routinely addressing user comments and questions. In addition, online user-friendly tools now include common searches, statistics, sorting of responses, time-entry histories, comparisons of data with empirical intensity estimates, and an easily-downloadable data format for researchers. A number of these functions were originally recommended by users, again emphasizing the need to attend to user feedback.

  4. Offline Arabic handwriting recognition: a survey.

    PubMed

    Lorigo, Liana M; Govindaraju, Venu

    2006-05-01

    The automatic recognition of text on scanned images has enabled many applications such as searching for words in large volumes of documents, automatic sorting of postal mail, and convenient editing of previously printed documents. The domain of handwriting in the Arabic script presents unique technical challenges and has been addressed more recently than other domains. Many different methods have been proposed and applied to various types of images. This paper provides a comprehensive review of these methods. It is the first survey to focus on Arabic handwriting recognition and the first Arabic character recognition survey to provide recognition rates and descriptions of test data for the approaches discussed. It includes background on the field, discussion of the methods, and future research directions.

  5. The experiences of undergraduate nursing students with bots in Second LifeRTM

    NASA Astrophysics Data System (ADS)

    Rose, Lesele H.

    As technology continues to transform education from the status quo of traditional lecture-style instruction to an interactive engaging learning experience, students' experiences within the learning environment continues to change as well. This dissertation addressed the need for continuing research in advancing implementation of technology in higher education. The purpose of this phenomenological study was to discover more about the experiences of undergraduate nursing students using standardized geriatric evaluation tools when interacting with scripted geriatric patient bots tools in a simulated instructional intake setting. Data was collected through a Demographics questionnaire, an Experiential questionnaire, and a Reflection questionnaire. Triangulation of data collection occurred through an automatically created log of the interactions with the two bots, and by an automatically recorded log of the participants' movements while in the simulated geriatric intake interview. The data analysis consisted of an iterative review of the questionnaires and the participants' logs in an effort to identify common themes, recurring comments, and issues which would benefit from further exploration. Findings revealed that the interactions with the bots were perceived as a valuable experience for the participants from the perspective of interacting with the Geriatric Evaluation Tools in the role of an intake nurse. Further research is indicated to explore instructional interactions with bots in effectively mastering the use of established Geriatric Evaluation Tools.

  6. Target recognition of log-polar ladar range images using moment invariants

    NASA Astrophysics Data System (ADS)

    Xia, Wenze; Han, Shaokun; Cao, Jie; Yu, Haoyong

    2017-01-01

    The ladar range image has received considerable attentions in the automatic target recognition field. However, previous research does not cover target recognition using log-polar ladar range images. Therefore, we construct a target recognition system based on log-polar ladar range images in this paper. In this system combined moment invariants and backpropagation neural network are selected as shape descriptor and shape classifier, respectively. In order to fully analyze the effect of log-polar sampling pattern on recognition result, several comparative experiments based on simulated and real range images are carried out. Eventually, several important conclusions are drawn: (i) if combined moments are computed directly by log-polar range images, translation, rotation and scaling invariant properties of combined moments will be invalid (ii) when object is located in the center of field of view, recognition rate of log-polar range images is less sensitive to the changing of field of view (iii) as object position changes from center to edge of field of view, recognition performance of log-polar range images will decline dramatically (iv) log-polar range images has a better noise robustness than Cartesian range images. Finally, we give a suggestion that it is better to divide field of view into recognition area and searching area in the real application.

  7. Image classification at low light levels

    NASA Astrophysics Data System (ADS)

    Wernick, Miles N.; Morris, G. Michael

    1986-12-01

    An imaging photon-counting detector is used to achieve automatic sorting of two image classes. The classification decision is formed on the basis of the cross correlation between a photon-limited input image and a reference function stored in computer memory. Expressions for the statistical parameters of the low-light-level correlation signal are given and are verified experimentally. To obtain a correlation-based system for two-class sorting, it is necessary to construct a reference function that produces useful information for class discrimination. An expression for such a reference function is derived using maximum-likelihood decision theory. Theoretically predicted results are used to compare on the basis of performance the maximum-likelihood reference function with Fukunaga-Koontz basis vectors and average filters. For each method, good class discrimination is found to result in milliseconds from a sparse sampling of the input image.

  8. A novel automated spike sorting algorithm with adaptable feature extraction.

    PubMed

    Bestel, Robert; Daus, Andreas W; Thielemann, Christiane

    2012-10-15

    To study the electrophysiological properties of neuronal networks, in vitro studies based on microelectrode arrays have become a viable tool for analysis. Although in constant progress, a challenging task still remains in this area: the development of an efficient spike sorting algorithm that allows an accurate signal analysis at the single-cell level. Most sorting algorithms currently available only extract a specific feature type, such as the principal components or Wavelet coefficients of the measured spike signals in order to separate different spike shapes generated by different neurons. However, due to the great variety in the obtained spike shapes, the derivation of an optimal feature set is still a very complex issue that current algorithms struggle with. To address this problem, we propose a novel algorithm that (i) extracts a variety of geometric, Wavelet and principal component-based features and (ii) automatically derives a feature subset, most suitable for sorting an individual set of spike signals. Thus, there is a new approach that evaluates the probability distribution of the obtained spike features and consequently determines the candidates most suitable for the actual spike sorting. These candidates can be formed into an individually adjusted set of spike features, allowing a separation of the various shapes present in the obtained neuronal signal by a subsequent expectation maximisation clustering algorithm. Test results with simulated data files and data obtained from chick embryonic neurons cultured on microelectrode arrays showed an excellent classification result, indicating the superior performance of the described algorithm approach. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. VLSI Design, Parallel Computation and Distributed Computing

    DTIC Science & Technology

    1991-09-30

    I U1 TA 3 Daniel Mleitman U. : C ..( -_. .. .s .. . . . . Tom Leighton David Shmoys . ........A ,~i ;.t , 77 Michael Sipser , Di.,t a-., Eva Tardos...Leighton and Plaxton on the construction of a sim- ple c log .- depth circuit (where c < 7.5) that sorts a random permutation with very high probability...puting iPOD( ). Aug-ust 1992. Vancouver. British Columbia (to appear). 20. B 1Xti~ c .. U(.ii. 1. Gopal. M. [Kaplan and S. Kutten, "Distributed Control for

  10. Effects of sediment-associated extractable metals, degree of sediment grain sorting, and dissolved organic carbon upon Cryptosporidium parvum removal and transport within riverbank filtration sediments, Sonoma County, California.

    PubMed

    Metge, David W; Harvey, Ronald W; Aiken, George R; Anders, Robert; Lincoln, George; Jasperse, Jay; Hill, Mary C

    2011-07-01

    Oocysts of the protozoan pathogen Cryptosporidium parvum are of particular concern for riverbank filtration (RBF) operations because of their persistence, ubiquity, and resistance to chlorine disinfection. At the Russian River RBF site (Sonoma County, CA), transport of C. parvum oocysts and oocyst-sized (3 μm) carboxylate-modified microspheres through poorly sorted (sorting indices, σ(1), up to 3.0) and geochemically heterogeneous sediments collected between 2 and 25 m below land surface (bls) were assessed. Removal was highly sensitive to variations in both the quantity of extractable metals (mainly Fe and Al) and degree of grain sorting. In flow-through columns, there was a log-linear relationship (r(2) = 0.82 at p < 0.002) between collision efficiency (α, the probability that colloidal collisions with grain surfaces would result in attachment) and extractable metals, and a linear relationship (r(2) = 0.99 at p < 0.002) between α and σ(1). Collectively, variability in extractable metals and grain sorting accounted for ∼83% of the variability in α (at p < 0.0002) along the depth profiles. Amendments of 2.2 mg L(-1) of Russian River dissolved organic carbon (DOC) reduced α for oocysts by 4-5 fold. The highly reactive hydrophobic organic acid (HPOA) fraction was particularly effective in re-entraining sediment-attached microspheres. However, the transport-enhancing effects of the riverine DOC did not appear to penetrate very deeply into the underlying sediments, judging from high α values (∼1.0) observed for oocysts being advected through unamended sediments collected at ∼2 m bls. This study suggests that in evaluating the efficacy of RBF operations to remove oocysts, it may be necessary to consider not only the geochemical nature and size distribution of the sediment grains, but also the degrees of sediment sorting and the concentration, reactivity, and penetration of the source water DOC.

  11. Automatic recognition of light source from color negative films using sorting classification techniques

    NASA Astrophysics Data System (ADS)

    Sanger, Demas S.; Haneishi, Hideaki; Miyake, Yoichi

    1995-08-01

    This paper proposed a simple and automatic method for recognizing the light sources from various color negative film brands by means of digital image processing. First, we stretched the image obtained from a negative based on the standardized scaling factors, then extracted the dominant color component among red, green, and blue components of the stretched image. The dominant color component became the discriminator for the recognition. The experimental results verified that any one of the three techniques could recognize the light source from negatives of any film brands and all brands greater than 93.2 and 96.6% correct recognitions, respectively. This method is significant for the automation of color quality control in color reproduction from color negative film in mass processing and printing machine.

  12. Array tomography: characterizing FAC-sorted populations of zebrafish immune cells by their 3D ultrastructure.

    PubMed

    Wacker, Irene; Chockley, Peter; Bartels, Carolin; Spomer, Waldemar; Hofmann, Andreas; Gengenbach, Ulrich; Singh, Sachin; Thaler, Marlene; Grabher, Clemens; Schröder, Rasmus R

    2015-08-01

    For 3D reconstructions of whole immune cells from zebrafish, isolated from adult animals by FAC-sorting we employed array tomography on hundreds of serial sections deposited on silicon wafers. Image stacks were either recorded manually or automatically with the newly released ZEISS Atlas 5 Array Tomography platform on a Zeiss FEGSEM. To characterize different populations of immune cells, organelle inventories were created by segmenting individual cells. In addition, arrays were used for quantification of cell populations with respect to the various cell types they contained. The detection of immunological synapses in cocultures of cell populations from thymus or WKM with cancer cells helped to identify the cytotoxic nature of these cells. Our results demonstrate the practicality and benefit of AT for high-throughput ultrastructural imaging of substantial volumes. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.

  13. Automatically Identifying and Predicting Unplanned Wind Turbine Stoppages Using SCADA and Alarms System Data: Case Study and Results

    NASA Astrophysics Data System (ADS)

    Leahy, Kevin; Gallagher, Colm; Bruton, Ken; O'Donovan, Peter; O'Sullivan, Dominic T. J.

    2017-11-01

    Using 10-minute wind turbine SCADA data for fault prediction offers an attractive way of gaining additional prognostic capabilities without needing to invest in extra hardware. To use these data-driven methods effectively, the historical SCADA data must be labelled with the periods when the turbine was in faulty operation as well the sub-system the fault was attributed to. Manually identifying faults using maintenance logs can be effective, but is also highly time consuming and tedious due to the disparate nature of these logs across manufacturers, operators and even individual maintenance events. Turbine alarm systems can help to identify these periods, but the sheer volume of alarms and false positives generated makes analysing them on an individual basis ineffective. In this work, we present a new method for automatically identifying historical stoppages on the turbine using SCADA and alarms data. Each stoppage is associated with either a fault in one of the turbine’s sub-systems, a routine maintenance activity, a grid-related event or a number of other categories. This is then checked against maintenance logs for accuracy and the labelled data fed into a classifier for predicting when these stoppages will occur. Results show that the automated labelling process correctly identifies each type of stoppage, and can be effectively used for SCADA-based prediction of turbine faults.

  14. Lithology and base of the surficial aquifer system, Palm Beach County, Florida

    USGS Publications Warehouse

    Miller, Wesley L.

    1987-01-01

    The surficial aquifer system is a major source of freshwater in Palm Beach County. In 1982, public supply withdrawals from the aquifer system totaled 33,543 million gallons, 77.5% of total public supply withdrawals. To evaluate the aquifer system and its geologic framework, a cooperative study with Palm Beach County was begun in 1982 by the U.S. Geological Survey. The surficial aquifer system in Palm Beach County is composed primarily of sand, sandstone, shell, silt, calcareous clay (marl), and limestone deposited during the Pleistocene and Pliocene epochs. In the western two-thirds of Palm Beach County, sediments in the aquifer system are poorly consolidated sand, shell, and sandy limestone. Owing to interspersed calcareous clays and silt and very poorly sorted materials, permeabilities in this zone of the aquifer system are relatively low. Two other zones of the aquifer system are found in the eastern one-third of the county where the sediments are appreciably more permeable than in the west due to better sorting and less silt and clay content. The location of more detailed lithologic logs for wells in these sections, along with data from nearby wells, allowed enhanced interpretation and depiction of the lithology which had previously been generalized. The most permeable zone of the aquifer system in this area is characterized by highly developed secondary porosity where infiltrating rainwater and solution by groundwater have removed calcitic-cementing materials from the sediments to produce interconnected cavities. Increased permeability in the aquifer system is generally coincident with the eastern boundary of the overlying organic soils and Lake Flirt Marl. Lithologic logs of wells in Palm Beach County indicate that sediments forming the aquifer system were deposited directly on the erosional surface of the Hawthorn Formation in some areas. In other locations in the county, lithologic logs indicate that the base of the aquifer system was formed by fluvial deposits containing erosional materials from the Tamiami and Hawthorn Formations and Caloosahatchee Marl. (Lantz-PTT)

  15. Development, installation, and testing services for an automatic, point type thermal sensor, fire protection system on a mining dozer. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lease, W.D.

    1976-08-01

    Lease AFEX, Inc., modified its standard design of an automatic fire protection system used in the past on logging equipment, and long-term, in-mine tested system on a Fiat-Alli's HD-41B dozer at the Lemmons and Company coal mine, Boonville, Ind. The modification of the standard AFEX system involved improving the actuation device. The AFEX system is called a point-type thermal sensor, automatic fire protection system. The in-mine test took place in late 1975, and early 1976. The system was then tested by simulating a fire on the dozer. The system operated successfully after the 4 months of in-mine endurance testing. (Colormore » illustrations reproduced in black and white.)« less

  16. High-throughput determination of octanol/water partition coefficients using a shake-flask method and novel two-phase solvent system.

    PubMed

    Morikawa, Go; Suzuka, Chihiro; Shoji, Atsushi; Shibusawa, Yoichi; Yanagida, Akio

    2016-01-05

    A high-throughput method for determining the octanol/water partition coefficient (P(o/w)) of a large variety of compounds exhibiting a wide range in hydrophobicity was established. The method combines a simple shake-flask method with a novel two-phase solvent system comprising an acetonitrile-phosphate buffer (0.1 M, pH 7.4)-1-octanol (25:25:4, v/v/v; AN system). The AN system partition coefficients (K(AN)) of 51 standard compounds for which log P(o/w) (at pH 7.4; log D) values had been reported were determined by single two-phase partitioning in test tubes, followed by measurement of the solute concentration in both phases using an automatic flow injection-ultraviolet detection system. The log K(AN) values were closely related to reported log D values, and the relationship could be expressed by the following linear regression equation: log D=2.8630 log K(AN) -0.1497(n=51). The relationship reveals that log D values (+8 to -8) for a large variety of highly hydrophobic and/or hydrophilic compounds can be estimated indirectly from the narrow range of log K(AN) values (+3 to -3) determined using the present method. Furthermore, log K(AN) values for highly polar compounds for which no log D values have been reported, such as amino acids, peptides, proteins, nucleosides, and nucleotides, can be estimated using the present method. The wide-ranging log D values (+5.9 to -7.5) of these molecules were estimated for the first time from their log K(AN) values and the above regression equation. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Traumatic asphyxia--fatal accident in an automatic revolving door.

    PubMed

    Cortis, J; Falk, J; Rothschild, M A

    2015-09-01

    Due to continuing modernisation, the number of automatic doors in routine use, including powered revolving doors, has increased in recent years. Automatic revolving doors are found mostly in department stores, airports, railway stations and hospitals. Although safety arrangements and guidelines concerning the installation of automatic doors are in existence, their disregard in conjunction with obsolete or incorrect installation can lead to fatal accidents. In this report, a 19-month-old boy is described whose right arm was caught between the elements of an automatic revolving door. As a direct result of rescue attempts, the child's body was drawn further into the narrow gap between elements of the door. To get the boy's body out of the 4-cm-wide gap between the fixed outer wall of the revolving door and the revolving inner, back-up batteries had to be disconnected so as to stop the electrical motor powering the door. Cardiopulmonary resuscitation was begun immediately after the rescue but was unsuccessful; the child was declared dead at the hospital he was taken to. The cause of death was a combination of compression-related skull and brain injury together with thoracic compression. This case shows an outstanding example of the preventive aspect as a special task of forensic medicine. Additionally, it serves as a warning for the correct installation and use of automatic revolving doors. Even so, small children should not use these doors on their own, but only with an alert companion, so as to prevent further fatal accidents of this sort.

  18. High frequency mesozooplankton monitoring: Can imaging systems and automated sample analysis help us describe and interpret changes in zooplankton community composition and size structure — An example from a coastal site

    NASA Astrophysics Data System (ADS)

    Romagnan, Jean Baptiste; Aldamman, Lama; Gasparini, Stéphane; Nival, Paul; Aubert, Anaïs; Jamet, Jean Louis; Stemmann, Lars

    2016-10-01

    The present work aims to show that high throughput imaging systems can be useful to estimate mesozooplankton community size and taxonomic descriptors that can be the base for consistent large scale monitoring of plankton communities. Such monitoring is required by the European Marine Strategy Framework Directive (MSFD) in order to ensure the Good Environmental Status (GES) of European coastal and offshore marine ecosystems. Time and cost-effective, automatic, techniques are of high interest in this context. An imaging-based protocol has been applied to a high frequency time series (every second day between April 2003 to April 2004 on average) of zooplankton obtained in a coastal site of the NW Mediterranean Sea, Villefranche Bay. One hundred eighty four mesozooplankton net collected samples were analysed with a Zooscan and an associated semi-automatic classification technique. The constitution of a learning set designed to maximize copepod identification with more than 10,000 objects enabled the automatic sorting of copepods with an accuracy of 91% (true positives) and a contamination of 14% (false positives). Twenty seven samples were then chosen from the total copepod time series for detailed visual sorting of copepods after automatic identification. This method enabled the description of the dynamics of two well-known copepod species, Centropages typicus and Temora stylifera, and 7 other taxonomically broader copepod groups, in terms of size, biovolume and abundance-size distributions (size spectra). Also, total copepod size spectra underwent significant changes during the sampling period. These changes could be partially related to changes in the copepod assemblage taxonomic composition and size distributions. This study shows that the use of high throughput imaging systems is of great interest to extract relevant coarse (i.e. total abundance, size structure) and detailed (i.e. selected species dynamics) descriptors of zooplankton dynamics. Innovative zooplankton analyses are therefore proposed and open the way for further development of zooplankton community indicators of changes.

  19. In-Line Sorting of Harumanis Mango Based on External Quality Using Visible Imaging

    PubMed Central

    Ibrahim, Mohd Firdaus; Ahmad Sa’ad, Fathinul Syahir; Zakaria, Ammar; Md Shakaff, Ali Yeon

    2016-01-01

    The conventional method of grading Harumanis mango is time-consuming, costly and affected by human bias. In this research, an in-line system was developed to classify Harumanis mango using computer vision. The system was able to identify the irregularity of mango shape and its estimated mass. A group of images of mangoes of different size and shape was used as database set. Some important features such as length, height, centroid and parameter were extracted from each image. Fourier descriptor and size-shape parameters were used to describe the mango shape while the disk method was used to estimate the mass of the mango. Four features have been selected by stepwise discriminant analysis which was effective in sorting regular and misshapen mango. The volume from water displacement method was compared with the volume estimated by image processing using paired t-test and Bland-Altman method. The result between both measurements was not significantly different (P > 0.05). The average correct classification for shape classification was 98% for a training set composed of 180 mangoes. The data was validated with another testing set consist of 140 mangoes which have the success rate of 92%. The same set was used for evaluating the performance of mass estimation. The average success rate of the classification for grading based on its mass was 94%. The results indicate that the in-line sorting system using machine vision has a great potential in automatic fruit sorting according to its shape and mass. PMID:27801799

  20. In-Line Sorting of Harumanis Mango Based on External Quality Using Visible Imaging.

    PubMed

    Ibrahim, Mohd Firdaus; Ahmad Sa'ad, Fathinul Syahir; Zakaria, Ammar; Md Shakaff, Ali Yeon

    2016-10-27

    The conventional method of grading Harumanis mango is time-consuming, costly and affected by human bias. In this research, an in-line system was developed to classify Harumanis mango using computer vision. The system was able to identify the irregularity of mango shape and its estimated mass. A group of images of mangoes of different size and shape was used as database set. Some important features such as length, height, centroid and parameter were extracted from each image. Fourier descriptor and size-shape parameters were used to describe the mango shape while the disk method was used to estimate the mass of the mango. Four features have been selected by stepwise discriminant analysis which was effective in sorting regular and misshapen mango. The volume from water displacement method was compared with the volume estimated by image processing using paired t -test and Bland-Altman method. The result between both measurements was not significantly different (P > 0.05). The average correct classification for shape classification was 98% for a training set composed of 180 mangoes. The data was validated with another testing set consist of 140 mangoes which have the success rate of 92%. The same set was used for evaluating the performance of mass estimation. The average success rate of the classification for grading based on its mass was 94%. The results indicate that the in-line sorting system using machine vision has a great potential in automatic fruit sorting according to its shape and mass.

  1. Real World Experience With Ion Implant Fault Detection at Freescale Semiconductor

    NASA Astrophysics Data System (ADS)

    Sing, David C.; Breeden, Terry; Fakhreddine, Hassan; Gladwin, Steven; Locke, Jason; McHugh, Jim; Rendon, Michael

    2006-11-01

    The Freescale automatic fault detection and classification (FDC) system has logged data from over 3.5 million implants in the past two years. The Freescale FDC system is a low cost system which collects summary implant statistics at the conclusion of each implant run. The data is collected by either downloading implant data log files from the implant tool workstation, or by exporting summary implant statistics through the tool's automation interface. Compared to the traditional FDC systems which gather trace data from sensors on the tool as the implant proceeds, the Freescale FDC system cannot prevent scrap when a fault initially occurs, since the data is collected after the implant concludes. However, the system can prevent catastrophic scrap events due to faults which are not detected for days or weeks, leading to the loss of hundreds or thousands of wafers. At the Freescale ATMC facility, the practical applications of the FD system fall into two categories: PM trigger rules which monitor tool signals such as ion gauges and charge control signals, and scrap prevention rules which are designed to detect specific failure modes that have been correlated to yield loss and scrap. PM trigger rules are designed to detect shifts in tool signals which indicate normal aging of tool systems. For example, charging parameters gradually shift as flood gun assemblies age, and when charge control rules start to fail a flood gun PM is performed. Scrap prevention rules are deployed to detect events such as particle bursts and excessive beam noise, events which have been correlated to yield loss. The FDC system does have tool log-down capability, and scrap prevention rules often use this capability to automatically log the tool into a maintenance state while simultaneously paging the sustaining technician for data review and disposition of the affected product.

  2. Feasibility of Epidemiologic Research on Nonauditory Health Effects of Residential Aircraft Noise Exposure. Volume 3. Summary of Literature on Cardiovascular Effects on Noise Exposure

    DTIC Science & Technology

    1989-12-01

    analyser epinephrine, dopamine, Cigala, F.; shop and sorting, and General Rodo cortisol measured Ricco, M.; matched for age, exposed personal dosimeters ...old sured by calibrated person- BP with semi-automatic de Vries, F. F.; duction depart- 31% 25-34 yrs.; al dosimeters with a short Waterpik...with without noise strain, history of hyper- tension excluded. 84 Table 3-23: continued. Suammuy of Epid oSl c Sades - contnud Bias and Potential

  3. Controlling suspended samplers by programmable calculator and interface circuitry

    Treesearch

    Rand E. Eads; Mark R. Boolootian

    1985-01-01

    A programmable calculator connected to an interface circuit can control automatic samplers and record streamflow data. The circuit converts a voltage representing water stage to a digital signal. The sampling program logs streamflow data when there is a predefined deviation from a linear trend in the water elevation. The calculator estimates suspended sediment...

  4. Controlling suspended sediment samplers by programmable calculator and interface circuitry

    Treesearch

    Rand E. Eads; Mark R. Boolootian

    1985-01-01

    A programmable calculator connected to an interface circuit can control automatic samplers and record streamflow data. The circuit converts a voltage representing water stage to a digital signal. The sampling program logs streamflow data when there is a predefined deviation from a linear trend in the water elevation. The calculator estimates suspended sediment...

  5. Investigating Student Choices in Performing Higher-Level Comprehension Tasks Using TED

    ERIC Educational Resources Information Center

    Bianchi, Francesca; Marenzi, Ivana

    2016-01-01

    The current paper describes a first experiment in the use of TED talks and open tagging exercises to train higher-level comprehension skills, and of automatic logging of the student's actions to investigate the student choices while performing analytical tasks. The experiment took advantage of an interactive learning platform--LearnWeb--that…

  6. iLOG: A Framework for Automatic Annotation of Learning Objects with Empirical Usage Metadata

    ERIC Educational Resources Information Center

    Miller, L. D.; Soh, Leen-Kiat; Samal, Ashok; Nugent, Gwen

    2012-01-01

    Learning objects (LOs) are digital or non-digital entities used for learning, education or training commonly stored in repositories searchable by their associated metadata. Unfortunately, based on the current standards, such metadata is often missing or incorrectly entered making search difficult or impossible. In this paper, we investigate…

  7. CAMUS: Automatically Mapping Cyber Assets to Mission and Users (PREPRINT)

    DTIC Science & Technology

    2009-10-01

    which machines regularly use a particular mail server. Armed with these basic data sources – LDAP, NetFlow traffic and user logs – fuselets were created... NetFlow traffic used in the demonstration has over ten thousand unique IP Addresses and is over one gigabyte in size. A number of high performance

  8. Heterogeneous Multi-Robot Multi-Sensor Platform for Intruder Detection

    DTIC Science & Technology

    2009-09-15

    propagation model, with variance τi: si ~ N(b0i + b1i *logDi, τ i). The initial parameters (b0i, b1i, τ i ) of the model are unknown, and the training...that the advantage of MOO-learned mode would become more significant over time compared with the other mode. 1 2 3 4 5 6 7 0 0.05 0.1 0.15 0.2...nondominated sorting genetic algorithm for multi-objective optimization: NSGA-II,” in Parallel Problem Solving from Nature (PPSN VI), M. Schoenauer

  9. Microfluidic, marker-free isolation of circulating tumor cells from blood samples

    PubMed Central

    Karabacak, Nezihi Murat; Spuhler, Philipp S; Fachin, Fabio; Lim, Eugene J; Pai, Vincent; Ozkumur, Emre; Martel, Joseph M; Kojic, Nikola; Smith, Kyle; Chen, Pin-i; Yang, Jennifer; Hwang, Henry; Morgan, Bailey; Trautwein, Julie; Barber, Thomas A; Stott, Shannon L; Maheswaran, Shyamala; Kapur, Ravi; Haber, Daniel A; Toner, Mehmet

    2014-01-01

    The ability to isolate and analyze rare circulating tumor cells (CTCs) has the potential to further our understanding of cancer metastasis and enhance the care of cancer patients. In this protocol, we describe the procedure for isolating rare CTCs from blood samples by using tumor antigen–independent microfluidic CTC-iChip technology. The CTC-iChip uses deterministic lateral displacement, inertial focusing and magnetophoresis to sort up to 107 cells/s. By using two-stage magnetophoresis and depletion antibodies against leukocytes, we achieve 3.8-log depletion of white blood cells and a 97% yield of rare cells with a sample processing rate of 8 ml of whole blood/h. The CTC-iChip is compatible with standard cytopathological and RNA-based characterization methods. This protocol describes device production, assembly, blood sample preparation, system setup and the CTC isolation process. Sorting 8 ml of blood sample requires 2 h including setup time, and chip production requires 2–5 d. PMID:24577360

  10. Integration of QR codes into an anesthesia information management system for resident case log management.

    PubMed

    Avidan, Alexander; Weissman, Charles; Levin, Phillip D

    2015-04-01

    Quick response (QR) codes containing anesthesia syllabus data were introduced into an anesthesia information management system. The code was generated automatically at the conclusion of each case and available for resident case logging using a smartphone or tablet. The goal of this study was to evaluate the use and usability/user-friendliness of such system. Resident case logging practices were assessed prior to introducing the QR codes. QR code use and satisfactions amongst residents was reassessed at three and six months. Before QR code introduction only 12/23 (52.2%) residents maintained a case log. Most of the remaining residents (9/23, 39.1%) expected to receive a case list from the anesthesia information management system database at the end of their residency. At three months and six months 17/26 (65.4%) and 15/25 (60.0%) residents, respectively, were using the QR codes. Satisfaction was rated as very good or good. QR codes for residents' case logging with smartphones or tablets were successfully introduced in an anesthesia information management system and used by most residents. QR codes can be successfully implemented into medical practice to support data transfer. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. EARS : Repositioning data management near data acquisition.

    NASA Astrophysics Data System (ADS)

    Sinquin, Jean-Marc; Sorribas, Jordi; Diviacco, Paolo; Vandenberghe, Thomas; Munoz, Raquel; Garcia, Oscar

    2016-04-01

    The EU FP7 Projects Eurofleets and Eurofleets2 are an European wide alliance of marine research centers that aim to share their research vessels, to improve information sharing on planned, current and completed cruises, on details of ocean-going research vessels and specialized equipment, and to durably improve cost-effectiveness of cruises. Within this context logging of information on how, when and where anything happens on board of the vessel is crucial information for data users in a later stage. This forms a primordial step in the process of data quality control as it could assist in the understanding of anomalies and unexpected trends recorded in the acquired data sets. In this way completeness of the metadata is improved as it is recorded accurately at the origin of the measurement. The collection of this crucial information has been done in very different ways, using different procedures, formats and pieces of software in the context of the European Research Fleet. At the time that the Eurofleets project started, every institution and country had adopted different strategies and approaches, which complicated the task of users that need to log general purpose information and events on-board whenever they access a different platform loosing the opportunity to produce this valuable metadata on-board. Among the many goals the Eurofleets project has, a very important task is the development of an "event log software" called EARS (Eurofleets Automatic Reporting System) that enables scientists and operators to record what happens during a survey. EARS will allow users to fill, in a standardized way, the gap existing at the moment in metadata description that only very seldom links data with its history. Events generated automatically by acquisition instruments will also be handled, enhancing the granularity and precision of the event annotation. The adoption of a common procedure to log survey events and a common terminology to describe them is crucial to provide a friendly and successfully metadata on-board creation procedure for the whole the European Fleet. The possibility of automatically reporting metadata and general purpose data, will simplify the work of scientists and data managers with regards to data transmission. An improved accuracy and completeness of metadata is expected when events are recorded at acquisition time. This will also enhance multiple usages of the data as it allows verification of the different requirements existing in different disciplines.

  12. Applications of color machine vision in the agricultural and food industries

    NASA Astrophysics Data System (ADS)

    Zhang, Min; Ludas, Laszlo I.; Morgan, Mark T.; Krutz, Gary W.; Precetti, Cyrille J.

    1999-01-01

    Color is an important factor in Agricultural and the Food Industry. Agricultural or prepared food products are often grade by producers and consumers using color parameters. Color is used to estimate maturity, sort produce for defects, but also perform genetic screenings or make an aesthetic judgement. The task of sorting produce following a color scale is very complex, requires special illumination and training. Also, this task cannot be performed for long durations without fatigue and loss of accuracy. This paper describes a machine vision system designed to perform color classification in real-time. Applications for sorting a variety of agricultural products are included: e.g. seeds, meat, baked goods, plant and wood.FIrst the theory of color classification of agricultural and biological materials is introduced. Then, some tools for classifier development are presented. Finally, the implementation of the algorithm on real-time image processing hardware and example applications for industry is described. This paper also presented an image analysis algorithm and a prototype machine vision system which was developed for industry. This system will automatically locate the surface of some plants using digital camera and predict information such as size, potential value and type of this plant. The algorithm developed will be feasible for real-time identification in an industrial environment.

  13. Automatic peak selection by a Benjamini-Hochberg-based algorithm.

    PubMed

    Abbas, Ahmed; Kong, Xin-Bing; Liu, Zhi; Jing, Bing-Yi; Gao, Xin

    2013-01-01

    A common issue in bioinformatics is that computational methods often generate a large number of predictions sorted according to certain confidence scores. A key problem is then determining how many predictions must be selected to include most of the true predictions while maintaining reasonably high precision. In nuclear magnetic resonance (NMR)-based protein structure determination, for instance, computational peak picking methods are becoming more and more common, although expert-knowledge remains the method of choice to determine how many peaks among thousands of candidate peaks should be taken into consideration to capture the true peaks. Here, we propose a Benjamini-Hochberg (B-H)-based approach that automatically selects the number of peaks. We formulate the peak selection problem as a multiple testing problem. Given a candidate peak list sorted by either volumes or intensities, we first convert the peaks into [Formula: see text]-values and then apply the B-H-based algorithm to automatically select the number of peaks. The proposed approach is tested on the state-of-the-art peak picking methods, including WaVPeak [1] and PICKY [2]. Compared with the traditional fixed number-based approach, our approach returns significantly more true peaks. For instance, by combining WaVPeak or PICKY with the proposed method, the missing peak rates are on average reduced by 20% and 26%, respectively, in a benchmark set of 32 spectra extracted from eight proteins. The consensus of the B-H-selected peaks from both WaVPeak and PICKY achieves 88% recall and 83% precision, which significantly outperforms each individual method and the consensus method without using the B-H algorithm. The proposed method can be used as a standard procedure for any peak picking method and straightforwardly applied to some other prediction selection problems in bioinformatics. The source code, documentation and example data of the proposed method is available at http://sfb.kaust.edu.sa/pages/software.aspx.

  14. Automatic Peak Selection by a Benjamini-Hochberg-Based Algorithm

    PubMed Central

    Abbas, Ahmed; Kong, Xin-Bing; Liu, Zhi; Jing, Bing-Yi; Gao, Xin

    2013-01-01

    A common issue in bioinformatics is that computational methods often generate a large number of predictions sorted according to certain confidence scores. A key problem is then determining how many predictions must be selected to include most of the true predictions while maintaining reasonably high precision. In nuclear magnetic resonance (NMR)-based protein structure determination, for instance, computational peak picking methods are becoming more and more common, although expert-knowledge remains the method of choice to determine how many peaks among thousands of candidate peaks should be taken into consideration to capture the true peaks. Here, we propose a Benjamini-Hochberg (B-H)-based approach that automatically selects the number of peaks. We formulate the peak selection problem as a multiple testing problem. Given a candidate peak list sorted by either volumes or intensities, we first convert the peaks into -values and then apply the B-H-based algorithm to automatically select the number of peaks. The proposed approach is tested on the state-of-the-art peak picking methods, including WaVPeak [1] and PICKY [2]. Compared with the traditional fixed number-based approach, our approach returns significantly more true peaks. For instance, by combining WaVPeak or PICKY with the proposed method, the missing peak rates are on average reduced by 20% and 26%, respectively, in a benchmark set of 32 spectra extracted from eight proteins. The consensus of the B-H-selected peaks from both WaVPeak and PICKY achieves 88% recall and 83% precision, which significantly outperforms each individual method and the consensus method without using the B-H algorithm. The proposed method can be used as a standard procedure for any peak picking method and straightforwardly applied to some other prediction selection problems in bioinformatics. The source code, documentation and example data of the proposed method is available at http://sfb.kaust.edu.sa/pages/software.aspx. PMID:23308147

  15. What Do Context Aware Electronic Alerts from Virtual Learning Environments Tell Us about User Time & Location?

    ERIC Educational Resources Information Center

    Crane, Laura; Benachour, Phillip

    2013-01-01

    The paper describes the analysis of user location and time stamp information automatically logged when students receive and interact with electronic updates from the University's virtual learning environment. The electronic updates are sent to students' mobile devices using RSS feeds. The mobile reception of such information can be received in…

  16. Remote Sensing Analysis of Forest Disturbances

    NASA Technical Reports Server (NTRS)

    Asner, Gregory P. (Inventor)

    2015-01-01

    The present invention provides systems and methods to automatically analyze Landsat satellite data of forests. The present invention can easily be used to monitor any type of forest disturbance such as from selective logging, agriculture, cattle ranching, natural hazards (fire, wind events, storms), etc. The present invention provides a large-scale, high-resolution, automated remote sensing analysis of such disturbances.

  17. Remote sensing analysis of forest disturbances

    NASA Technical Reports Server (NTRS)

    Asner, Gregory P. (Inventor)

    2012-01-01

    The present invention provides systems and methods to automatically analyze Landsat satellite data of forests. The present invention can easily be used to monitor any type of forest disturbance such as from selective logging, agriculture, cattle ranching, natural hazards (fire, wind events, storms), etc. The present invention provides a large-scale, high-resolution, automated remote sensing analysis of such disturbances.

  18. Query Classification and Study of University Students' Search Trends

    ERIC Educational Resources Information Center

    Maabreh, Majdi A.; Al-Kabi, Mohammed N.; Alsmadi, Izzat M.

    2012-01-01

    Purpose: This study is an attempt to develop an automatic identification method for Arabic web queries and divide them into several query types using data mining. In addition, it seeks to evaluate the impact of the academic environment on using the internet. Design/methodology/approach: The web log files were collected from one of the higher…

  19. Measuring circuit

    DOEpatents

    Sun, Shan C.; Chaprnka, Anthony G.

    1977-01-11

    An automatic gain control circuit functions to adjust the magnitude of an input signal supplied to a measuring circuit to a level within the dynamic range of the measuring circuit while a log-ratio circuit adjusts the magnitude of the output signal from the measuring circuit to the level of the input signal and optimizes the signal-to-noise ratio performance of the measuring circuit.

  20. Sensor-Free or Sensor-Full: A Comparison of Data Modalities in Multi-Channel Affect Detection

    ERIC Educational Resources Information Center

    Paquette, Luc; Rowe, Jonathan; Baker, Ryan; Mott, Bradford; Lester, James; DeFalco, Jeanine; Brawner, Keith; Sottilare, Robert; Georgoulas, Vasiliki

    2016-01-01

    Computational models that automatically detect learners' affective states are powerful tools for investigating the interplay of affect and learning. Over the past decade, affect detectors--which recognize learners' affective states at run-time using behavior logs and sensor data--have advanced substantially across a range of K-12 and postsecondary…

  1. PLAT: An Automated Fault and Behavioural Anomaly Detection Tool for PLC Controlled Manufacturing Systems.

    PubMed

    Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun; Wang, Gi-Nam

    2016-01-01

    Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively.

  2. PLAT: An Automated Fault and Behavioural Anomaly Detection Tool for PLC Controlled Manufacturing Systems

    PubMed Central

    Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun

    2016-01-01

    Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively. PMID:27974882

  3. Addition of tracers into the polypropylene in view of automatic sorting of plastic wastes using X-ray fluorescence spectrometry.

    PubMed

    Bezati, F; Froelich, D; Massardier, V; Maris, E

    2010-04-01

    This study focused on the detection of rare earth oxides, used as tracers for the identification of polymer materials, using XRF (X-ray fluorescence) spectrometry. The tests were carried out in a test system device which allows the collection of static measurements of the samples' spectrum through the use of energy dispersive X-ray fluorescence technology. A sorting process based on tracers added into the polymer matrix is proposed in order to increase sorting selectivity of polypropylene during end-of-life recycling. Tracers consist of systems formed by one or by several substances dispersed into a material, to add a selective property to it, with the aim of improving the efficiency of sorting and high speed identification. Several samples containing rare earth oxides (Y(2)O(3), CeO(2), Nd(2)O(3), Gd(2)O(3), Dy(2)O(3), Er(2)O(3) and Yb(2)O(3)) in different concentrations were prepared in order to analyse some of the parameters which can influence the detection, such as the concentration of tracers, the acquisition time and the possible overlapping among the tracers. This work shows that by using the XRF test system device, it was possible to detect 5 of the 7 tracers tested for 1min exposure time and at a concentration level of 1000ppm. These two parameters will play an important role in the development of an industrial device, which indicates the necessity of further works that needs to be conducted in order to reduce them. Copyright 2009 Elsevier Ltd. All rights reserved.

  4. GreenNet: A Global Ground-Based Network of Instruments Measuring Greenhouse Gases in the Atmosphere

    NASA Astrophysics Data System (ADS)

    Floyd, M.; Grunberg, M.; Wilson, E. L.

    2017-12-01

    Climate change is the most important crisis of our lifetime. For policy makers to take action to combat the effects of climate change, they will need definitive proof that it is occurring globally. We have developed a low-cost ground instrument - a portable miniaturized laser heterodyne radiometer (mini-LHR) - capable of measuring concentrations of two of the most potent anthropogenic greenhouse gases, CO2 and methane, in columns in the atmosphere. They work by combining sunlight that has undergone absorption by gases with light from a laser. This combined light is detected by a photoreciever and a radio frequency beat signal is produced. From this beat signal, concentrations of these gases throughout the atmospheric column can be determined. A network of mini-LHR instruments in locations around the world will give us the data necessary to significantly reduce uncertainty in greenhouse gas sinks and sources contributing to climate change. Each instrument takes one reading per minute while the sun is up. With a goal to establish up to 500 instrument sites, the estimated total data per day will likely exceed 1GB. Every piece of data must be sorted as it comes in to determine whether it is a good or bad reading. The goal of the citizen science project is to collaborate with citizen scientists enrolled with Zooniverse.org to cycle through our data and help sort it, while also learning about the mini-LHR, greenhouse gases and climate change. This data will be used to construct an algorithm to automatically sort data that relies on statistical analyses of the previously sorted data.

  5. Computations on the massively parallel processor at the Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Strong, James P.

    1991-01-01

    Described are four significant algorithms implemented on the massively parallel processor (MPP) at the Goddard Space Flight Center. Two are in the area of image analysis. Of the other two, one is a mathematical simulation experiment and the other deals with the efficient transfer of data between distantly separated processors in the MPP array. The first algorithm presented is the automatic determination of elevations from stereo pairs. The second algorithm solves mathematical logistic equations capable of producing both ordered and chaotic (or random) solutions. This work can potentially lead to the simulation of artificial life processes. The third algorithm is the automatic segmentation of images into reasonable regions based on some similarity criterion, while the fourth is an implementation of a bitonic sort of data which significantly overcomes the nearest neighbor interconnection constraints on the MPP for transferring data between distant processors.

  6. Terminal area automatic navigation, guidance and control research using the Microwave Landing System (MLS). Part 5: Design and development of a Digital Integrated Automatic Landing System (DIALS) for steep final approach using modern control techniques

    NASA Technical Reports Server (NTRS)

    Halyo, N.

    1983-01-01

    The design and development of a 3-D Digital Integrated Automatic Landing System (DIALS) for the Terminal Configured Vehicle (TCV) Research Aircraft, a B-737-100 is described. The system was designed using sampled data Linear Quadratic Gaussian (LOG) methods, resulting in a direct digital design with a modern control structure which consists of a Kalman filter followed by a control gain matrix, all operating at 10 Hz. DIALS uses Microwave Landing System (MLS) position, body-mounted accelerometers, as well as on-board sensors usually available on commercial aircraft, but does not use inertial platforms. The phases of the final approach considered are the localizer and glideslope capture which may be performed simultaneously, localizer and steep glideslope track or hold, crab/decrab and flare to touchdown. DIALS captures, tracks and flares from steep glideslopes ranging from 2.5 deg to 5.5 deg, selected prior to glideslope capture. Digital Integrated Automatic Landing System is the first modern control design automatic landing system successfully flight tested. The results of an initial nonlinear simulation are presented here.

  7. Usage of air jigging for multi-component separation of construction and demolition waste.

    PubMed

    Ambrós, Weslei Monteiro; Sampaio, Carlos Hoffmann; Cazacliu, Bogdan Grigore; Miltzarek, Gerson Luis; Miranda, Leonardo R

    2017-02-01

    The use of air jigging for performing multi-component separation in the treatment of mixed construction and demolition waste was studied. Sorting tests were carried out with mixtures of equal bulk volume of concrete and brick in which fixed quantities of unwanted materials - gypsum, wood and paper - were added. Experimental results have demonstrated the possibility to use air jigging to carry out both the removal of low-density contaminants and the concrete concentration in only one process step. In relation to the removal of contaminants only, the overall performance of jigging process can be comparable with that of commercial air classifiers and automatic sorting systems. Also, the initial content of contaminants seems does not have a significant effect on the separation extent. These results are of particular importance for recycling plants processing as they represent an alternative to optimize the use of air jigs. Further investigation is needed in order to evaluate the practical feasibility of such method. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. SU-F-T-469: A Clinically Observed Discrepancy Between Image-Based and Log- Based MLC Position

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neal, B; Ahmed, M; Siebers, J

    2016-06-15

    Purpose: To present a clinical case which challenges the base assumption of log-file based QA, by showing that the actual position of a MLC leaf can suddenly deviate from its programmed and logged position by >1 mm as observed with real-time imaging. Methods: An EPID-based exit-fluence dosimetry system designed to prevent gross delivery errors was used in cine mode to capture portal images during treatment. Visual monitoring identified an anomalous MLC leaf pair gap not otherwise detected by the automatic position verification. The position of the erred leaf was measured on EPID images and log files were analyzed for themore » treatment in question, the prior day’s treatment, and for daily MLC test patterns acquired on those treatment days. Additional standard test patterns were used to quantify the leaf position. Results: Whereas the log file reported no difference between planned and recorded positions, image-based measurements showed the leaf to be 1.3±0.1 mm medial from the planned position. This offset was confirmed with the test pattern irradiations. Conclusion: It has been clinically observed that log-file derived leaf positions can differ from their actual positions by >1 mm, and therefore cannot be considered to be the actual leaf positions. This cautions the use of log-based methods for MLC or patient quality assurance without independent confirmation of log integrity. Frequent verification of MLC positions through independent means is a necessary precondition to trusting log file records. Intra-treatment EPID imaging provides a method to capture departures from MLC planned positions. Work was supported in part by Varian Medical Systems.« less

  9. Environmental corrections of a dual-induction logging while drilling tool in vertical wells

    NASA Astrophysics Data System (ADS)

    Kang, Zhengming; Ke, Shizhen; Jiang, Ming; Yin, Chengfang; Li, Anzong; Li, Junjian

    2018-04-01

    With the development of Logging While Drilling (LWD) technology, dual-induction LWD logging is not only widely applied in deviated wells and horizontal wells, but it is used commonly in vertical wells. Accordingly, it is necessary to simulate the response of LWD tools in vertical wells for logging interpretation. In this paper, the investigation characteristics, the effects of the tool structure, skin effect and drilling environment of a dual-induction LWD tool are simulated by the three-dimensional (3D) finite element method (FEM). In order to closely simulate the actual situation, real structure of the tool is taking into account. The results demonstrate that the influence of the background value of the tool structure can be eliminated. The values of deducting the background of a tool structure and analytical solution have a quantitative agreement in homogeneous formations. The effect of measurement frequency could be effectively eliminated by chart of skin effect correction. In addition, the measurement environment, borehole size, mud resistivity, shoulder bed, layer thickness and invasion, have an effect on the true resistivity. To eliminate these effects, borehole correction charts, shoulder bed correction charts and tornado charts are computed based on real tool structure. Based on correction charts, well logging data can be corrected automatically by a suitable interpolation method, which is convenient and fast. Verified with actual logging data in vertical wells, this method could obtain the true resistivity of formation.

  10. Fish connectivity mapping intermediate data files and outputs

    EPA Pesticide Factsheets

    RLWrankedLists.tar.gz:These lists linked to various chemical treatment conditions serve as the target collection of Cmap. Probes of the entire microarray are sorted based on their log fold changes over control conditions. RLWsignatures2015.tar.gz: These signatures linked to various chemical treatment conditions serve as queries in Cmap.This dataset is associated with the following publication:Wang , R., A. Biales , N. Garcia-Reyero, E. Perkins, D. Villeneuve, G. Ankley, and D. Bencic. Fish Connectivity Mapping: Linking Chemical Stressors by Their MOA-Driven Transcriptomic Profiles. BMC Genomics. BioMed Central Ltd, London, UK, 17(84): 1-20, (2016).

  11. Neural network hardware and software solutions for sorting of waste plastics for recycling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stanton, S.L.; Alam, M.K.; Hebner, G.A.

    1992-12-31

    While plastic recycling efforts have expanded during the past several years, the cost of recovering plastics is still a major impediment for recyclers. Several factors contribute to the prohibitive cost of recycled resins, including the present low marketability of products made with mixed recycled materials, and costs of collecting, sorting and reprocessing plastic materials. A method for automatic sorting of post-consumer plastics into pure polymer streams is needed to overcome the inaccuracies and low product throughput of the currently used method of hand sorting of waste plastics for recycling. The Society of Plastics has designated seven categories as recyclable: Polyethylenemore » terephthalate (PET); High Density Polyethylene (HDPE); Polyvinyl Chloride (PVC); Low Density Polyethylene (LDPE); Polypropylene (PP); Polystyrene (PS); and Other (mixtures, layered items, etc.). With these categories in mind, a system for sorting of waste plastics using near-infrared reflectance spectra and a backpropagation neural network classifier has been developed. A solution has been demonstrated in the laboratory using a high resolution, and relatively slow instrument. A faster instrument is being developed at this time. Neural network hardware options have been evaluated for use in a real-time industrial system. In the lab, a Fourier transform Near Infrared (FT-NIR) scanning spectrometer was used to gather reflectance data from various locations on samples of actual waste plastics. Neural networks were trained off-line with this data using the NeuralWorks Professional II Plus software package on a SparcStation 2. One of the successfully trained networks was used to compare the neural accelerator hardware options available. The results of running this ``worst case`` network on the neural network hardware will be presented. The AT&T ANNA chip and the Intel 80170NX chip development system were used to determine the ease of implementation and accuracies for this network.« less

  12. Neural network hardware and software solutions for sorting of waste plastics for recycling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stanton, S.L.; Alam, M.K.; Hebner, G.A.

    1992-01-01

    While plastic recycling efforts have expanded during the past several years, the cost of recovering plastics is still a major impediment for recyclers. Several factors contribute to the prohibitive cost of recycled resins, including the present low marketability of products made with mixed recycled materials, and costs of collecting, sorting and reprocessing plastic materials. A method for automatic sorting of post-consumer plastics into pure polymer streams is needed to overcome the inaccuracies and low product throughput of the currently used method of hand sorting of waste plastics for recycling. The Society of Plastics has designated seven categories as recyclable: Polyethylenemore » terephthalate (PET); High Density Polyethylene (HDPE); Polyvinyl Chloride (PVC); Low Density Polyethylene (LDPE); Polypropylene (PP); Polystyrene (PS); and Other (mixtures, layered items, etc.). With these categories in mind, a system for sorting of waste plastics using near-infrared reflectance spectra and a backpropagation neural network classifier has been developed. A solution has been demonstrated in the laboratory using a high resolution, and relatively slow instrument. A faster instrument is being developed at this time. Neural network hardware options have been evaluated for use in a real-time industrial system. In the lab, a Fourier transform Near Infrared (FT-NIR) scanning spectrometer was used to gather reflectance data from various locations on samples of actual waste plastics. Neural networks were trained off-line with this data using the NeuralWorks Professional II Plus software package on a SparcStation 2. One of the successfully trained networks was used to compare the neural accelerator hardware options available. The results of running this worst case'' network on the neural network hardware will be presented. The AT T ANNA chip and the Intel 80170NX chip development system were used to determine the ease of implementation and accuracies for this network.« less

  13. The NetLogger Toolkit V2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gunter, Dan; Lee, Jason; Stoufer, Martin

    2003-03-28

    The NetLogger Toolkit is designed to monitor, under actual operating conditions, the behavior of all the elements of the application-to-application communication path in order to determine exactly where time is spent within a complex system Using NetLogger, distnbuted application components are modified to produce timestamped logs of "interesting" events at all the critical points of the distributed system Events from each component are correlated, which allov^ one to characterize the performance of all aspects of the system and network in detail. The NetLogger Toolkit itself consists of four components an API and library of functions to simplify the generation ofmore » application-level event logs, a set of tools for collecting and sorting log files, an event archive system, and a tool for visualization and analysis of the log files In order to instrument an application to produce event logs, the application developer inserts calls to the NetLogger API at all the critical points in the code, then links the application with the NetLogger library All the tools in the NetLogger Toolkit share a common log format, and assume the existence of accurate and synchronized system clocks NetLogger messages can be logged using an easy-to-read text based format based on the lETF-proposed ULM format, or a binary format that can still be used through the same API but that is several times faster and smaller, with performance comparable or better than binary message formats such as MPI, XDR, SDDF-Binary, and PBIO. The NetLogger binary format is both highly efficient and self-describing, thus optimized for the dynamic message construction and parsing of application instrumentation. NetLogger includes an "activation" API that allows NetLogger logging to be turned on, off, or modified by changing an external file This IS useful for activating logging in daemons/services (e g GndFTP server). The NetLogger reliability API provides the ability to specify backup logging locations and penodically try to reconnect broken TCP pipe. A typical use for this is to store data on local disk while net is down. An event archiver can log one or more incoming NetLogger streams to a local disk file (netlogd) or to a mySQL database (netarchd). We have found exploratory, visual analysis of the log event data to be the most useful means of determining the causes of performance anomalies The NetLogger Visualization tool, niv, has been developed to provide a flexible and interactive graphical representation of system-level and application-level events.« less

  14. Statistical Methods in Assembly Quality Management of Multi-Element Products on Automatic Rotor Lines

    NASA Astrophysics Data System (ADS)

    Pries, V. V.; Proskuriakov, N. E.

    2018-04-01

    To control the assembly quality of multi-element mass-produced products on automatic rotor lines, control methods with operational feedback are required. However, due to possible failures in the operation of the devices and systems of automatic rotor line, there is always a real probability of getting defective (incomplete) products into the output process stream. Therefore, a continuous sampling control of the products completeness, based on the use of statistical methods, remains an important element in managing the quality of assembly of multi-element mass products on automatic rotor lines. The feature of continuous sampling control of the multi-element products completeness in the assembly process is its breaking sort, which excludes the possibility of returning component parts after sampling control to the process stream and leads to a decrease in the actual productivity of the assembly equipment. Therefore, the use of statistical procedures for continuous sampling control of the multi-element products completeness when assembled on automatic rotor lines requires the use of such sampling plans that ensure a minimum size of control samples. Comparison of the values of the limit of the average output defect level for the continuous sampling plan (CSP) and for the automated continuous sampling plan (ACSP) shows the possibility of providing lower limit values for the average output defects level using the ACSP-1. Also, the average sample size when using the ACSP-1 plan is less than when using the CSP-1 plan. Thus, the application of statistical methods in the assembly quality management of multi-element products on automatic rotor lines, involving the use of proposed plans and methods for continuous selective control, will allow to automating sampling control procedures and the required level of quality of assembled products while minimizing sample size.

  15. SU-G-BRB-04: Automated Output Factor Measurements Using Continuous Data Logging for Linac Commissioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, X; Li, S; Zheng, D

    Purpose: Linac commissioning is a time consuming and labor intensive process, the streamline of which is highly desirable. In particular, manual measurement of output factors for a variety of field sizes and energy greatly hinders the commissioning efficiency. In this study, automated measurement of output factors was demonstrated as ‘one-click’ using data logging of an electrometer. Methods: Beams to be measured were created in the recording and verifying (R&V) system and configured for continuous delivery. An electrometer with an automatic data logging feature enabled continuous data collection for all fields without human intervention. The electrometer saved data into a spreadsheetmore » every 0.5 seconds. A Matlab program was developed to analyze the excel data to monitor and check the data quality. Results: For each photon energy, output factors were measured for five configurations, including open field and four wedges. Each configuration includes 72 fields sizes, ranging from 4×4 to 20×30 cm{sup 2}. Using automation, it took 50 minutes to complete the measurement of 72 field sizes, in contrast to 80 minutes when using the manual approach. The automation avoided the necessity of redundant Linac status checks between fields as in the manual approach. In fact, the only limiting factor in such automation is Linac overheating. The data collection beams in the R&V system are reusable, and the simplified process is less error-prone. In addition, our Matlab program extracted the output factors faithfully from data logging, and the discrepancy between the automatic and manual measurement is within ±0.3%. For two separate automated measurements 30 days apart, consistency check shows a discrepancy within ±1% for 6MV photon with a 60 degree wedge. Conclusion: Automated output factor measurements can save time by 40% when compared with conventional manual approach. This work laid ground for further improvement for the automation of Linac commissioning.« less

  16. Design of an optimum computer vision-based automatic abalone (Haliotis discus hannai) grading algorithm.

    PubMed

    Lee, Donggil; Lee, Kyounghoon; Kim, Seonghun; Yang, Yongsu

    2015-04-01

    An automatic abalone grading algorithm that estimates abalone weights on the basis of computer vision using 2D images is developed and tested. The algorithm overcomes the problems experienced by conventional abalone grading methods that utilize manual sorting and mechanical automatic grading. To design an optimal algorithm, a regression formula and R(2) value were investigated by performing a regression analysis for each of total length, body width, thickness, view area, and actual volume against abalone weights. The R(2) value between the actual volume and abalone weight was 0.999, showing a relatively high correlation. As a result, to easily estimate the actual volumes of abalones based on computer vision, the volumes were calculated under the assumption that abalone shapes are half-oblate ellipsoids, and a regression formula was derived to estimate the volumes of abalones through linear regression analysis between the calculated and actual volumes. The final automatic abalone grading algorithm is designed using the abalone volume estimation regression formula derived from test results, and the actual volumes and abalone weights regression formula. In the range of abalones weighting from 16.51 to 128.01 g, the results of evaluation of the performance of algorithm via cross-validation indicate root mean square and worst-case prediction errors of are 2.8 and ±8 g, respectively. © 2015 Institute of Food Technologists®

  17. A Procedure for Extending Input Selection Algorithms to Low Quality Data in Modelling Problems with Application to the Automatic Grading of Uploaded Assignments

    PubMed Central

    Otero, José; Palacios, Ana; Suárez, Rosario; Junco, Luis

    2014-01-01

    When selecting relevant inputs in modeling problems with low quality data, the ranking of the most informative inputs is also uncertain. In this paper, this issue is addressed through a new procedure that allows the extending of different crisp feature selection algorithms to vague data. The partial knowledge about the ordinal of each feature is modelled by means of a possibility distribution, and a ranking is hereby applied to sort these distributions. It will be shown that this technique makes the most use of the available information in some vague datasets. The approach is demonstrated in a real-world application. In the context of massive online computer science courses, methods are sought for automatically providing the student with a qualification through code metrics. Feature selection methods are used to find the metrics involved in the most meaningful predictions. In this study, 800 source code files, collected and revised by the authors in classroom Computer Science lectures taught between 2013 and 2014, are analyzed with the proposed technique, and the most relevant metrics for the automatic grading task are discussed. PMID:25114967

  18. Automatic Tools for Enhancing the Collaborative Experience in Large Projects

    NASA Astrophysics Data System (ADS)

    Bourilkov, D.; Rodriquez, J. L.

    2014-06-01

    With the explosion of big data in many fields, the efficient management of knowledge about all aspects of the data analysis gains in importance. A key feature of collaboration in large scale projects is keeping a log of what is being done and how - for private use, reuse, and for sharing selected parts with collaborators and peers, often distributed geographically on an increasingly global scale. Even better if the log is automatically created on the fly while the scientist or software developer is working in a habitual way, without the need for extra efforts. This saves time and enables a team to do more with the same resources. The CODESH - COllaborative DEvelopment SHell - and CAVES - Collaborative Analysis Versioning Environment System projects address this problem in a novel way. They build on the concepts of virtual states and transitions to enhance the collaborative experience by providing automatic persistent virtual logbooks. CAVES is designed for sessions of distributed data analysis using the popular ROOT framework, while CODESH generalizes the approach for any type of work on the command line in typical UNIX shells like bash or tcsh. Repositories of sessions can be configured dynamically to record and make available the knowledge accumulated in the course of a scientific or software endeavor. Access can be controlled to define logbooks of private sessions or sessions shared within or between collaborating groups. A typical use case is building working scalable systems for analysis of Petascale volumes of data as encountered in the LHC experiments. Our approach is general enough to find applications in many fields.

  19. MO-G-BRE-04: Automatic Verification of Daily Treatment Deliveries and Generation of Daily Treatment Reports for a MR Image-Guided Treatment Machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, D; Li, X; Li, H

    2014-06-15

    Purpose: Two aims of this work were to develop a method to automatically verify treatment delivery accuracy immediately after patient treatment and to develop a comprehensive daily treatment report to provide all required information for daily MR-IGRT review. Methods: After systematically analyzing the requirements for treatment delivery verification and understanding the available information from a novel MR-IGRT treatment machine, we designed a method to use 1) treatment plan files, 2) delivery log files, and 3) dosimetric calibration information to verify the accuracy and completeness of daily treatment deliveries. The method verifies the correctness of delivered treatment plans and beams, beammore » segments, and for each segment, the beam-on time and MLC leaf positions. Composite primary fluence maps are calculated from the MLC leaf positions and the beam-on time. Error statistics are calculated on the fluence difference maps between the plan and the delivery. We also designed the daily treatment delivery report by including all required information for MR-IGRT and physics weekly review - the plan and treatment fraction information, dose verification information, daily patient setup screen captures, and the treatment delivery verification results. Results: The parameters in the log files (e.g. MLC positions) were independently verified and deemed accurate and trustable. A computer program was developed to implement the automatic delivery verification and daily report generation. The program was tested and clinically commissioned with sufficient IMRT and 3D treatment delivery data. The final version has been integrated into a commercial MR-IGRT treatment delivery system. Conclusion: A method was developed to automatically verify MR-IGRT treatment deliveries and generate daily treatment reports. Already in clinical use since December 2013, the system is able to facilitate delivery error detection, and expedite physician daily IGRT review and physicist weekly chart review.« less

  20. Automatic Author Profiling of Online Chat Logs

    DTIC Science & Technology

    2007-03-01

    CLASSIFICATION WITH PRIOR ..........91 1. All Test Data ................................91 2. Extracted Test Data: Teens and 20s ...........92 3...Extracted Test Data: Teens and 30s ...........92 4. Extracted Test Data: Teens and 40s ...........93 5. Extracted Test Data: Teens and 50s ...........93 6...Data ................................97 C. AGE: BINARY CLASSIFICATION WITH PRIOR .............98 1. Extracted Test Data: Teens and 20s ...........98 2

  1. Characterization of gas hydrate reservoirs by integration of core and log data in the Ulleung Basin, East Sea

    USGS Publications Warehouse

    Bahk, J.-J.; Kim, G.-Y.; Chun, J.-H.; Kim, J.-H.; Lee, J.Y.; Ryu, B.-J.; Lee, J.-H.; Son, B.-K.; Collett, Timothy S.

    2013-01-01

    Examinations of core and well-log data from the Second Ulleung Basin Gas Hydrate Drilling Expedition (UBGH2) drill sites suggest that Sites UBGH2-2_2 and UBGH2-6 have relatively good gas hydrate reservoir quality in terms of individual and total cumulative thicknesses of gas-hydrate-bearing sand (HYBS) beds. In both of the sites, core sediments are generally dominated by hemipelagic muds which are intercalated with turbidite sands. The turbidite sands are usually thin-to-medium bedded and mainly consist of well sorted coarse silt to fine sand. Anomalies in infrared core temperatures and porewater chlorinity data and pressure core measurements indicate that “gas hydrate occurrence zones” (GHOZ) are present about 68–155 mbsf at Site UBGH2-2_2 and 110–155 mbsf at Site UBGH2-6. In both the GHOZ, gas hydrates are preferentially associated with many of the turbidite sands as “pore-filling” type hydrates. The HYBS identified in the cores from Site UBGH2-6 are medium-to-thick bedded particularly in the lower part of the GHOZ and well coincident with significant high excursions in all of the resistivity, density, and velocity logs. Gas-hydrate saturations in the HYBS range from 12% to 79% with an average of 52% based on pore-water chlorinity. In contrast, the HYBS from Site UBGH2-2_2 are usually thin-bedded and show poor correlations with both of the resistivity and velocity logs owing to volume averaging effects of the logging tools on the thin HYBS beds. Gas-hydrate saturations in the HYBS range from 15% to 65% with an average of 37% based on pore-water chlorinity. In both of the sites, large fluctuations in biogenic opal contents have significant effects on the sediment physical properties, resulting in limited usage of gamma ray and density logs in discriminating sand reservoirs.

  2. SU-E-T-473: A Patient-Specific QC Paradigm Based On Trajectory Log Files and DICOM Plan Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeMarco, J; McCloskey, S; Low, D

    Purpose: To evaluate a remote QC tool for monitoring treatment machine parameters and treatment workflow. Methods: The Varian TrueBeamTM linear accelerator is a digital machine that records machine axis parameters and MLC leaf positions as a function of delivered monitor unit or control point. This information is saved to a binary trajectory log file for every treatment or imaging field in the patient treatment session. A MATLAB analysis routine was developed to parse the trajectory log files for a given patient, compare the expected versus actual machine and MLC positions as well as perform a cross-comparison with the DICOM-RT planmore » file exported from the treatment planning system. The parsing routine sorts the trajectory log files based on the time and date stamp and generates a sequential report file listing treatment parameters and provides a match relative to the DICOM-RT plan file. Results: The trajectory log parsing-routine was compared against a standard record and verify listing for patients undergoing initial IMRT dosimetry verification and weekly and final chart QC. The complete treatment course was independently verified for 10 patients of varying treatment site and a total of 1267 treatment fields were evaluated including pre-treatment imaging fields where applicable. In the context of IMRT plan verification, eight prostate SBRT plans with 4-arcs per plan were evaluated based on expected versus actual machine axis parameters. The average value for the maximum RMS MLC error was 0.067±0.001mm and 0.066±0.002mm for leaf bank A and B respectively. Conclusion: A real-time QC analysis program was tested using trajectory log files and DICOM-RT plan files. The parsing routine is efficient and able to evaluate all relevant machine axis parameters during a patient treatment course including MLC leaf positions and table positions at time of image acquisition and during treatment.« less

  3. Expert systems for automated correlation and interpretation of wireline logs

    USGS Publications Warehouse

    Olea, R.A.

    1994-01-01

    CORRELATOR is an interactive computer program for lithostratigraphic correlation of wireline logs able to store correlations in a data base with a consistency, accuracy, speed, and resolution that are difficult to obtain manually. The automatic determination of correlations is based on the maximization of a weighted correlation coefficient using two wireline logs per well. CORRELATOR has an expert system to scan and flag incongruous correlations in the data base. The user has the option to accept or disregard the advice offered by the system. The expert system represents knowledge through production rules. The inference system is goal-driven and uses backward chaining to scan through the rules. Work in progress is used to illustrate the potential that a second expert system with a similar architecture for interpreting dip diagrams could have to identify episodes-as those of interest in sequence stratigraphy and fault detection- and annotate them in the stratigraphic column. Several examples illustrate the presentation. ?? 1994 International Association for Mathematical Geology.

  4. Framework for Automation of Hazard Log Management on Large Critical Projects

    NASA Astrophysics Data System (ADS)

    Vinerbi, Lorenzo; Babu, Arun P.

    2016-08-01

    Hazard log is a database of all risk management activities in a project. Maintaining its correctness and consistency on large safety/mission critical projects involving multiple vendors, suppliers, and partners is critical and challenging. IBM DOORS is one of the popular tool used for hazard management in space applications. However, not all stake- holders are familiar with it. Also, It is not always feasible to expect all stake-holders to provide correct and consistent hazard data.The current work describes the process and tools to simplify the process of hazard data collection on large projects. It demonstrates how the collected data from all stake-holders is merged to form the hazard log while ensuring data consistency and correctness.The data provided by all parties are collected using a template containing scripts. The scripts check for mistakes based on internal standards of company in charge of hazard management. The collected data is then subjected to merging in DOORS, which also contain scripts to check and import data to form the hazard log. The proposed tool has been applied to a mission critical project, and has been found to save time and reduce the number of mistakes while creating the hazard log. The use of automatic checks paves the way for correct tracking of risk and hazard analysis activities for large critical projects.

  5. Study on Impact Acoustic—Visual Sensor-Based Sorting of ELV Plastic Materials

    PubMed Central

    Huang, Jiu; Tian, Chuyuan; Ren, Jingwei; Bian, Zhengfu

    2017-01-01

    This paper concentrates on a study of a novel multi-sensor aided method by using acoustic and visual sensors for detection, recognition and separation of End-of Life vehicles’ (ELVs) plastic materials, in order to optimize the recycling rate of automotive shredder residues (ASRs). Sensor-based sorting technologies have been utilized for material recycling for the last two decades. One of the problems still remaining results from black and dark dyed plastics which are very difficult to recognize using visual sensors. In this paper a new multi-sensor technology for black plastic recognition and sorting by using impact resonant acoustic emissions (AEs) and laser triangulation scanning was introduced. A pilot sorting system which consists of a 3-dimensional visual sensor and an acoustic sensor was also established; two kinds commonly used vehicle plastics, polypropylene (PP) and acrylonitrile-butadiene-styrene (ABS) and two kinds of modified vehicle plastics, polypropylene/ethylene-propylene-diene-monomer (PP-EPDM) and acrylonitrile-butadiene-styrene/polycarbonate (ABS-PC) were tested. In this study the geometrical features of tested plastic scraps were measured by the visual sensor, and their corresponding impact acoustic emission (AE) signals were acquired by the acoustic sensor. The signal processing and feature extraction of visual data as well as acoustic signals were realized by virtual instruments. Impact acoustic features were recognized by using FFT based power spectral density analysis. The results shows that the characteristics of the tested PP and ABS plastics were totally different, but similar to their respective modified materials. The probability of scrap material recognition rate, i.e., the theoretical sorting efficiency between PP and PP-EPDM, could reach about 50%, and between ABS and ABS-PC it could reach about 75% with diameters ranging from 14 mm to 23 mm, and with exclusion of abnormal impacts, the actual separation rates were 39.2% for PP, 41.4% for PP/EPDM scraps as well as 62.4% for ABS, and 70.8% for ABS/PC scraps. Within the diameter range of 8-13 mm, only 25% of PP and 27% of PP/EPDM scraps, as well as 43% of ABS, and 47% of ABS/PC scraps were finally separated. This research proposes a new approach for sensor-aided automatic recognition and sorting of black plastic materials, it is an effective method for ASR reduction and recycling. PMID:28594341

  6. Study on Impact Acoustic-Visual Sensor-Based Sorting of ELV Plastic Materials.

    PubMed

    Huang, Jiu; Tian, Chuyuan; Ren, Jingwei; Bian, Zhengfu

    2017-06-08

    This paper concentrates on a study of a novel multi-sensor aided method by using acoustic and visual sensors for detection, recognition and separation of End-of Life vehicles' (ELVs) plastic materials, in order to optimize the recycling rate of automotive shredder residues (ASRs). Sensor-based sorting technologies have been utilized for material recycling for the last two decades. One of the problems still remaining results from black and dark dyed plastics which are very difficult to recognize using visual sensors. In this paper a new multi-sensor technology for black plastic recognition and sorting by using impact resonant acoustic emissions (AEs) and laser triangulation scanning was introduced. A pilot sorting system which consists of a 3-dimensional visual sensor and an acoustic sensor was also established; two kinds commonly used vehicle plastics, polypropylene (PP) and acrylonitrile-butadiene-styrene (ABS) and two kinds of modified vehicle plastics, polypropylene/ethylene-propylene-diene-monomer (PP-EPDM) and acrylonitrile-butadiene-styrene/polycarbonate (ABS-PC) were tested. In this study the geometrical features of tested plastic scraps were measured by the visual sensor, and their corresponding impact acoustic emission (AE) signals were acquired by the acoustic sensor. The signal processing and feature extraction of visual data as well as acoustic signals were realized by virtual instruments. Impact acoustic features were recognized by using FFT based power spectral density analysis. The results shows that the characteristics of the tested PP and ABS plastics were totally different, but similar to their respective modified materials. The probability of scrap material recognition rate, i.e., the theoretical sorting efficiency between PP and PP-EPDM, could reach about 50%, and between ABS and ABS-PC it could reach about 75% with diameters ranging from 14 mm to 23 mm, and with exclusion of abnormal impacts, the actual separation rates were 39.2% for PP, 41.4% for PP/EPDM scraps as well as 62.4% for ABS, and 70.8% for ABS/PC scraps. Within the diameter range of 8-13 mm, only 25% of PP and 27% of PP/EPDM scraps, as well as 43% of ABS, and 47% of ABS/PC scraps were finally separated. This research proposes a new approach for sensor-aided automatic recognition and sorting of black plastic materials, it is an effective method for ASR reduction and recycling.

  7. The Tic-Tac-Toe Theory of Gravity

    NASA Astrophysics Data System (ADS)

    Greenberger, Daniel M.

    2012-01-01

    The Tic-Tac-Toe theory is a qualitative, phenomenological theory that automatically explains many of the features of the universe that we see, such as dark matter and dark energy. In that sense it is a Copernican theory that gives an alternate approach, which immediately and intuitively explains phenomena,independently of any detailed dynamics, for which the explanations in accepted standard theories are usually somewhat ad-hoc. The basic concept is to take the possibility of negative masses seriously, and generalize this to counter the unconvincing treatment of negative masses by the equivalence principle. Surprisingly, this automatically solves all sorts of other problems at the same time. Since the theory comes without a detailed dynamics, one can hardly expect people to embrace it with open arms, but it is presented to help convince students that there are many plausible new ideas available to them, and they should not let themselves be intimidated into believing that we are close to understanding nature.

  8. Automatic threshold optimization in nonlinear energy operator based spike detection.

    PubMed

    Malik, Muhammad H; Saeed, Maryam; Kamboh, Awais M

    2016-08-01

    In neural spike sorting systems, the performance of the spike detector has to be maximized because it affects the performance of all subsequent blocks. Non-linear energy operator (NEO), is a popular spike detector due to its detection accuracy and its hardware friendly architecture. However, it involves a thresholding stage, whose value is usually approximated and is thus not optimal. This approximation deteriorates the performance in real-time systems where signal to noise ratio (SNR) estimation is a challenge, especially at lower SNRs. In this paper, we propose an automatic and robust threshold calculation method using an empirical gradient technique. The method is tested on two different datasets. The results show that our optimized threshold improves the detection accuracy in both high SNR and low SNR signals. Boxplots are presented that provide a statistical analysis of improvements in accuracy, for instance, the 75th percentile was at 98.7% and 93.5% for the optimized NEO threshold and traditional NEO threshold, respectively.

  9. Sorting Olive Batches for the Milling Process Using Image Processing

    PubMed Central

    Puerto, Daniel Aguilera; Martínez Gila, Diego Manuel; Gámez García, Javier; Gómez Ortega, Juan

    2015-01-01

    The quality of virgin olive oil obtained in the milling process is directly bound to the characteristics of the olives. Hence, the correct classification of the different incoming olive batches is crucial to reach the maximum quality of the oil. The aim of this work is to provide an automatic inspection system, based on computer vision, and to classify automatically different batches of olives entering the milling process. The classification is based on the differentiation between ground and tree olives. For this purpose, three different species have been studied (Picudo, Picual and Hojiblanco). The samples have been obtained by picking the olives directly from the tree or from the ground. The feature vector of the samples has been obtained on the basis of the olive image histograms. Moreover, different image preprocessing has been employed, and two classification techniques have been used: these are discriminant analysis and neural networks. The proposed methodology has been validated successfully, obtaining good classification results. PMID:26147729

  10. PipeOnline 2.0: automated EST processing and functional data sorting.

    PubMed

    Ayoubi, Patricia; Jin, Xiaojing; Leite, Saul; Liu, Xianghui; Martajaja, Jeson; Abduraham, Abdurashid; Wan, Qiaolan; Yan, Wei; Misawa, Eduardo; Prade, Rolf A

    2002-11-01

    Expressed sequence tags (ESTs) are generated and deposited in the public domain, as redundant, unannotated, single-pass reactions, with virtually no biological content. PipeOnline automatically analyses and transforms large collections of raw DNA-sequence data from chromatograms or FASTA files by calling the quality of bases, screening and removing vector sequences, assembling and rewriting consensus sequences of redundant input files into a unigene EST data set and finally through translation, amino acid sequence similarity searches, annotation of public databases and functional data. PipeOnline generates an annotated database, retaining the processed unigene sequence, clone/file history, alignments with similar sequences, and proposed functional classification, if available. Functional annotation is automatic and based on a novel method that relies on homology of amino acid sequence multiplicity within GenBank records. Records are examined through a function ordered browser or keyword queries with automated export of results. PipeOnline offers customization for individual projects (MyPipeOnline), automated updating and alert service. PipeOnline is available at http://stress-genomics.org.

  11. Automated phenotype pattern recognition of zebrafish for high-throughput screening.

    PubMed

    Schutera, Mark; Dickmeis, Thomas; Mione, Marina; Peravali, Ravindra; Marcato, Daniel; Reischl, Markus; Mikut, Ralf; Pylatiuk, Christian

    2016-07-03

    Over the last years, the zebrafish (Danio rerio) has become a key model organism in genetic and chemical screenings. A growing number of experiments and an expanding interest in zebrafish research makes it increasingly essential to automatize the distribution of embryos and larvae into standard microtiter plates or other sample holders for screening, often according to phenotypical features. Until now, such sorting processes have been carried out by manually handling the larvae and manual feature detection. Here, a prototype platform for image acquisition together with a classification software is presented. Zebrafish embryos and larvae and their features such as pigmentation are detected automatically from the image. Zebrafish of 4 different phenotypes can be classified through pattern recognition at 72 h post fertilization (hpf), allowing the software to classify an embryo into 2 distinct phenotypic classes: wild-type versus variant. The zebrafish phenotypes are classified with an accuracy of 79-99% without any user interaction. A description of the prototype platform and of the algorithms for image processing and pattern recognition is presented.

  12. Automated clustering-based workload characterization

    NASA Technical Reports Server (NTRS)

    Pentakalos, Odysseas I.; Menasce, Daniel A.; Yesha, Yelena

    1996-01-01

    The demands placed on the mass storage systems at various federal agencies and national laboratories are continuously increasing in intensity. This forces system managers to constantly monitor the system, evaluate the demand placed on it, and tune it appropriately using either heuristics based on experience or analytic models. Performance models require an accurate workload characterization. This can be a laborious and time consuming process. It became evident from our experience that a tool is necessary to automate the workload characterization process. This paper presents the design and discusses the implementation of a tool for workload characterization of mass storage systems. The main features of the tool discussed here are: (1)Automatic support for peak-period determination. Histograms of system activity are generated and presented to the user for peak-period determination; (2) Automatic clustering analysis. The data collected from the mass storage system logs is clustered using clustering algorithms and tightness measures to limit the number of generated clusters; (3) Reporting of varied file statistics. The tool computes several statistics on file sizes such as average, standard deviation, minimum, maximum, frequency, as well as average transfer time. These statistics are given on a per cluster basis; (4) Portability. The tool can easily be used to characterize the workload in mass storage systems of different vendors. The user needs to specify through a simple log description language how the a specific log should be interpreted. The rest of this paper is organized as follows. Section two presents basic concepts in workload characterization as they apply to mass storage systems. Section three describes clustering algorithms and tightness measures. The following section presents the architecture of the tool. Section five presents some results of workload characterization using the tool.Finally, section six presents some concluding remarks.

  13. Precision measurements from very-large scale aerial digital imagery.

    PubMed

    Booth, D Terrance; Cox, Samuel E; Berryman, Robert D

    2006-01-01

    Managers need measurements and resource managers need the length/width of a variety of items including that of animals, logs, streams, plant canopies, man-made objects, riparian habitat, vegetation patches and other things important in resource monitoring and land inspection. These types of measurements can now be easily and accurately obtained from very large scale aerial (VLSA) imagery having spatial resolutions as fine as 1 millimeter per pixel by using the three new software programs described here. VLSA images have small fields of view and are used for intermittent sampling across extensive landscapes. Pixel-coverage among images is influenced by small changes in airplane altitude above ground level (AGL) and orientation relative to the ground, as well as by changes in topography. These factors affect the object-to-camera distance used for image-resolution calculations. 'ImageMeasurement' offers a user-friendly interface for accounting for pixel-coverage variation among images by utilizing a database. 'LaserLOG' records and displays airplane altitude AGL measured from a high frequency laser rangefinder, and displays the vertical velocity. 'Merge' sorts through large amounts of data generated by LaserLOG and matches precise airplane altitudes with camera trigger times for input to the ImageMeasurement database. We discuss application of these tools, including error estimates. We found measurements from aerial images (collection resolution: 5-26 mm/pixel as projected on the ground) using ImageMeasurement, LaserLOG, and Merge, were accurate to centimeters with an error less than 10%. We recommend these software packages as a means for expanding the utility of aerial image data.

  14. Understand your Algorithm: Drill Down to Sample Visualizations in Jupyter Notebooks

    NASA Astrophysics Data System (ADS)

    Mapes, B. E.; Ho, Y.; Cheedela, S. K.; McWhirter, J.

    2017-12-01

    Statistics are the currency of climate dynamics, but the space of all possible algorithms is fathomless - especially for 4-dimensional weather-resolving data that many "impact" variables depend on. Algorithms are designed on data samples, but how do you know if they measure what you expect when turned loose on Big Data? We will introduce the year-1 prototype of a 3-year scientist-led, NSF-supported, Unidata-quality software stack called DRILSDOWN (https://brianmapes.github.io/EarthCube-DRILSDOWN/) for automatically extracting, integrating, and visualizing multivariate 4D data samples. Based on a customizable "IDV bundle" of data sources, fields and displays supplied by the user, the system will teleport its space-time coordinates to fetch Cases of Interest (edge cases, typical cases, etc.) from large aggregated repositories. These standard displays can serve as backdrops to overlay with your value-added fields (such as derived quantities stored on a user's local disk). Fields can be readily pulled out of the visualization object for further processing in Python. The hope is that algorithms successfully tested in this visualization space will then be lifted out and added to automatic processing toolchains, lending confidence in the next round of processing, to seek the next Cases of Interest, in light of a user's statistical measures of "Interest". To log the scientific work done in this vein, the visualizations are wrapped in iPython-based Jupyter notebooks for rich, human-readable documentation (indeed, quasi-publication with formatted text, LaTex math, etc.). Such notebooks are readable and executable, with digital replicability and provenance built in. The entire digital object of a case study can be stored in a repository, where libraries of these Case Study Notebooks can be examined in a browser. Model data (the session topic) are of course especially convenient for this system, but observations of all sorts can also be brought in, overlain, and differenced or otherwise co-processed. The system is available in various tiers, from minimal-install GUI visualizations only, to GUI+Notebook system, to the full system with the repository software. We seek interested users, initially in a "beta tester" mode with the goodwill to offer reports and requests to help drive improvements in project years 2 and 3.

  15. Catching errors with patient-specific pretreatment machine log file analysis.

    PubMed

    Rangaraj, Dharanipathy; Zhu, Mingyao; Yang, Deshan; Palaniswaamy, Geethpriya; Yaddanapudi, Sridhar; Wooten, Omar H; Brame, Scott; Mutic, Sasa

    2013-01-01

    A robust, efficient, and reliable quality assurance (QA) process is highly desired for modern external beam radiation therapy treatments. Here, we report the results of a semiautomatic, pretreatment, patient-specific QA process based on dynamic machine log file analysis clinically implemented for intensity modulated radiation therapy (IMRT) treatments delivered by high energy linear accelerators (Varian 2100/2300 EX, Trilogy, iX-D, Varian Medical Systems Inc, Palo Alto, CA). The multileaf collimator machine (MLC) log files are called Dynalog by Varian. Using an in-house developed computer program called "Dynalog QA," we automatically compare the beam delivery parameters in the log files that are generated during pretreatment point dose verification measurements, with the treatment plan to determine any discrepancies in IMRT deliveries. Fluence maps are constructed and compared between the delivered and planned beams. Since clinical introduction in June 2009, 912 machine log file analyses QA were performed by the end of 2010. Among these, 14 errors causing dosimetric deviation were detected and required further investigation and intervention. These errors were the result of human operating mistakes, flawed treatment planning, and data modification during plan file transfer. Minor errors were also reported in 174 other log file analyses, some of which stemmed from false positives and unreliable results; the origins of these are discussed herein. It has been demonstrated that the machine log file analysis is a robust, efficient, and reliable QA process capable of detecting errors originating from human mistakes, flawed planning, and data transfer problems. The possibility of detecting these errors is low using point and planar dosimetric measurements. Copyright © 2013 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  16. Pattern mining of user interaction logs for a post-deployment usability evaluation of a radiology PACS client.

    PubMed

    Jorritsma, Wiard; Cnossen, Fokie; Dierckx, Rudi A; Oudkerk, Matthijs; van Ooijen, Peter M A

    2016-01-01

    To perform a post-deployment usability evaluation of a radiology Picture Archiving and Communication System (PACS) client based on pattern mining of user interaction log data, and to assess the usefulness of this approach compared to a field study. All user actions performed on the PACS client were logged for four months. A data mining technique called closed sequential pattern mining was used to automatically extract frequently occurring interaction patterns from the log data. These patterns were used to identify usability issues with the PACS. The results of this evaluation were compared to the results of a field study based usability evaluation of the same PACS client. The interaction patterns revealed four usability issues: (1) the display protocols do not function properly, (2) the line measurement tool stays active until another tool is selected, rather than being deactivated after one use, (3) the PACS's built-in 3D functionality does not allow users to effectively perform certain 3D-related tasks, (4) users underuse the PACS's customization possibilities. All usability issues identified based on the log data were also found in the field study, which identified 48 issues in total. Post-deployment usability evaluation based on pattern mining of user interaction log data provides useful insights into the way users interact with the radiology PACS client. However, it reveals few usability issues compared to a field study and should therefore not be used as the sole method of usability evaluation. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  17. Advanced Video Activity Analytics (AVAA): Human Factors Evaluation

    DTIC Science & Technology

    2015-05-01

    video, and 3) creating and saving annotations (Fig. 11). (The logging program was updated after the pilot to also capture search clicks.) Playing and... visual search task and the auditory task together and thus automatically focused on the visual task. Alternatively, the operator may have intentionally...affect performance on the primary task; however, in the current test there was no apparent effect on the operator’s performance in the visual search task

  18. Sensory Information Processing and Symbolic Computation

    DTIC Science & Technology

    1973-12-31

    plague all image deblurring methods when working with high signal to noise ratios, is that of a ringing or ghost image phenomenon which surrounds high...Figure 11 The Impulse Response of an All-Pass Random Phase Filter 24 Figure 12 (a) Unsmoothed Log Spectra of the Sentence "The pipe began to...of automatic deblurring of images, linear predictive coding of speech and the refinement and application of mathematical models of human vision and

  19. Image based automatic water meter reader

    NASA Astrophysics Data System (ADS)

    Jawas, N.; Indrianto

    2018-01-01

    Water meter is used as a tool to calculate water consumption. This tool works by utilizing water flow and shows the calculation result with mechanical digit counter. Practically, in everyday use, an operator will manually check the digit counter periodically. The Operator makes logs of the number shows by water meter to know the water consumption. This manual operation is time consuming and prone to human error. Therefore, in this paper we propose an automatic water meter digit reader from digital image. The digits sequence is detected by utilizing contour information of the water meter front panel.. Then an OCR method is used to get the each digit character. The digit sequence detection is an important part of overall process. It determines the success of overall system. The result shows promising results especially in sequence detection.

  20. How PEMEX engineered a deep well completion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Antimo A., J.C.

    1971-09-01

    Completion and testing of Petroleos Mexicano's W. Reynosa Well No. 1 in the NE. Frontier District south of the Texas border required engineering innovation to combat the 375/sup 0/F temperatures and pressures near 18,000 psi. Drilled to nearly 18,000 ft, the well was completed and tested below 17,000 ft. Completion plans were designed to determine the economic importance of the reservoir and to provide information and experience in planning future completions to 20,000 ft and deeper. Interval selection was based in part on data acquired during drilling, including lithology, geologic age, rock characteristics, and sensitivity to damage caused by drillingmore » fluids. A set of logs was obtained and evaluated by computer in correlation with mud-log and pressure data. The logs also were correlated with logs from other wells in the area. Pressure gradients in the Reynosa field indicated the possibility of pressures on the order of 10,000 psi, and required the use of specially designed valves rated to 20,000 psi, in conjunction with a casinghead that would permit drilling to the projected depth. A choke manifold consisted of an interchangeable, manually operated positive choke and a set of automatic adjustable chokes. Well conditioning, including cementing, perforating, and well plugging are described.« less

  1. Comparative Study Of Artificial Intelligence Techniques As Applied To The Location Of Address Blocks On Mail Pieces

    NASA Astrophysics Data System (ADS)

    Koljonen, Juha T.; Glickman, Frederick R.

    1989-03-01

    Rule-based reasoning when applied to locating destination addresses on mail pieces can enhance system performance and accuracy. One of the critical steps in the automatic reading and sorting of mail by machine is in locating the block of text that is the destination address on a mail piece. This is complicated by the variation of global structure on mail piece faces, e.g., return and destination addresses can be anywhere on the mail piece, in any orientation and of any size. Compounding the problem is the addition of extraneous text and graphics such as advertising.

  2. Distributed databases for materials study of thermo-kinetic properties

    NASA Astrophysics Data System (ADS)

    Toher, Cormac

    2015-03-01

    High-throughput computational materials science provides researchers with the opportunity to rapidly generate large databases of materials properties. To rapidly add thermal properties to the AFLOWLIB consortium and Materials Project repositories, we have implemented an automated quasi-harmonic Debye model, the Automatic GIBBS Library (AGL). This enables us to screen thousands of materials for thermal conductivity, bulk modulus, thermal expansion and related properties. The search and sort functions of the online database can then be used to identify suitable materials for more in-depth study using more precise computational or experimental techniques. AFLOW-AGL source code is public domain and will soon be released within the GNU-GPL license.

  3. Rapid estimation of aquifer salinity structure from oil and gas geophysical logs

    NASA Astrophysics Data System (ADS)

    Shimabukuro, D.; Stephens, M.; Ducart, A.; Skinner, S. M.

    2016-12-01

    We describe a workflow for creating aquifer salinity maps using Archie's equation for areas that have geophysical data from oil and gas wells. We apply this method in California, where geophysical logs are available in raster format from the Division of Oil, Gas, and Geothermal Resource (DOGGR) online archive. This method should be applicable to any region where geophysical logs are readily available. Much of the work is controlled by computer code, allowing salinity estimates for new areas to be rapidly generated. For a region of interest, the DOGGR online database is scraped for wells that were logged with multi-tool suites, such as the Platform Express or Triple Combination Logging Tools. Then, well construction metadata, such as measured depth, spud date, and well orientation, is attached. The resultant local database allows a weighted criteria selection of wells that are most likely to have the shallow resistivity, deep resistivity, and density porosity measurements necessary to calculate salinity over the longest depth interval. The algorithm can be adjusted for geophysical log availability for older well fields and density of sampling. Once priority wells are identified, a student researcher team uses Neuralog software to digitize the raster geophysical logs. Total dissolved solid (TDS) concentration is then calculated in clean, wet sand intervals using the resistivity-porosity method, a modified form of Archie's equation. These sand intervals are automatically selected using a combination of spontaneous potential and the difference in shallow resistivity and deep resistivity measurements. Gamma ray logs are not used because arkosic sands common in California make it difficult to distinguish sand and shale. Computer calculation allows easy adjustment of Archie's parameters. The result is a semi-continuous TDS profile for the wells of interest. These profiles are combined and contoured using standard 3-d visualization software to yield preliminary salinity maps for the region of interest. We present results for select well fields in the Southern San Joaquin Valley, California.

  4. Classification of EEG abnormalities in partial epilepsy with simultaneous EEG-fMRI recordings.

    PubMed

    Pedreira, C; Vaudano, A E; Thornton, R C; Chaudhary, U J; Vulliemoz, S; Laufs, H; Rodionov, R; Carmichael, D W; Lhatoo, S D; Guye, M; Quian Quiroga, R; Lemieux, L

    2014-10-01

    Scalp EEG recordings and the classification of interictal epileptiform discharges (IED) in patients with epilepsy provide valuable information about the epileptogenic network, particularly by defining the boundaries of the "irritative zone" (IZ), and hence are helpful during pre-surgical evaluation of patients with severe refractory epilepsies. The current detection and classification of epileptiform signals essentially rely on expert observers. This is a very time-consuming procedure, which also leads to inter-observer variability. Here, we propose a novel approach to automatically classify epileptic activity and show how this method provides critical and reliable information related to the IZ localization beyond the one provided by previous approaches. We applied Wave_clus, an automatic spike sorting algorithm, for the classification of IED visually identified from pre-surgical simultaneous Electroencephalogram-functional Magnetic Resonance Imagining (EEG-fMRI) recordings in 8 patients affected by refractory partial epilepsy candidate for surgery. For each patient, two fMRI analyses were performed: one based on the visual classification and one based on the algorithmic sorting. This novel approach successfully identified a total of 29 IED classes (compared to 26 for visual identification). The general concordance between methods was good, providing a full match of EEG patterns in 2 cases, additional EEG information in 2 other cases and, in general, covering EEG patterns of the same areas as expert classification in 7 of the 8 cases. Most notably, evaluation of the method with EEG-fMRI data analysis showed hemodynamic maps related to the majority of IED classes representing improved performance than the visual IED classification-based analysis (72% versus 50%). Furthermore, the IED-related BOLD changes revealed by using the algorithm were localized within the presumed IZ for a larger number of IED classes (9) in a greater number of patients than the expert classification (7 and 5, respectively). In contrast, in only one case presented the new algorithm resulted in fewer classes and activation areas. We propose that the use of automated spike sorting algorithms to classify IED provides an efficient tool for mapping IED-related fMRI changes and increases the EEG-fMRI clinical value for the pre-surgical assessment of patients with severe epilepsy. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. A Novel Method to Increase LinLog CMOS Sensors’ Performance in High Dynamic Range Scenarios

    PubMed Central

    Martínez-Sánchez, Antonio; Fernández, Carlos; Navarro, Pedro J.; Iborra, Andrés

    2011-01-01

    Images from high dynamic range (HDR) scenes must be obtained with minimum loss of information. For this purpose it is necessary to take full advantage of the quantification levels provided by the CCD/CMOS image sensor. LinLog CMOS sensors satisfy the above demand by offering an adjustable response curve that combines linear and logarithmic responses. This paper presents a novel method to quickly adjust the parameters that control the response curve of a LinLog CMOS image sensor. We propose to use an Adaptive Proportional-Integral-Derivative controller to adjust the exposure time of the sensor, together with control algorithms based on the saturation level and the entropy of the images. With this method the sensor’s maximum dynamic range (120 dB) can be used to acquire good quality images from HDR scenes with fast, automatic adaptation to scene conditions. Adaptation to a new scene is rapid, with a sensor response adjustment of less than eight frames when working in real time video mode. At least 67% of the scene entropy can be retained with this method. PMID:22164083

  6. A novel method to increase LinLog CMOS sensors' performance in high dynamic range scenarios.

    PubMed

    Martínez-Sánchez, Antonio; Fernández, Carlos; Navarro, Pedro J; Iborra, Andrés

    2011-01-01

    Images from high dynamic range (HDR) scenes must be obtained with minimum loss of information. For this purpose it is necessary to take full advantage of the quantification levels provided by the CCD/CMOS image sensor. LinLog CMOS sensors satisfy the above demand by offering an adjustable response curve that combines linear and logarithmic responses. This paper presents a novel method to quickly adjust the parameters that control the response curve of a LinLog CMOS image sensor. We propose to use an Adaptive Proportional-Integral-Derivative controller to adjust the exposure time of the sensor, together with control algorithms based on the saturation level and the entropy of the images. With this method the sensor's maximum dynamic range (120 dB) can be used to acquire good quality images from HDR scenes with fast, automatic adaptation to scene conditions. Adaptation to a new scene is rapid, with a sensor response adjustment of less than eight frames when working in real time video mode. At least 67% of the scene entropy can be retained with this method.

  7. Automatic rock detection for in situ spectroscopy applications on Mars

    NASA Astrophysics Data System (ADS)

    Mahapatra, Pooja; Foing, Bernard H.

    A novel algorithm for rock detection has been developed for effectively utilising Mars rovers, and enabling autonomous selection of target rocks that require close-contact spectroscopic measurements. The algorithm demarcates small rocks in terrain images as seen by cameras on a Mars rover during traverse. This information may be used by the rover for selection of geologically relevant sample rocks, and (in conjunction with a rangefinder) to pick up target samples using a robotic arm for automatic in situ determination of rock composition and mineralogy using, for example, a Raman spectrometer. Determining rock samples within the region that are of specific interest without physically approaching them significantly reduces time, power and risk. Input images in colour are converted to greyscale for intensity analysis. Bilateral filtering is used for texture removal while preserving rock boundaries. Unsharp masking is used for contrast enhance-ment. Sharp contrasts in intensities are detected using Canny edge detection, with thresholds that are calculated from the image obtained after contrast-limited adaptive histogram equalisation of the unsharp masked image. Scale-space representations are then generated by convolving this image with a Gaussian kernel. A scale-invariant blob detector (Laplacian of the Gaussian, LoG) detects blobs independently of their sizes, and therefore requires a multi-scale approach with automatic scale se-lection. The scale-space blob detector consists of convolution of the Canny edge-detected image with a scale-normalised LoG at several scales, and finding the maxima of squared LoG response in scale-space. After the extraction of local intensity extrema, the intensity profiles along rays going out of the local extremum are investigated. An ellipse is fitted to the region determined by significant changes in the intensity profiles. The fitted ellipses are overlaid on the original Mars terrain image for a visual estimation of the rock detection accuracy, and the number of ellipses are counted. Since geometry and illumination have the least effect on small rocks, the proposed algorithm is effective in detecting small rocks (or bigger rocks at larger distances from the camera) that consist of a small fraction of image pixels. Acknowledgements: The first author would like to express her gratitude to the European Space Agency (ESA/ESTEC) and the International Lunar Exploration Working Group (ILEWG) for their support of this work.

  8. Using phrases and document metadata to improve topic modeling of clinical reports.

    PubMed

    Speier, William; Ong, Michael K; Arnold, Corey W

    2016-06-01

    Probabilistic topic models provide an unsupervised method for analyzing unstructured text, which have the potential to be integrated into clinical automatic summarization systems. Clinical documents are accompanied by metadata in a patient's medical history and frequently contains multiword concepts that can be valuable for accurately interpreting the included text. While existing methods have attempted to address these problems individually, we present a unified model for free-text clinical documents that integrates contextual patient- and document-level data, and discovers multi-word concepts. In the proposed model, phrases are represented by chained n-grams and a Dirichlet hyper-parameter is weighted by both document-level and patient-level context. This method and three other Latent Dirichlet allocation models were fit to a large collection of clinical reports. Examples of resulting topics demonstrate the results of the new model and the quality of the representations are evaluated using empirical log likelihood. The proposed model was able to create informative prior probabilities based on patient and document information, and captured phrases that represented various clinical concepts. The representation using the proposed model had a significantly higher empirical log likelihood than the compared methods. Integrating document metadata and capturing phrases in clinical text greatly improves the topic representation of clinical documents. The resulting clinically informative topics may effectively serve as the basis for an automatic summarization system for clinical reports. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Adaptive Spot Detection With Optimal Scale Selection in Fluorescence Microscopy Images.

    PubMed

    Basset, Antoine; Boulanger, Jérôme; Salamero, Jean; Bouthemy, Patrick; Kervrann, Charles

    2015-11-01

    Accurately detecting subcellular particles in fluorescence microscopy is of primary interest for further quantitative analysis such as counting, tracking, or classification. Our primary goal is to segment vesicles likely to share nearly the same size in fluorescence microscopy images. Our method termed adaptive thresholding of Laplacian of Gaussian (LoG) images with autoselected scale (ATLAS) automatically selects the optimal scale corresponding to the most frequent spot size in the image. Four criteria are proposed and compared to determine the optimal scale in a scale-space framework. Then, the segmentation stage amounts to thresholding the LoG of the intensity image. In contrast to other methods, the threshold is locally adapted given a probability of false alarm (PFA) specified by the user for the whole set of images to be processed. The local threshold is automatically derived from the PFA value and local image statistics estimated in a window whose size is not a critical parameter. We also propose a new data set for benchmarking, consisting of six collections of one hundred images each, which exploits backgrounds extracted from real microscopy images. We have carried out an extensive comparative evaluation on several data sets with ground-truth, which demonstrates that ATLAS outperforms existing methods. ATLAS does not need any fine parameter tuning and requires very low computation time. Convincing results are also reported on real total internal reflection fluorescence microscopy images.

  10. Automatic gain control in the echolocation system of dolphins

    NASA Astrophysics Data System (ADS)

    Au, Whitlow W. L.; Benoit-Bird, Kelly J.

    2003-06-01

    In bats and technological sonars, the gain of the receiver is progressively increased with time after the transmission of a signal to compensate for acoustic propagation loss. The current understanding of dolphin echolocation indicates that automatic gain control is not a part of their sonar system. In order to test this understanding, we have performed field measurements of free-ranging echolocating dolphins. Here we show that dolphins do possess an automatic gain control mechanism, but that it is implemented in the transmission phase rather than the receiving phase of a sonar cycle. We find that the amplitude of the dolphins' echolocation signals are highly range dependent; this amplitude increases with increasing target range, R, in a 20log(R) fashion to compensate for propagation loss. If the echolocation target is a fish school with many sound scatterers, the echoes from the school will remain nearly constant with range as the dolphin closes in on it. This characteristic has the same effect as time-varying gain in bats and technological sonar when considered from a sonar system perspective.

  11. Automated recognition of stratigraphic marker shales from geophysical logs in iron ore deposits

    NASA Astrophysics Data System (ADS)

    Silversides, Katherine; Melkumyan, Arman; Wyman, Derek; Hatherly, Peter

    2015-04-01

    The mining of stratiform ore deposits requires a means of determining the location of stratigraphic boundaries. A variety of geophysical logs may provide the required data but, in the case of banded iron formation hosted iron ore deposits in the Hamersley Ranges of Western Australia, only one geophysical log type (natural gamma) is collected for this purpose. The information from these logs is currently processed by slow manual interpretation. In this paper we present an alternative method of automatically identifying recurring stratigraphic markers in natural gamma logs from multiple drill holes. Our approach is demonstrated using natural gamma geophysical logs that contain features corresponding to the presence of stratigraphically important marker shales. The host stratigraphic sequence is highly consistent throughout the Hamersley and the marker shales can therefore be used to identify the stratigraphic location of the banded iron formation (BIF) or BIF hosted ore. The marker shales are identified using Gaussian Processes (GP) trained by either manual or active learning methods and the results are compared to the existing geological interpretation. The manual method involves the user selecting the signatures for improving the library, whereas the active learning method uses the measure of uncertainty provided by the GP to select specific examples for the user to consider for addition. The results demonstrate that both GP methods can identify a feature, but the active learning approach has several benefits over the manual method. These benefits include greater accuracy in the identified signatures, faster library building, and an objective approach for selecting signatures that includes the full range of signatures across a deposit in the library. When using the active learning method, it was found that the current manual interpretation could be replaced in 78.4% of the holes with an accuracy of 95.7%.

  12. Automatic imitation effects are influenced by experience of synchronous action in children.

    PubMed

    O'Sullivan, Eoin P; Bijvoet-van den Berg, Simone; Caldwell, Christine A

    2018-07-01

    By their fourth year of life, children are expert imitators, but it is unclear how this ability develops. One approach suggests that certain types of experience might forge associations between the sensory and motor representations of an action that may facilitate imitation at a later time. Sensorimotor experience of this sort may occur when an infant's action is imitated by a caregiver or when socially synchronous action occurs. This learning approach, therefore, predicts that the strength of sensory-motor associations should depend on the frequency and quality of previous experience. Here, we tested this prediction by examining automatic imitation, that is, the tendency of an action stimulus to facilitate the performance of that action and interfere with the performance of an incompatible action. We required children (aged between 3 years 8 months and 7 years 11 months) to respond to actions performed by an experimenter (e.g., two hands clapping) with both compatible actions (i.e., two hands clapping) and incompatible actions (i.e., two hands waving) at different stages in the experimental procedure. As predicted by a learning account, actions thought to be performed in synchrony (i.e., clapping/waving) produced stronger automatic imitation effects when compared with actions where previous sensorimotor experience is likely to be more limited (e.g., pointing/hand closing). Furthermore, these automatic imitation effects were not found to vary with age, with both compatible and incompatible responses quickening with age. These findings suggest a role for sensorimotor experience in the development of imitative ability. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. 3D models mapping optimization through an integrated parameterization approach: cases studies from Ravenna

    NASA Astrophysics Data System (ADS)

    Cipriani, L.; Fantini, F.; Bertacchi, S.

    2014-06-01

    Image-based modelling tools based on SfM algorithms gained great popularity since several software houses provided applications able to achieve 3D textured models easily and automatically. The aim of this paper is to point out the importance of controlling models parameterization process, considering that automatic solutions included in these modelling tools can produce poor results in terms of texture utilization. In order to achieve a better quality of textured models from image-based modelling applications, this research presents a series of practical strategies aimed at providing a better balance between geometric resolution of models from passive sensors and their corresponding (u,v) map reference systems. This aspect is essential for the achievement of a high-quality 3D representation, since "apparent colour" is a fundamental aspect in the field of Cultural Heritage documentation. Complex meshes without native parameterization have to be "flatten" or "unwrapped" in the (u,v) parameter space, with the main objective to be mapped with a single image. This result can be obtained by using two different strategies: the former automatic and faster, while the latter manual and time-consuming. Reverse modelling applications provide automatic solutions based on splitting the models by means of different algorithms, that produce a sort of "atlas" of the original model in the parameter space, in many instances not adequate and negatively affecting the overall quality of representation. Using in synergy different solutions, ranging from semantic aware modelling techniques to quad-dominant meshes achieved using retopology tools, it is possible to obtain a complete control of the parameterization process.

  14. ELSA: An integrated, semi-automated nebular abundance package

    NASA Astrophysics Data System (ADS)

    Johnson, Matthew D.; Levitt, Jesse S.; Henry, Richard B. C.; Kwitter, Karen B.

    We present ELSA, a new modular software package, written in C, to analyze and manage spectroscopic data from emission-line objects. In addition to calculating plasma diagnostics and abundances from nebular emission lines, the software provides a number of convenient features including the ability to ingest logs produced by IRAF's splot task, to semi-automatically merge spectra in different wavelength ranges, and to automatically generate various data tables in machine-readable or LaTeX format. ELSA features a highly sophisticated interstellar reddening correction scheme that takes into account temperature and density effects as well as He II contamination of the hydrogen Balmer lines. Abundance calculations are performed using a 5-level atom approximation with recent atomic data, based on R. Henry's ABUN program. Downloading and detailed documentation for all aspects of ELSA are available at the following URL:

  15. WinTICS-24 --- A Telescope Control Interface for MS Windows

    NASA Astrophysics Data System (ADS)

    Hawkins, R. Lee

    1995-12-01

    WinTICS-24 is a telescope control system interface and observing assistant written in Visual Basic for MS Windows. It provides the ability to control a telescope and up to 3 other instruments via the serial ports on an IBM-PC compatible computer, all from one consistent user interface. In addition to telescope control, WinTICS contains an observing logbook, trouble log (which can automatically email its entries to a responsible person), lunar phase display, object database (which allows the observer to type in the name of an object and automatically slew to it), a time of minimum calculator for eclipsing binary stars, and an interface to the Guide CD-ROM for bringing up finder charts of the current telescope coordinates. Currently WinTICS supports control of DFM telescopes, but is easily adaptable to other telescopes and instrumentation.

  16. An algorithm for intelligent sorting of CT-related dose parameters.

    PubMed

    Cook, Tessa S; Zimmerman, Stefan L; Steingall, Scott R; Boonn, William W; Kim, Woojin

    2012-02-01

    Imaging centers nationwide are seeking innovative means to record and monitor computed tomography (CT)-related radiation dose in light of multiple instances of patient overexposure to medical radiation. As a solution, we have developed RADIANCE, an automated pipeline for extraction, archival, and reporting of CT-related dose parameters. Estimation of whole-body effective dose from CT dose length product (DLP)--an indirect estimate of radiation dose--requires anatomy-specific conversion factors that cannot be applied to total DLP, but instead necessitate individual anatomy-based DLPs. A challenge exists because the total DLP reported on a dose sheet often includes multiple separate examinations (e.g., chest CT followed by abdominopelvic CT). Furthermore, the individual reported series DLPs may not be clearly or consistently labeled. For example, "arterial" could refer to the arterial phase of the triple liver CT or the arterial phase of a CT angiogram. To address this problem, we have designed an intelligent algorithm to parse dose sheets for multi-series CT examinations and correctly separate the total DLP into its anatomic components. The algorithm uses information from the departmental PACS to determine how many distinct CT examinations were concurrently performed. Then, it matches the number of distinct accession numbers to the series that were acquired and anatomically matches individual series DLPs to their appropriate CT examinations. This algorithm allows for more accurate dose analytics, but there remain instances where automatic sorting is not feasible. To ultimately improve radiology patient care, we must standardize series names and exam names to unequivocally sort exams by anatomy and correctly estimate whole-body effective dose.

  17. An algorithm for intelligent sorting of CT-related dose parameters

    NASA Astrophysics Data System (ADS)

    Cook, Tessa S.; Zimmerman, Stefan L.; Steingal, Scott; Boonn, William W.; Kim, Woojin

    2011-03-01

    Imaging centers nationwide are seeking innovative means to record and monitor CT-related radiation dose in light of multiple instances of patient over-exposure to medical radiation. As a solution, we have developed RADIANCE, an automated pipeline for extraction, archival and reporting of CT-related dose parameters. Estimation of whole-body effective dose from CT dose-length product (DLP)-an indirect estimate of radiation dose-requires anatomy-specific conversion factors that cannot be applied to total DLP, but instead necessitate individual anatomy-based DLPs. A challenge exists because the total DLP reported on a dose sheet often includes multiple separate examinations (e.g., chest CT followed by abdominopelvic CT). Furthermore, the individual reported series DLPs may not be clearly or consistently labeled. For example, Arterial could refer to the arterial phase of the triple liver CT or the arterial phase of a CT angiogram. To address this problem, we have designed an intelligent algorithm to parse dose sheets for multi-series CT examinations and correctly separate the total DLP into its anatomic components. The algorithm uses information from the departmental PACS to determine how many distinct CT examinations were concurrently performed. Then, it matches the number of distinct accession numbers to the series that were acquired, and anatomically matches individual series DLPs to their appropriate CT examinations. This algorithm allows for more accurate dose analytics, but there remain instances where automatic sorting is not feasible. To ultimately improve radiology patient care, we must standardize series names and exam names to unequivocally sort exams by anatomy and correctly estimate whole-body effective dose.

  18. Linearly Supporting Feature Extraction for Automated Estimation of Stellar Atmospheric Parameters

    NASA Astrophysics Data System (ADS)

    Li, Xiangru; Lu, Yu; Comte, Georges; Luo, Ali; Zhao, Yongheng; Wang, Yongjun

    2015-05-01

    We describe a scheme to extract linearly supporting (LSU) features from stellar spectra to automatically estimate the atmospheric parameters {{T}{\\tt{eff} }}, log g, and [Fe/H]. “Linearly supporting” means that the atmospheric parameters can be accurately estimated from the extracted features through a linear model. The successive steps of the process are as follow: first, decompose the spectrum using a wavelet packet (WP) and represent it by the derived decomposition coefficients; second, detect representative spectral features from the decomposition coefficients using the proposed method Least Absolute Shrinkage and Selection Operator (LARS)bs; third, estimate the atmospheric parameters {{T}{\\tt{eff} }}, log g, and [Fe/H] from the detected features using a linear regression method. One prominent characteristic of this scheme is its ability to evaluate quantitatively the contribution of each detected feature to the atmospheric parameter estimate and also to trace back the physical significance of that feature. This work also shows that the usefulness of a component depends on both the wavelength and frequency. The proposed scheme has been evaluated on both real spectra from the Sloan Digital Sky Survey (SDSS)/SEGUE and synthetic spectra calculated from Kurucz's NEWODF models. On real spectra, we extracted 23 features to estimate {{T}{\\tt{eff} }}, 62 features for log g, and 68 features for [Fe/H]. Test consistencies between our estimates and those provided by the Spectroscopic Parameter Pipeline of SDSS show that the mean absolute errors (MAEs) are 0.0062 dex for log {{T}{\\tt{eff} }} (83 K for {{T}{\\tt{eff} }}), 0.2345 dex for log g, and 0.1564 dex for [Fe/H]. For the synthetic spectra, the MAE test accuracies are 0.0022 dex for log {{T}{\\tt{eff} }} (32 K for {{T}{\\tt{eff} }}), 0.0337 dex for log g, and 0.0268 dex for [Fe/H].

  19. A system for multichannel recording and automatic reading of information. [for onboard cosmic ray counter

    NASA Technical Reports Server (NTRS)

    Bogomolov, E. A.; Yevstafev, Y. Y.; Karakadko, V. K.; Lubyanaya, N. D.; Romanov, V. A.; Totubalina, M. G.; Yamshchikov, M. A.

    1975-01-01

    A system for the recording and processing of telescope data is considered for measurements of EW asymmetry. The information is recorded by 45 channels on a continuously moving 35-mm film. The dead time of the recorder is about 0.1 sec. A sorting electronic circuit is used to reduce the errors when the statistical time distribution of the pulses is recorded. The recorded information is read out by means of photoresistors. The phototransmitter signals are fed either to the mechanical recorder unit for preliminary processing, or to a logical circuit which controls the operation of the punching device. The punched tape is processed by an electronic computer.

  20. Automated detection and classification of dice

    NASA Astrophysics Data System (ADS)

    Correia, Bento A. B.; Silva, Jeronimo A.; Carvalho, Fernando D.; Guilherme, Rui; Rodrigues, Fernando C.; de Silva Ferreira, Antonio M.

    1995-03-01

    This paper describes a typical machine vision system in an unusual application, the automated visual inspection of a Casino's playing tables. The SORTE computer vision system was developed at INETI under a contract with the Portuguese Gaming Inspection Authorities IGJ. It aims to automate the tasks of detection and classification of the dice's scores on the playing tables of the game `Banca Francesa' (which means French Banking) in Casinos. The system is based on the on-line analysis of the images captured by a monochrome CCD camera placed over the playing tables, in order to extract relevant information concerning the score indicated by the dice. Image processing algorithms for real time automatic throwing detection and dice classification were developed and implemented.

  1. Method and apparatus for analyzing error conditions in a massively parallel computer system by identifying anomalous nodes within a communicator set

    DOEpatents

    Gooding, Thomas Michael [Rochester, MN

    2011-04-19

    An analytical mechanism for a massively parallel computer system automatically analyzes data retrieved from the system, and identifies nodes which exhibit anomalous behavior in comparison to their immediate neighbors. Preferably, anomalous behavior is determined by comparing call-return stack tracebacks for each node, grouping like nodes together, and identifying neighboring nodes which do not themselves belong to the group. A node, not itself in the group, having a large number of neighbors in the group, is a likely locality of error. The analyzer preferably presents this information to the user by sorting the neighbors according to number of adjoining members of the group.

  2. Automatic creation of object hierarchies for ray tracing

    NASA Technical Reports Server (NTRS)

    Goldsmith, Jeffrey; Salmon, John

    1987-01-01

    Various methods for evaluating generated trees are proposed. The use of the hierarchical extent method of Rubin and Whitted (1980) to find the objects that will be hit by a ray is examined. This method employs tree searching; the construction of a tree of bounding volumes in order to determine the number of objects that will be hit by a ray is discussed. A tree generation algorithm, which uses a heuristic tree search strategy, is described. The effects of shuffling and sorting on the input data are investigated. The cost of inserting an object into the hierarchy during the construction of a tree algorithm is estimated. The steps involved in estimating the number of intersection calculations are presented.

  3. Capricorn-A Web-Based Automatic Case Log and Volume Analytics for Diagnostic Radiology Residents.

    PubMed

    Chen, Po-Hao; Chen, Yin Jie; Cook, Tessa S

    2015-10-01

    On-service clinical learning is a mainstay of radiology education. However, an accurate and timely case log is difficult to keep, especially in the absence of software tools tailored to resident education. Furthermore, volume-related feedback from the residency program sometimes occurs months after a rotation ends, limiting the opportunity for meaningful intervention. We surveyed the residents of a single academic institution to evaluate the current state of and the existing need for tracking interpretation volume. Using the results of the survey, we created an open-source automated case log software. Finally, we evaluated the effect of the software tool on the residency in a 1-month, postimplementation survey. Before implementation of the system, 89% of respondents stated that volume is an important component of training, but 71% stated that volume data was inconvenient to obtain. Although the residency program provides semiannual reviews, 90% preferred reviewing interpretation volumes at least once monthly. After implementation, 95% of the respondents stated that the software is convenient to access, 75% found it useful, and 88% stated they would use the software at least once a month. The included analytics module, which benchmarks the user using historical aggregate average volumes, is the most often used feature of the software. Server log demonstrates that, on average, residents use the system approximately twice a week. An automated case log software system may fulfill a previously unmet need in diagnostic radiology training, making accurate and timely review of volume-related performance analytics a convenient process. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  4. Correlating Petrophysical Well Logs Using Fractal-based Analysis to Identify Changes in the Signal Complexity Across Neutron, Density, Dipole Sonic, and Gamma Ray Tool Types

    NASA Astrophysics Data System (ADS)

    Matthews, L.; Gurrola, H.

    2015-12-01

    Typical petrophysical well log correlation is accomplished by manual pattern recognition leading to subjective correlations. The change in character in a well log is dependent upon the change in the response of the tool to lithology. The petrophysical interpreter looks for a change in one log type that would correspond to the way a different tool responds to the same lithology. To develop an objective way to pick changes in well log characteristics, we adapt a method of first arrival picking used in seismic data to analyze changes in the character of well logs. We chose to use the fractal method developed by Boschetti et al[1] (1996). This method worked better than we expected and we found similar changes in the fractal dimension across very different tool types (sonic vs density vs gamma ray). We reason the fractal response of the log is not dependent on the physics of the tool response but rather the change in the complexity of the log data. When a formation changes physical character in time or space the recorded magnitude in tool data changes complexity at the same time even if the original tool response is very different. The relative complexity of the data regardless of the tool used is dependent upon the complexity of the medium relative to tool measurement. The relative complexity of the recorded magnitude data changes as a tool transitions from one character type to another. The character we are measuring is the roughness or complexity of the petrophysical curve. Our method provides a way to directly compare different log types based on a quantitative change in signal complexity. For example, using changes in data complexity allow us to correlate gamma ray suites with sonic logs within a well and then across to an adjacent well with similar signatures. Our method creates reliable and automatic correlations to be made in data sets beyond the reasonable cognitive limits of geoscientists in both speed and consistent pattern recognition. [1] Fabio Boschetti, Mike D. Dentith, and Ron D. List, (1996). A fractal-based algorithm for detecting first arrivals on seismic traces. Geophysics, Vol.61, No.4, P. 1095-1102.

  5. Hazard Management with DOORS: Rail Infrastructure Projects

    NASA Astrophysics Data System (ADS)

    Hughes, Dave; Saeed, Amer

    LOI is a major rail infrastructure project that will contribute to a modernised transport system in time for the 2012 Olympic Games. A review of the procedures and tool infrastructure was conducted in early 2006, coinciding with a planned move to main works. A hazard log support tool was needed to provide: an automatic audit trial, version control and support collaborative working. A DOORS based Hazard Log (DHL) was selected as the Tool Strategy. A systematic approach was followed for the development of DHL, after a series of tests and acceptance gateways, DHL was handed over to the project in autumn 2006. The first few months were used for operational trials and he Hazard Management rocedure was modified to be a hybrid approach that used the strengths of DHL and Excel. The user experience in the deployment of DHL is summarised and directions for future improvement identified.

  6. Internal respiratory surrogate in multislice 4D CT using a combination of Fourier transform and anatomical features

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hui, Cheukkai; Suh, Yelin; Robertson, Daniel

    Purpose: The purpose of this study was to develop a novel algorithm to create a robust internal respiratory signal (IRS) for retrospective sorting of four-dimensional (4D) computed tomography (CT) images. Methods: The proposed algorithm combines information from the Fourier transform of the CT images and from internal anatomical features to form the IRS. The algorithm first extracts potential respiratory signals from low-frequency components in the Fourier space and selected anatomical features in the image space. A clustering algorithm then constructs groups of potential respiratory signals with similar temporal oscillation patterns. The clustered group with the largest number of similar signalsmore » is chosen to form the final IRS. To evaluate the performance of the proposed algorithm, the IRS was computed and compared with the external respiratory signal from the real-time position management (RPM) system on 80 patients. Results: In 72 (90%) of the 4D CT data sets tested, the IRS computed by the authors’ proposed algorithm matched with the RPM signal based on their normalized cross correlation. For these data sets with matching respiratory signals, the average difference between the end inspiration times (Δt{sub ins}) in the IRS and RPM signal was 0.11 s, and only 2.1% of Δt{sub ins} were more than 0.5 s apart. In the eight (10%) 4D CT data sets in which the IRS and the RPM signal did not match, the average Δt{sub ins} was 0.73 s in the nonmatching couch positions, and 35.4% of them had a Δt{sub ins} greater than 0.5 s. At couch positions in which IRS did not match the RPM signal, a correlation-based metric indicated poorer matching of neighboring couch positions in the RPM-sorted images. This implied that, when IRS did not match the RPM signal, the images sorted using the IRS showed fewer artifacts than the clinical images sorted using the RPM signal. Conclusions: The authors’ proposed algorithm can generate robust IRSs that can be used for retrospective sorting of 4D CT data. The algorithm is completely automatic and requires very little processing time. The algorithm is cost efficient and can be easily adopted for everyday clinical use.« less

  7. Fine-scale habitat use by orang-utans in a disturbed peat swamp forest, central Kalimantan, and implications for conservation management.

    PubMed

    Morrogh-Bernard, Helen C; Husson, Simon J; Harsanto, Fransiskus A; Chivers, David J

    2014-01-01

    This study was conducted to see how orang-utans (Pongo pygmaeus wurmbii) were coping with fine-scale habitat disturbance in a selectively logged peat swamp forest in Central Kalimantan, Borneo. Seven habitat classes were defined, and orang-utans were found to use all of these, but were selective in their preference for certain classes over others. Overall, the tall forest classes (≥20 m) were preferred. They were preferred for feeding, irrespective of canopy connectivity, whereas classes with a connected canopy (canopy cover ≥75%), irrespective of canopy height, were preferred for resting and nesting, suggesting that tall trees are preferred for feeding and connected canopy for security and protection. The smaller forest classes (≤10 m high) were least preferred and were used mainly for travelling from patch to patch. Thus, selective logging is demonstrated here to be compatible with orang-utan survival as long as large food trees and patches of primary forest remain. Logged forest, therefore, should not automatically be designated as 'degraded'. These findings have important implications for forest management, forest classification and the designation of protected areas for orang-utan conservation.

  8. Colour image compression by grey to colour conversion

    NASA Astrophysics Data System (ADS)

    Drew, Mark S.; Finlayson, Graham D.; Jindal, Abhilash

    2011-03-01

    Instead of de-correlating image luminance from chrominance, some use has been made of using the correlation between the luminance component of an image and its chromatic components, or the correlation between colour components, for colour image compression. In one approach, the Green colour channel was taken as a base, and the other colour channels or their DCT subbands were approximated as polynomial functions of the base inside image windows. This paper points out that we can do better if we introduce an addressing scheme into the image description such that similar colours are grouped together spatially. With a Luminance component base, we test several colour spaces and rearrangement schemes, including segmentation. and settle on a log-geometric-mean colour space. Along with PSNR versus bits-per-pixel, we found that spatially-keyed s-CIELAB colour error better identifies problem regions. Instead of segmentation, we found that rearranging on sorted chromatic components has almost equal performance and better compression. Here, we sort on each of the chromatic components and separately encode windows of each. The result consists of the original greyscale plane plus the polynomial coefficients of windows of rearranged chromatic values, which are then quantized. The simplicity of the method produces a fast and simple scheme for colour image and video compression, with excellent results.

  9. Application of automatic gain control for radiometer diagnostic in SST-1 tokamak.

    PubMed

    Makwana, Foram R; Siju, Varsha; Edappala, Praveenlal; Pathak, S K

    2017-12-01

    This paper describes the characterisation of a negative feedback type of automatic gain control (AGC) circuit that will be an integral part of the heterodyne radiometer system operating at a frequency range of 75-86 GHz at SST-1 tokamak. The developed AGC circuit is a combination of variable gain amplifier and log amplifier which provides both gain and attenuation typically up to 15 dB and 45 dB, respectively, at a fixed set point voltage and it has been explored for the first time in tokamak radiometry application. The other important characteristics are that it exhibits a very fast response time of 390 ns to understand the fast dynamics of electron cyclotron emission and can operate at very wide input RF power dynamic range of around 60 dB that ensures signal level within the dynamic range of the detection system.

  10. A fully-automatic fast segmentation of the sub-basal layer nerves in corneal images.

    PubMed

    Guimarães, Pedro; Wigdahl, Jeff; Poletti, Enea; Ruggeri, Alfredo

    2014-01-01

    Corneal nerves changes have been linked to damage caused by surgical interventions or prolonged contact lens wear. Furthermore nerve tortuosity has been shown to correlate with the severity of diabetic neuropathy. For these reasons there has been an increasing interest on the analysis of these structures. In this work we propose a novel, robust, and fast fully automatic algorithm capable of tracing the sub-basal plexus nerves from human corneal confocal images. We resort to logGabor filters and support vector machines to trace the corneal nerves. The proposed algorithm traced most of the corneal nerves correctly (sensitivity of 0.88 ± 0.06 and false discovery rate of 0.08 ± 0.06). The displayed performance is comparable to a human grader. We believe that the achieved processing time (0.661 ± 0.07 s) and tracing quality are major advantages for the daily clinical practice.

  11. The Seven-Segment Data Logger

    NASA Astrophysics Data System (ADS)

    Bates, Alan

    2015-12-01

    Instruments or digital meters with data values visible on a seven-segment display can easily be found in the physics lab. Examples include multimeters, sound level meters, Geiger-Müller counters and electromagnetic field meters, where the display is used to show numerical data. Such instruments, without the ability to connect to computers or data loggers, can measure and display data at a particular instant in time. The user should be present to read the display and to record the data. Unlike these digital meters, the sensor-data logger system has the advantage of automatically measuring and recording data at selectable sample rates over a desired sample time. The process of adding data logging features to a digital meter with a seven-segment display can be achieved with Seven Segment Optical Character Recognition (SSOCR) software. One might ask, why not just purchase a field meter with data logging features? They are relatively inexpensive, reliable, available online, and can be delivered within a few days. But then there is the challenge of making your own instrument, the excitement of implementing a design, the pleasure of experiencing an entire process from concept to product, and the satisfaction of avoiding costs by taking advantage of available technology. This experiment makes use of an electromagnetic field meter with a seven-segment liquid crystal display to measure background electromagnetic field intensity. Images of the meter display are automatically captured with a camera and analyzed using SSOCR to produce a text file containing meter display values.

  12. Rapid Diagnostics of Onboard Sequences

    NASA Technical Reports Server (NTRS)

    Starbird, Thomas W.; Morris, John R.; Shams, Khawaja S.; Maimone, Mark W.

    2012-01-01

    Keeping track of sequences onboard a spacecraft is challenging. When reviewing Event Verification Records (EVRs) of sequence executions on the Mars Exploration Rover (MER), operators often found themselves wondering which version of a named sequence the EVR corresponded to. The lack of this information drastically impacts the operators diagnostic capabilities as well as their situational awareness with respect to the commands the spacecraft has executed, since the EVRs do not provide argument values or explanatory comments. Having this information immediately available can be instrumental in diagnosing critical events and can significantly enhance the overall safety of the spacecraft. This software provides auditing capability that can eliminate that uncertainty while diagnosing critical conditions. Furthermore, the Restful interface provides a simple way for sequencing tools to automatically retrieve binary compiled sequence SCMFs (Space Command Message Files) on demand. It also enables developers to change the underlying database, while maintaining the same interface to the existing applications. The logging capabilities are also beneficial to operators when they are trying to recall how they solved a similar problem many days ago: this software enables automatic recovery of SCMF and RML (Robot Markup Language) sequence files directly from the command EVRs, eliminating the need for people to find and validate the corresponding sequences. To address the lack of auditing capability for sequences onboard a spacecraft during earlier missions, extensive logging support was added on the Mars Science Laboratory (MSL) sequencing server. This server is responsible for generating all MSL binary SCMFs from RML input sequences. The sequencing server logs every SCMF it generates into a MySQL database, as well as the high-level RML file and dictionary name inputs used to create the SCMF. The SCMF is then indexed by a hash value that is automatically included in all command EVRs by the onboard flight software. Second, both the binary SCMF result and the RML input file can be retrieved simply by specifying the hash to a Restful web interface. This interface enables command line tools as well as large sophisticated programs to download the SCMF and RMLs on-demand from the database, enabling a vast array of tools to be built on top of it. One such command line tool can retrieve and display RML files, or annotate a list of EVRs by interleaving them with the original sequence commands. This software has been integrated with the MSL sequencing pipeline where it will serve sequences useful in diagnostics, debugging, and situational awareness throughout the mission.

  13. Automatic lithofacies segmentation from well-logs data. A comparative study between the Self-Organizing Map (SOM) and Walsh transform

    NASA Astrophysics Data System (ADS)

    Aliouane, Leila; Ouadfeul, Sid-Ali; Rabhi, Abdessalem; Rouina, Fouzi; Benaissa, Zahia; Boudella, Amar

    2013-04-01

    The main goal of this work is to realize a comparison between two lithofacies segmentation techniques of reservoir interval. The first one is based on the Kohonen's Self-Organizing Map neural network machine. The second technique is based on the Walsh transform decomposition. Application to real well-logs data of two boreholes located in the Algerian Sahara shows that the Self-organizing map is able to provide more lithological details that the obtained lithofacies model given by the Walsh decomposition. Keywords: Comparison, Lithofacies, SOM, Walsh References: 1)Aliouane, L., Ouadfeul, S., Boudella, A., 2011, Fractal analysis based on the continuous wavelet transform and lithofacies classification from well-logs data using the self-organizing map neural network, Arabian Journal of geosciences, doi: 10.1007/s12517-011-0459-4 2) Aliouane, L., Ouadfeul, S., Djarfour, N., Boudella, A., 2012, Petrophysical Parameters Estimation from Well-Logs Data Using Multilayer Perceptron and Radial Basis Function Neural Networks, Lecture Notes in Computer Science Volume 7667, 2012, pp 730-736, doi : 10.1007/978-3-642-34500-5_86 3)Ouadfeul, S. and Aliouane., L., 2011, Multifractal analysis revisited by the continuous wavelet transform applied in lithofacies segmentation from well-logs data, International journal of applied physics and mathematics, Vol01 N01. 4) Ouadfeul, S., Aliouane, L., 2012, Lithofacies Classification Using the Multilayer Perceptron and the Self-organizing Neural Networks, Lecture Notes in Computer Science Volume 7667, 2012, pp 737-744, doi : 10.1007/978-3-642-34500-5_87 5) Weisstein, Eric W. "Fast Walsh Transform." From MathWorld--A Wolfram Web Resource. http://mathworld.wolfram.com/FastWalshTransform.html

  14. A hybrid framework for reservoir characterization using fuzzy ranking and an artificial neural network

    NASA Astrophysics Data System (ADS)

    Wang, Baijie; Wang, Xin; Chen, Zhangxin

    2013-08-01

    Reservoir characterization refers to the process of quantitatively assigning reservoir properties using all available field data. Artificial neural networks (ANN) have recently been introduced to solve reservoir characterization problems dealing with the complex underlying relationships inherent in well log data. Despite the utility of ANNs, the current limitation is that most existing applications simply focus on directly implementing existing ANN models instead of improving/customizing them to fit the specific reservoir characterization tasks at hand. In this paper, we propose a novel intelligent framework that integrates fuzzy ranking (FR) and multilayer perceptron (MLP) neural networks for reservoir characterization. FR can automatically identify a minimum subset of well log data as neural inputs, and the MLP is trained to learn the complex correlations from the selected well log data to a target reservoir property. FR guarantees the selection of the optimal subset of representative data from the overall well log data set for the characterization of a specific reservoir property; and, this implicitly improves the modeling and predication accuracy of the MLP. In addition, a growing number of industrial agencies are implementing geographic information systems (GIS) in field data management; and, we have designed the GFAR solution (GIS-based FR ANN Reservoir characterization solution) system, which integrates the proposed framework into a GIS system that provides an efficient characterization solution. Three separate petroleum wells from southwestern Alberta, Canada, were used in the presented case study of reservoir porosity characterization. Our experiments demonstrate that our method can generate reliable results.

  15. Improving Website Hyperlink Structure Using Server Logs

    PubMed Central

    Paranjape, Ashwin; West, Robert; Zia, Leila; Leskovec, Jure

    2016-01-01

    Good websites should be easy to navigate via hyperlinks, yet maintaining a high-quality link structure is difficult. Identifying pairs of pages that should be linked may be hard for human editors, especially if the site is large and changes frequently. Further, given a set of useful link candidates, the task of incorporating them into the site can be expensive, since it typically involves humans editing pages. In the light of these challenges, it is desirable to develop data-driven methods for automating the link placement task. Here we develop an approach for automatically finding useful hyperlinks to add to a website. We show that passively collected server logs, beyond telling us which existing links are useful, also contain implicit signals indicating which nonexistent links would be useful if they were to be introduced. We leverage these signals to model the future usefulness of yet nonexistent links. Based on our model, we define the problem of link placement under budget constraints and propose an efficient algorithm for solving it. We demonstrate the effectiveness of our approach by evaluating it on Wikipedia, a large website for which we have access to both server logs (used for finding useful new links) and the complete revision history (containing a ground truth of new links). As our method is based exclusively on standard server logs, it may also be applied to any other website, as we show with the example of the biomedical research site Simtk. PMID:28345077

  16. Automatic cell cloning assay for determining the clonogenic capacity of cancer and cancer stem-like cells.

    PubMed

    Fedr, Radek; Pernicová, Zuzana; Slabáková, Eva; Straková, Nicol; Bouchal, Jan; Grepl, Michal; Kozubík, Alois; Souček, Karel

    2013-05-01

    The clonogenic assay is a well-established in vitro method for testing the survival and proliferative capability of cells. It can be used to determine the cytotoxic effects of various treatments including chemotherapeutics and ionizing radiation. However, this approach can also characterize cells with different phenotypes and biological properties, such as stem cells or cancer stem cells. In this study, we implemented a faster and more precise method for assessing the cloning efficiency of cancer stem-like cells that were characterized and separated using a high-speed cell sorter. Cell plating onto a microplate using an automatic cell deposition unit was performed in a single-cell or dilution rank mode by the fluorescence-activated cell sorting method. We tested the new automatic cell-cloning assay (ACCA) on selected cancer cell lines and compared it with the manual approach. The obtained results were also compared with the results of the limiting dilution assay for different cell lines. We applied the ACCA to analyze the cloning capacity of different subpopulations of prostate and colon cancer cells based on the expression of the characteristic markers of stem (CD44 and CD133) and cancer stem cells (TROP-2, CD49f, and CD44). Our results revealed that the novel ACCA is a straightforward approach for determining the clonogenic capacity of cancer stem-like cells identified in both cell lines and patient samples. Copyright © 2013 International Society for Advancement of Cytometry.

  17. Wavelet versus detrended fluctuation analysis of multifractal structures

    NASA Astrophysics Data System (ADS)

    Oświȩcimka, Paweł; Kwapień, Jarosław; Drożdż, Stanisław

    2006-07-01

    We perform a comparative study of applicability of the multifractal detrended fluctuation analysis (MFDFA) and the wavelet transform modulus maxima (WTMM) method in proper detecting of monofractal and multifractal character of data. We quantify the performance of both methods by using different sorts of artificial signals generated according to a few well-known exactly soluble mathematical models: monofractal fractional Brownian motion, bifractal Lévy flights, and different sorts of multifractal binomial cascades. Our results show that in the majority of situations in which one does not know a priori the fractal properties of a process, choosing MFDFA should be recommended. In particular, WTMM gives biased outcomes for the fractional Brownian motion with different values of Hurst exponent, indicating spurious multifractality. In some cases WTMM can also give different results if one applies different wavelets. We do not exclude using WTMM in real data analysis, but it occurs that while one may apply MFDFA in a more automatic fashion, WTMM must be applied with care. In the second part of our work, we perform an analogous analysis on empirical data coming from the American and from the German stock market. For this data both methods detect rich multifractality in terms of broad f(α) , but MFDFA suggests that this multifractality is poorer than in the case of WTMM.

  18. Image Classification of Ribbed Smoked Sheet using Learning Vector Quantization

    NASA Astrophysics Data System (ADS)

    Rahmat, R. F.; Pulungan, A. F.; Faza, S.; Budiarto, R.

    2017-01-01

    Natural rubber is an important export commodity in Indonesia, which can be a major contributor to national economic development. One type of rubber used as rubber material exports is Ribbed Smoked Sheet (RSS). The quantity of RSS exports depends on the quality of RSS. RSS rubber quality has been assigned in SNI 06-001-1987 and the International Standards of Quality and Packing for Natural Rubber Grades (The Green Book). The determination of RSS quality is also known as the sorting process. In the rubber factones, the sorting process is still done manually by looking and detecting at the levels of air bubbles on the surface of the rubber sheet by naked eyes so that the result is subjective and not so good. Therefore, a method is required to classify RSS rubber automatically and precisely. We propose some image processing techniques for the pre-processing, zoning method for feature extraction and Learning Vector Quantization (LVQ) method for classifying RSS rubber into two grades, namely RSS1 and RSS3. We used 120 RSS images as training dataset and 60 RSS images as testing dataset. The result shows that our proposed method can give 89% of accuracy and the best perform epoch is in the fifteenth epoch.

  19. A graphics package for meteorological data, version 1.5

    NASA Technical Reports Server (NTRS)

    Moorthi, Shrinivas; Suarez, Max; Phillips, Bill; Schemm, Jae-Kyung; Schubert, Siegfried

    1989-01-01

    A plotting package has been developed to simplify the task of plotting meteorological data. The calling sequences and examples of high level yet flexible routines which allow contouring, vectors and shading of cylindrical, polar, orthographic and Mollweide (egg) projections are given. Routines are also included for contouring pressure-latitude and pressure-longitude fields with linear or log scales in pressure (interpolation to fixed grid interval is done automatically). Also included is a fairly general line plotting routine. The present version (1.5) produces plots on WMS laser printers and uses graphics primitives from WOLFPLOT.

  20. The development of a high-throughput measurement method of octanol/water distribution coefficient based on hollow fiber membrane solvent microextraction technique.

    PubMed

    Bao, James J; Liu, Xiaojing; Zhang, Yong; Li, Youxin

    2014-09-15

    This paper describes the development of a novel high-throughput hollow fiber membrane solvent microextraction technique for the simultaneous measurement of the octanol/water distribution coefficient (logD) for organic compounds such as drugs. The method is based on a designed system, which consists of a 96-well plate modified with 96 hollow fiber membrane tubes and a matching lid with 96 center holes and 96 side holes distributing in 96 grids. Each center hole was glued with a sealed on one end hollow fiber membrane tube, which is used to separate the aqueous phase from the octanol phase. A needle, such as microsyringe or automatic sampler, can be directly inserted into the membrane tube to deposit octanol as the accepted phase or take out the mixture of the octanol and the drug. Each side hole is filled with aqueous phase and could freely take in/out solvent as the donor phase from the outside of the hollow fiber membranes. The logD can be calculated by measuring the drug concentration in each phase after extraction equilibrium. After a comprehensive comparison, the polytetrafluoroethylene hollow fiber with the thickness of 210 μm, an extraction time of 300 min, a temperature of 25 °C and atmospheric pressure without stirring are selected for the high throughput measurement. The correlation coefficient of the linear fit of the logD values of five drugs determined by our system to reference values is 0.9954, showed a nice accurate. The -8.9% intra-day and -4.4% inter-day precision of logD for metronidazole indicates a good precision. In addition, the logD values of eight drugs were simultaneously and successfully measured, which indicated that the 96 throughput measure method of logD value was accurate, precise, reliable and useful for high throughput screening. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Automated nodule location and size estimation using a multi-scale Laplacian of Gaussian filtering approach.

    PubMed

    Jirapatnakul, Artit C; Fotin, Sergei V; Reeves, Anthony P; Biancardi, Alberto M; Yankelevitz, David F; Henschke, Claudia I

    2009-01-01

    Estimation of nodule location and size is an important pre-processing step in some nodule segmentation algorithms to determine the size and location of the region of interest. Ideally, such estimation methods will consistently find the same nodule location regardless of where the the seed point (provided either manually or by a nodule detection algorithm) is placed relative to the "true" center of the nodule, and the size should be a reasonable estimate of the true nodule size. We developed a method that estimates nodule location and size using multi-scale Laplacian of Gaussian (LoG) filtering. Nodule candidates near a given seed point are found by searching for blob-like regions with high filter response. The candidates are then pruned according to filter response and location, and the remaining candidates are sorted by size and the largest candidate selected. This method was compared to a previously published template-based method. The methods were evaluated on the basis of stability of the estimated nodule location to changes in the initial seed point and how well the size estimates agreed with volumes determined by a semi-automated nodule segmentation method. The LoG method exhibited better stability to changes in the seed point, with 93% of nodules having the same estimated location even when the seed point was altered, compared to only 52% of nodules for the template-based method. Both methods also showed good agreement with sizes determined by a nodule segmentation method, with an average relative size difference of 5% and -5% for the LoG and template-based methods respectively.

  2. Statistical Significance of Periodicity and Log-Periodicity with Heavy-Tailed Correlated Noise

    NASA Astrophysics Data System (ADS)

    Zhou, Wei-Xing; Sornette, Didier

    We estimate the probability that random noise, of several plausible standard distributions, creates a false alarm that a periodicity (or log-periodicity) is found in a time series. The solution of this problem is already known for independent Gaussian distributed noise. We investigate more general situations with non-Gaussian correlated noises and present synthetic tests on the detectability and statistical significance of periodic components. A periodic component of a time series is usually detected by some sort of Fourier analysis. Here, we use the Lomb periodogram analysis, which is suitable and outperforms Fourier transforms for unevenly sampled time series. We examine the false-alarm probability of the largest spectral peak of the Lomb periodogram in the presence of power-law distributed noises, of short-range and of long-range fractional-Gaussian noises. Increasing heavy-tailness (respectively correlations describing persistence) tends to decrease (respectively increase) the false-alarm probability of finding a large spurious Lomb peak. Increasing anti-persistence tends to decrease the false-alarm probability. We also study the interplay between heavy-tailness and long-range correlations. In order to fully determine if a Lomb peak signals a genuine rather than a spurious periodicity, one should in principle characterize the Lomb peak height, its width and its relations to other peaks in the complete spectrum. As a step towards this full characterization, we construct the joint-distribution of the frequency position (relative to other peaks) and of the height of the highest peak of the power spectrum. We also provide the distributions of the ratio of the highest Lomb peak to the second highest one. Using the insight obtained by the present statistical study, we re-examine previously reported claims of ``log-periodicity'' and find that the credibility for log-periodicity in 2D-freely decaying turbulence is weakened while it is strengthened for fracture, for the ion-signature prior to the Kobe earthquake and for financial markets.

  3. The Luminosity Function of Star Clusters in 20 Star-Forming Galaxies Based on Hubble Legacy Archive Photometry

    NASA Astrophysics Data System (ADS)

    Bowers, Ariel; Whitmore, B. C.; Chandar, R.; Larsen, S. S.

    2014-01-01

    Luminosity functions have been determined for star cluster populations in 20 nearby (4 - 30 Mpc), star-forming galaxies based on ACS source lists generated by the Hubble Legacy Archive (http://hla.stsci.edu). These cluster catalogs provide one of the largest sets of uniform, automatically-generated cluster candidates available in the literature at present. Comparisons are made with other recently generated cluster catalogs demonstrating that the HLA-generated catalogs are of similar quality, but in general do not go as deep. A typical cluster luminosity function can be approximated by a power-law, dN/dL ∝ Lα, with an average value for α of -2.37 and rms scatter = 0.18. A comparison of fitting results based on methods which use binned and unbinned data shows good agreement, although there may be a systematic tendency for the unbinned (maximum-likelihood) method to give slightly more negative values of α for galaxies with steper luminosity functions. Our uniform database results in a small scatter (0.5 magnitude) in the correlation between the magnitude of the brightest cluster (Mbrightest) and Log of the number of clusters brighter than MI = -9 (Log N). We also examine the magnitude of the brightest cluster vs. Log SFR for a sample including LIRGS and ULIRGS.

  4. FAMA: An automatic code for stellar parameter and abundance determination

    NASA Astrophysics Data System (ADS)

    Magrini, Laura; Randich, Sofia; Friel, Eileen; Spina, Lorenzo; Jacobson, Heather; Cantat-Gaudin, Tristan; Donati, Paolo; Baglioni, Roberto; Maiorca, Enrico; Bragaglia, Angela; Sordo, Rosanna; Vallenari, Antonella

    2013-10-01

    Context. The large amount of spectra obtained during the epoch of extensive spectroscopic surveys of Galactic stars needs the development of automatic procedures to derive their atmospheric parameters and individual element abundances. Aims: Starting from the widely-used code MOOG by C. Sneden, we have developed a new procedure to determine atmospheric parameters and abundances in a fully automatic way. The code FAMA (Fast Automatic MOOG Analysis) is presented describing its approach to derive atmospheric stellar parameters and element abundances. The code, freely distributed, is written in Perl and can be used on different platforms. Methods: The aim of FAMA is to render the computation of the atmospheric parameters and abundances of a large number of stars using measurements of equivalent widths (EWs) as automatic and as independent of any subjective approach as possible. It is based on the simultaneous search for three equilibria: excitation equilibrium, ionization balance, and the relationship between log n(Fe i) and the reduced EWs. FAMA also evaluates the statistical errors on individual element abundances and errors due to the uncertainties in the stellar parameters. The convergence criteria are not fixed "a priori" but are based on the quality of the spectra. Results: In this paper we present tests performed on the solar spectrum EWs that assess the method's dependency on the initial parameters and we analyze a sample of stars observed in Galactic open and globular clusters. The current version of FAMA is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/558/A38

  5. Stream-based Hebbian eigenfilter for real-time neuronal spike discrimination

    PubMed Central

    2012-01-01

    Background Principal component analysis (PCA) has been widely employed for automatic neuronal spike sorting. Calculating principal components (PCs) is computationally expensive, and requires complex numerical operations and large memory resources. Substantial hardware resources are therefore needed for hardware implementations of PCA. General Hebbian algorithm (GHA) has been proposed for calculating PCs of neuronal spikes in our previous work, which eliminates the needs of computationally expensive covariance analysis and eigenvalue decomposition in conventional PCA algorithms. However, large memory resources are still inherently required for storing a large volume of aligned spikes for training PCs. The large size memory will consume large hardware resources and contribute significant power dissipation, which make GHA difficult to be implemented in portable or implantable multi-channel recording micro-systems. Method In this paper, we present a new algorithm for PCA-based spike sorting based on GHA, namely stream-based Hebbian eigenfilter, which eliminates the inherent memory requirements of GHA while keeping the accuracy of spike sorting by utilizing the pseudo-stationarity of neuronal spikes. Because of the reduction of large hardware storage requirements, the proposed algorithm can lead to ultra-low hardware resources and power consumption of hardware implementations, which is critical for the future multi-channel micro-systems. Both clinical and synthetic neural recording data sets were employed for evaluating the accuracy of the stream-based Hebbian eigenfilter. The performance of spike sorting using stream-based eigenfilter and the computational complexity of the eigenfilter were rigorously evaluated and compared with conventional PCA algorithms. Field programmable logic arrays (FPGAs) were employed to implement the proposed algorithm, evaluate the hardware implementations and demonstrate the reduction in both power consumption and hardware memories achieved by the streaming computing Results and discussion Results demonstrate that the stream-based eigenfilter can achieve the same accuracy and is 10 times more computationally efficient when compared with conventional PCA algorithms. Hardware evaluations show that 90.3% logic resources, 95.1% power consumption and 86.8% computing latency can be reduced by the stream-based eigenfilter when compared with PCA hardware. By utilizing the streaming method, 92% memory resources and 67% power consumption can be saved when compared with the direct implementation of GHA. Conclusion Stream-based Hebbian eigenfilter presents a novel approach to enable real-time spike sorting with reduced computational complexity and hardware costs. This new design can be further utilized for multi-channel neuro-physiological experiments or chronic implants. PMID:22490725

  6. Analyzing Medical Image Search Behavior: Semantics and Prediction of Query Results.

    PubMed

    De-Arteaga, Maria; Eggel, Ivan; Kahn, Charles E; Müller, Henning

    2015-10-01

    Log files of information retrieval systems that record user behavior have been used to improve the outcomes of retrieval systems, understand user behavior, and predict events. In this article, a log file of the ARRS GoldMiner search engine containing 222,005 consecutive queries is analyzed. Time stamps are available for each query, as well as masked IP addresses, which enables to identify queries from the same person. This article describes the ways in which physicians (or Internet searchers interested in medical images) search and proposes potential improvements by suggesting query modifications. For example, many queries contain only few terms and therefore are not specific; others contain spelling mistakes or non-medical terms that likely lead to poor or empty results. One of the goals of this report is to predict the number of results a query will have since such a model allows search engines to automatically propose query modifications in order to avoid result lists that are empty or too large. This prediction is made based on characteristics of the query terms themselves. Prediction of empty results has an accuracy above 88%, and thus can be used to automatically modify the query to avoid empty result sets for a user. The semantic analysis and data of reformulations done by users in the past can aid the development of better search systems, particularly to improve results for novice users. Therefore, this paper gives important ideas to better understand how people search and how to use this knowledge to improve the performance of specialized medical search engines.

  7. Biosonar adjustments to target range of echolocating bottlenose dolphins (Tursiops sp.) in the wild.

    PubMed

    Jensen, F H; Bejder, L; Wahlberg, M; Madsen, P T

    2009-04-01

    Toothed whales use echolocation to locate and track prey. Most knowledge of toothed whale echolocation stems from studies on trained animals, and little is known about how toothed whales regulate and use their biosonar systems in the wild. Recent research suggests that an automatic gain control mechanism in delphinid biosonars adjusts the biosonar output to the one-way transmission loss to the target, possibly a consequence of pneumatic restrictions in how fast the sound generator can be actuated and still maintain high outputs. This study examines the relationships between target range (R), click intervals, and source levels of wild bottlenose dolphins (Tursiops sp.) by recording regular (non-buzz) echolocation clicks with a linear hydrophone array. Dolphins clicked faster with decreasing distance to the array, reflecting a decreasing delay between the outgoing echolocation click and the returning array echo. However, for interclick intervals longer than 30-40 ms, source levels were not limited by the repetition rate. Thus, pneumatic constraints in the sound-production apparatus cannot account for source level adjustments to range as a possible automatic gain control mechanism for target ranges longer than a few body lengths of the dolphin. Source level estimates drop with reducing range between the echolocating dolphins and the target as a function of 17 log(R). This may indicate either (1) an active form of time-varying gain in the biosonar independent of click intervals or (2) a bias in array recordings towards a 20 log(R) relationship for apparent source levels introduced by a threshold on received click levels included in the analysis.

  8. Identifying the Machine Translation Error Types with the Greatest Impact on Post-editing Effort

    PubMed Central

    Daems, Joke; Vandepitte, Sonia; Hartsuiker, Robert J.; Macken, Lieve

    2017-01-01

    Translation Environment Tools make translators’ work easier by providing them with term lists, translation memories and machine translation output. Ideally, such tools automatically predict whether it is more effortful to post-edit than to translate from scratch, and determine whether or not to provide translators with machine translation output. Current machine translation quality estimation systems heavily rely on automatic metrics, even though they do not accurately capture actual post-editing effort. In addition, these systems do not take translator experience into account, even though novices’ translation processes are different from those of professional translators. In this paper, we report on the impact of machine translation errors on various types of post-editing effort indicators, for professional translators as well as student translators. We compare the impact of MT quality on a product effort indicator (HTER) with that on various process effort indicators. The translation and post-editing process of student translators and professional translators was logged with a combination of keystroke logging and eye-tracking, and the MT output was analyzed with a fine-grained translation quality assessment approach. We find that most post-editing effort indicators (product as well as process) are influenced by machine translation quality, but that different error types affect different post-editing effort indicators, confirming that a more fine-grained MT quality analysis is needed to correctly estimate actual post-editing effort. Coherence, meaning shifts, and structural issues are shown to be good indicators of post-editing effort. The additional impact of experience on these interactions between MT quality and post-editing effort is smaller than expected. PMID:28824482

  9. Identifying the Machine Translation Error Types with the Greatest Impact on Post-editing Effort.

    PubMed

    Daems, Joke; Vandepitte, Sonia; Hartsuiker, Robert J; Macken, Lieve

    2017-01-01

    Translation Environment Tools make translators' work easier by providing them with term lists, translation memories and machine translation output. Ideally, such tools automatically predict whether it is more effortful to post-edit than to translate from scratch, and determine whether or not to provide translators with machine translation output. Current machine translation quality estimation systems heavily rely on automatic metrics, even though they do not accurately capture actual post-editing effort. In addition, these systems do not take translator experience into account, even though novices' translation processes are different from those of professional translators. In this paper, we report on the impact of machine translation errors on various types of post-editing effort indicators, for professional translators as well as student translators. We compare the impact of MT quality on a product effort indicator (HTER) with that on various process effort indicators. The translation and post-editing process of student translators and professional translators was logged with a combination of keystroke logging and eye-tracking, and the MT output was analyzed with a fine-grained translation quality assessment approach. We find that most post-editing effort indicators (product as well as process) are influenced by machine translation quality, but that different error types affect different post-editing effort indicators, confirming that a more fine-grained MT quality analysis is needed to correctly estimate actual post-editing effort. Coherence, meaning shifts, and structural issues are shown to be good indicators of post-editing effort. The additional impact of experience on these interactions between MT quality and post-editing effort is smaller than expected.

  10. An unsupervised method for summarizing egocentric sport videos

    NASA Astrophysics Data System (ADS)

    Habibi Aghdam, Hamed; Jahani Heravi, Elnaz; Puig, Domenec

    2015-12-01

    People are getting more interested to record their sport activities using head-worn or hand-held cameras. This type of videos which is called egocentric sport videos has different motion and appearance patterns compared with life-logging videos. While a life-logging video can be defined in terms of well-defined human-object interactions, notwithstanding, it is not trivial to describe egocentric sport videos using well-defined activities. For this reason, summarizing egocentric sport videos based on human-object interaction might fail to produce meaningful results. In this paper, we propose an unsupervised method for summarizing egocentric videos by identifying the key-frames of the video. Our method utilizes both appearance and motion information and it automatically finds the number of the key-frames. Our blind user study on the new dataset collected from YouTube shows that in 93:5% cases, the users choose the proposed method as their first video summary choice. In addition, our method is within the top 2 choices of the users in 99% of studies.

  11. Automatic method for evaluating the activity of sourdough strains based on gas pressure measurements.

    PubMed

    Wick, M; Vanhoutte, J J; Adhemard, A; Turini, G; Lebeault, J M

    2001-04-01

    A new method is proposed for the evaluation of the activity of sourdough strains, based on gas pressure measurements in closed air-tight reactors. Gas pressure and pH were monitored on-line during the cultivation of commercial yeasts and heterofermentative lactic acid bacteria on a semi-synthetic medium with glucose as the major carbon source. Relative gas pressure evolution was compared both to glucose consumption and to acidification and growth. It became obvious that gas pressure evolution is related to glucose consumption kinetics. For each strain, a correlation was made between maximum gas pressure variation and amount of glucose consumed. The mass balance of CO2 in both liquid and gas phase demonstrated that around 90% of CO2 was recovered. Concerning biomass production, a linear relationship was found between log colony-forming units/ml and log pressure for both yeasts and bacteria during the exponential phase; and for yeasts, relative gas pressure evolution also followed optical density variation.

  12. Random forest models to predict aqueous solubility.

    PubMed

    Palmer, David S; O'Boyle, Noel M; Glen, Robert C; Mitchell, John B O

    2007-01-01

    Random Forest regression (RF), Partial-Least-Squares (PLS) regression, Support Vector Machines (SVM), and Artificial Neural Networks (ANN) were used to develop QSPR models for the prediction of aqueous solubility, based on experimental data for 988 organic molecules. The Random Forest regression model predicted aqueous solubility more accurately than those created by PLS, SVM, and ANN and offered methods for automatic descriptor selection, an assessment of descriptor importance, and an in-parallel measure of predictive ability, all of which serve to recommend its use. The prediction of log molar solubility for an external test set of 330 molecules that are solid at 25 degrees C gave an r2 = 0.89 and RMSE = 0.69 log S units. For a standard data set selected from the literature, the model performed well with respect to other documented methods. Finally, the diversity of the training and test sets are compared to the chemical space occupied by molecules in the MDL drug data report, on the basis of molecular descriptors selected by the regression analysis.

  13. Analyzing coastal environments by means of functional data analysis

    NASA Astrophysics Data System (ADS)

    Sierra, Carlos; Flor-Blanco, Germán; Ordoñez, Celestino; Flor, Germán; Gallego, José R.

    2017-07-01

    Here we used Functional Data Analysis (FDA) to examine particle-size distributions (PSDs) in a beach/shallow marine sedimentary environment in Gijón Bay (NW Spain). The work involved both Functional Principal Components Analysis (FPCA) and Functional Cluster Analysis (FCA). The grainsize of the sand samples was characterized by means of laser dispersion spectroscopy. Within this framework, FPCA was used as a dimension reduction technique to explore and uncover patterns in grain-size frequency curves. This procedure proved useful to describe variability in the structure of the data set. Moreover, an alternative approach, FCA, was applied to identify clusters and to interpret their spatial distribution. Results obtained with this latter technique were compared with those obtained by means of two vector approaches that combine PCA with CA (Cluster Analysis). The first method, the point density function (PDF), was employed after adapting a log-normal distribution to each PSD and resuming each of the density functions by its mean, sorting, skewness and kurtosis. The second applied a centered-log-ratio (clr) to the original data. PCA was then applied to the transformed data, and finally CA to the retained principal component scores. The study revealed functional data analysis, specifically FPCA and FCA, as a suitable alternative with considerable advantages over traditional vector analysis techniques in sedimentary geology studies.

  14. Aircraft- and ground-based assessment of the CCN-AOD relationship and implications on model analysis of ACI and underlying aerosol processes

    NASA Astrophysics Data System (ADS)

    Shinozuka, Y.; Clarke, A. D.; Nenes, A.; Lathem, T. L.; Redemann, J.; Jefferson, A.; Wood, R.

    2014-12-01

    Contrary to common assumptions in satellite-based modeling of aerosol-cloud interactions, ∂logCCN/∂logAOD is less than unity, i.e., the number concentration of cloud condensation nuclei (CCN) less than doubles as aerosol optical depth (AOD) doubles. This can be explained by omnipresent aerosol processes. Condensation, coagulation and cloud processing, for example, generally make particles scatter more light while hardly increasing their number. This paper reports on the relationship in local air masses between CCN concentration, aerosol size distribution and light extinction observed from aircraft and the ground at diverse locations. The CCN-to-local-extinction relationship, when averaged over ~1 km distance and sorted by the wavelength dependence of extinction, varies approximately by a factor of 2, reflecting the variability in aerosol intensive properties. This, together with retrieval uncertainties and the variability in aerosol spatio-temporal distribution and hygroscopic growth, challenges satellite-based CCN estimates. However, the large differences in estimated CCN may correspond to a considerably lower uncertainty in cloud drop number concentration (CDNC), given the sublinear response of CDNC to CCN. Overall, our findings from airborne and ground-based observations call for model-based reexamination of aerosol-cloud interactions and underlying aerosol processes.

  15. NK Cells with KIR2DS2 Immunogenotype Have a Functional Activation Advantage To Efficiently Kill Glioblastoma and Prolong Animal Survival

    PubMed Central

    Gras Navarro, Andrea; Kmiecik, Justyna; Leiss, Lina; Zelkowski, Mateusz; Engelsen, Agnete; Bruserud, Øystein; Zimmer, Jacques; Enger, Per Øyvind

    2014-01-01

    Glioblastomas (GBMs) are lethal brain cancers that are resistant to current therapies. We investigated the cytotoxicity of human allogeneic NK cells against patient-derived GBM in vitro and in vivo, as well as mechanisms mediating their efficacy. We demonstrate that KIR2DS2 immunogenotype NK cells were more potent killers, notwithstanding the absence of inhibitory killer Ig–like receptor (KIR)-HLA ligand mismatch. FACS-sorted and enriched KIR2DS2+ NK cell subpopulations retained significantly high levels of CD69 and CD16 when in contact with GBM cells at a 1:1 ratio and highly expressed CD107a and secreted more soluble CD137 and granzyme A. In contrast, KIR2DS2− immunogenotype donor NK cells were less cytotoxic against GBM and K562, and, similar to FACS-sorted or gated KIR2DS2− NK cells, significantly diminished CD16, CD107a, granzyme A, and CD69 when in contact with GBM cells. Furthermore, NK cell–mediated GBM killing in vitro depended upon the expression of ligands for the activating receptor NKG2D and was partially abrogated by Ab blockade. Treatment of GBM xenografts in NOD/SCID mice with NK cells from a KIR2DS2+ donor lacking inhibitory KIR-HLA ligand mismatch significantly prolonged the median survival to 163 d compared with vehicle controls (log-rank test, p = 0.0001), in contrast to 117.5 d (log-rank test, p = 0.0005) for NK cells with several inhibitory KIR-HLA ligand mismatches but lacking KIR2DS2 genotype. Significantly more CD56+CD16+ NK cells from a KIR2DS2+ donor survived in nontumor-bearing brains 3 wk after infusion compared with KIR2DS2− NK cells, independent of their proliferative capacity. In conclusion, KIR2DS2 identifies potent alloreactive NK cells against GBM that are mediated by commensurate, but dominant, activating signals. PMID:25381437

  16. PACS quality control and automatic problem notifier

    NASA Astrophysics Data System (ADS)

    Honeyman-Buck, Janice C.; Jones, Douglas; Frost, Meryll M.; Staab, Edward V.

    1997-05-01

    One side effect of installing a clinical PACS Is that users become dependent upon the technology and in some cases it can be very difficult to revert back to a film based system if components fail. The nature of system failures range from slow deterioration of function as seen in the loss of monitor luminance through sudden catastrophic loss of the entire PACS networks. This paper describes the quality control procedures in place at the University of Florida and the automatic notification system that alerts PACS personnel when a failure has happened or is anticipated. The goal is to recover from a failure with a minimum of downtime and no data loss. Routine quality control is practiced on all aspects of PACS, from acquisition, through network routing, through display, and including archiving. Whenever possible, the system components perform self and between platform checks for active processes, file system status, errors in log files, and system uptime. When an error is detected or a exception occurs, an automatic page is sent to a pager with a diagnostic code. Documentation on each code, trouble shooting procedures, and repairs are kept on an intranet server accessible only to people involved in maintaining the PACS. In addition to the automatic paging system for error conditions, acquisition is assured by an automatic fax report sent on a daily basis to all technologists acquiring PACS images to be used as a cross check that all studies are archived prior to being removed from the acquisition systems. Daily quality control is preformed to assure that studies can be moved from each acquisition and contrast adjustment. The results of selected quality control reports will be presented. The intranet documentation server will be described with the automatic pager system. Monitor quality control reports will be described and the cost of quality control will be quantified. As PACS is accepted as a clinical tool, the same standards of quality control must be established as are expected on other equipment used in the diagnostic process.

  17. Worldwide Research, Worldwide Participation: Web-Based Test Logger

    NASA Technical Reports Server (NTRS)

    Clark, David A.

    1998-01-01

    Thanks to the World Wide Web, a new paradigm has been born. ESCORT (steady state data system) facilities can now be configured to use a Web-based test logger, enabling worldwide participation in tests. NASA Lewis Research Center's new Web-based test logger for ESCORT automatically writes selected test and facility parameters to a browser and allows researchers to insert comments. All data can be viewed in real time via Internet connections, so anyone with a Web browser and the correct URL (universal resource locator, or Web address) can interactively participate. As the test proceeds and ESCORT data are taken, Web browsers connected to the logger are updated automatically. The use of this logger has demonstrated several benefits. First, researchers are free from manual data entry and are able to focus more on the tests. Second, research logs can be printed in report format immediately after (or during) a test. And finally, all test information is readily available to an international public.

  18. Can Wireless Technology Enable New Diabetes Management Tools?

    PubMed Central

    Hedtke, Paul A.

    2008-01-01

    Mobile computing and communications technology embodied in the modern cell phone device can be employed to improve the lives of diabetes patients by giving them better tools for self-management. Several companies are working on the development of diabetes management tools that leverage the ubiquitous cell phone to bring self-management tools to the hand of the diabetes patient. Integration of blood glucose monitoring (BGM) technology with the cell phone platform adds a level of convenience for the person with diabetes, but, more importantly, allows BGM data to be automatically captured, logged, and processed in near real time in order to provide the diabetes patient with assistance in managing their blood glucose levels. Other automatic measurements can estimate physical activity, and information regarding medication events and food intake can be captured and analyzed in order to provide the diabetes patient with continual assistance in managing their therapy and behaviors in order to improve glycemic control. The path to realization of such solutions is not, however, without obstacles. PMID:19885187

  19. Can wireless technology enable new diabetes management tools?

    PubMed

    Hedtke, Paul A

    2008-01-01

    Mobile computing and communications technology embodied in the modern cell phone device can be employed to improve the lives of diabetes patients by giving them better tools for self-management. Several companies are working on the development of diabetes management tools that leverage the ubiquitous cell phone to bring self-management tools to the hand of the diabetes patient. Integration of blood glucose monitoring (BGM) technology with the cell phone platform adds a level of convenience for the person with diabetes, but, more importantly, allows BGM data to be automatically captured, logged, and processed in near real time in order to provide the diabetes patient with assistance in managing their blood glucose levels. Other automatic measurements can estimate physical activity, and information regarding medication events and food intake can be captured and analyzed in order to provide the diabetes patient with continual assistance in managing their therapy and behaviors in order to improve glycemic control. The path to realization of such solutions is not, however, without obstacles.

  20. Control and materials characterization System for 6T Superconducting Cryogen Free Magnet Facility at IUAC, New Delhi

    NASA Astrophysics Data System (ADS)

    Dutt, R. N.; Meena, D. K.; Kar, S.; Soni, V.; Nadaf, A.; Das, A.; Singh, F.; Datta, T. S.

    2017-02-01

    A system for carrying out automatic experimental measurements of various electrical transport characteristics and their relation to magnetic fields for samples mounted on the sample holder on a Variable Temperature Insert (VTI) of the Cryogen Free Superconducting Magnet System (CFMS) has been developed. The control and characterization system is capable of monitoring, online plotting and history logging in real-time of cryogenic temperatures with the Silicon (Si) Diode and Zirconium Oxy-Nitride sensors installed inside the magnet facility. Electrical transport property measurements have been automated with implementation of current reversal resistance measurements and automatic temperature set-point ramping with the parameters of interest available in real-time as well as for later analysis. The Graphical User Interface (GUI) based system is user friendly to facilitate operations. An ingenious electronics for reading Zirconium Oxy-Nitride temperature sensors has been used. Price to performance ratio has been optimized by using in house developed measurement techniques mixed with specialized commercial cryogenic measurement / control equipment.

  1. Automatic co-registration of 3D multi-sensor point clouds

    NASA Astrophysics Data System (ADS)

    Persad, Ravi Ancil; Armenakis, Costas

    2017-08-01

    We propose an approach for the automatic coarse alignment of 3D point clouds which have been acquired from various platforms. The method is based on 2D keypoint matching performed on height map images of the point clouds. Initially, a multi-scale wavelet keypoint detector is applied, followed by adaptive non-maxima suppression. A scale, rotation and translation-invariant descriptor is then computed for all keypoints. The descriptor is built using the log-polar mapping of Gabor filter derivatives in combination with the so-called Rapid Transform. In the final step, source and target height map keypoint correspondences are determined using a bi-directional nearest neighbour similarity check, together with a threshold-free modified-RANSAC. Experiments with urban and non-urban scenes are presented and results show scale errors ranging from 0.01 to 0.03, 3D rotation errors in the order of 0.2° to 0.3° and 3D translation errors from 0.09 m to 1.1 m.

  2. MxCuBE: a synchrotron beamline control environment customized for macromolecular crystallography experiments

    PubMed Central

    Gabadinho, José; Beteva, Antonia; Guijarro, Matias; Rey-Bakaikoa, Vicente; Spruce, Darren; Bowler, Matthew W.; Brockhauser, Sandor; Flot, David; Gordon, Elspeth J.; Hall, David R.; Lavault, Bernard; McCarthy, Andrew A.; McCarthy, Joanne; Mitchell, Edward; Monaco, Stéphanie; Mueller-Dieckmann, Christoph; Nurizzo, Didier; Ravelli, Raimond B. G.; Thibault, Xavier; Walsh, Martin A.; Leonard, Gordon A.; McSweeney, Sean M.

    2010-01-01

    The design and features of a beamline control software system for macromolecular crystallography (MX) experiments developed at the European Synchrotron Radiation Facility (ESRF) are described. This system, MxCuBE, allows users to easily and simply interact with beamline hardware components and provides automated routines for common tasks in the operation of a synchrotron beamline dedicated to experiments in MX. Additional functionality is provided through intuitive interfaces that enable the assessment of the diffraction characteristics of samples, experiment planning, automatic data collection and the on-line collection and analysis of X-ray emission spectra. The software can be run in a tandem client-server mode that allows for remote control and relevant experimental parameters and results are automatically logged in a relational database, ISPyB. MxCuBE is modular, flexible and extensible and is currently deployed on eight macromolecular crystallography beamlines at the ESRF. Additionally, the software is installed at MAX-lab beamline I911-3 and at BESSY beamline BL14.1. PMID:20724792

  3. Separation and identification of the silt-sized heavy-mineral fraction in sediments

    USGS Publications Warehouse

    Commeau, Judith A.; Poppe, Lawrence J.; Commeau, R.F.

    1992-01-01

    The separation of silt-sized minerals by specific gravity is made possible by using a nontoxic, heavy liquid medium of sodium polytungstate and water. Once separated, the silt-sized heavy-mineral fraction is prepared for analysis with a scanning electron microscope equipped with an automatic image analyzer and energy-dispersive spectrometer. Particles within each sample are sized and sorted according to their chemistry, and the data are tabulated in histograms and tables. Where possible, the user can define the chemical categories to simulate distinct mineral groups. Polymorphs and minerals that have overlapping compositions are combined into a group and differentiated by X-ray diffraction. Hundreds of particles can be rapidly sized and classified by chemistry. The technique can be employed on sediments from any environment.

  4. The role of self-regulatory skills and automaticity on the effectiveness of a brief weight loss habit-based intervention: secondary analysis of the 10 top tips randomised trial.

    PubMed

    Kliemann, Nathalie; Vickerstaff, Victoria; Croker, Helen; Johnson, Fiona; Nazareth, Irwin; Beeken, Rebecca J

    2017-09-05

    Habit-interventions are designed to promote the automaticity of healthy behaviours and may also enhance self-regulatory skills during the habit-formation process. A recent trial of habit-based advice for weight loss (10 Top Tips; 10TT), found that patients allocated to 10TT lost significantly more weight over 3 months than those allocated to usual care, and reported greater increases in automaticity for the target behaviours. The current study aimed to test the hypothesis that i) 10TT increased self-regulatory skills more than usual care, and ii) that self-regulatory skills and automaticity changes mediated the effect of 10TT on weight loss. 537 obese patients from 14 primary care practices in the UK were randomized to receive 10TT or usual care. Patients in the 10TT group received a leaflet containing tips for weight loss and healthy habits formation, a self-monitoring log book and a wallet-sized shopping guide on how to read food labels. Patients were weighed and completed validated questionnaires for self-regulation and automaticity at baseline and 3-month follow-up. Within-group and Between-group effects were explored using Paired T-test and ANCOVA, respectively. Mediation was assessed using bootstrapping to estimate indirect effects and the sobel test. Over 3 months patients who were given 10TT reported greater increases in self-regulatory skills (Mean difference: .08; 95% CI .01; .15) than those who received usual care. Changes in self-regulatory skills and automaticity over 3 months mediated the effect of the intervention on weight loss (β = .52, 95% Bias Corrected CI .17; .91). As hypothesised, 10TT enhanced self-regulatory skills and changes in self-regulatory skills and automaticity mediated the effect of the intervention on weight loss. This supports the proposition that self-regulatory training and habit formation are important features of weight loss interventions. This study was prospectively registered with the International Standard Randomised Controlled Trials ( ISRCTN16347068 ) on 26 September 2011.

  5. Monitoring the performance of the Southern African Large Telescope

    NASA Astrophysics Data System (ADS)

    Hettlage, Christian; Coetzee, Chris; Väisänen, Petri; Romero Colmenero, Encarni; Crawford, Steven M.; Kotze, Paul; Rabe, Paul; Hulme, Stephen; Brink, Janus; Maartens, Deneys; Browne, Keith; Strydom, Ockert; De Bruyn, David

    2016-07-01

    The efficient operation of a telescope requires awareness of its performance on a daily and long-term basis. This paper outlines the Fault Tracker, WebSAMMI and the Dashboard used by the Southern African Large Telescope (SALT) to achieve this aim. Faults are mostly logged automatically, but the Fault Tracker allows users to add and edit faults. The SALT Astronomer and SALT Operator record weather conditions and telescope usage with WebSAMMI. Various efficiency metrics are shown for different time periods on the Dashboard. A kiosk mode for displaying on a public screen is included. Possible applications for other telescopes are discussed.

  6. College residential sleep environment.

    PubMed

    Sexton-Radek, Kathy; Hartley, Andrew

    2013-12-01

    College students regularly report increased sleep disturbances as well as concomitant reductions in performance (e.g., academic grades) upon entering college. Sleep hygiene refers to healthy sleep practices that are commonly used as first interventions in sleep disturbances. One widely used practice of this sort involves arranging the sleep environment to minimize disturbances from excessive noise and light at bedtime. Communal sleep situations such as those in college residence halls do not easily support this intervention. Following several focus groups, a questionnaire was designed to gather self-reported information on sleep disturbances in a college population. The present study used The Young Adult Sleep Environment Inventory (YASEI) and sleep logs to investigate the sleep environment of college students living in residential halls. A summary of responses indicated that noise and light are significant sleep disturbances in these environments. Recommendations are presented related to these findings.

  7. The effect of the Baton Rouge fault on flow in the Abita aquifer of southeastern Louisiana

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rapp, T.R.

    1993-03-01

    The ground-water resources of southern Tangipahoa Parish and adjacent areas were studied to determine their potential for development as an alternative to the Mississippi River as a water supply source for Jefferson Parish, Louisiana. The study area, in southeastern Louisiana, is underlain by eight major aquifers and is crossed by a fault zone, referred to as the Baton Rouge fault. The fault restricts the flow of water in the aquifers of intermediate depth. Data from a test well drilling program and geophysical logs of a nearby oil well indicated that a significant freshwater aquifer that provides water to a nearbymore » municipality was actually the Abita aquifer and not the Covington aquifer, as was previously thought. The Abita aquifer, a shallower aquifer with a lower hydraulic conductivity, had been displaced to a position equivalent to that of the Covington aquifer by the Baton Rouge fault. An additional final test well drilled south of the fault penetrated the leading edge of a wedge-shaped saltwater interface. Analysis of lithologic and geophysical logs indicated that the Abita aquifer has a well-sorted, clean sand at the base of the aquifer and substantial amounts of clay in the top two-thirds of the aquifer. Geophysical logs of oil test wells south of the fault zone indicated that the sand thickens substantially to the south. The thicker sand south of a public supply well that pumps water from the Abita aquifer and the higher hydraulic conductivity of the lower part of the aquifer where the saline water was detected indicate that a much larger percentage of recharge to the public supply well may come from the south than was originally thought.« less

  8. Interpretation of a compositional time series

    NASA Astrophysics Data System (ADS)

    Tolosana-Delgado, R.; van den Boogaart, K. G.

    2012-04-01

    Common methods for multivariate time series analysis use linear operations, from the definition of a time-lagged covariance/correlation to the prediction of new outcomes. However, when the time series response is a composition (a vector of positive components showing the relative importance of a set of parts in a total, like percentages and proportions), then linear operations are afflicted of several problems. For instance, it has been long recognised that (auto/cross-)correlations between raw percentages are spurious, more dependent on which other components are being considered than on any natural link between the components of interest. Also, a long-term forecast of a composition in models with a linear trend will ultimately predict negative components. In general terms, compositional data should not be treated in a raw scale, but after a log-ratio transformation (Aitchison, 1986: The statistical analysis of compositional data. Chapman and Hill). This is so because the information conveyed by a compositional data is relative, as stated in their definition. The principle of working in coordinates allows to apply any sort of multivariate analysis to a log-ratio transformed composition, as long as this transformation is invertible. This principle is of full application to time series analysis. We will discuss how results (both auto/cross-correlation functions and predictions) can be back-transformed, viewed and interpreted in a meaningful way. One view is to use the exhaustive set of all possible pairwise log-ratios, which allows to express the results into D(D - 1)/2 separate, interpretable sets of one-dimensional models showing the behaviour of each possible pairwise log-ratios. Another view is the interpretation of estimated coefficients or correlations back-transformed in terms of compositions. These two views are compatible and complementary. These issues are illustrated with time series of seasonal precipitation patterns at different rain gauges of the USA. In this data set, the proportion of annual precipitation falling in winter, spring, summer and autumn is considered a 4-component time series. Three invertible log-ratios are defined for calculations, balancing rainfall in autumn vs. winter, in summer vs. spring, and in autumn-winter vs. spring-summer. Results suggest a 2-year correlation range, and certain oscillatory behaviour in the last balance, which does not occur in the other two.

  9. SU-G-JeP1-08: Dual Modality Verification for Respiratory Gating Using New Real- Time Tumor Tracking Radiotherapy System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiinoki, T; Hanazawa, H; Shibuya, K

    Purpose: The respirato ry gating system combined the TrueBeam and a new real-time tumor-tracking radiotherapy system (RTRT) was installed. The RTRT system consists of two x-ray tubes and color image intensifiers. Using fluoroscopic images, the fiducial marker which was implanted near the tumor was tracked and was used as the internal surrogate for respiratory gating. The purposes of this study was to develop the verification technique of the respiratory gating with the new RTRT using cine electronic portal image device images (EPIDs) of TrueBeam and log files of the RTRT. Methods: A patient who underwent respiratory gated SBRT of themore » lung using the RTRT were enrolled in this study. For a patient, the log files of three-dimensional coordinate of fiducial marker used as an internal surrogate were acquired using the RTRT. Simultaneously, the cine EPIDs were acquired during respiratory gated radiotherapy. The data acquisition was performed for one field at five sessions during the course of SBRT. The residual motion errors were calculated using the log files (E{sub log}). The fiducial marker used as an internal surrogate into the cine EPIDs was automatically extracted by in-house software based on the template-matching algorithm. The differences between the the marker positions of cine EPIDs and digitally reconstructed radiograph were calculated (E{sub EPID}). Results: Marker detection on EPID using in-house software was influenced by low image contrast. For one field during the course of SBRT, the respiratory gating using the RTRT showed the mean ± S.D. of 95{sup th} percentile E{sub EPID} were 1.3 ± 0.3 mm,1.1 ± 0.5 mm,and those of E{sub log} were 1.5 ± 0.2 mm, 1.1 ± 0.2 mm in LR and SI directions, respectively. Conclusion: We have developed the verification method of respiratory gating combined TrueBeam and new real-time tumor-tracking radiotherapy system using EPIDs and log files.« less

  10. Server-side Log Data Analytics for I/O Workload Characterization and Coordination on Large Shared Storage Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Y.; Gunasekaran, Raghul; Ma, Xiaosong

    2016-01-01

    Inter-application I/O contention and performance interference have been recognized as severe problems. In this work, we demonstrate, through measurement from Titan (world s No. 3 supercomputer), that high I/O variance co-exists with the fact that individual storage units remain under-utilized for the majority of the time. This motivates us to propose AID, a system that performs automatic application I/O characterization and I/O-aware job scheduling. AID analyzes existing I/O traffic and batch job history logs, without any prior knowledge on applications or user/developer involvement. It identifies the small set of I/O-intensive candidates among all applications running on a supercomputer and subsequentlymore » mines their I/O patterns, using more detailed per-I/O-node traffic logs. Based on such auto- extracted information, AID provides online I/O-aware scheduling recommendations to steer I/O-intensive applications away from heavy ongoing I/O activities. We evaluate AID on Titan, using both real applications (with extracted I/O patterns validated by contacting users) and our own pseudo-applications. Our results confirm that AID is able to (1) identify I/O-intensive applications and their detailed I/O characteristics, and (2) significantly reduce these applications I/O performance degradation/variance by jointly evaluating out- standing applications I/O pattern and real-time system l/O load.« less

  11. QSARpy: A new flexible algorithm to generate QSAR models based on dissimilarities. The log Kow case study.

    PubMed

    Ferrari, Thomas; Lombardo, Anna; Benfenati, Emilio

    2018-05-14

    Several methods exist to develop QSAR models automatically. Some are based on indices of the presence of atoms, other on the most similar compounds, other on molecular descriptors. Here we introduce QSARpy v1.0, a new QSAR modeling tool based on a different approach: the dissimilarity. This tool fragments the molecules of the training set to extract fragments that can be associated to a difference in the property/activity value, called modulators. If the target molecule share part of the structure with a molecule of the training set and differences can be explained with one or more modulators, the property/activity value of the molecule of the training set is adjusted using the value associated to the modulator(s). This tool is tested here on the n-octanol/water partition coefficient (Kow, usually expressed in logarithmic units as log Kow). It is a key parameter in risk assessment since it is a measure of hydrophobicity. Its wide spread use makes these estimation methods very useful to reduce testing costs. Using QSARpy v1.0, we obtained a new model to predict log Kow with accurate performance (RMSE 0.43 and R 2 0.94 for the external test set), comparing favorably with other programs. QSARpy is freely available on request. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Factors influencing the growth of Salmonella during sprouting of naturally contaminated alfalfa seeds.

    PubMed

    Fu, Tong-Jen; Reineke, Karl F; Chirtel, Stuart; VanPelt, Olif M

    2008-05-01

    In this study, the factors that affect Salmonella growth during sprouting of naturally contaminated alfalfa seeds associated with two previous outbreaks of salmonellosis were examined. A minidrum sprouter equipped with automatic irrigation and rotation systems was built to allow sprouting to be conducted under conditions similar to those used commercially. The growth of Salmonella during sprouting in the minidrum was compared with that observed in sprouts grown in glass jars under conditions commonly used at home. The level of Salmonella increased by as much as 4 log units after 48 h of sprouting in jars but remained constant during the entire sprouting period in the minidrum. The effect of temperature and irrigation frequency on Salmonella growth was examined. Increasing the sprouting temperature from 20 to 30 degrees C increased the Salmonella counts by as much as 2 log units on sprouts grown both in the minidrum and in the glass jars. Decreasing the irrigation frequency from every 20 min to every 2 h during sprouting in the minidrum or from every 4 h to every 24 h during sprouting in the glass jars resulted in an approximately 2-log increase in Salmonella counts. The levels of total aerobic mesophilic bacteria, coliforms, and Salmonella in spent irrigation water closely reflected those found in sprouts, confirming that monitoring of spent irrigation water is a good way to monitor pathogen levels during sprouting.

  13. Development of a novel cell sorting method that samples population diversity in flow cytometry.

    PubMed

    Osborne, Geoffrey W; Andersen, Stacey B; Battye, Francis L

    2015-11-01

    Flow cytometry based electrostatic cell sorting is an important tool in the separation of cell populations. Existing instruments can sort single cells into multi-well collection plates, and keep track of cell of origin and sorted well location. However currently single sorted cell results reflect the population distribution and fail to capture the population diversity. Software was designed that implements a novel sorting approach, "Slice and Dice Sorting," that links a graphical representation of a multi-well plate to logic that ensures that single cells are sampled and sorted from all areas defined by the sort region/s. Therefore the diversity of the total population is captured, and the more frequently occurring or rarer cell types are all sampled. The sorting approach was tested computationally, and using functional cell based assays. Computationally we demonstrate that conventional single cell sorting can sample as little as 50% of the population diversity dependant on the population distribution, and that Slice and Dice sorting samples much more of the variety present within a cell population. We then show by sorting single cells into wells using the Slice and Dice sorting method that there are cells sorted using this method that would be either rarely sorted, or not sorted at all using conventional single cell sorting approaches. The present study demonstrates a novel single cell sorting method that samples much more of the population diversity than current methods. It has implications in clonal selection, stem cell sorting, single cell sequencing and any areas where population heterogeneity is of importance. © 2015 International Society for Advancement of Cytometry.

  14. A task analysis of the shift from teacher instructions to self-instructions in performing an in-common task.

    PubMed

    Grote, I; Rosales, J; Baer, D M

    1996-11-01

    Three preschool children repeatedly did four kinds of sorts with a deck of stimulus cards: a difficult, untaught target sort and three other sorts considered analytic of self-instructing the target performance. The untaught target sort was to find in a deck of cards those matching what two sample cards had in common. Most preschool children must be taught to mediate this problem. The three other kinds of sorts taught skills involved in the target performance or its mediation. As correct self-instructive talk emerged in the target sorts, it was confirmed. The untaught target sorts were interspersed infrequently among the three alternating directly taught skill sorts, to see if accurate target sorts, and accurate self-instructive talk about the target sorts, would emerge as the three skill sorts were mastered. As all the sorts progressed, increasing accuracy was seen first in the skill sorts and then in the untaught target sorts. All three subjects showed subsequent generalization to new target sorts involving other stimulus sets. Correct spontaneous self-instructions about the target sorts increased from near zero at the beginning of the experiment to consistency at its end. Thus the three skill sorts appeared sufficient for the emergence of a self-instructed solution to the previously insoluble target performance.

  15. Parallel sort with a ranged, partitioned key-value store in a high perfomance computing environment

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron; Poole, Stephen W.

    2016-01-26

    Improved sorting techniques are provided that perform a parallel sort using a ranged, partitioned key-value store in a high performance computing (HPC) environment. A plurality of input data files comprising unsorted key-value data in a partitioned key-value store are sorted. The partitioned key-value store comprises a range server for each of a plurality of ranges. Each input data file has an associated reader thread. Each reader thread reads the unsorted key-value data in the corresponding input data file and performs a local sort of the unsorted key-value data to generate sorted key-value data. A plurality of sorted, ranged subsets of each of the sorted key-value data are generated based on the plurality of ranges. Each sorted, ranged subset corresponds to a given one of the ranges and is provided to one of the range servers corresponding to the range of the sorted, ranged subset. Each range server sorts the received sorted, ranged subsets and provides a sorted range. A plurality of the sorted ranges are concatenated to obtain a globally sorted result.

  16. Automatic detection of anomalies in screening mammograms

    PubMed Central

    2013-01-01

    Background Diagnostic performance in breast screening programs may be influenced by the prior probability of disease. Since breast cancer incidence is roughly half a percent in the general population there is a large probability that the screening exam will be normal. That factor may contribute to false negatives. Screening programs typically exhibit about 83% sensitivity and 91% specificity. This investigation was undertaken to determine if a system could be developed to pre-sort screening-images into normal and suspicious bins based on their likelihood to contain disease. Wavelets were investigated as a method to parse the image data, potentially removing confounding information. The development of a classification system based on features extracted from wavelet transformed mammograms is reported. Methods In the multi-step procedure images were processed using 2D discrete wavelet transforms to create a set of maps at different size scales. Next, statistical features were computed from each map, and a subset of these features was the input for a concerted-effort set of naïve Bayesian classifiers. The classifier network was constructed to calculate the probability that the parent mammography image contained an abnormality. The abnormalities were not identified, nor were they regionalized. The algorithm was tested on two publicly available databases: the Digital Database for Screening Mammography (DDSM) and the Mammographic Images Analysis Society’s database (MIAS). These databases contain radiologist-verified images and feature common abnormalities including: spiculations, masses, geometric deformations and fibroid tissues. Results The classifier-network designs tested achieved sensitivities and specificities sufficient to be potentially useful in a clinical setting. This first series of tests identified networks with 100% sensitivity and up to 79% specificity for abnormalities. This performance significantly exceeds the mean sensitivity reported in literature for the unaided human expert. Conclusions Classifiers based on wavelet-derived features proved to be highly sensitive to a range of pathologies, as a result Type II errors were nearly eliminated. Pre-sorting the images changed the prior probability in the sorted database from 37% to 74%. PMID:24330643

  17. Bisphenol A in Solid Waste Materials, Leachate Water, and Air Particles from Norwegian Waste-Handling Facilities: Presence and Partitioning Behavior.

    PubMed

    Morin, Nicolas; Arp, Hans Peter H; Hale, Sarah E

    2015-07-07

    The plastic additive bisphenol A (BPA) is commonly found in landfill leachate at levels exceeding acute toxicity benchmarks. To gain insight into the mechanisms controlling BPA emissions from waste and waste-handling facilities, a comprehensive field and laboratory campaign was conducted to quantify BPA in solid waste materials (glass, combustibles, vehicle fluff, waste electric and electronic equipment (WEEE), plastics, fly ash, bottom ash, and digestate), leachate water, and atmospheric dust from Norwegian sorting, incineration, and landfill facilities. Solid waste concentrations varied from below 0.002 mg/kg (fly ash) to 188 ± 125 mg/kg (plastics). A novel passive sampling method was developed to, for the first time, establish a set of waste-water partition coefficients, KD,waste, for BPA, and to quantify differences between total and freely dissolved concentrations in waste-facility leachate. Log-normalized KD,waste (L/kg) values were similar for all solid waste materials (from 2.4 to 3.1), excluding glass and metals, indicating BPA is readily leachable. Leachate concentrations were similar for landfills and WEEE/vehicle sorting facilities (from 0.7 to 200 μg/L) and dominated by the freely dissolved fraction, not bound to (plastic) colloids (agreeing with measured KD,waste values). Dust concentrations ranged from 2.3 to 50.7 mg/kgdust. Incineration appears to be an effective way to reduce BPA concentrations in solid waste, dust, and leachate.

  18. Grain size statistics and depositional pattern of the Ecca Group sandstones, Karoo Supergroup in the Eastern Cape Province, South Africa

    NASA Astrophysics Data System (ADS)

    Baiyegunhi, Christopher; Liu, Kuiwu; Gwavava, Oswald

    2017-11-01

    Grain size analysis is a vital sedimentological tool used to unravel the hydrodynamic conditions, mode of transportation and deposition of detrital sediments. In this study, detailed grain-size analysis was carried out on thirty-five sandstone samples from the Ecca Group in the Eastern Cape Province of South Africa. Grain-size statistical parameters, bivariate analysis, linear discriminate functions, Passega diagrams and log-probability curves were used to reveal the depositional processes, sedimentation mechanisms, hydrodynamic energy conditions and to discriminate different depositional environments. The grain-size parameters show that most of the sandstones are very fine to fine grained, moderately well sorted, mostly near-symmetrical and mesokurtic in nature. The abundance of very fine to fine grained sandstones indicate the dominance of low energy environment. The bivariate plots show that the samples are mostly grouped, except for the Prince Albert samples that show scattered trend, which is due to the either mixture of two modes in equal proportion in bimodal sediments or good sorting in unimodal sediments. The linear discriminant function analysis is dominantly indicative of turbidity current deposits under shallow marine environments for samples from the Prince Albert, Collingham and Ripon Formations, while those samples from the Fort Brown Formation are lacustrine or deltaic deposits. The C-M plots indicated that the sediments were deposited mainly by suspension and saltation, and graded suspension. Visher diagrams show that saltation is the major process of transportation, followed by suspension.

  19. Automatic generation of Web mining environments

    NASA Astrophysics Data System (ADS)

    Cibelli, Maurizio; Costagliola, Gennaro

    1999-02-01

    The main problem related to the retrieval of information from the world wide web is the enormous number of unstructured documents and resources, i.e., the difficulty of locating and tracking appropriate sources. This paper presents a web mining environment (WME), which is capable of finding, extracting and structuring information related to a particular domain from web documents, using general purpose indices. The WME architecture includes a web engine filter (WEF), to sort and reduce the answer set returned by a web engine, a data source pre-processor (DSP), which processes html layout cues in order to collect and qualify page segments, and a heuristic-based information extraction system (HIES), to finally retrieve the required data. Furthermore, we present a web mining environment generator, WMEG, that allows naive users to generate a WME specific to a given domain by providing a set of specifications.

  20. Effects of a metronome on the filled pauses of fluent speakers.

    PubMed

    Christenfeld, N

    1996-12-01

    Filled pauses (the "ums" and "uhs" that litter spontaneous speech) seem to be a product of the speaker paying deliberate attention to the normally automatic act of talking. This is the same sort of explanation that has been offered for stuttering. In this paper we explore whether a manipulation that has long been known to decrease stuttering, synchronizing speech to the beats of a metronome, will then also decrease filled pauses. Two experiments indicate that a metronome has a dramatic effect on the production of filled pauses. This effect is not due to any simplification or slowing of the speech and supports the view that a metronome causes speakers to attend more to how they are talking and less to what they are saying. It also lends support to the connection between stutters and filled pauses.

  1. Validating silicon polytrodes with paired juxtacellular recordings: method and dataset

    PubMed Central

    Lopes, Gonçalo; Frazão, João; Nogueira, Joana; Lacerda, Pedro; Baião, Pedro; Aarts, Arno; Andrei, Alexandru; Musa, Silke; Fortunato, Elvira; Barquinha, Pedro; Kampff, Adam R.

    2016-01-01

    Cross-validating new methods for recording neural activity is necessary to accurately interpret and compare the signals they measure. Here we describe a procedure for precisely aligning two probes for in vivo “paired-recordings” such that the spiking activity of a single neuron is monitored with both a dense extracellular silicon polytrode and a juxtacellular micropipette. Our new method allows for efficient, reliable, and automated guidance of both probes to the same neural structure with micrometer resolution. We also describe a new dataset of paired-recordings, which is available online. We propose that our novel targeting system, and ever expanding cross-validation dataset, will be vital to the development of new algorithms for automatically detecting/sorting single-units, characterizing new electrode materials/designs, and resolving nagging questions regarding the origin and nature of extracellular neural signals. PMID:27306671

  2. LANDMARK-BASED SPEECH RECOGNITION: REPORT OF THE 2004 JOHNS HOPKINS SUMMER WORKSHOP.

    PubMed

    Hasegawa-Johnson, Mark; Baker, James; Borys, Sarah; Chen, Ken; Coogan, Emily; Greenberg, Steven; Juneja, Amit; Kirchhoff, Katrin; Livescu, Karen; Mohan, Srividya; Muller, Jennifer; Sonmez, Kemal; Wang, Tianyu

    2005-01-01

    Three research prototype speech recognition systems are described, all of which use recently developed methods from artificial intelligence (specifically support vector machines, dynamic Bayesian networks, and maximum entropy classification) in order to implement, in the form of an automatic speech recognizer, current theories of human speech perception and phonology (specifically landmark-based speech perception, nonlinear phonology, and articulatory phonology). All three systems begin with a high-dimensional multiframe acoustic-to-distinctive feature transformation, implemented using support vector machines trained to detect and classify acoustic phonetic landmarks. Distinctive feature probabilities estimated by the support vector machines are then integrated using one of three pronunciation models: a dynamic programming algorithm that assumes canonical pronunciation of each word, a dynamic Bayesian network implementation of articulatory phonology, or a discriminative pronunciation model trained using the methods of maximum entropy classification. Log probability scores computed by these models are then combined, using log-linear combination, with other word scores available in the lattice output of a first-pass recognizer, and the resulting combination score is used to compute a second-pass speech recognition output.

  3. The Wettzell System Monitoring Concept and First Realizations

    NASA Technical Reports Server (NTRS)

    Ettl, Martin; Neidhardt, Alexander; Muehlbauer, Matthias; Ploetz, Christian; Beaudoin, Christopher

    2010-01-01

    Automated monitoring of operational system parameters for the geodetic space techniques is becoming more important in order to improve the geodetic data and to ensure the safety and stability of automatic and remote-controlled observations. Therefore, the Wettzell group has developed the system monitoring software, SysMon, which is based on a reliable, remotely-controllable hardware/software realization. A multi-layered data logging system based on a fanless, robust industrial PC with an internal database system is used to collect data from several external, serial, bus, or PCI-based sensors. The internal communication is realized with Remote Procedure Calls (RPC) and uses generative programming with the interface software generator idl2rpc.pl developed at Wettzell. Each data monitoring stream can be configured individually via configuration files to define the logging rates or analog-digital-conversion parameters. First realizations are currently installed at the new laser ranging system at Wettzell to address safety issues and at the VLBI station O Higgins as a meteorological data logger. The system monitoring concept should be realized for the Wettzell radio telescope in the near future.

  4. Localizing people in crosswalks with a moving handheld camera: proof of concept

    NASA Astrophysics Data System (ADS)

    Lalonde, Marc; Chapdelaine, Claude; Foucher, Samuel

    2015-02-01

    Although people or object tracking in uncontrolled environments has been acknowledged in the literature, the accurate localization of a subject with respect to a reference ground plane remains a major issue. This study describes an early prototype for the tracking and localization of pedestrians with a handheld camera. One application envisioned here is to analyze the trajectories of blind people going across long crosswalks when following different audio signals as a guide. This kind of study is generally conducted manually with an observer following a subject and logging his/her current position at regular time intervals with respect to a white grid painted on the ground. This study aims at automating the manual logging activity: with a marker attached to the subject's foot, a video of the crossing is recorded by a person following the subject, and a semi-automatic tool analyzes the video and estimates the trajectory of the marker with respect to the painted markings. Challenges include robustness to variations to lighting conditions (shadows, etc.), occlusions, and changes in camera viewpoint. Results are promising when compared to GNSS measurements.

  5. A flexible system for vital signs monitoring in hospital general care wards based on the integration of UNIX-based workstations, standard networks and portable vital signs monitors.

    PubMed

    Welch, J P; Sims, N; Ford-Carlton, P; Moon, J B; West, K; Honore, G; Colquitt, N

    1991-01-01

    The article describes a study conducted on general surgical and thoracic surgical floors of a 1000-bed hospital to assess the impact of a new network for portable patient care devices. This network was developed to address the needs of hospital patients who need constant, multi-parameter, vital signs surveillance, but do not require intensive nursing care. Bedside wall jacks were linked to UNIX-based workstations using standard digital network hardware, creating a flexible system (for general care floors of the hospital) that allowed the number of monitored locations to increase and decrease as patient census and acuity levels varied. It also allowed the general care floors to provide immediate, centralized vital signs monitoring for patients who unexpectedly became unstable, and permitted portable monitors to travel with patients as they were transferred between hospital departments. A disk-based log within the workstation automatically collected performance data, including patient demographics, monitor alarms, and network status for analysis. The log has allowed the developers to evaluate the use and performance of the system.

  6. Enhanced performance for differential detection in coherent Brillouin optical time-domain analysis sensors

    NASA Astrophysics Data System (ADS)

    Shao, Liyang; Zhang, Yunpeng; Li, Zonglei; Zhang, Zhiyong; Zou, Xihua; Luo, Bin; Pan, Wei; Yan, Lianshan

    2016-11-01

    Logarithmic detectors (LogDs) have been used in coherent Brillouin optical time-domain analysis (BOTDA) sensors to reduce the effect of phase fluctuation, demodulation complexities, and measurement time. However, because of the inherent properties of LogDs, a DC component at the level of hundreds of millivolts that prohibits high-gain signal amplification (SA) could be generated, resulting in unacceptable data acquisition (DAQ) inaccuracies and decoding errors in the process of prototype integration. By generating a reference light at a level similar to the probe light, differential detection can be applied to remove the DC component automatically using a differential amplifier before the DAQ process. Therefore, high-gain SA can be employed to reduce quantization errors. The signal-to-noise ratio of the weak Brillouin gain signal is improved from ˜11.5 to ˜21.8 dB. A BOTDA prototype is implemented based on the proposed scheme. The experimental results show that the measurement accuracy of the Brillouin frequency shift (BFS) is improved from ±1.9 to ±0.8 MHz at the end of a 40-km sensing fiber.

  7. Comparison of path visualizations and cognitive measures relative to travel technique in a virtual environment.

    PubMed

    Zanbaka, Catherine A; Lok, Benjamin C; Babu, Sabarish V; Ulinski, Amy C; Hodges, Larry F

    2005-01-01

    We describe a between-subjects experiment that compared four different methods of travel and their effect on cognition and paths taken in an immersive virtual environment (IVE). Participants answered a set of questions based on Crook's condensation of Bloom's taxonomy that assessed their cognition of the IVE with respect to knowledge, understanding and application, and higher mental processes. Participants also drew a sketch map of the IVE and the objects within it. The users' sense of presence was measured using the Steed-Usoh-Slater Presence Questionnaire. The participants' position and head orientation were automatically logged during their exposure to the virtual environment. These logs were later used to create visualizations of the paths taken. Path analysis, such as exploring the overlaid path visualizations and dwell data information, revealed further differences among the travel techniques. Our results suggest that, for applications where problem solving and evaluation of information is important or where opportunity to train is minimal, then having a large tracked space so that the participant can walk around the virtual environment provides benefits over common virtual travel techniques.

  8. Application of a Mathematical Model to Describe the Effects of Chlorpyrifos on Caenorhabditis elegans Development

    PubMed Central

    Boyd, Windy A.; Smith, Marjolein V.; Kissling, Grace E.; Rice, Julie R.; Snyder, Daniel W.; Portier, Christopher J.; Freedman, Jonathan H.

    2009-01-01

    Background The nematode Caenorhabditis elegans is being assessed as an alternative model organism as part of an interagency effort to develop better means to test potentially toxic substances. As part of this effort, assays that use the COPAS Biosort flow sorting technology to record optical measurements (time of flight (TOF) and extinction (EXT)) of individual nematodes under various chemical exposure conditions are being developed. A mathematical model has been created that uses Biosort data to quantitatively and qualitatively describe C. elegans growth, and link changes in growth rates to biological events. Chlorpyrifos, an organophosphate pesticide known to cause developmental delays and malformations in mammals, was used as a model toxicant to test the applicability of the growth model for in vivo toxicological testing. Methodology/Principal Findings L1 larval nematodes were exposed to a range of sub-lethal chlorpyrifos concentrations (0–75 µM) and measured every 12 h. In the absence of toxicant, C. elegans matured from L1s to gravid adults by 60 h. A mathematical model was used to estimate nematode size distributions at various times. Mathematical modeling of the distributions allowed the number of measured nematodes and log(EXT) and log(TOF) growth rates to be estimated. The model revealed three distinct growth phases. The points at which estimated growth rates changed (change points) were constant across the ten chlorpyrifos concentrations. Concentration response curves with respect to several model-estimated quantities (numbers of measured nematodes, mean log(TOF) and log(EXT), growth rates, and time to reach change points) showed a significant decrease in C. elegans growth with increasing chlorpyrifos concentration. Conclusions Effects of chlorpyrifos on C. elegans growth and development were mathematically modeled. Statistical tests confirmed a significant concentration effect on several model endpoints. This confirmed that chlorpyrifos affects C. elegans development in a concentration dependent manner. The most noticeable effect on growth occurred during early larval stages: L2 and L3. This study supports the utility of the C. elegans growth assay and mathematical modeling in determining the effects of potentially toxic substances in an alternative model organism using high-throughput technologies. PMID:19753116

  9. Mammarenaviruses deleted from their Z gene are replicative and produce an infectious progeny in BHK-21 cells.

    PubMed

    Zaza, Amélie D; Herbreteau, Cécile H; Peyrefitte, Christophe N; Emonet, Sébastien F

    2018-05-01

    Mammarenaviruses bud out of infected cells via the recruitment of the endosomal sorting complex required for transport through late domain motifs localized into their Z protein. Here, we demonstrated that mammarenaviruses lacking this protein can be rescued and are replicative, despite a 3-log reduction in virion production, in BHK-21 cells, but not in five other cell lines. Mutations of putative late domain motifs identified into the viral nucleoprotein resulted in the almost complete abolition of infectious virion production by Z-deleted mammarenaviruses. This result strongly suggested that the nucleoprotein may compensate for the deletion of Z. These observations were primarily obtained using the Lymphocytic choriomeningitis virus, and further confirmed using the Old World Lassa and New World Machupo viruses, responsible of human hemorrhagic fevers. Z-deleted viruses should prove very useful tools to investigate the biology of Mammarenaviruses. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Computer routines for probability distributions, random numbers, and related functions

    USGS Publications Warehouse

    Kirby, W.

    1983-01-01

    Use of previously coded and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main progress. The probability distributions provided include the beta, chi-square, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F. Other mathematical functions include the Bessel function, I sub o, gamma and log-gamma functions, error functions, and exponential integral. Auxiliary services include sorting and printer-plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  11. Computer routines for probability distributions, random numbers, and related functions

    USGS Publications Warehouse

    Kirby, W.H.

    1980-01-01

    Use of previously codes and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main programs. The probability distributions provided include the beta, chisquare, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F tests. Other mathematical functions include the Bessel function I (subzero), gamma and log-gamma functions, error functions and exponential integral. Auxiliary services include sorting and printer plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  12. Development of a microprocessor controller for stand-alone photovoltaic power systems

    NASA Technical Reports Server (NTRS)

    Millner, A. R.; Kaufman, D. L.

    1984-01-01

    A controller for stand-alone photovoltaic systems has been developed using a low power CMOS microprocessor. It performs battery state of charge estimation, array control, load management, instrumentation, automatic testing, and communications functions. Array control options are sequential subarray switching and maximum power control. A calculator keypad and LCD display provides manual control, fault diagnosis and digital multimeter functions. An RS-232 port provides data logging or remote control capability. A prototype 5 kW unit has been built and tested successfully. The controller is expected to be useful in village photovoltaic power systems, large solar water pumping installations, and other battery management applications.

  13. Denni Algorithm An Enhanced Of SMS (Scan, Move and Sort) Algorithm

    NASA Astrophysics Data System (ADS)

    Aprilsyah Lubis, Denni; Salim Sitompul, Opim; Marwan; Tulus; Andri Budiman, M.

    2017-12-01

    Sorting has been a profound area for the algorithmic researchers, and many resources are invested to suggest a more working sorting algorithm. For this purpose many existing sorting algorithms were observed in terms of the efficiency of the algorithmic complexity. Efficient sorting is important to optimize the use of other algorithms that require sorted lists to work correctly. Sorting has been considered as a fundamental problem in the study of algorithms that due to many reasons namely, the necessary to sort information is inherent in many applications, algorithms often use sorting as a key subroutine, in algorithm design there are many essential techniques represented in the body of sorting algorithms, and many engineering issues come to the fore when implementing sorting algorithms., Many algorithms are very well known for sorting the unordered lists, and one of the well-known algorithms that make the process of sorting to be more economical and efficient is SMS (Scan, Move and Sort) algorithm, an enhancement of Quicksort invented Rami Mansi in 2010. This paper presents a new sorting algorithm called Denni-algorithm. The Denni algorithm is considered as an enhancement on the SMS algorithm in average, and worst cases. The Denni algorithm is compared with the SMS algorithm and the results were promising.

  14. Rmax: A systematic approach to evaluate instrument sort performance using center stream catch☆

    PubMed Central

    Riddell, Andrew; Gardner, Rui; Perez-Gonzalez, Alexis; Lopes, Telma; Martinez, Lola

    2015-01-01

    Sorting performance can be evaluated with regard to Purity, Yield and/or Recovery of the sorted fraction. Purity is a check on the quality of the sample and the sort decisions made by the instrument. Recovery and Yield definitions vary with some authors regarding both as how efficient the instrument is at sorting the target particles from the original sample, others distinguishing Recovery from Yield, where the former is used to describe the accuracy of the instrument’s sort count. Yield and Recovery are often neglected, mostly due to difficulties in their measurement. Purity of the sort product is often cited alone but is not sufficient to evaluate sorting performance. All of these three performance metrics require re-sampling of the sorted fraction. But, unlike Purity, calculating Yield and/or Recovery calls for the absolute counting of particles in the sorted fraction, which may not be feasible, particularly when dealing with rare populations and precious samples. In addition, the counting process itself involves large errors. Here we describe a new metric for evaluating instrument sort Recovery, defined as the number of particles sorted relative to the number of original particles to be sorted. This calculation requires only measuring the ratios of target and non-target populations in the original pre-sort sample and in the waste stream or center stream catch (CSC), avoiding re-sampling the sorted fraction and absolute counting. We called this new metric Rmax, since it corresponds to the maximum expected Recovery for a particular set of instrument parameters. Rmax is ideal to evaluate and troubleshoot the optimum drop-charge delay of the sorter, or any instrument related failures that will affect sort performance. It can be used as a daily quality control check but can be particularly useful to assess instrument performance before single-cell sorting experiments. Because we do not perturb the sort fraction we can calculate Rmax during the sort process, being especially valuable to check instrument performance during rare population sorts. PMID:25747337

  15. Biomedical literature classification using encyclopedic knowledge: a Wikipedia-based bag-of-concepts approach.

    PubMed

    Mouriño García, Marcos Antonio; Pérez Rodríguez, Roberto; Anido Rifón, Luis E

    2015-01-01

    Automatic classification of text documents into a set of categories has a lot of applications. Among those applications, the automatic classification of biomedical literature stands out as an important application for automatic document classification strategies. Biomedical staff and researchers have to deal with a lot of literature in their daily activities, so it would be useful a system that allows for accessing to documents of interest in a simple and effective way; thus, it is necessary that these documents are sorted based on some criteria-that is to say, they have to be classified. Documents to classify are usually represented following the bag-of-words (BoW) paradigm. Features are words in the text-thus suffering from synonymy and polysemy-and their weights are just based on their frequency of occurrence. This paper presents an empirical study of the efficiency of a classifier that leverages encyclopedic background knowledge-concretely Wikipedia-in order to create bag-of-concepts (BoC) representations of documents, understanding concept as "unit of meaning", and thus tackling synonymy and polysemy. Besides, the weighting of concepts is based on their semantic relevance in the text. For the evaluation of the proposal, empirical experiments have been conducted with one of the commonly used corpora for evaluating classification and retrieval of biomedical information, OHSUMED, and also with a purpose-built corpus of MEDLINE biomedical abstracts, UVigoMED. Results obtained show that the Wikipedia-based bag-of-concepts representation outperforms the classical bag-of-words representation up to 157% in the single-label classification problem and up to 100% in the multi-label problem for OHSUMED corpus, and up to 122% in the single-label classification problem and up to 155% in the multi-label problem for UVigoMED corpus.

  16. The Decision to Engage Cognitive Control Is Driven by Expected Reward-Value: Neural and Behavioral Evidence

    PubMed Central

    Dixon, Matthew L.; Christoff, Kalina

    2012-01-01

    Cognitive control is a fundamental skill reflecting the active use of task-rules to guide behavior and suppress inappropriate automatic responses. Prior work has traditionally used paradigms in which subjects are told when to engage cognitive control. Thus, surprisingly little is known about the factors that influence individuals' initial decision of whether or not to act in a reflective, rule-based manner. To examine this, we took three classic cognitive control tasks (Stroop, Wisconsin Card Sorting Task, Go/No-Go task) and created novel ‘free-choice’ versions in which human subjects were free to select an automatic, pre-potent action, or an action requiring rule-based cognitive control, and earned varying amounts of money based on their choices. Our findings demonstrated that subjects' decision to engage cognitive control was driven by an explicit representation of monetary rewards expected to be obtained from rule-use. Subjects rarely engaged cognitive control when the expected outcome was of equal or lesser value as compared to the value of the automatic response, but frequently engaged cognitive control when it was expected to yield a larger monetary outcome. Additionally, we exploited fMRI-adaptation to show that the lateral prefrontal cortex (LPFC) represents associations between rules and expected reward outcomes. Together, these findings suggest that individuals are more likely to act in a reflective, rule-based manner when they expect that it will result in a desired outcome. Thus, choosing to exert cognitive control is not simply a matter of reason and willpower, but rather, conforms to standard mechanisms of value-based decision making. Finally, in contrast to current models of LPFC function, our results suggest that the LPFC plays a direct role in representing motivational incentives. PMID:23284730

  17. An Interative Grahical User Interface for Maritime Security Services

    NASA Astrophysics Data System (ADS)

    Reize, T.; Müller, R.; Kiefl, R.

    2013-10-01

    In order to analyse optical satellite images for maritime security issues in Near-Real-Time (NRT) an interactive graphical user interface (GUI) based on NASA World Wind was developed and is presented in this article. Targets or activities can be detected, measured and classified with this tool simply and quickly. The service uses optical satellite images, currently taken from 6 sensors: Worldview-1 and Worldview-2, Ikonos, Quickbird, GeoEye-1 and EROS-B. The GUI can also handle SAR-images, air-borne images or UAV images. Software configurations are provided in a job-order file and thus all preparation tasks, such as image installation are performed fully automatically. The imagery can be overlaid with vessels derived by an automatic detection processor. These potential vessel layers can be zoomed in by a single click and sorted with an adapted method. Further object properties, such as vessel type or confidence level of identification, can be added by the operator manually. The heading angle can be refined by dragging the vessel's head or switching it to 180° with a single click. Further vessels or other relevant objects can be added. The objects length, width, heading and position are calculated automatically from three clicks on top, bottom and an arbitrary point at one of the object's longer side. In case of an Activity Detection, the detected objects can be grouped in area of interests (AOI) and classified, according to the ordered activities. All relevant information is finally written to an exchange file, after quality control and necessary correction procedures are performed. If required, image thumbnails can be cut around objects or around whole areas of interest and saved as separated, geo-referenced images.

  18. Biomedical literature classification using encyclopedic knowledge: a Wikipedia-based bag-of-concepts approach

    PubMed Central

    Pérez Rodríguez, Roberto; Anido Rifón, Luis E.

    2015-01-01

    Automatic classification of text documents into a set of categories has a lot of applications. Among those applications, the automatic classification of biomedical literature stands out as an important application for automatic document classification strategies. Biomedical staff and researchers have to deal with a lot of literature in their daily activities, so it would be useful a system that allows for accessing to documents of interest in a simple and effective way; thus, it is necessary that these documents are sorted based on some criteria—that is to say, they have to be classified. Documents to classify are usually represented following the bag-of-words (BoW) paradigm. Features are words in the text—thus suffering from synonymy and polysemy—and their weights are just based on their frequency of occurrence. This paper presents an empirical study of the efficiency of a classifier that leverages encyclopedic background knowledge—concretely Wikipedia—in order to create bag-of-concepts (BoC) representations of documents, understanding concept as “unit of meaning”, and thus tackling synonymy and polysemy. Besides, the weighting of concepts is based on their semantic relevance in the text. For the evaluation of the proposal, empirical experiments have been conducted with one of the commonly used corpora for evaluating classification and retrieval of biomedical information, OHSUMED, and also with a purpose-built corpus of MEDLINE biomedical abstracts, UVigoMED. Results obtained show that the Wikipedia-based bag-of-concepts representation outperforms the classical bag-of-words representation up to 157% in the single-label classification problem and up to 100% in the multi-label problem for OHSUMED corpus, and up to 122% in the single-label classification problem and up to 155% in the multi-label problem for UVigoMED corpus. PMID:26468436

  19. BiobankConnect: software to rapidly connect data elements for pooled analysis across biobanks using ontological and lexical indexing.

    PubMed

    Pang, Chao; Hendriksen, Dennis; Dijkstra, Martijn; van der Velde, K Joeri; Kuiper, Joel; Hillege, Hans L; Swertz, Morris A

    2015-01-01

    Pooling data across biobanks is necessary to increase statistical power, reveal more subtle associations, and synergize the value of data sources. However, searching for desired data elements among the thousands of available elements and harmonizing differences in terminology, data collection, and structure, is arduous and time consuming. To speed up biobank data pooling we developed BiobankConnect, a system to semi-automatically match desired data elements to available elements by: (1) annotating the desired elements with ontology terms using BioPortal; (2) automatically expanding the query for these elements with synonyms and subclass information using OntoCAT; (3) automatically searching available elements for these expanded terms using Lucene lexical matching; and (4) shortlisting relevant matches sorted by matching score. We evaluated BiobankConnect using human curated matches from EU-BioSHaRE, searching for 32 desired data elements in 7461 available elements from six biobanks. We found 0.75 precision at rank 1 and 0.74 recall at rank 10 compared to a manually curated set of relevant matches. In addition, best matches chosen by BioSHaRE experts ranked first in 63.0% and in the top 10 in 98.4% of cases, indicating that our system has the potential to significantly reduce manual matching work. BiobankConnect provides an easy user interface to significantly speed up the biobank harmonization process. It may also prove useful for other forms of biomedical data integration. All the software can be downloaded as a MOLGENIS open source app from http://www.github.com/molgenis, with a demo available at http://www.biobankconnect.org. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  20. BiobankConnect: software to rapidly connect data elements for pooled analysis across biobanks using ontological and lexical indexing

    PubMed Central

    Pang, Chao; Hendriksen, Dennis; Dijkstra, Martijn; van der Velde, K Joeri; Kuiper, Joel; Hillege, Hans L; Swertz, Morris A

    2015-01-01

    Objective Pooling data across biobanks is necessary to increase statistical power, reveal more subtle associations, and synergize the value of data sources. However, searching for desired data elements among the thousands of available elements and harmonizing differences in terminology, data collection, and structure, is arduous and time consuming. Materials and methods To speed up biobank data pooling we developed BiobankConnect, a system to semi-automatically match desired data elements to available elements by: (1) annotating the desired elements with ontology terms using BioPortal; (2) automatically expanding the query for these elements with synonyms and subclass information using OntoCAT; (3) automatically searching available elements for these expanded terms using Lucene lexical matching; and (4) shortlisting relevant matches sorted by matching score. Results We evaluated BiobankConnect using human curated matches from EU-BioSHaRE, searching for 32 desired data elements in 7461 available elements from six biobanks. We found 0.75 precision at rank 1 and 0.74 recall at rank 10 compared to a manually curated set of relevant matches. In addition, best matches chosen by BioSHaRE experts ranked first in 63.0% and in the top 10 in 98.4% of cases, indicating that our system has the potential to significantly reduce manual matching work. Conclusions BiobankConnect provides an easy user interface to significantly speed up the biobank harmonization process. It may also prove useful for other forms of biomedical data integration. All the software can be downloaded as a MOLGENIS open source app from http://www.github.com/molgenis, with a demo available at http://www.biobankconnect.org. PMID:25361575

  1. Microbiological quality and somatic cell count in bulk milk of dromedary camels (Camelus dromedarius): descriptive statistics, correlations, and factors of variation.

    PubMed

    Nagy, P; Faye, B; Marko, O; Thomas, S; Wernery, U; Juhasz, J

    2013-09-01

    The objectives of the present study were to monitor the microbiological quality and somatic cell count (SCC) of bulk tank milk at the world's first large-scale camel dairy farm for a 2-yr period, to compare the results of 2 methods for the enumeration of SCC, to evaluate correlation among milk quality indicators, and to determine the effect of specific factors (year, season, stage of lactation, and level of production) on milk quality indicators. The study was conducted from January 2008 to January 2010. Total viable count (TVC), coliform count (CC), California Mastitis Test (CMT) score, and SCC were determined from daily bulk milk samples. Somatic cell count was measured by using a direct microscopic method and with an automatic cell counter. In addition, production parameters [total daily milk production (TDM, kg), number of milking camels (NMC), average milk per camel (AMC, kg)] and stage of lactation (average postpartum days, PPD) were recorded for each test day. A strong correlation (r=0.33) was found between the 2 methods for SCC enumeration; however, values derived using the microscopic method were higher. The geometric means of SCC and TVC were 394×10(3) cells/mL and 5,157 cfu/mL during the observation period, respectively. Somatic cell count was >500×10(3) cells/mL on 14.6% (106/725) and TVC was >10×10(3) cfu/mL on 4.0% (30/742) of the test days. Both milk quality indicators had a distinct seasonal pattern. For log SCC, the mean was lowest in summer and highest in autumn. The seasonal pattern of log TVC was slightly different, with the lowest values being recorded during the spring. The monthly mean TVC pattern showed a clear difference between years. Coliform count was <10 cfu/mL in most of the samples (709/742, 95.6%). A positive correlation was found between log SCC and log TVC (r=0.32), between log SCC and CMT score (r=0.26), and between log TVC and CC in yr 1 (r=0.30). All production parameters and stage of lactation showed strong seasonal variation. Log SCC was negatively correlated with TDM (r=-0.35), AMC (r=-0.37), and NMC (r=-0.15) and positively correlated with PPD (r=0.40). Log TVC had a negative correlation with AMC (r=-0.40) but a positive correlation with NMC (r=0.32), TDM (r=0.16), and PPD (r=0.45). The linear mixed model with stepwise variable selection showed that the main sources of log SCC variation were PPD, TDM, PPD × season, and season. For log TVC, the same factors and year contributed to the variation. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  2. Automating linear accelerator quality assurance.

    PubMed

    Eckhause, Tobias; Al-Hallaq, Hania; Ritter, Timothy; DeMarco, John; Farrey, Karl; Pawlicki, Todd; Kim, Gwe-Ya; Popple, Richard; Sharma, Vijeshwar; Perez, Mario; Park, SungYong; Booth, Jeremy T; Thorwarth, Ryan; Moran, Jean M

    2015-10-01

    The purpose of this study was 2-fold. One purpose was to develop an automated, streamlined quality assurance (QA) program for use by multiple centers. The second purpose was to evaluate machine performance over time for multiple centers using linear accelerator (Linac) log files and electronic portal images. The authors sought to evaluate variations in Linac performance to establish as a reference for other centers. The authors developed analytical software tools for a QA program using both log files and electronic portal imaging device (EPID) measurements. The first tool is a general analysis tool which can read and visually represent data in the log file. This tool, which can be used to automatically analyze patient treatment or QA log files, examines the files for Linac deviations which exceed thresholds. The second set of tools consists of a test suite of QA fields, a standard phantom, and software to collect information from the log files on deviations from the expected values. The test suite was designed to focus on the mechanical tests of the Linac to include jaw, MLC, and collimator positions during static, IMRT, and volumetric modulated arc therapy delivery. A consortium of eight institutions delivered the test suite at monthly or weekly intervals on each Linac using a standard phantom. The behavior of various components was analyzed for eight TrueBeam Linacs. For the EPID and trajectory log file analysis, all observed deviations which exceeded established thresholds for Linac behavior resulted in a beam hold off. In the absence of an interlock-triggering event, the maximum observed log file deviations between the expected and actual component positions (such as MLC leaves) varied from less than 1% to 26% of published tolerance thresholds. The maximum and standard deviations of the variations due to gantry sag, collimator angle, jaw position, and MLC positions are presented. Gantry sag among Linacs was 0.336 ± 0.072 mm. The standard deviation in MLC position, as determined by EPID measurements, across the consortium was 0.33 mm for IMRT fields. With respect to the log files, the deviations between expected and actual positions for parameters were small (<0.12 mm) for all Linacs. Considering both log files and EPID measurements, all parameters were well within published tolerance values. Variations in collimator angle, MLC position, and gantry sag were also evaluated for all Linacs. The performance of the TrueBeam Linac model was shown to be consistent based on automated analysis of trajectory log files and EPID images acquired during delivery of a standardized test suite. The results can be compared directly to tolerance thresholds. In addition, sharing of results from standard tests across institutions can facilitate the identification of QA process and Linac changes. These reference values are presented along with the standard deviation for common tests so that the test suite can be used by other centers to evaluate their Linac performance against those in this consortium.

  3. Performance of a sequencing-batch membrane bioreactor (SMBR) with an automatic control strategy treating high-strength swine wastewater.

    PubMed

    Sui, Qianwen; Jiang, Chao; Yu, Dawei; Chen, Meixue; Zhang, Junya; Wang, Yawei; Wei, Yuansong

    2018-01-15

    Due to high-strength of organic matters, nutrients and pathogen, swine wastewater is a major source of pollution to rural environment and surface water. A sequencing-batch membrane bioreactor (SMBR) system with an automatic control strategy was developed for high-strength swine wastewater treatment. Short-cut nitrification and denitrification (SND) was achieved at nitrite accumulation rate of 83.6%, with removal rates of COD, NH 4 + -N and TN at 95%, 99% and 93%, respectively, at reduced HRT of 6.0 d and TN loading rate of 0.02kgN/(kgVSS d). With effective membrane separation, the reduction of total bacteria (TB) and putative pathogen were 2.77 logs and 1%, respectively. The shift of microbial community was well responded to controlling parameters. During the SND process, ammonia oxidizing bacteria (AOB) (Nitrosomonas, Nitrosospira) and nitrite oxidizing bacteria (NOB) (Nitrospira) were enriched by 52 times and reduced by 2 times, respectively. The denitrifiers (Thauera) were well enriched and the diversity was enhanced. Copyright © 2017. Published by Elsevier B.V.

  4. Autonomous mental development with selective attention, object perception, and knowledge representation

    NASA Astrophysics Data System (ADS)

    Ban, Sang-Woo; Lee, Minho

    2008-04-01

    Knowledge-based clustering and autonomous mental development remains a high priority research topic, among which the learning techniques of neural networks are used to achieve optimal performance. In this paper, we present a new framework that can automatically generate a relevance map from sensory data that can represent knowledge regarding objects and infer new knowledge about novel objects. The proposed model is based on understating of the visual what pathway in our brain. A stereo saliency map model can selectively decide salient object areas by additionally considering local symmetry feature. The incremental object perception model makes clusters for the construction of an ontology map in the color and form domains in order to perceive an arbitrary object, which is implemented by the growing fuzzy topology adaptive resonant theory (GFTART) network. Log-polar transformed color and form features for a selected object are used as inputs of the GFTART. The clustered information is relevant to describe specific objects, and the proposed model can automatically infer an unknown object by using the learned information. Experimental results with real data have demonstrated the validity of this approach.

  5. Impact of basal inferolateral scar burden determined by automatic analysis of 99mTc-MIBI myocardial perfusion SPECT on the long-term prognosis of cardiac resynchronization therapy.

    PubMed

    Morishima, Itsuro; Okumura, Kenji; Tsuboi, Hideyuki; Morita, Yasuhiro; Takagi, Kensuke; Yoshida, Ruka; Nagai, Hiroaki; Tomomatsu, Toshiro; Ikai, Yoshihiro; Terada, Kazushi; Sone, Takahito; Murohara, Toyoaki

    2017-04-01

    Left-ventricular (LV) scarring may be associated with a poor response to cardiac resynchronization therapy (CRT). The automatic analysis of myocardial perfusion single-photon emission computed tomography (MP-SPECT) may provide objective quantification of LV scarring. We investigated the impact of LV scarring determined by an automatic analysis of MP-SPECT on short-term LV volume response as well as long-term outcome. We studied consecutive 51 patients who were eligible to undergo 99mTc-MIBI MP-SPECT both at baseline and 6 months after CRT (ischaemic cardiomyopathies 31%). Quantitative perfusion SPECT was used to evaluate the defect extent (an index of global scarring) and the LV 17-segment regional uptake ratio (an inverse index of regional scar burden). The primary outcome was the composite of overall mortality or first hospitalization for worsening heart failure. A high global scar burden and a low mid/basal inferolateral regional uptake ratio were associated with volume non-responders to CRT at 6 months. The basal inferolateral regional uptake ratio remained as a predictor of volume non-response after adjusting for the type of cardiomyopathy. During a median follow-up of 36.1 months, the outcome occurred in 28 patients. The patients with a low basal inferolateral regional uptake ratio with a cutoff value of 57% showed poor prognosis (log-rank P= 0.006). The scarring determined by automatic analysis of MP-SPECT images may predict a poor response to CRT regardless of the pathogenesis of cardiomyopathy. The basal inferolateral scar burden in particular may have an adverse impact on long-term prognosis. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2016. For permissions please email: journals.permissions@oup.com.

  6. Elimination of water pathogens with solar radiation using an automated sequential batch CPC reactor.

    PubMed

    Polo-López, M I; Fernández-Ibáñez, P; Ubomba-Jaswa, E; Navntoft, C; García-Fernández, I; Dunlop, P S M; Schmid, M; Byrne, J A; McGuigan, K G

    2011-11-30

    Solar disinfection (SODIS) of water is a well-known, effective treatment process which is practiced at household level in many developing countries. However, this process is limited by the small volume treated and there is no indication of treatment efficacy for the user. Low cost glass tube reactors, together with compound parabolic collector (CPC) technology, have been shown to significantly increase the efficiency of solar disinfection. However, these reactors still require user input to control each batch SODIS process and there is no feedback that the process is complete. Automatic operation of the batch SODIS process, controlled by UVA-radiation sensors, can provide information on the status of the process, can ensure the required UVA dose to achieve complete disinfection is received and reduces user work-load through automatic sequential batch processing. In this work, an enhanced CPC photo-reactor with a concentration factor of 1.89 was developed. The apparatus was automated to achieve exposure to a pre-determined UVA dose. Treated water was automatically dispensed into a reservoir tank. The reactor was tested using Escherichia coli as a model pathogen in natural well water. A 6-log inactivation of E. coli was achieved following exposure to the minimum uninterrupted lethal UVA dose. The enhanced reactor decreased the exposure time required to achieve the lethal UVA dose, in comparison to a CPC system with a concentration factor of 1.0. Doubling the lethal UVA dose prevented the need for a period of post-exposure dark inactivation and reduced the overall treatment time. Using this reactor, SODIS can be automatically carried out at an affordable cost, with reduced exposure time and minimal user input. Copyright © 2011 Elsevier B.V. All rights reserved.

  7. Parallel integer sorting with medium and fine-scale parallelism

    NASA Technical Reports Server (NTRS)

    Dagum, Leonardo

    1993-01-01

    Two new parallel integer sorting algorithms, queue-sort and barrel-sort, are presented and analyzed in detail. These algorithms do not have optimal parallel complexity, yet they show very good performance in practice. Queue-sort designed for fine-scale parallel architectures which allow the queueing of multiple messages to the same destination. Barrel-sort is designed for medium-scale parallel architectures with a high message passing overhead. The performance results from the implementation of queue-sort on a Connection Machine CM-2 and barrel-sort on a 128 processor iPSC/860 are given. The two implementations are found to be comparable in performance but not as good as a fully vectorized bucket sort on the Cray YMP.

  8. Sort computation

    NASA Technical Reports Server (NTRS)

    Dorband, John E.

    1988-01-01

    Sorting has long been used to organize data in preparation for further computation, but sort computation allows some types of computation to be performed during the sort. Sort aggregation and sort distribution are the two basic forms of sort computation. Sort aggregation generates an accumulative or aggregate result for each group of records and places this result in one of the records. An aggregate operation can be any operation that is both associative and commutative, i.e., any operation whose result does not depend on the order of the operands or the order in which the operations are performed. Sort distribution copies the value from a field of a specific record in a group into that field in every record of that group.

  9. Impact of food processing and detoxification treatments on mycotoxin contamination.

    PubMed

    Karlovsky, Petr; Suman, Michele; Berthiller, Franz; De Meester, Johan; Eisenbrand, Gerhard; Perrin, Irène; Oswald, Isabelle P; Speijers, Gerrit; Chiodini, Alessandro; Recker, Tobias; Dussort, Pierre

    2016-11-01

    Mycotoxins are fungal metabolites commonly occurring in food, which pose a health risk to the consumer. Maximum levels for major mycotoxins allowed in food have been established worldwide. Good agricultural practices, plant disease management, and adequate storage conditions limit mycotoxin levels in the food chain yet do not eliminate mycotoxins completely. Food processing can further reduce mycotoxin levels by physical removal and decontamination by chemical or enzymatic transformation of mycotoxins into less toxic products. Physical removal of mycotoxins is very efficient: manual sorting of grains, nuts, and fruits by farmers as well as automatic sorting by the industry significantly lowers the mean mycotoxin content. Further processing such as milling, steeping, and extrusion can also reduce mycotoxin content. Mycotoxins can be detoxified chemically by reacting with food components and technical aids; these reactions are facilitated by high temperature and alkaline or acidic conditions. Detoxification of mycotoxins can also be achieved enzymatically. Some enzymes able to transform mycotoxins naturally occur in food commodities or are produced during fermentation but more efficient detoxification can be achieved by deliberate introduction of purified enzymes. We recommend integrating evaluation of processing technologies for their impact on mycotoxins into risk management. Processing steps proven to mitigate mycotoxin contamination should be used whenever necessary. Development of detoxification technologies for high-risk commodities should be a priority for research. While physical techniques currently offer the most efficient post-harvest reduction of mycotoxin content in food, biotechnology possesses the largest potential for future developments.

  10. Derivation of sorting programs

    NASA Technical Reports Server (NTRS)

    Varghese, Joseph; Loganantharaj, Rasiah

    1990-01-01

    Program synthesis for critical applications has become a viable alternative to program verification. Nested resolution and its extension are used to synthesize a set of sorting programs from their first order logic specifications. A set of sorting programs, such as, naive sort, merge sort, and insertion sort, were successfully synthesized starting from the same set of specifications.

  11. Safe sorting of GFP-transduced live cells for subsequent culture using a modified FACS vantage.

    PubMed

    Sørensen, T U; Gram, G J; Nielsen, S D; Hansen, J E

    1999-12-01

    A stream-in-air cell sorter enables rapid sorting to a high purity, but it is not well suited for sorting of infectious material due to the risk of airborne spread to the surroundings. A FACS Vantage cell sorter was modified for safe use with potentially HIV infected cells. Safety tests with bacteriophages were performed to evaluate the potential spread of biologically active material during cell sorting. Cells transduced with a retroviral vector carrying the gene for GFP were sorted on the basis of their GFP fluorescence, and GFP expression was followed during subsequent culture. The bacteriophage sorting showed that the biologically active material was confined to the sorting chamber. A failure mode simulating a nozzle blockage resulted in detectable droplets inside the sorting chamber, but no droplets could be detected when an additional air suction from the sorting chamber had been put on. The GFP transduced cells were sorted to 99% purity. Cells not expressing GFP at the time of sorting did not turn on the gene during subsequent culture. Un-sorted cells and cells sorted to be positive for GFP showed a decrease in the fraction of GFP positive cells during culture. Sorting of live infected cells can be performed safely and with no deleterious effects on vector expression using the modified FACS Vantage instrument. Copyright 1999 Wiley-Liss, Inc.

  12. Learning Relational Policies from Electronic Health Record Access Logs

    PubMed Central

    Malin, Bradley; Nyemba, Steve; Paulett, John

    2011-01-01

    Modern healthcare organizations (HCOs) are composed of complex dynamic teams to ensure clinical operations are executed in a quick and competent manner. At the same time, the fluid nature of such environments hinders administrators' efforts to define access control policies that appropriately balance patient privacy and healthcare functions. Manual efforts to define these policies are labor-intensive and error-prone, often resulting in systems that endow certain care providers with overly broad access to patients' medical records while restricting other providers from legitimate and timely use. In this work, we propose an alternative method to generate these policies by automatically mining usage patterns from electronic health record (EHR) systems. EHR systems are increasingly being integrated into clinical environments and our approach is designed to be generalizable across HCOs, thus assisting in the design and evaluation of local access control policies. Our technique, which is grounded in data mining and social network analysis theory, extracts a statistical model of the organization from the access logs of its EHRs. In doing so, our approach enables the review of predefined policies, as well as the discovery of unknown behaviors. We evaluate our approach with five months of access logs from the Vanderbilt University Medical Center and confirm the existence of stable social structures and intuitive business operations. Additionally, we demonstrate that there is significant turnover in the interactions between users in the HCO and that policies learned at the department level afford greater stability over time. PMID:21277996

  13. Design and realization of sort manipulator of crystal-angle sort machine

    NASA Astrophysics Data System (ADS)

    Wang, Ming-shun; Chen, Shu-ping; Guan, Shou-ping; Zhang, Yao-wei

    2005-12-01

    It is a current tendency of development in automation technology to replace manpower with manipulators in working places where dangerous, harmful, heavy or repetitive work is involved. The sort manipulator is installed in a crystal-angle sort machine to take the place of manpower, and engaged in unloading and sorting work. It is the outcome of combing together mechanism, electric transmission, and pneumatic element and micro-controller control. The step motor makes the sort manipulator operate precisely. The pneumatic elements make the sort manipulator be cleverer. Micro-controller's software bestows some simple artificial intelligence on the sort manipulator, so that it can precisely repeat its unloading and sorting work. The combination of manipulator's zero position and step motor counting control puts an end to accumulating error in long time operation. A sort manipulator's design in the practice engineering has been proved to be correct and reliable.

  14. A Task Analysis of the Shift from Teacher Instructions to Self-Instructions in Performing an In-Common Task.

    ERIC Educational Resources Information Center

    Grote, Irene; And Others

    1996-01-01

    Three preschoolers performed four sorts with stimulus cards--an untaught target sort and three directly taught alternating sorts considered to self-instruct the target performance. Accuracy increased first in the skill sorts and then in the untaught target sorts. All subjects generalized to new target sorts. Correct spontaneous self-instructions…

  15. Validating silicon polytrodes with paired juxtacellular recordings: method and dataset.

    PubMed

    Neto, Joana P; Lopes, Gonçalo; Frazão, João; Nogueira, Joana; Lacerda, Pedro; Baião, Pedro; Aarts, Arno; Andrei, Alexandru; Musa, Silke; Fortunato, Elvira; Barquinha, Pedro; Kampff, Adam R

    2016-08-01

    Cross-validating new methods for recording neural activity is necessary to accurately interpret and compare the signals they measure. Here we describe a procedure for precisely aligning two probes for in vivo "paired-recordings" such that the spiking activity of a single neuron is monitored with both a dense extracellular silicon polytrode and a juxtacellular micropipette. Our new method allows for efficient, reliable, and automated guidance of both probes to the same neural structure with micrometer resolution. We also describe a new dataset of paired-recordings, which is available online. We propose that our novel targeting system, and ever expanding cross-validation dataset, will be vital to the development of new algorithms for automatically detecting/sorting single-units, characterizing new electrode materials/designs, and resolving nagging questions regarding the origin and nature of extracellular neural signals. Copyright © 2016 the American Physiological Society.

  16. Graph Structured Program Evolution: Evolution of Loop Structures

    NASA Astrophysics Data System (ADS)

    Shirakawa, Shinichi; Nagao, Tomoharu

    Recently, numerous automatic programming techniques have been developed and applied in various fields. A typical example is genetic programming (GP), and various extensions and representations of GP have been proposed thus far. Complex programs and hand-written programs, however, may contain several loops and handle multiple data types. In this chapter, we propose a new method called Graph Structured Program Evolution (GRAPE). The representation of GRAPE is a graph structure; therefore, it can represent branches and loops using this structure. Each programis constructed as an arbitrary directed graph of nodes and a data set. The GRAPE program handles multiple data types using the data set for each type, and the genotype of GRAPE takes the form of a linear string of integers. We apply GRAPE to three test problems, factorial, exponentiation, and list sorting, and demonstrate that the optimum solution in each problem is obtained by the GRAPE system.

  17. Dendritic cell recognition using template matching based on one-dimensional (1D) Fourier descriptors (FD)

    NASA Astrophysics Data System (ADS)

    Muhd Suberi, Anis Azwani; Wan Zakaria, Wan Nurshazwani; Tomari, Razali; Lau, Mei Xia

    2016-07-01

    Identification of Dendritic Cell (DC) particularly in the cancer microenvironment is a unique disclosure since fighting tumor from the harnessing immune system has been a novel treatment under investigation. Nowadays, the staining procedure in sorting DC can affect their viability. In this paper, a computer aided system is proposed for automatic classification of DC in peripheral blood mononuclear cell (PBMC) images. Initially, the images undergo a few steps in preprocessing to remove uneven illumination and artifacts around the cells. In segmentation, morphological operators and Canny edge are implemented to isolate the cell shapes and extract the contours. Following that, information from the contours are extracted based on Fourier descriptors, derived from one dimensional (1D) shape signatures. Eventually, cells are classified as DC by comparing template matching (TM) of established template and target images. The results show that the proposed scheme is reliable and effective to recognize DC.

  18. Computerized training management system

    DOEpatents

    Rice, H.B.; McNair, R.C.; White, K.; Maugeri, T.

    1998-08-04

    A Computerized Training Management System (CTMS) is disclosed for providing a procedurally defined process that is employed to develop accreditable performance based training programs for job classifications that are sensitive to documented regulations and technical information. CTMS is a database that links information needed to maintain a five-phase approach to training-analysis, design, development, implementation, and evaluation independent of training program design. CTMS is designed using R-Base{trademark}, an-SQL compliant software platform. Information is logically entered and linked in CTMS. Each task is linked directly to a performance objective, which, in turn, is linked directly to a learning objective; then, each enabling objective is linked to its respective test items. In addition, tasks, performance objectives, enabling objectives, and test items are linked to their associated reference documents. CTMS keeps all information up to date since it automatically sorts, files and links all data; CTMS includes key word and reference document searches. 18 figs.

  19. Computerized training management system

    DOEpatents

    Rice, Harold B.; McNair, Robert C.; White, Kenneth; Maugeri, Terry

    1998-08-04

    A Computerized Training Management System (CTMS) for providing a procedurally defined process that is employed to develop accreditable performance based training programs for job classifications that are sensitive to documented regulations and technical information. CTMS is a database that links information needed to maintain a five-phase approach to training-analysis, design, development, implementation, and evaluation independent of training program design. CTMS is designed using R-Base.RTM., an-SQL compliant software platform. Information is logically entered and linked in CTMS. Each task is linked directly to a performance objective, which, in turn, is linked directly to a learning objective; then, each enabling objective is linked to its respective test items. In addition, tasks, performance objectives, enabling objectives, and test items are linked to their associated reference documents. CTMS keeps all information up to date since it automatically sorts, files and links all data; CTMS includes key word and reference document searches.

  20. Mediation analysis allowing for exposure-mediator interactions and causal interpretation: theoretical assumptions and implementation with SAS and SPSS macros

    PubMed Central

    Valeri, Linda; VanderWeele, Tyler J.

    2012-01-01

    Mediation analysis is a useful and widely employed approach to studies in the field of psychology and in the social and biomedical sciences. The contributions of this paper are several-fold. First we seek to bring the developments in mediation analysis for non linear models within the counterfactual framework to the psychology audience in an accessible format and compare the sorts of inferences about mediation that are possible in the presence of exposure-mediator interaction when using a counterfactual versus the standard statistical approach. Second, the work by VanderWeele and Vansteelandt (2009, 2010) is extended here to allow for dichotomous mediators and count outcomes. Third, we provide SAS and SPSS macros to implement all of these mediation analysis techniques automatically and we compare the types of inferences about mediation that are allowed by a variety of software macros. PMID:23379553

  1. Mediation analysis allowing for exposure-mediator interactions and causal interpretation: theoretical assumptions and implementation with SAS and SPSS macros.

    PubMed

    Valeri, Linda; Vanderweele, Tyler J

    2013-06-01

    Mediation analysis is a useful and widely employed approach to studies in the field of psychology and in the social and biomedical sciences. The contributions of this article are several-fold. First we seek to bring the developments in mediation analysis for nonlinear models within the counterfactual framework to the psychology audience in an accessible format and compare the sorts of inferences about mediation that are possible in the presence of exposure-mediator interaction when using a counterfactual versus the standard statistical approach. Second, the work by VanderWeele and Vansteelandt (2009, 2010) is extended here to allow for dichotomous mediators and count outcomes. Third, we provide SAS and SPSS macros to implement all of these mediation analysis techniques automatically, and we compare the types of inferences about mediation that are allowed by a variety of software macros. (PsycINFO Database Record (c) 2013 APA, all rights reserved).

  2. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2011-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  3. A tool for developing an automatic insect identification system based on wing outlines

    PubMed Central

    Yang, He-Ping; Ma, Chun-Sen; Wen, Hui; Zhan, Qing-Bin; Wang, Xin-Li

    2015-01-01

    For some insect groups, wing outline is an important character for species identification. We have constructed a program as the integral part of an automated system to identify insects based on wing outlines (DAIIS). This program includes two main functions: (1) outline digitization and Elliptic Fourier transformation and (2) classifier model training by pattern recognition of support vector machines and model validation. To demonstrate the utility of this program, a sample of 120 owlflies (Neuroptera: Ascalaphidae) was split into training and validation sets. After training, the sample was sorted into seven species using this tool. In five repeated experiments, the mean accuracy for identification of each species ranged from 90% to 98%. The accuracy increased to 99% when the samples were first divided into two groups based on features of their compound eyes. DAIIS can therefore be a useful tool for developing a system of automated insect identification. PMID:26251292

  4. An Intelligent Gear Fault Diagnosis Methodology Using a Complex Wavelet Enhanced Convolutional Neural Network.

    PubMed

    Sun, Weifang; Yao, Bin; Zeng, Nianyin; Chen, Binqiang; He, Yuchao; Cao, Xincheng; He, Wangpeng

    2017-07-12

    As a typical example of large and complex mechanical systems, rotating machinery is prone to diversified sorts of mechanical faults. Among these faults, one of the prominent causes of malfunction is generated in gear transmission chains. Although they can be collected via vibration signals, the fault signatures are always submerged in overwhelming interfering contents. Therefore, identifying the critical fault's characteristic signal is far from an easy task. In order to improve the recognition accuracy of a fault's characteristic signal, a novel intelligent fault diagnosis method is presented. In this method, a dual-tree complex wavelet transform (DTCWT) is employed to acquire the multiscale signal's features. In addition, a convolutional neural network (CNN) approach is utilized to automatically recognise a fault feature from the multiscale signal features. The experiment results of the recognition for gear faults show the feasibility and effectiveness of the proposed method, especially in the gear's weak fault features.

  5. Klusters, NeuroScope, NDManager: a free software suite for neurophysiological data processing and visualization.

    PubMed

    Hazan, Lynn; Zugaro, Michaël; Buzsáki, György

    2006-09-15

    Recent technological advances now allow for simultaneous recording of large populations of anatomically distributed neurons in behaving animals. The free software package described here was designed to help neurophysiologists process and view recorded data in an efficient and user-friendly manner. This package consists of several well-integrated applications, including NeuroScope (http://neuroscope.sourceforce.net), an advanced viewer for electrophysiological and behavioral data with limited editing capabilities, Klusters (http://klusters.sourceforge.net), a graphical cluster cutting application for manual and semi-automatic spike sorting, NDManager (GPL,see http://www.gnu.org/licenses/gpl.html), an experimental parameter and data processing manager. All of these programs are distributed under the GNU General Public License (GPL, see ), which gives its users legal permission to copy, distribute and/or modify the software. Also included are extensive user manuals and sample data, as well as source code and documentation.

  6. Sorting nexin-2 is associated with tubular elements of the early endosome, but is not essential for retromer-mediated endosome-to-TGN transport

    PubMed Central

    Carlton, Jez G.; Bujny, Miriam V.; Peter, Brian J.; Oorschot, Viola M. J.; Rutherford, Anna; Arkell, Rebecca S.; Klumperman, Judith; McMahon, Harvey T.; Cullen, Peter J.

    2006-01-01

    Summary Sorting nexins are a large family of phox-homology-domain-containing proteins that have been implicated in the control of endosomal sorting. Sorting nexin-1 is a component of the mammalian retromer complex that regulates retrieval of the cation-independent mannose 6-phosphate receptor from endosomes to the trans-Golgi network. In yeast, retromer is composed of Vps5p (the orthologue of sorting nexin-1), Vps17p (a related sorting nexin) and a cargo selective subcomplex composed of Vps26p, Vps29p and Vps35p. With the exception of Vps17p, mammalian orthologues of all yeast retromer components have been identified. For Vps17p, one potential mammalian orthologue is sorting nexin-2. Here we show that, like sorting nexin-1, sorting nexin-2 binds phosphatidylinositol 3-monophosphate and phosphatidylinositol 3,5-bisphosphate, and possesses a Bin/Amphiphysin/Rvs domain that can sense membrane curvature. However, in contrast to sorting nexin-1, sorting nexin-2 could not induce membrane tubulation in vitro or in vivo. Functionally, we show that endogenous sorting nexin-1 and sorting nexin-2 co-localise on high curvature tubular elements of the 3-phosphoinositide-enriched early endosome, and that suppression of sorting nexin-2 does not perturb the degradative sorting of receptors for epidermal growth factor or transferrin, nor the steady-state distribution of the cation-independent mannose 6-phosphate receptor. However, suppression of sorting nexin-2 results in a subtle alteration in the kinetics of cation-independent mannose 6-phosphate receptor retrieval. These data suggest that although sorting nexin-2 may be a component of the retromer complex, its presence is not essential for the regulation of endosome-to-trans Golgi network retrieval of the cation-independent mannose 6-phosphate receptor. PMID:16179610

  7. Computer aided stress analysis of long bones utilizing computer tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marom, S.A.

    1986-01-01

    A computer aided analysis method, utilizing computed tomography (CT) has been developed, which together with a finite element program determines the stress-displacement pattern in a long bone section. The CT data file provides the geometry, the density and the material properties for the generated finite element model. A three-dimensional finite element model of a tibial shaft is automatically generated from the CT file by a pre-processing procedure for a finite element program. The developed pre-processor includes an edge detection algorithm which determines the boundaries of the reconstructed cross-sectional images of the scanned bone. A mesh generation procedure than automatically generatesmore » a three-dimensional mesh of a user-selected refinement. The elastic properties needed for the stress analysis are individually determined for each model element using the radiographic density (CT number) of each pixel with the elemental borders. The elastic modulus is determined from the CT radiographic density by using an empirical relationship from the literature. The generated finite element model, together with applied loads, determined from existing gait analysis and initial displacements, comprise a formatted input for the SAP IV finite element program. The output of this program, stresses and displacements at the model elements and nodes, are sorted and displayed by a developed post-processor to provide maximum and minimum values at selected locations in the model.« less

  8. Automatic detection and classification of EOL-concrete and resulting recovered products by hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Palmieri, Roberta; Bonifazi, Giuseppe; Serranti, Silvia

    2014-05-01

    The recovery of materials from Demolition Waste (DW) represents one of the main target of the recycling industry and the its characterization is important in order to set up efficient sorting and/or quality control systems. End-Of-Life (EOL) concrete materials identification is necessary to maximize DW conversion into useful secondary raw materials, so it is fundamental to develop strategies for the implementation of an automatic recognition system of the recovered products. In this paper, HyperSpectral Imaging (HSI) technique was applied in order to detect DW composition. Hyperspectral images were acquired by a laboratory device equipped with a HSI sensing device working in the near infrared range (1000-1700 nm): NIR Spectral Camera™, embedding an ImSpector™ N17E (SPECIM Ltd, Finland). Acquired spectral data were analyzed adopting the PLS_Toolbox (Version 7.5, Eigenvector Research, Inc.) under Matlab® environment (Version 7.11.1, The Mathworks, Inc.), applying different chemometric methods: Principal Component Analysis (PCA) for exploratory data approach and Partial Least Square- Discriminant Analysis (PLS-DA) to build classification models. Results showed that it is possible to recognize DW materials, distinguishing recycled aggregates from contaminants (e.g. bricks, gypsum, plastics, wood, foam, etc.). The developed procedure is cheap, fast and non-destructive: it could be used to make some steps of the recycling process more efficient and less expensive.

  9. Data Acquisition for a Patient-directed Intervention Protocol in the Dynamic Intensive Care Unit Setting

    PubMed Central

    Chlan, Linda; Patterson, Robert P.; Heiderscheit, Annie

    2011-01-01

    Methods to easily, accurately, and efficiently obtain data in an ICU-based clinical trial can be challenging in this high-tech setting. Patient medical status and the dynamic nature of this clinical setting further complicates data collection. The purpose of this paper is to describe the modifications of commercially available headphones and the application of a data logging device to capture frequency and length of protocol use (music listening or headphones only for noise cancellation) without burdening participants or busy ICU nurses. With the automatic capture of protocol use by research participants, there have been no instances of lost data for this clinical trial. PMID:21382515

  10. The constant displacement scheme for tracking particles in heterogeneous aquifers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wen, X.H.; Gomez-Hernandez, J.J.

    1996-01-01

    Simulation of mass transport by particle tracking or random walk in highly heterogeneous media may be inefficient from a computational point of view if the traditional constant time step scheme is used. A new scheme which adjusts automatically the time step for each particle according to the local pore velocity, so that each particle always travels a constant distance, is shown to be computationally faster for the same degree of accuracy than the constant time step method. Using the constant displacement scheme, transport calculations in a 2-D aquifer model, with nature log-transmissivity variance of 4, can be 8.6 times fastermore » than using the constant time step scheme.« less

  11. Data acquisition for a patient-directed intervention protocol in the dynamic intensive care unit setting.

    PubMed

    Chlan, Linda; Patterson, Robert P; Heiderscheit, Annie

    2011-07-01

    Methods to easily, accurately, and efficiently obtain data in an ICU-based clinical trial can be challenging in this high-tech setting. Patient medical status and the dynamic nature of this clinical setting further complicate data collection. The purpose of this paper is to describe the modifications of commercially available headphones and the application of a data logging device to capture frequency and length of protocol use (music listening or headphones only for noise cancellation) without burdening participants or busy ICU nurses. With the automatic capture of protocol use by research participants, there have been no instances of lost data for this clinical trial. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. Dissecting Stop Transfer versus Conservative Sorting Pathways for Mitochondrial Inner Membrane Proteins in Vivo*

    PubMed Central

    Park, Kwangjin; Botelho, Salomé Calado; Hong, Joonki; Österberg, Marie; Kim, Hyun

    2013-01-01

    Mitochondrial inner membrane proteins that carry an N-terminal presequence are sorted by one of two pathways: stop transfer or conservative sorting. However, the sorting pathway is known for only a small number of proteins, in part due to the lack of robust experimental tools with which to study. Here we present an approach that facilitates determination of inner membrane protein sorting pathways in vivo by fusing a mitochondrial inner membrane protein to the C-terminal part of Mgm1p containing the rhomboid cleavage region. We validated the Mgm1 fusion approach using a set of proteins for which the sorting pathway is known, and determined sorting pathways of inner membrane proteins for which the sorting mode was previously uncharacterized. For Sdh4p, a multispanning membrane protein, our results suggest that both conservative sorting and stop transfer mechanisms are required for insertion. Furthermore, the sorting process of Mgm1 fusion proteins was analyzed under different growth conditions and yeast mutant strains that were defective in the import motor or the m-AAA protease function. Our results show that the sorting of mitochondrial proteins carrying moderately hydrophobic transmembrane segments is sensitive to cellular conditions, implying that mitochondrial import and membrane sorting in the physiological environment may be dynamically tuned. PMID:23184936

  13. GISentinel: a software platform for automatic ulcer detection on capsule endoscopy videos

    NASA Astrophysics Data System (ADS)

    Yi, Steven; Jiao, Heng; Meng, Fan; Leighton, Jonathon A.; Shabana, Pasha; Rentz, Lauri

    2014-03-01

    In this paper, we present a novel and clinically valuable software platform for automatic ulcer detection on gastrointestinal (GI) tract from Capsule Endoscopy (CE) videos. Typical CE videos take about 8 hours. They have to be reviewed manually by physicians to detect and locate diseases such as ulcers and bleedings. The process is time consuming. Moreover, because of the long-time manual review, it is easy to lead to miss-finding. Working with our collaborators, we were focusing on developing a software platform called GISentinel, which can fully automated GI tract ulcer detection and classification. This software includes 3 parts: the frequency based Log-Gabor filter regions of interest (ROI) extraction, the unique feature selection and validation method (e.g. illumination invariant feature, color independent features, and symmetrical texture features), and the cascade SVM classification for handling "ulcer vs. non-ulcer" cases. After the experiments, this SW gave descent results. In frame-wise, the ulcer detection rate is 69.65% (319/458). In instance-wise, the ulcer detection rate is 82.35%(28/34).The false alarm rate is 16.43% (34/207). This work is a part of our innovative 2D/3D based GI tract disease detection software platform. The final goal of this SW is to find and classification of major GI tract diseases intelligently, such as bleeding, ulcer, and polyp from the CE videos. This paper will mainly describe the automatic ulcer detection functional module.

  14. Seminal plasma affects sperm sex sorting in boars.

    PubMed

    Alkmin, Diego V; Parrilla, Inmaculada; Tarantini, Tatiana; Del Olmo, David; Vazquez, Juan M; Martinez, Emilio A; Roca, Jordi

    2016-04-01

    Two experiments were conducted in boar semen samples to evaluate how both holding time (24h) and the presence of seminal plasma (SP) before sorting affect sperm sortability and the ability of sex-sorted spermatozoa to tolerate liquid storage. Whole ejaculate samples were divided into three aliquots immediately after collection: one was diluted (1:1, v/v) in Beltsville thawing solution (BTS; 50% SP); the SP of the other two aliquots was removed and the sperm pellets were diluted with BTS + 10% of their own SP (10% SP) or BTS alone (0% SP). The three aliquots of each ejaculate were divided into two portions, one that was processed immediately for sorting and a second that was sorted after 24h storage at 15-17°C. In the first experiment, the ability to exhibit well-defined X- and Y-chromosome-bearing sperm peaks (split) in the cytometry histogram and the subsequent sorting efficiency were assessed (20 ejaculates). In contrast with holding time, the SP proportion influenced the parameters examined, as evidenced by the higher number of ejaculates exhibiting split and better sorting efficiency (P<0.05) in semen samples with 0-10% SP compared with those with 50% SP. In a second experiment, the quality (viability, total and progressive motility) and functionality (plasma membrane fluidity and intracellular generation of reactive oxygen species) of sex-sorted spermatozoa were evaluated after 0, 72 and 120h storage at 15-17°C (10 ejaculates). Holding time and SP proportion did not influence the quality or functionality of stored sex-sorted spermatozoa. In conclusion, a holding time as long as 24h before sorting did not negatively affect sex sorting efficiency or the ability of sorted boar spermatozoa to tolerate long-term liquid storage. A high proportion of SP (50%) in the semen samples before sorting reduced the number of ejaculates to be sorted and negatively influenced the sorting efficiency, but did not affect the ability of sex-sorted spermatozoa to tolerate liquid storage.

  15. SU-E-J-141: Comparison of Dose Calculation On Automatically Generated MRBased ED Maps and Corresponding Patient CT for Clinical Prostate EBRT Plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schadewaldt, N; Schulz, H; Helle, M

    2014-06-01

    Purpose: To analyze the effect of computing radiation dose on automatically generated MR-based simulated CT images compared to true patient CTs. Methods: Six prostate cancer patients received a regular planning CT for RT planning as well as a conventional 3D fast-field dual-echo scan on a Philips 3.0T Achieva, adding approximately 2 min of scan time to the clinical protocol. Simulated CTs (simCT) where synthesized by assigning known average CT values to the tissue classes air, water, fat, cortical and cancellous bone. For this, Dixon reconstruction of the nearly out-of-phase (echo 1) and in-phase images (echo 2) allowed for water andmore » fat classification. Model based bone segmentation was performed on a combination of the DIXON images. A subsequent automatic threshold divides into cortical and cancellous bone. For validation, the simCT was registered to the true CT and clinical treatment plans were re-computed on the simCT in pinnacle{sup 3}. To differentiate effects related to the 5 tissue classes and changes in the patient anatomy not compensated by rigid registration, we also calculate the dose on a stratified CT, where HU values are sorted in to the same 5 tissue classes as the simCT. Results: Dose and volume parameters on PTV and risk organs as used for the clinical approval were compared. All deviations are below 1.1%, except the anal sphincter mean dose, which is at most 2.2%, but well below clinical acceptance threshold. Average deviations are below 0.4% for PTV and risk organs and 1.3% for the anal sphincter. The deviations of the stratifiedCT are in the same range as for the simCT. All plans would have passed clinical acceptance thresholds on the simulated CT images. Conclusion: This study demonstrated the clinical usability of MR based dose calculation with the presented Dixon acquisition and subsequent fully automatic image processing. N. Schadewaldt, H. Schulz, M. Helle and S. Renisch are employed by Phlips Technologie Innovative Techonologies, a subsidiary of Royal Philips NV.« less

  16. Difficulties in everyday life: young persons with attention-deficit/hyperactivity disorder and autism spectrum disorders perspectives. A chat-log analysis.

    PubMed

    Ahlström, Britt H; Wentz, Elisabet

    2014-01-01

    This study focuses on the everyday life of young persons with attention-deficit/hyperactivity disorder (ADHD) and autism spectrum disorder (ASD). There are follow-up studies describing ADHD, and ASD in adults, and residual impairments that affect life. Few qualitative studies have been conducted on the subject of their experiences of everyday life, and even fewer are from young persons' perspectives. This study's aim was to describe how young persons with ADHD and ASD function and how they manage their everyday life based on analyses of Internet-based chat logs. Twelve young persons (7 males and 5 females aged 15-26) diagnosed with ADHD and ASD were included consecutively and offered 8 weeks of Internet-based Support and Coaching (IBSC). Data were collected from 12 chat logs (445 pages of text) produced interactively by the participants and the coaches. Qualitative content analysis was applied. The text was coded and sorted into subthemes and further interpreted into themes. The findings revealed two themes: "fighting against an everyday life lived in vulnerability" with the following subthemes: "difficult things," "stress and rest," and "when feelings and thoughts are a concern"; and the theme "struggling to find a life of one's own" with the following subthemes: "decide and carry out," "making life choices," and "taking care of oneself." Dealing with the problematic situations that everyday encompasses requires personal strength and a desire to find adequate solutions, as well as to discover a role in society. This study, into the provision of support and coaching over the Internet, led to more in-depth knowledge about these young persons' everyday lives and revealed their ability to use IBSC to express the complexity of everyday life for young persons with ADHD and ASD. The implications of the findings are that using online coaching makes available new opportunities for healthcare professionals to acknowledge these young persons' problems.

  17. Reservoir assessment of the Nubian sandstone reservoir in South Central Gulf of Suez, Egypt

    NASA Astrophysics Data System (ADS)

    El-Gendy, Nader; Barakat, Moataz; Abdallah, Hamed

    2017-05-01

    The Gulf of Suez is considered as one of the most important petroleum provinces in Egypt and contains the Saqqara and Edfu oil fields located in the South Central portion of the Gulf of Suez. The Nubian sandstone reservoir in the Gulf of Suez basin is well known for its great capability to store and produce large volumes of hydrocarbons. The Nubian sandstone overlies basement rocks throughout most of the Gulf of Suez region. It consists of a sequence of sandstones and shales of Paleozoic to Cretaceous age. The Nubian sandstone intersected in most wells has excellent reservoir characteristics. Its porosity is controlled by sedimentation style and diagenesis. The cementation materials are mainly kaolinite and quartz overgrowths. The permeability of the Nubian sandstone is mainly controlled by grain size, sorting, porosity and clay content especially kaolinite and decreases with increase of kaolinite. The permeability of the Nubian Sandstone is evaluated using the Nuclear Magnetic Resonance (NMR technology) and formation pressure data in addition to the conventional logs and the results were calibrated using core data. In this work, the Nubian sandstone was investigated and evaluated using complete suites of conventional and advanced logging techniques to understand its reservoir characteristics which have impact on economics of oil recovery. The Nubian reservoir has a complicated wettability nature which affects the petrophysical evaluation and reservoir productivity. So, understanding the reservoir wettability is very important for managing well performance, productivity and oil recovery.

  18. Stochastic Model of Vesicular Sorting in Cellular Organelles

    NASA Astrophysics Data System (ADS)

    Vagne, Quentin; Sens, Pierre

    2018-02-01

    The proper sorting of membrane components by regulated exchange between cellular organelles is crucial to intracellular organization. This process relies on the budding and fusion of transport vesicles, and should be strongly influenced by stochastic fluctuations, considering the relatively small size of many organelles. We identify the perfect sorting of two membrane components initially mixed in a single compartment as a first passage process, and we show that the mean sorting time exhibits two distinct regimes as a function of the ratio of vesicle fusion to budding rates. Low ratio values lead to fast sorting but result in a broad size distribution of sorted compartments dominated by small entities. High ratio values result in two well-defined sorted compartments but sorting is exponentially slow. Our results suggest an optimal balance between vesicle budding and fusion for the rapid and efficient sorting of membrane components and highlight the importance of stochastic effects for the steady-state organization of intracellular compartments.

  19. Identification and genetic analysis of cancer cells with PCR-activated cell sorting

    PubMed Central

    Eastburn, Dennis J.; Sciambi, Adam; Abate, Adam R.

    2014-01-01

    Cell sorting is a central tool in life science research for analyzing cellular heterogeneity or enriching rare cells out of large populations. Although methods like FACS and FISH-FC can characterize and isolate cells from heterogeneous populations, they are limited by their reliance on antibodies, or the requirement to chemically fix cells. We introduce a new cell sorting technology that robustly sorts based on sequence-specific analysis of cellular nucleic acids. Our approach, PCR-activated cell sorting (PACS), uses TaqMan PCR to detect nucleic acids within single cells and trigger their sorting. With this method, we identified and sorted prostate cancer cells from a heterogeneous population by performing >132 000 simultaneous single-cell TaqMan RT-PCR reactions targeting vimentin mRNA. Following vimentin-positive droplet sorting and downstream analysis of recovered nucleic acids, we found that cancer-specific genomes and transcripts were significantly enriched. Additionally, we demonstrate that PACS can be used to sort and enrich cells via TaqMan PCR reactions targeting single-copy genomic DNA. PACS provides a general new technical capability that expands the application space of cell sorting by enabling sorting based on cellular information not amenable to existing approaches. PMID:25030902

  20. Correlating behavioral responses to FMRI signals from human prefrontal cortex: examining cognitive processes using task analysis.

    PubMed

    DeSouza, Joseph F X; Ovaysikia, Shima; Pynn, Laura

    2012-06-20

    The aim of this methods paper is to describe how to implement a neuroimaging technique to examine complementary brain processes engaged by two similar tasks. Participants' behavior during task performance in an fMRI scanner can then be correlated to the brain activity using the blood-oxygen-level-dependent signal. We measure behavior to be able to sort correct trials, where the subject performed the task correctly and then be able to examine the brain signals related to correct performance. Conversely, if subjects do not perform the task correctly, and these trials are included in the same analysis with the correct trials we would introduce trials that were not only for correct performance. Thus, in many cases these errors can be used themselves to then correlate brain activity to them. We describe two complementary tasks that are used in our lab to examine the brain during suppression of an automatic responses: the stroop(1) and anti-saccade tasks. The emotional stroop paradigm instructs participants to either report the superimposed emotional 'word' across the affective faces or the facial 'expressions' of the face stimuli(1,2). When the word and the facial expression refer to different emotions, a conflict between what must be said and what is automatically read occurs. The participant has to resolve the conflict between two simultaneously competing processes of word reading and facial expression. Our urge to read out a word leads to strong 'stimulus-response (SR)' associations; hence inhibiting these strong SR's is difficult and participants are prone to making errors. Overcoming this conflict and directing attention away from the face or the word requires the subject to inhibit bottom up processes which typically directs attention to the more salient stimulus. Similarly, in the anti-saccade task(3,4,5,6), where an instruction cue is used to direct only attention to a peripheral stimulus location but then the eye movement is made to the mirror opposite position. Yet again we measure behavior by recording the eye movements of participants which allows for the sorting of the behavioral responses into correct and error trials(7) which then can be correlated to brain activity. Neuroimaging now allows researchers to measure different behaviors of correct and error trials that are indicative of different cognitive processes and pinpoint the different neural networks involved.

  1. Nonlocal Means Denoising of Self-Gated and k-Space Sorted 4-Dimensional Magnetic Resonance Imaging Using Block-Matching and 3-Dimensional Filtering: Implications for Pancreatic Tumor Registration and Segmentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Jun; McKenzie, Elizabeth; Fan, Zhaoyang

    Purpose: To denoise self-gated k-space sorted 4-dimensional magnetic resonance imaging (SG-KS-4D-MRI) by applying a nonlocal means denoising filter, block-matching and 3-dimensional filtering (BM3D), to test its impact on the accuracy of 4D image deformable registration and automated tumor segmentation for pancreatic cancer patients. Methods and Materials: Nine patients with pancreatic cancer and abdominal SG-KS-4D-MRI were included in the study. Block-matching and 3D filtering was adapted to search in the axial slices/frames adjacent to the reference image patch in the spatial and temporal domains. The patches with high similarity to the reference patch were used to collectively denoise the 4D-MRI image. Themore » pancreas tumor was manually contoured on the first end-of-exhalation phase for both the raw and the denoised 4D-MRI. B-spline deformable registration was applied to the subsequent phases for contour propagation. The consistency of tumor volume defined by the standard deviation of gross tumor volumes from 10 breathing phases (σ-GTV), tumor motion trajectories in 3 cardinal motion planes, 4D-MRI imaging noise, and image contrast-to-noise ratio were compared between the raw and denoised groups. Results: Block-matching and 3D filtering visually and quantitatively reduced image noise by 52% and improved image contrast-to-noise ratio by 56%, without compromising soft tissue edge definitions. Automatic tumor segmentation is statistically more consistent on the denoised 4D-MRI (σ-GTV = 0.6 cm{sup 3}) than on the raw 4D-MRI (σ-GTV = 0.8 cm{sup 3}). Tumor end-of-exhalation location is also more reproducible on the denoised 4D-MRI than on the raw 4D-MRI in all 3 cardinal motion planes. Conclusions: Block-matching and 3D filtering can significantly reduce random image noise while maintaining structural features in the SG-KS-4D-MRI datasets. In this study of pancreatic tumor segmentation, automatic segmentation of GTV in the registered image sets is shown to be more consistent on the denoised 4D-MRI than on the raw 4D-MRI.« less

  2. Evaluation of motility, membrane status and DNA integrity of frozen-thawed bottlenose dolphin (Tursiops truncatus) spermatozoa after sex-sorting and recryopreservation.

    PubMed

    Montano, G A; Kraemer, D C; Love, C C; Robeck, T R; O'Brien, J K

    2012-06-01

    Artificial insemination (AI) with sex-sorted frozen-thawed spermatozoa has led to enhanced management of ex situ bottlenose dolphin populations. Extended distance of animals from the sorting facility can be overcome by the use of frozen-thawed, sorted and recryopreserved spermatozoa. Although one bottlenose dolphin calf had been born using sexed frozen-thawed spermatozoa derived from frozen semen, a critical evaluation of in vitro sperm quality is needed to justify the routine use of such samples in AI programs. Sperm motility parameters and plasma membrane integrity were influenced by stage of the sex-sorting process, sperm type (non-sorted and sorted) and freezing method (straw and directional) (P<0.05). After recryopreservation, sorted spermatozoa frozen with the directional freezing method maintained higher (P<0.05) motility parameters over a 24-h incubation period compared to spermatozoa frozen using straws. Quality of sperm DNA of non-sorted spermatozoa, as assessed by the sperm chromatin structure assay (SCSA), was high and remained unchanged throughout freeze-thawing and incubation processes. Though a possible interaction between Hoechst 33342 and the SCSA-derived acridine orange was observed in stained and sorted samples, the proportion of sex-sorted, recryopreserved spermatozoa exhibiting denatured DNA was low (6.6±4.1%) at 6 h after the second thawing step and remained unchanged (P>0.05) at 24 h. The viability of sorted spermatozoa was higher (P<0.05) than that of non-sorted spermatozoa across all time points after recryopreservation. Collective results indicate that bottlenose dolphin spermatozoa undergoing cryopreservation, sorting and recryopreservation are of adequate quality for use in AI.

  3. A flexible system for vital signs monitoring in hospital general care wards based on the integration of UNIX-based workstations, standard networks and portable vital signs monitors.

    PubMed Central

    Welch, J. P.; Sims, N.; Ford-Carlton, P.; Moon, J. B.; West, K.; Honore, G.; Colquitt, N.

    1991-01-01

    The article describes a study conducted on general surgical and thoracic surgical floors of a 1000-bed hospital to assess the impact of a new network for portable patient care devices. This network was developed to address the needs of hospital patients who need constant, multi-parameter, vital signs surveillance, but do not require intensive nursing care. Bedside wall jacks were linked to UNIX-based workstations using standard digital network hardware, creating a flexible system (for general care floors of the hospital) that allowed the number of monitored locations to increase and decrease as patient census and acuity levels varied. It also allowed the general care floors to provide immediate, centralized vital signs monitoring for patients who unexpectedly became unstable, and permitted portable monitors to travel with patients as they were transferred between hospital departments. A disk-based log within the workstation automatically collected performance data, including patient demographics, monitor alarms, and network status for analysis. The log has allowed the developers to evaluate the use and performance of the system. PMID:1807720

  4. Multi-agent integrated password management (MIPM) application secured with encryption

    NASA Astrophysics Data System (ADS)

    Awang, Norkhushaini; Zukri, Nurul Hidayah Ahmad; Rashid, Nor Aimuni Md; Zulkifli, Zuhri Arafah; Nazri, Nor Afifah Mohd

    2017-10-01

    Users use weak passwords and reuse them on different websites and applications. Password managers are a solution to store login information for websites and help users log in automatically. This project developed a system that acts as an agent managing passwords. Multi-Agent Integrated Password Management (MIPM) is an application using encryption that provides users with secure storage of their login account information such as their username, emails and passwords. This project was developed on an Android platform with an encryption agent using Java Agent Development Environment (JADE). The purpose of the embedded agents is to act as a third-party software to ease the encryption process, and in the future, the developed encryption agents can form part of the security system. This application can be used by the computer and mobile users. Currently, users log into many applications causing them to use unique passwords to prevent password leaking. The crypto agent handles the encryption process using an Advanced Encryption Standard (AES) 128-bit encryption algorithm. As a whole, MIPM is developed on the Android application to provide a secure platform to store passwords and has high potential to be commercialised for public use.

  5. Microfluidic cell sorting: a review of the advances in the separation of cells from debulking to rare cell isolation.

    PubMed

    Shields, C Wyatt; Reyes, Catherine D; López, Gabriel P

    2015-03-07

    Accurate and high throughput cell sorting is a critical enabling technology in molecular and cellular biology, biotechnology, and medicine. While conventional methods can provide high efficiency sorting in short timescales, advances in microfluidics have enabled the realization of miniaturized devices offering similar capabilities that exploit a variety of physical principles. We classify these technologies as either active or passive. Active systems generally use external fields (e.g., acoustic, electric, magnetic, and optical) to impose forces to displace cells for sorting, whereas passive systems use inertial forces, filters, and adhesion mechanisms to purify cell populations. Cell sorting on microchips provides numerous advantages over conventional methods by reducing the size of necessary equipment, eliminating potentially biohazardous aerosols, and simplifying the complex protocols commonly associated with cell sorting. Additionally, microchip devices are well suited for parallelization, enabling complete lab-on-a-chip devices for cellular isolation, analysis, and experimental processing. In this review, we examine the breadth of microfluidic cell sorting technologies, while focusing on those that offer the greatest potential for translation into clinical and industrial practice and that offer multiple, useful functions. We organize these sorting technologies by the type of cell preparation required (i.e., fluorescent label-based sorting, bead-based sorting, and label-free sorting) as well as by the physical principles underlying each sorting mechanism.

  6. Microfluidic Cell Sorting: A Review of the Advances in the Separation of Cells from Debulking to Rare Cell Isolation

    PubMed Central

    Shields, C. Wyatt; Reyes, Catherine D.; López, Gabriel P.

    2015-01-01

    Accurate and high throughput cell sorting is a critical enabling technology in molecular and cellular biology, biotechnology, and medicine. While conventional methods can provide high efficiency sorting in short timescales, advances in microfluidics have enabled the realization of miniaturized devices offering similar capabilities that exploit a variety of physical principles. We classify these technologies as either active or passive. Active systems generally use external fields (e.g., acoustic, electric, magnetic, and optical) to impose forces to displace cells for sorting, whereas passive systems use inertial forces, filters, and adhesion mechanisms to purify cell populations. Cell sorting on microchips provides numerous advantages over conventional methods by reducing the size of necessary equipment, eliminating potentially biohazardous aerosols, and simplifying the complex protocols commonly associated with cell sorting. Additionally, microchip devices are well suited for parallelization, enabling complete lab-on-a-chip devices for cellular isolation, analysis, and experimental processing. In this review, we examine the breadth of microfluidic cell sorting technologies, while focusing on those that offer the greatest potential for translation into clinical and industrial practice and that offer multiple, useful functions. We organize these sorting technologies by the type of cell preparation required (i.e., fluorescent label-based sorting, bead-based sorting, and label-free sorting) as well as by the physical principles underlying each sorting mechanism. PMID:25598308

  7. A Binary Array Asynchronous Sorting Algorithm with Using Petri Nets

    NASA Astrophysics Data System (ADS)

    Voevoda, A. A.; Romannikov, D. O.

    2017-01-01

    Nowadays the tasks of computations speed-up and/or their optimization are actual. Among the approaches on how to solve these tasks, a method applying approaches of parallelization and asynchronization to a sorting algorithm is considered in the paper. The sorting methods are ones of elementary methods and they are used in a huge amount of different applications. In the paper, we offer a method of an array sorting that based on a division into a set of independent adjacent pairs of numbers and their parallel and asynchronous comparison. And this one distinguishes the offered method from the traditional sorting algorithms (like quick sorting, merge sorting, insertion sorting and others). The algorithm is implemented with the use of Petri nets, like the most suitable tool for an asynchronous systems description.

  8. Post-consumer contamination in high-density polyethylene (HDPE) milk bottles and the design of a bottle-to-bottle recycling process.

    PubMed

    Welle, F

    2005-10-01

    Six hundred conventional recycled HDPE flake samples, which were recollected and sorted in the UK, were screened for post-consumer contamination levels. Each analysed sample consisted of 40-50 individual flakes so that the amount of analysed individual containers was in the range 24,000-30,000 post-consumer milk bottles. Predominant contaminants in hot-washed flake samples were unsaturated oligomers, which can be also be found in virgin high-density polyethylene (HDPE) pellet samples used for milk bottle production. In addition, the flavour compound limonene, the degradation product of antioxidant additives di-tert-butylphenol and low amounts of saturated oligomers were found in higher concentrations in the post-consumer samples in comparison with virgin HDPE. However, the overall concentrations in post-consumer recycled samples were similar to or lower than concentration ranges in comparison with virgin HDPE. Contamination with other HDPE untypical compounds was rare and was in most cases related to non-milk bottles, which are <2.1% of the input material of the recycling process. The maximum concentration found in one sample of 1 g was estimated as 130 mg kg(-1), which corresponds to a contamination of 5200-6500 mg kg(-1) in the individual bottle. The recycling process investigated was based on an efficient sorting process, a hot-washing of the ground bottles, and a further deep-cleaning of the flakes with high temperatures and vacuum. Based on the fact that the contamination levels of post-consumer flake samples are similar to virgin HDPE and on the high cleaning efficiency of the super-clean recycling process especially for highly volatile compounds, the recycling process investigated is suitable for recycled post-consumer HDPE bottles for direct food-contact applications. However, hand-picking after automatically sorting is recommended to decrease the amount of non-milk bottles. The conclusions for suitability are valid, provided that the migration testing of recyclate contains milk bottles up to 100% and that both shelf-life testing and sensorial testing of the products are successful, which are topics of further investigations.

  9. Provider risk factors for medication administration error alerts: analyses of a large-scale closed-loop medication administration system using RFID and barcode.

    PubMed

    Hwang, Yeonsoo; Yoon, Dukyong; Ahn, Eun Kyoung; Hwang, Hee; Park, Rae Woong

    2016-12-01

    To determine the risk factors and rate of medication administration error (MAE) alerts by analyzing large-scale medication administration data and related error logs automatically recorded in a closed-loop medication administration system using radio-frequency identification and barcodes. The subject hospital adopted a closed-loop medication administration system. All medication administrations in the general wards were automatically recorded in real-time using radio-frequency identification, barcodes, and hand-held point-of-care devices. MAE alert logs recorded during a full 1 year of 2012. We evaluated risk factors for MAE alerts including administration time, order type, medication route, the number of medication doses administered, and factors associated with nurse practices by logistic regression analysis. A total of 2 874 539 medication dose records from 30 232 patients (882.6 patient-years) were included in 2012. We identified 35 082 MAE alerts (1.22% of total medication doses). The MAE alerts were significantly related to administration at non-standard time [odds ratio (OR) 1.559, 95% confidence interval (CI) 1.515-1.604], emergency order (OR 1.527, 95%CI 1.464-1.594), and the number of medication doses administered (OR 0.993, 95%CI 0.992-0.993). Medication route, nurse's employment duration, and working schedule were also significantly related. The MAE alert rate was 1.22% over the 1-year observation period in the hospital examined in this study. The MAE alerts were significantly related to administration time, order type, medication route, the number of medication doses administered, nurse's employment duration, and working schedule. The real-time closed-loop medication administration system contributed to improving patient safety by preventing potential MAEs. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  10. An automated full waveform logging system for high-resolution P-wave profiles in marine sediments

    NASA Astrophysics Data System (ADS)

    Breitzke, Monika; Spieβ, Volkhard

    1993-11-01

    An automated, PC-based logging system has been developed to investigate marine sediment cores by full waveform transmission seismograms. High-resolution P-wave velocity and amplitude attenuation profiles are simultaneously derived from the transmission data to characterize the acoustic properties of the sediment column. A pair of ultrasonic, piezoelectric wheel probes is used to generate and record the transmission signals travelling radially through the sediment core. Both unsplit and split cores are allowed. Mounted in a carriage driven by a stepping motor via a shaft the probes automatically move along the core liner, stopping at equidistant spacings to provide a quasi-continuous inspection of the core by the transmission data. The axial travel distance and the core diameter are determined by digital measuring tools. First arrivals are picked automatically from the transmission seismograms using either a threshold in the seismogram's envelope or a cross-correlation algorithm taking the ‘zero-offset’ signal of both wheel probes into account. Combined with the core diameter these first arrivals lead to a P-wave velocity profile with a relative precision of 1 to 2 m s-1. Simultaneously, the maximum peak-to-peak amplitudes of the transmission seismograms are evaluated to get a first idea on the amplitude attenuation along the sediment core. Two examples of gravity cores taken during a recent cruise of R.V. METEOR in the Western Equatorial Atlantic are presented. They yield that the P-wave profiles can be used for locating strong and fine-scale lithological changes, e.g. turbidite layers and slight variations in the sand, silt or clay content. In addition, the transmission seismograms and their amplitude spectra obviously seem to reveal a correlation between the relative amount of low-frequency spectral components and the sediment grain size, and thus provide a tool for the determination of additional, related physical or sedimentological parameters in future investigations.

  11. Performance evaluation of firefly algorithm with variation in sorting for non-linear benchmark problems

    NASA Astrophysics Data System (ADS)

    Umbarkar, A. J.; Balande, U. T.; Seth, P. D.

    2017-06-01

    The field of nature inspired computing and optimization techniques have evolved to solve difficult optimization problems in diverse fields of engineering, science and technology. The firefly attraction process is mimicked in the algorithm for solving optimization problems. In Firefly Algorithm (FA) sorting of fireflies is done by using sorting algorithm. The original FA is proposed with bubble sort for ranking the fireflies. In this paper, the quick sort replaces bubble sort to decrease the time complexity of FA. The dataset used is unconstrained benchmark functions from CEC 2005 [22]. The comparison of FA using bubble sort and FA using quick sort is performed with respect to best, worst, mean, standard deviation, number of comparisons and execution time. The experimental result shows that FA using quick sort requires less number of comparisons but requires more execution time. The increased number of fireflies helps to converge into optimal solution whereas by varying dimension for algorithm performed better at a lower dimension than higher dimension.

  12. Quantitative structure-property relationships for predicting sorption of pharmaceuticals to sewage sludge during waste water treatment processes.

    PubMed

    Berthod, L; Whitley, D C; Roberts, G; Sharpe, A; Greenwood, R; Mills, G A

    2017-02-01

    Understanding the sorption of pharmaceuticals to sewage sludge during waste water treatment processes is important for understanding their environmental fate and in risk assessments. The degree of sorption is defined by the sludge/water partition coefficient (K d ). Experimental K d values (n=297) for active pharmaceutical ingredients (n=148) in primary and activated sludge were collected from literature. The compounds were classified by their charge at pH7.4 (44 uncharged, 60 positively and 28 negatively charged, and 16 zwitterions). Univariate models relating log K d to log K ow for each charge class showed weak correlations (maximum R 2 =0.51 for positively charged) with no overall correlation for the combined dataset (R 2 =0.04). Weaker correlations were found when relating log K d to log D ow . Three sets of molecular descriptors (Molecular Operating Environment, VolSurf and ParaSurf) encoding a range of physico-chemical properties were used to derive multivariate models using stepwise regression, partial least squares and Bayesian artificial neural networks (ANN). The best predictive performance was obtained with ANN, with R 2 =0.62-0.69 for these descriptors using the complete dataset. Use of more complex Vsurf and ParaSurf descriptors showed little improvement over Molecular Operating Environment descriptors. The most influential descriptors in the ANN models, identified by automatic relevance determination, highlighted the importance of hydrophobicity, charge and molecular shape effects in these sorbate-sorbent interactions. The heterogeneous nature of the different sewage sludges used to measure K d limited the predictability of sorption from physico-chemical properties of the pharmaceuticals alone. Standardization of test materials for the measurement of K d would improve comparability of data from different studies, in the long-term leading to better quality environmental risk assessments. Copyright © 2016 British Geological Survey, NERC. Published by Elsevier B.V. All rights reserved.

  13. Influence of uncorrected refractive error and unmet refractive error on visual impairment in a Brazilian population.

    PubMed

    Ferraz, Fabio H; Corrente, José E; Opromolla, Paula; Schellini, Silvana A

    2014-06-25

    The World Health Organization (WHO) definitions of blindness and visual impairment are widely based on best-corrected visual acuity excluding uncorrected refractive errors (URE) as a visual impairment cause. Recently, URE was included as a cause of visual impairment, thus emphasizing the burden of visual impairment due to refractive error (RE) worldwide is substantially higher. The purpose of the present study is to determine the reversal of visual impairment and blindness in the population correcting RE and possible associations between RE and individual characteristics. A cross-sectional study was conducted in nine counties of the western region of state of São Paulo, using systematic and random sampling of households between March 2004 and July 2005. Individuals aged more than 1 year old were included and were evaluated for demographic data, eye complaints, history, and eye exam, including no corrected visual acuity (NCVA), best corrected vision acuity (BCVA), automatic and manual refractive examination. The definition adopted for URE was applied to individuals with NCVA > 0.15 logMAR and BCVA ≤ 0.15 logMAR after refractive correction and unmet refractive error (UREN), individuals who had visual impairment or blindness (NCVA > 0.5 logMAR) and BCVA ≤ 0.5 logMAR after optical correction. A total of 70.2% of subjects had normal NCVA. URE was detected in 13.8%. Prevalence of 4.6% of optically reversible low vision and 1.8% of blindness reversible by optical correction were found. UREN was detected in 6.5% of individuals, more frequently observed in women over the age of 50 and in higher RE carriers. Visual impairment related to eye diseases is not reversible with spectacles. Using multivariate analysis, associations between URE and UREN with regard to sex, age and RE was observed. RE is an important cause of reversible blindness and low vision in the Brazilian population.

  14. A fractal concentration area method for assigning a color palette for image representation

    NASA Astrophysics Data System (ADS)

    Cheng, Qiuming; Li, Qingmou

    2002-05-01

    Displaying the remotely sensed image with a proper color palette is the first task in any kind of image processing and pattern recognition in GIS and image processing environments. The purpose of displaying the image should be not only to provide a visual representation of the variance of the image, although this has been the primary objective of most conventional methods, but also the color palette should reflect real-world features on the ground which must be the primary objective of employing remotely sensed data. Although most conventional methods focus only on the first purpose of image representation, the concentration-area ( C- A plot) fractal method proposed in this paper aims to meet both purposes on the basis of pixel values and pixel value frequency distribution as well as spatial and geometrical properties of image patterns. The C- A method can be used to establish power-law relationships between the area A(⩾ s) with the pixel values greater than s and the pixel value s itself after plotting these values on log-log paper. A number of straight-line segments can be manually or automatically fitted to the points on the log-log paper, each representing a power-law relationship between the area A and the cutoff pixel value for s in a particular range. These straight-line segments can yield a group of cutoff values on the basis of which the image can be classified into discrete classes or zones. These zones usually correspond to the real-world features on the ground. A Windows program has been prepared in ActiveX format for implementing the C- A method and integrating it into other GIS and image processing systems. A case study of Landsat TM band 5 has been used to demonstrate the application of the method and the flexibility of the computer program.

  15. Influence of uncorrected refractive error and unmet refractive error on visual impairment in a Brazilian population

    PubMed Central

    2014-01-01

    Background The World Health Organization (WHO) definitions of blindness and visual impairment are widely based on best-corrected visual acuity excluding uncorrected refractive errors (URE) as a visual impairment cause. Recently, URE was included as a cause of visual impairment, thus emphasizing the burden of visual impairment due to refractive error (RE) worldwide is substantially higher. The purpose of the present study is to determine the reversal of visual impairment and blindness in the population correcting RE and possible associations between RE and individual characteristics. Methods A cross-sectional study was conducted in nine counties of the western region of state of São Paulo, using systematic and random sampling of households between March 2004 and July 2005. Individuals aged more than 1 year old were included and were evaluated for demographic data, eye complaints, history, and eye exam, including no corrected visual acuity (NCVA), best corrected vision acuity (BCVA), automatic and manual refractive examination. The definition adopted for URE was applied to individuals with NCVA > 0.15 logMAR and BCVA ≤ 0.15 logMAR after refractive correction and unmet refractive error (UREN), individuals who had visual impairment or blindness (NCVA > 0.5 logMAR) and BCVA ≤ 0.5 logMAR after optical correction. Results A total of 70.2% of subjects had normal NCVA. URE was detected in 13.8%. Prevalence of 4.6% of optically reversible low vision and 1.8% of blindness reversible by optical correction were found. UREN was detected in 6.5% of individuals, more frequently observed in women over the age of 50 and in higher RE carriers. Visual impairment related to eye diseases is not reversible with spectacles. Using multivariate analysis, associations between URE and UREN with regard to sex, age and RE was observed. Conclusion RE is an important cause of reversible blindness and low vision in the Brazilian population. PMID:24965318

  16. Miniaturized Water Flow and Level Monitoring System for Flood Disaster Early Warning

    NASA Astrophysics Data System (ADS)

    Ifedapo Abdullahi, Salami; Hadi Habaebi, Mohamed; Surya Gunawan, Teddy; Rafiqul Islam, MD

    2017-11-01

    This study presents the performance of a prototype miniaturised water flow and water level monitoring sensor designed towards supporting flood disaster early warning systems. The design involved selection of sensors, coding to control the system mechanism, and automatic data logging and storage. During the design phase, the apparatus was constructed where all the components were assembled using locally sourced items. Subsequently, under controlled laboratory environment, the system was tested by running water through the inlet during which the flow rate and rising water levels are automatically recorded and stored in a database via Microsoft Excel using Coolterm software. The system is simulated such that the water level readings measured in centimeters is output in meters using a multiplicative of 10. A total number of 80 readings were analyzed to evaluate the performance of the system. The result shows that the system is sensitive to water level rise and yielded accurate measurement of water level. But, the flow rate fluctuates due to the manual water supply that produced inconsistent flow. It was also observed that the flow sensor has a duty cycle of 50% of operating time under normal condition which implies that the performance of the flow sensor is optimal.

  17. A cost analysis comparing xeroradiography to film technics for intraoral radiography.

    PubMed

    Gratt, B M; Sickles, E A

    1986-01-01

    In the United States during 1978 $730 million was spent on dental radiographic services. Currently there are three alternatives for the processing of intraoral radiographs: manual wet-tanks, automatic film units, or xeroradiography. It was the intent of this study to determine which processing system is the most economical. Cost estimates were based on a usage rate of 750 patient images per month and included a calculation of the average cost per radiograph over a five-year period. Capital costs included initial processing equipment and site preparation. Operational costs included labor, supplies, utilities, darkroom rental, and breakdown costs. Clinical time trials were employed to measure examination times. Maintenance logs were employed to assess labor costs. Indirect costs of training were estimated. Results indicated that xeroradiography was the most cost effective ($0.81 per image) compared to either automatic film processing ($1.14 per image) or manual processing ($1.35 per image). Variations in projected costs indicated that if a dental practice performs primarily complete-mouth surveys, exposes less than 120 radiographs per month, and pays less than +6.50 per hour in wages, then manual (wet-tank) processing is the most economical method for producing intraoral radiographs.

  18. Writing in dyslexia: product and process.

    PubMed

    Morken, Frøydis; Helland, Turid

    2013-08-01

    Research on dyslexia has largely centred on reading. The aim of this study was to assess the writing of 13 children with and 28 without dyslexia at age 11 years. A programme for keystroke logging was used to allow recording of typing activity as the children performed a sentence dictation task. Five sentences were read aloud twice each. The task was to type the sentence as correctly as possible, with no time constraints. The data were analysed from a product (spelling, grammar and semantics) and process (transcription fluency and revisions) perspective, using repeated measures ANOVA and t-tests to investigate group differences. Furthermore, the data were correlated with measures of rapid automatic naming and working memory. Results showed that the group with dyslexia revised their texts as much as the typical group, but they used more time, and the result was poorer. Moreover, rapid automatic naming correlated with transcription fluency, and working memory correlated with the number of semantic errors. This shows that dyslexia is generally not an issue of effort and that cognitive skills that are known to be important for reading also affect writing. Copyright © 2013 John Wiley & Sons, Ltd.

  19. Combining user logging with eye tracking for interactive and dynamic applications.

    PubMed

    Ooms, Kristien; Coltekin, Arzu; De Maeyer, Philippe; Dupont, Lien; Fabrikant, Sara; Incoul, Annelies; Kuhn, Matthias; Slabbinck, Hendrik; Vansteenkiste, Pieter; Van der Haegen, Lise

    2015-12-01

    User evaluations of interactive and dynamic applications face various challenges related to the active nature of these displays. For example, users can often zoom and pan on digital products, and these interactions cause changes in the extent and/or level of detail of the stimulus. Therefore, in eye tracking studies, when a user's gaze is at a particular screen position (gaze position) over a period of time, the information contained in this particular position may have changed. Such digital activities are commonplace in modern life, yet it has been difficult to automatically compare the changing information at the viewed position, especially across many participants. Existing solutions typically involve tedious and time-consuming manual work. In this article, we propose a methodology that can overcome this problem. By combining eye tracking with user logging (mouse and keyboard actions) with cartographic products, we are able to accurately reference screen coordinates to geographic coordinates. This referencing approach allows researchers to know which geographic object (location or attribute) corresponds to the gaze coordinates at all times. We tested the proposed approach through two case studies, and discuss the advantages and disadvantages of the applied methodology. Furthermore, the applicability of the proposed approach is discussed with respect to other fields of research that use eye tracking-namely, marketing, sports and movement sciences, and experimental psychology. From these case studies and discussions, we conclude that combining eye tracking and user-logging data is an essential step forward in efficiently studying user behavior with interactive and static stimuli in multiple research fields.

  20. Choriocapillaris Flow Features Follow a Power Law Distribution: Implications for Characterization and Mechanisms of Disease Progression.

    PubMed

    Spaide, Richard F

    2016-10-01

    To investigate flow characteristics of the choriocapillaris using optical coherence tomography angiography. Retrospective observational case series. Visualization of flow in individual choriocapillary vessels is below the current resolution limit of optical coherence tomography angiography instruments, but areas of absent flow signal, called flow voids, are resolvable. The central macula was imaged with the Optovue RTVue XR Avanti using a 10-μm slab thickness in 104 eyes of 80 patients who ranged in age from 24 to 99 years of age. Automatic local thresholding of the resultant raw data with the Phansalkar method was analyzed with generalized estimating equations. The distribution of flow voids vs size of the voids was highly skewed. The data showed a linear log-log plot and goodness-of-fit methods showed the data followed a power law distribution over the relevant range. A slope intercept relationship was also evaluated for the log transform and significant predictors for variables included age, hypertension, pseudodrusen, and the presence of late age-related macular degeneration (AMD) in the fellow eye. The pattern of flow voids forms a scale invariant pattern in the choriocapillaris starting at a size much smaller than a choroidal lobule. Age and hypertension affect the choriocapillaris, a flat layer of capillaries that may serve as an observable surrogate for the neural or systemic microvasculature. Significant alterations detectable in the flow pattern in eyes with pseudodrusen and in eyes with late AMD in the fellow eye offer diagnostic possibilities and impact theories of disease pathogenesis. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Learning cellular sorting pathways using protein interactions and sequence motifs.

    PubMed

    Lin, Tien-Ho; Bar-Joseph, Ziv; Murphy, Robert F

    2011-11-01

    Proper subcellular localization is critical for proteins to perform their roles in cellular functions. Proteins are transported by different cellular sorting pathways, some of which take a protein through several intermediate locations until reaching its final destination. The pathway a protein is transported through is determined by carrier proteins that bind to specific sequence motifs. In this article, we present a new method that integrates protein interaction and sequence motif data to model how proteins are sorted through these sorting pathways. We use a hidden Markov model (HMM) to represent protein sorting pathways. The model is able to determine intermediate sorting states and to assign carrier proteins and motifs to the sorting pathways. In simulation studies, we show that the method can accurately recover an underlying sorting model. Using data for yeast, we show that our model leads to accurate prediction of subcellular localization. We also show that the pathways learned by our model recover many known sorting pathways and correctly assign proteins to the path they utilize. The learned model identified new pathways and their putative carriers and motifs and these may represent novel protein sorting mechanisms. Supplementary results and software implementation are available from http://murphylab.web.cmu.edu/software/2010_RECOMB_pathways/.

  2. A microfluidic device for automated, high-speed microinjection of Caenorhabditis elegans

    PubMed Central

    Song, Pengfei; Dong, Xianke; Liu, Xinyu

    2016-01-01

    The nematode worm Caenorhabditis elegans has been widely used as a model organism in biological studies because of its short and prolific life cycle, relatively simple body structure, significant genetic overlap with human, and facile/inexpensive cultivation. Microinjection, as an established and versatile tool for delivering liquid substances into cellular/organismal objects, plays an important role in C. elegans research. However, the conventional manual procedure of C. elegans microinjection is labor-intensive and time-consuming and thus hinders large-scale C. elegans studies involving microinjection of a large number of C. elegans on a daily basis. In this paper, we report a novel microfluidic device that enables, for the first time, fully automated, high-speed microinjection of C. elegans. The device is automatically regulated by on-chip pneumatic valves and allows rapid loading, immobilization, injection, and downstream sorting of single C. elegans. For demonstration, we performed microinjection experiments on 200 C. elegans worms and demonstrated an average injection speed of 6.6 worm/min (average worm handling time: 9.45 s/worm) and a success rate of 77.5% (post-sorting success rate: 100%), both much higher than the performance of manual operation (speed: 1 worm/4 min and success rate: 30%). We conducted typical viability tests on the injected C. elegans and confirmed that the automated injection system does not impose significant adverse effect on the physiological condition of the injected C. elegans. We believe that the developed microfluidic device holds great potential to become a useful tool for facilitating high-throughput, large-scale worm biology research. PMID:26958099

  3. k(+)-buffer: An Efficient, Memory-Friendly and Dynamic k-buffer Framework.

    PubMed

    Vasilakis, Andreas-Alexandros; Papaioannou, Georgios; Fudos, Ioannis

    2015-06-01

    Depth-sorted fragment determination is fundamental for a host of image-based techniques which simulates complex rendering effects. It is also a challenging task in terms of time and space required when rasterizing scenes with high depth complexity. When low graphics memory requirements are of utmost importance, k-buffer can objectively be considered as the most preferred framework which advantageously ensures the correct depth order on a subset of all generated fragments. Although various alternatives have been introduced to partially or completely alleviate the noticeable quality artifacts produced by the initial k-buffer algorithm in the expense of memory increase or performance downgrade, appropriate tools to automatically and dynamically compute the most suitable value of k are still missing. To this end, we introduce k(+)-buffer, a fast framework that accurately simulates the behavior of k-buffer in a single rendering pass. Two memory-bounded data structures: (i) the max-array and (ii) the max-heap are developed on the GPU to concurrently maintain the k-foremost fragments per pixel by exploring pixel synchronization and fragment culling. Memory-friendly strategies are further introduced to dynamically (a) lessen the wasteful memory allocation of individual pixels with low depth complexity frequencies, (b) minimize the allocated size of k-buffer according to different application goals and hardware limitations via a straightforward depth histogram analysis and (c) manage local GPU cache with a fixed-memory depth-sorting mechanism. Finally, an extensive experimental evaluation is provided demonstrating the advantages of our work over all prior k-buffer variants in terms of memory usage, performance cost and image quality.

  4. Synopsis of a computer program designed to interface a personal computer with the fast data acquisition system of a time-of-flight mass spectrometer

    NASA Technical Reports Server (NTRS)

    Bechtel, R. D.; Mateos, M. A.; Lincoln, K. A.

    1988-01-01

    Briefly described are the essential features of a computer program designed to interface a personal computer with the fast, digital data acquisition system of a time-of-flight mass spectrometer. The instrumentation was developed to provide a time-resolved analysis of individual vapor pulses produced by the incidence of a pulsed laser beam on an ablative material. The high repetition rate spectrometer coupled to a fast transient recorder captures complete mass spectra every 20 to 35 microsecs, thereby providing the time resolution needed for the study of this sort of transient event. The program enables the computer to record the large amount of data generated by the system in short time intervals, and it provides the operator the immediate option of presenting the spectral data in several different formats. Furthermore, the system does this with a high degree of automation, including the tasks of mass labeling the spectra and logging pertinent instrumental parameters.

  5. On the statistical mechanics of species abundance distributions.

    PubMed

    Bowler, Michael G; Kelly, Colleen K

    2012-09-01

    A central issue in ecology is that of the factors determining the relative abundance of species within a natural community. The proper application of the principles of statistical physics to species abundance distributions (SADs) shows that simple ecological properties could account for the near universal features observed. These properties are (i) a limit on the number of individuals in an ecological guild and (ii) per capita birth and death rates. They underpin the neutral theory of Hubbell (2001), the master equation approach of Volkov et al. (2003, 2005) and the idiosyncratic (extreme niche) theory of Pueyo et al. (2007); they result in an underlying log series SAD, regardless of neutral or niche dynamics. The success of statistical mechanics in this application implies that communities are in dynamic equilibrium and hence that niches must be flexible and that temporal fluctuations on all sorts of scales are likely to be important in community structure. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. Hydrothermal Vents of Juan de Fuca Ridge

    NASA Astrophysics Data System (ADS)

    Stark, Joyce

    As a member of REVEL (Research and Education: Volcanoes, Exploration and Life), I had an opportunity to participant in a scientific research cruise focused on the active volcanoes along the Juan de Fuca Ridge, the submarine spreading center off the Washington- Oregon-Canada coast. REVEL was sponsored by the National Science Foundation, University of Washington, Pennsylvania State University and the American Museum of Natural History. We studied the geological, chemical and biological processes associated with active hydrothermal systems and my research focused on the biological communities of the sulfide structures. We worked on board the Woods Hole Oceanographic Institution Vessel, R/V Atlantis and the submersible ALVIN was used to sample the "Black Smokers". As a member of the scientific party, I participated in collection and sorting of biological specimens from the vent communities, attended lectures by scientists, contributed to the cruise log website, maintained a journal and developed my own research project. It was my responsibility to bring this cutting-edge research back to the classroom.

  7. Comparative transcriptional profiling of two wheat genotypes, with contrasting levels of minerals in grains, shows expression differences during grain filling.

    PubMed

    Singh, Sudhir P; Jeet, Raja; Kumar, Jitendra; Shukla, Vishnu; Srivastava, Rakesh; Mantri, Shrikant S; Tuli, Rakesh

    2014-01-01

    Wheat is one of the most important cereal crops in the world. To identify the candidate genes for mineral accumulation, it is important to examine differential transcriptome between wheat genotypes, with contrasting levels of minerals in grains. A transcriptional comparison of developing grains was carried out between two wheat genotypes- Triticum aestivum Cv. WL711 (low grain mineral), and T. aestivum L. IITR26 (high grain mineral), using Affymetrix GeneChip Wheat Genome Array. The study identified a total of 580 probe sets as differentially expressed (with log2 fold change of ≥2 at p≤0.01) between the two genotypes, during grain filling. Transcripts with significant differences in induction or repression between the two genotypes included genes related to metal homeostasis, metal tolerance, lignin and flavonoid biosynthesis, amino acid and protein transport, vacuolar-sorting receptor, aquaporins, and stress responses. Meta-analysis revealed spatial and temporal signatures of a majority of the differentially regulated transcripts.

  8. Comparative Transcriptional Profiling of Two Wheat Genotypes, with Contrasting Levels of Minerals in Grains, Shows Expression Differences during Grain Filling

    PubMed Central

    Singh, Sudhir P.; Jeet, Raja; Kumar, Jitendra; Shukla, Vishnu; Srivastava, Rakesh; Mantri, Shrikant S.; Tuli, Rakesh

    2014-01-01

    Wheat is one of the most important cereal crops in the world. To identify the candidate genes for mineral accumulation, it is important to examine differential transcriptome between wheat genotypes, with contrasting levels of minerals in grains. A transcriptional comparison of developing grains was carried out between two wheat genotypes- Triticum aestivum Cv. WL711 (low grain mineral), and T. aestivum L. IITR26 (high grain mineral), using Affymetrix GeneChip Wheat Genome Array. The study identified a total of 580 probe sets as differentially expressed (with log2 fold change of ≥2 at p≤0.01) between the two genotypes, during grain filling. Transcripts with significant differences in induction or repression between the two genotypes included genes related to metal homeostasis, metal tolerance, lignin and flavonoid biosynthesis, amino acid and protein transport, vacuolar-sorting receptor, aquaporins, and stress responses. Meta-analysis revealed spatial and temporal signatures of a majority of the differentially regulated transcripts. PMID:25364903

  9. Birth of kids after artificial insemination with sex-sorted, frozen-thawed goat spermatozoa.

    PubMed

    Bathgate, R; Mace, N; Heasman, K; Evans, G; Maxwell, W M C; de Graaf, S P

    2013-12-01

    Successful sex-sorting of goat spermatozoa and subsequent birth of pre-sexed kids have yet to be reported. As such, a series of experiments were conducted to develop protocols for sperm-sorting (using a modified flow cytometer, MoFlo SX(®) ) and cryopreservation of goat spermatozoa. Saanen goat spermatozoa (n = 2 males) were (i) collected into Salamon's or Tris catch media post-sorting and (ii) frozen in Tris-citrate-glucose media supplemented with 5, 10 or 20% egg yolk in (iii) 0.25 ml pellets on dry ice or 0.25 ml straws in a controlled-rate freezer. Post-sort and post-thaw sperm quality were assessed by motility (CASA), viability and acrosome integrity (PI/FITC-PNA). Sex-sorted goat spermatozoa frozen in pellets displayed significantly higher post-thaw motility and viability than spermatozoa frozen in straws. Catch media and differing egg yolk concentration had no effect on the sperm parameters tested. The in vitro and in vivo fertility of sex-sorted goat spermatozoa produced with this optimum protocol were then tested by means of a heterologous ova binding assay and intrauterine artificial insemination of Saanen goat does, respectively. Sex-sorted goat spermatozoa bound to sheep ova zona pellucidae in similar numbers (p > 0.05) to non-sorted goat spermatozoa, non-sorted ram spermatozoa and sex-sorted ram spermatozoa. Following intrauterine artificial insemination with sex-sorted spermatozoa, 38% (5/13) of does kidded with 83% (3/5) of kids being of the expected sex. Does inseminated with non-sorted spermatozoa achieved a 50% (3/6) kidding rate and a sex ratio of 3 : 1 (F : M). This study demonstrates for the first time that goat spermatozoa can be sex-sorted by flow cytometry, successfully frozen and used to produce pre-sexed kids. © 2013 Blackwell Verlag GmbH.

  10. Sorting drops and cells with acoustics: acoustic microfluidic fluorescence-activated cell sorter.

    PubMed

    Schmid, Lothar; Weitz, David A; Franke, Thomas

    2014-10-07

    We describe a versatile microfluidic fluorescence-activated cell sorter that uses acoustic actuation to sort cells or drops at ultra-high rates. Our acoustic sorter combines the advantages of traditional fluorescence-activated cell (FACS) and droplet sorting (FADS) and is applicable for a multitude of objects. We sort aqueous droplets, at rates as high as several kHz, into two or even more outlet channels. We can also sort cells directly from the medium without prior encapsulation into drops; we demonstrate this by sorting fluorescently labeled mouse melanoma cells in a single phase fluid. Our acoustic microfluidic FACS is compatible with standard cell sorting cytometers, yet, at the same time, enables a rich variety of more sophisticated applications.

  11. Surface acoustic wave actuated cell sorting (SAWACS).

    PubMed

    Franke, T; Braunmüller, S; Schmid, L; Wixforth, A; Weitz, D A

    2010-03-21

    We describe a novel microfluidic cell sorter which operates in continuous flow at high sorting rates. The device is based on a surface acoustic wave cell-sorting scheme and combines many advantages of fluorescence activated cell sorting (FACS) and fluorescence activated droplet sorting (FADS) in microfluidic channels. It is fully integrated on a PDMS device, and allows fast electronic control of cell diversion. We direct cells by acoustic streaming excited by a surface acoustic wave which deflects the fluid independently of the contrast in material properties of deflected objects and the continuous phase; thus the device underlying principle works without additional enhancement of the sorting by prior labelling of the cells with responsive markers such as magnetic or polarizable beads. Single cells are sorted directly from bulk media at rates as fast as several kHz without prior encapsulation into liquid droplet compartments as in traditional FACS. We have successfully directed HaCaT cells (human keratinocytes), fibroblasts from mice and MV3 melanoma cells. The low shear forces of this sorting method ensure that cells survive after sorting.

  12. Research of grasping algorithm based on scara industrial robot

    NASA Astrophysics Data System (ADS)

    Peng, Tao; Zuo, Ping; Yang, Hai

    2018-04-01

    As the tobacco industry grows, facing the challenge of the international tobacco giant, efficient logistics service is one of the key factors. How to complete the tobacco sorting task of efficient economy is the goal of tobacco sorting and optimization research. Now the cigarette distribution system uses a single line to carry out the single brand sorting task, this article adopts a single line to realize the cigarette sorting task of different brands. Using scara robot special algorithm for sorting and packaging, the optimization scheme significantly enhances the indicators of smoke sorting system. Saving labor productivity, obviously improve production efficiency.

  13. Learning Cellular Sorting Pathways Using Protein Interactions and Sequence Motifs

    PubMed Central

    Lin, Tien-Ho; Bar-Joseph, Ziv

    2011-01-01

    Abstract Proper subcellular localization is critical for proteins to perform their roles in cellular functions. Proteins are transported by different cellular sorting pathways, some of which take a protein through several intermediate locations until reaching its final destination. The pathway a protein is transported through is determined by carrier proteins that bind to specific sequence motifs. In this article, we present a new method that integrates protein interaction and sequence motif data to model how proteins are sorted through these sorting pathways. We use a hidden Markov model (HMM) to represent protein sorting pathways. The model is able to determine intermediate sorting states and to assign carrier proteins and motifs to the sorting pathways. In simulation studies, we show that the method can accurately recover an underlying sorting model. Using data for yeast, we show that our model leads to accurate prediction of subcellular localization. We also show that the pathways learned by our model recover many known sorting pathways and correctly assign proteins to the path they utilize. The learned model identified new pathways and their putative carriers and motifs and these may represent novel protein sorting mechanisms. Supplementary results and software implementation are available from http://murphylab.web.cmu.edu/software/2010_RECOMB_pathways/. PMID:21999284

  14. A New Algorithm Using the Non-Dominated Tree to Improve Non-Dominated Sorting.

    PubMed

    Gustavsson, Patrik; Syberfeldt, Anna

    2018-01-01

    Non-dominated sorting is a technique often used in evolutionary algorithms to determine the quality of solutions in a population. The most common algorithm is the Fast Non-dominated Sort (FNS). This algorithm, however, has the drawback that its performance deteriorates when the population size grows. The same drawback applies also to other non-dominating sorting algorithms such as the Efficient Non-dominated Sort with Binary Strategy (ENS-BS). An algorithm suggested to overcome this drawback is the Divide-and-Conquer Non-dominated Sort (DCNS) which works well on a limited number of objectives but deteriorates when the number of objectives grows. This article presents a new, more efficient algorithm called the Efficient Non-dominated Sort with Non-Dominated Tree (ENS-NDT). ENS-NDT is an extension of the ENS-BS algorithm and uses a novel Non-Dominated Tree (NDTree) to speed up the non-dominated sorting. ENS-NDT is able to handle large population sizes and a large number of objectives more efficiently than existing algorithms for non-dominated sorting. In the article, it is shown that with ENS-NDT the runtime of multi-objective optimization algorithms such as the Non-Dominated Sorting Genetic Algorithm II (NSGA-II) can be substantially reduced.

  15. Particle Transport and Size Sorting in Bubble Microstreaming Flow

    NASA Astrophysics Data System (ADS)

    Thameem, Raqeeb; Rallabandi, Bhargav; Wang, Cheng; Hilgenfeldt, Sascha

    2014-11-01

    Ultrasonic driving of sessile semicylindrical bubbles results in powerful steady streaming flows that are robust over a wide range of driving frequencies. In a microchannel, this flow field pattern can be fine-tuned to achieve size-sensitive sorting and trapping of particles at scales much smaller than the bubble itself; the sorting mechanism has been successfully described based on simple geometrical considerations. We investigate the sorting process in more detail, both experimentally (using new parameter variations that allow greater control over the sorting) and theoretically (incorporating the device geometry as well as the superimposed channel flow into an asymptotic theory). This results in optimized criteria for size sorting and a theoretical description that closely matches the particle behavior close to the bubble, the crucial region for size sorting.

  16. Infrared machine vision system for the automatic detection of olive fruit quality.

    PubMed

    Guzmán, Elena; Baeten, Vincent; Pierna, Juan Antonio Fernández; García-Mesa, José A

    2013-11-15

    External quality is an important factor in the extraction of olive oil and the marketing of olive fruits. The appearance and presence of external damage are factors that influence the quality of the oil extracted and the perception of consumers, determining the level of acceptance prior to purchase in the case of table olives. The aim of this paper is to report on artificial vision techniques developed for the online estimation of olive quality and to assess the effectiveness of these techniques in evaluating quality based on detecting external defects. This method of classifying olives according to the presence of defects is based on an infrared (IR) vision system. Images of defects were acquired using a digital monochrome camera with band-pass filters on near-infrared (NIR). The original images were processed using segmentation algorithms, edge detection and pixel value intensity to classify the whole fruit. The detection of the defect involved a pixel classification procedure based on nonparametric models of the healthy and defective areas of olives. Classification tests were performed on olives to assess the effectiveness of the proposed method. This research showed that the IR vision system is a useful technology for the automatic assessment of olives that has the potential for use in offline inspection and for online sorting for defects and the presence of surface damage, easily distinguishing those that do not meet minimum quality requirements. Crown Copyright © 2013 Published by Elsevier B.V. All rights reserved.

  17. A software tool to automatically assure and report daily treatment deliveries by a cobalt‐60 radiation therapy device

    PubMed Central

    Wooten, H. Omar; Green, Olga; Li, Harold H.; Liu, Shi; Li, Xiaoling; Rodriguez, Vivian; Mutic, Sasa; Kashani, Rojano

    2016-01-01

    The aims of this study were to develop a method for automatic and immediate verification of treatment delivery after each treatment fraction in order to detect and correct errors, and to develop a comprehensive daily report which includes delivery verification results, daily image‐guided radiation therapy (IGRT) review, and information for weekly physics reviews. After systematically analyzing the requirements for treatment delivery verification and understanding the available information from a commercial MRI‐guided radiotherapy treatment machine, we designed a procedure to use 1) treatment plan files, 2) delivery log files, and 3) beam output information to verify the accuracy and completeness of each daily treatment delivery. The procedure verifies the correctness of delivered treatment plan parameters including beams, beam segments and, for each segment, the beam‐on time and MLC leaf positions. For each beam, composite primary fluence maps are calculated from the MLC leaf positions and segment beam‐on time. Error statistics are calculated on the fluence difference maps between the plan and the delivery. A daily treatment delivery report is designed to include all required information for IGRT and weekly physics reviews including the plan and treatment fraction information, daily beam output information, and the treatment delivery verification results. A computer program was developed to implement the proposed procedure of the automatic delivery verification and daily report generation for an MRI guided radiation therapy system. The program was clinically commissioned. Sensitivity was measured with simulated errors. The final version has been integrated into the commercial version of the treatment delivery system. The method automatically verifies the EBRT treatment deliveries and generates the daily treatment reports. Already in clinical use for over one year, it is useful to facilitate delivery error detection, and to expedite physician daily IGRT review and physicist weekly chart review. PACS number(s): 87.55.km PMID:27167269

  18. Automatic Identification of Critical Data Items in a Database to Mitigate the Effects of Malicious Insiders

    NASA Astrophysics Data System (ADS)

    White, Jonathan; Panda, Brajendra

    A major concern for computer system security is the threat from malicious insiders who target and abuse critical data items in the system. In this paper, we propose a solution to enable automatic identification of critical data items in a database by way of data dependency relationships. This identification of critical data items is necessary because insider threats often target mission critical data in order to accomplish malicious tasks. Unfortunately, currently available systems fail to address this problem in a comprehensive manner. It is more difficult for non-experts to identify these critical data items because of their lack of familiarity and due to the fact that data systems are constantly changing. By identifying the critical data items automatically, security engineers will be better prepared to protect what is critical to the mission of the organization and also have the ability to focus their security efforts on these critical data items. We have developed an algorithm that scans the database logs and forms a directed graph showing which items influence a large number of other items and at what frequency this influence occurs. This graph is traversed to reveal the data items which have a large influence throughout the database system by using a novel metric based formula. These items are critical to the system because if they are maliciously altered or stolen, the malicious alterations will spread throughout the system, delaying recovery and causing a much more malignant effect. As these items have significant influence, they are deemed to be critical and worthy of extra security measures. Our proposal is not intended to replace existing intrusion detection systems, but rather is intended to complement current and future technologies. Our proposal has never been performed before, and our experimental results have shown that it is very effective in revealing critical data items automatically.

  19. An ontology-based personalization of health-care knowledge to support clinical decisions for chronically ill patients.

    PubMed

    Riaño, David; Real, Francis; López-Vallverdú, Joan Albert; Campana, Fabio; Ercolani, Sara; Mecocci, Patrizia; Annicchiarico, Roberta; Caltagirone, Carlo

    2012-06-01

    Chronically ill patients are complex health care cases that require the coordinated interaction of multiple professionals. A correct intervention of these sort of patients entails the accurate analysis of the conditions of each concrete patient and the adaptation of evidence-based standard intervention plans to these conditions. There are some other clinical circumstances such as wrong diagnoses, unobserved comorbidities, missing information, unobserved related diseases or prevention, whose detection depends on the capacities of deduction of the professionals involved. In this paper, we introduce an ontology for the care of chronically ill patients and implement two personalization processes and a decision support tool. The first personalization process adapts the contents of the ontology to the particularities observed in the health-care record of a given concrete patient, automatically providing a personalized ontology containing only the clinical information that is relevant for health-care professionals to manage that patient. The second personalization process uses the personalized ontology of a patient to automatically transform intervention plans describing health-care general treatments into individual intervention plans. For comorbid patients, this process concludes with the semi-automatic integration of several individual plans into a single personalized plan. Finally, the ontology is also used as the knowledge base of a decision support tool that helps health-care professionals to detect anomalous circumstances such as wrong diagnoses, unobserved comorbidities, missing information, unobserved related diseases, or preventive actions. Seven health-care centers participating in the K4CARE project, together with the group SAGESA and the Local Health System in the town of Pollenza have served as the validation platform for these two processes and tool. Health-care professionals participating in the evaluation agree about the average quality 84% (5.9/7.0) and utility 90% (6.3/7.0) of the tools and also about the correct reasoning of the decision support tool, according to clinical standards. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. Classification and Lateralization of Temporal Lobe Epilepsies with and without Hippocampal Atrophy Based on Whole-Brain Automatic MRI Segmentation

    PubMed Central

    Keihaninejad, Shiva; Heckemann, Rolf A.; Gousias, Ioannis S.; Hajnal, Joseph V.; Duncan, John S.; Aljabar, Paul; Rueckert, Daniel; Hammers, Alexander

    2012-01-01

    Brain images contain information suitable for automatically sorting subjects into categories such as healthy controls and patients. We sought to identify morphometric criteria for distinguishing controls (n = 28) from patients with unilateral temporal lobe epilepsy (TLE), 60 with and 20 without hippocampal atrophy (TLE-HA and TLE-N, respectively), and for determining the presumed side of seizure onset. The framework employs multi-atlas segmentation to estimate the volumes of 83 brain structures. A kernel-based separability criterion was then used to identify structures whose volumes discriminate between the groups. Next, we applied support vector machines (SVM) to the selected set for classification on the basis of volumes. We also computed pairwise similarities between all subjects and used spectral analysis to convert these into per-subject features. SVM was again applied to these feature data. After training on a subgroup, all TLE-HA patients were correctly distinguished from controls, achieving an accuracy of 96 ± 2% in both classification schemes. For TLE-N patients, the accuracy was 86 ± 2% based on structural volumes and 91 ± 3% using spectral analysis. Structures discriminating between patients and controls were mainly localized ipsilaterally to the presumed seizure focus. For the TLE-HA group, they were mainly in the temporal lobe; for the TLE-N group they included orbitofrontal regions, as well as the ipsilateral substantia nigra. Correct lateralization of the presumed seizure onset zone was achieved using hippocampi and parahippocampal gyri in all TLE-HA patients using either classification scheme; in the TLE-N patients, lateralization was accurate based on structural volumes in 86 ± 4%, and in 94 ± 4% with the spectral analysis approach. Unilateral TLE has imaging features that can be identified automatically, even when they are invisible to human experts. Such morphometric image features may serve as classification and lateralization criteria. The technique also detects unsuspected distinguishing features like the substantia nigra, warranting further study. PMID:22523539

  1. Sperm sex-sorting and preservation for managing the sex ratio and genetic diversity of the southern white rhinoceros (Ceratotherium simum simum).

    PubMed

    O'Brien, J K; Roth, T L; Stoops, M A; Ball, R L; Steinman, K J; Montano, G A; Love, C C; Robeck, T R

    2015-01-01

    White rhinoceros ejaculates (n=9) collected by electroejaculation from four males were shipped (10°C, 12h) to develop procedures for the production of chilled and frozen-thawed sex-sorted spermatozoa of adequate quality for artificial insemination (AI). Of all electroejaculate fractions, 39.7% (31/78) exhibited high quality post-collection (≥70% total motility and membrane integrity) and of those, 54.8% (17/31) presented reduced in vitro quality after transport and were retrospectively determined to exhibit urine-contamination (≥21.0μg creatinine/ml). Of fractions analyzed for creatinine concentration, 69% (44/64) were classified as urine-contaminated. For high quality non-contaminated fractions, in vitro parameters (motility, velocity, membrane, acrosome and DNA integrity) of chilled non-sorted and sorted spermatozoa were well-maintained at 5°C up to 54h post-collection, whereby >70% of post-transport (non-sorted) or post-sort (sorted) values were retained. By 54h post-collection, some motility parameters were higher (P<0.05) for non-sorted spermatozoa (total motility, rapid velocity, average path velocity) whereas all remaining motion parameters as well as membrane, acrosome and DNA integrity were similar between sperm types. In comparison with a straw method, directional freezing resulted in enhanced (P<0.05) motility and velocity of non-sorted and sorted spermatozoa, with comparable overall post-thaw quality between sperm types. High purity enrichment of X-bearing (89±6%) or Y-bearing (86±3%) spermatozoa was achieved using moderate sorting rates (2540±498X-spermatozoa/s; 1800±557Y-spermatozoa/s). Collective in vitro characteristics of sorted-chilled or sorted-frozen-thawed spermatozoa derived from high quality electroejaculates indicate acceptable fertility potential for use in AI. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Encapsulation of sex sorted boar semen: sperm membrane status and oocyte penetration parameters.

    PubMed

    Spinaci, Marcella; Chlapanidas, Theodora; Bucci, Diego; Vallorani, Claudia; Perteghella, Sara; Lucconi, Giulia; Communod, Ricardo; Vigo, Daniele; Galeati, Giovanna; Faustini, Massimo; Torre, Maria Luisa

    2013-03-01

    Although sorted semen is experimentally used for artificial, intrauterine, and intratubal insemination and in vitro fertilization, its commercial application in swine species is still far from a reality. This is because of the low sort rate and the large number of sperm required for routine artificial insemination in the pig, compared with other production animals, and the greater susceptibility of porcine spermatozoa to stress induced by the different sex sorting steps and the postsorting handling protocols. The encapsulation technology could overcome this limitation in vivo, protecting and allowing the slow release of low-dose sorted semen. The aim of this work was to evaluate the impact of the encapsulation process on viability, acrosome integrity, and on the in vitro fertilizing potential of sorted boar semen. Our results indicate that the encapsulation technique does not damage boar sorted semen; in fact, during a 72-hour storage, no differences were observed between liquid-stored sorted semen and encapsulated sorted semen in terms of plasma membrane (39.98 ± 14.38% vs. 44.32 ± 11.72%, respectively) and acrosome integrity (74.32 ± 12.17% vs. 66.07 ± 10.83%, respectively). Encapsulated sorted spermatozoa presented a lower penetration potential than nonencapsulated ones (47.02% vs. 24.57%, respectively, P < 0.0001), and a significant reduction of polyspermic fertilization (60.76% vs. 36.43%, respectively, polyspermic ova/total ova; P < 0.0001). However, no difference (P > 0.05) was observed in terms of total efficiency of fertilization expressed as normospermic oocytes/total oocytes (18.45% vs. 15.43% for sorted diluted and sorted encapsulated semen, respectively). The encapsulation could be an alternative method of storing of pig sex sorted spermatozoa and is potentially a promising technique in order to optimize the use of low dose of sexed spermatozoa in vivo. Copyright © 2013 Elsevier Inc. All rights reserved.

  3. Sortilin 1 Loss-of-Function Protects Against Cholestatic Liver Injury by Attenuating Hepatic Bile Acid Accumulation in Bile Duct Ligated Mice.

    PubMed

    Li, Jibiao; Woolbright, Benjamin L; Zhao, Wen; Wang, Yifeng; Matye, David; Hagenbuch, Bruno; Jaeschke, Hartmut; Li, Tiangang

    2018-01-01

    Sortilin 1 (Sort1) is an intracellular trafficking receptor that mediates protein sorting in the endocytic or secretory pathways. Recent studies revealed a role of Sort1 in the regulation of cholesterol and bile acid (BA) metabolism. This study further investigated the role of Sort1 in modulating BA detoxification and cholestatic liver injury in bile duct ligated mice. We found that Sort1 knockout (KO) mice had attenuated liver injury 24 h after bile duct ligation (BDL), which was mainly attributed to less bile infarct formation. Sham-operated Sort1 KO mice had about 20% larger BA pool size than sham-operated wildtype (WT) mice, but 24 h after BDL Sort1 KO mice had significantly attenuated hepatic BA accumulation and smaller BA pool size. After 14 days BDL, Sort1 KO mice showed significantly lower hepatic BA concentration and reduced expression of inflammatory and fibrotic marker genes, but similar degree of liver fibrosis compared with WT mice. Unbiased quantitative proteomics revealed that Sort1 KO mice had increased hepatic BA sulfotransferase 2A1, but unaltered phase-I BA metabolizing cytochrome P450s or phase-III BA efflux transporters. Consistently, Sort1 KO mice showed elevated plasma sulfated taurocholate after BDL. Finally, we found that liver Sort1 was repressed after BDL, which may be due to BA activation of farnesoid x receptor. In conclusion, we report a role of Sort1 in the regulation of hepatic BA detoxification and cholestatic liver injury in mice. The mechanisms underlying increased hepatic BA elimination in Sort1 KO mice after BDL require further investigation. © The Author 2017. Published by Oxford University Press on behalf of the Society of Toxicology. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Reducing 4D CT artifacts using optimized sorting based on anatomic similarity.

    PubMed

    Johnston, Eric; Diehn, Maximilian; Murphy, James D; Loo, Billy W; Maxim, Peter G

    2011-05-01

    Four-dimensional (4D) computed tomography (CT) has been widely used as a tool to characterize respiratory motion in radiotherapy. The two most commonly used 4D CT algorithms sort images by the associated respiratory phase or displacement into a predefined number of bins, and are prone to image artifacts at transitions between bed positions. The purpose of this work is to demonstrate a method of reducing motion artifacts in 4D CT by incorporating anatomic similarity into phase or displacement based sorting protocols. Ten patient datasets were retrospectively sorted using both the displacement and phase based sorting algorithms. Conventional sorting methods allow selection of only the nearest-neighbor image in time or displacement within each bin. In our method, for each bed position either the displacement or the phase defines the center of a bin range about which several candidate images are selected. The two dimensional correlation coefficients between slices bordering the interface between adjacent couch positions are then calculated for all candidate pairings. Two slices have a high correlation if they are anatomically similar. Candidates from each bin are then selected to maximize the slice correlation over the entire data set using the Dijkstra's shortest path algorithm. To assess the reduction of artifacts, two thoracic radiation oncologists independently compared the resorted 4D datasets pairwise with conventionally sorted datasets, blinded to the sorting method, to choose which had the least motion artifacts. Agreement between reviewers was evaluated using the weighted kappa score. Anatomically based image selection resulted in 4D CT datasets with significantly reduced motion artifacts with both displacement (P = 0.0063) and phase sorting (P = 0.00022). There was good agreement between the two reviewers, with complete agreement 34 times and complete disagreement 6 times. Optimized sorting using anatomic similarity significantly reduces 4D CT motion artifacts compared to conventional phase or displacement based sorting. This improved sorting algorithm is a straightforward extension of the two most common 4D CT sorting algorithms.

  5. NIH Toolbox Cognition Battery (NIHTB-CB): list sorting test to measure working memory.

    PubMed

    Tulsky, David S; Carlozzi, Noelle; Chiaravalloti, Nancy D; Beaumont, Jennifer L; Kisala, Pamela A; Mungas, Dan; Conway, Kevin; Gershon, Richard

    2014-07-01

    The List Sorting Working Memory Test was designed to assess working memory (WM) as part of the NIH Toolbox Cognition Battery. List Sorting is a sequencing task requiring children and adults to sort and sequence stimuli that are presented visually and auditorily. Validation data are presented for 268 participants ages 20 to 85 years. A subset of participants (N=89) was retested 7 to 21 days later. As expected, the List Sorting Test had moderately high correlations with other measures of working memory and executive functioning (convergent validity) but a low correlation with a test of receptive vocabulary (discriminant validity). Furthermore, List Sorting demonstrates expected changes over the age span and has excellent test-retest reliability. Collectively, these results provide initial support for the construct validity of the List Sorting Working Memory Measure as a measure of working memory. However, the relationship between the List Sorting Test and general executive function has yet to be determined.

  6. Manual sorting to eliminate aflatoxin from peanuts.

    PubMed

    Galvez, F C F; Francisco, M L D L; Villarino, B J; Lustre, A O; Resurreccion, A V A

    2003-10-01

    A manual sorting procedure was developed to eliminate aflatoxin contamination from peanuts. The efficiency of the sorting process in eliminating aflatoxin-contaminated kernels from lots of raw peanuts was verified. The blanching of 20 kg of peanuts at 140 degrees C for 25 min in preheated roasters facilitated the manual sorting of aflatoxin-contaminated kernels after deskinning. The manual sorting of raw materials with initially high aflatoxin contents (300 ppb) resulted in aflatoxin-free peanuts (i.e., peanuts in which no aflatoxin was detected). Verification procedures showed that the sorted sound peanuts contained no aflatoxin or contained low levels (<15 ppb) of aflatoxin. The results obtained confirmed that the sorting process was effective in separating contaminated peanuts whether or nor contamination was extensive. At the commercial level, when roasters were not preheated, the dry blanching of 50 kg of peanuts for 45 to 55 min facilitated the proper deskinning and subsequent manual sorting of aflatoxin-contaminated peanut kernels from sound kernels.

  7. A Simple Deep Learning Method for Neuronal Spike Sorting

    NASA Astrophysics Data System (ADS)

    Yang, Kai; Wu, Haifeng; Zeng, Yu

    2017-10-01

    Spike sorting is one of key technique to understand brain activity. With the development of modern electrophysiology technology, some recent multi-electrode technologies have been able to record the activity of thousands of neuronal spikes simultaneously. The spike sorting in this case will increase the computational complexity of conventional sorting algorithms. In this paper, we will focus spike sorting on how to reduce the complexity, and introduce a deep learning algorithm, principal component analysis network (PCANet) to spike sorting. The introduced method starts from a conventional model and establish a Toeplitz matrix. Through the column vectors in the matrix, we trains a PCANet, where some eigenvalue vectors of spikes could be extracted. Finally, support vector machine (SVM) is used to sort spikes. In experiments, we choose two groups of simulated data from public databases availably and compare this introduced method with conventional methods. The results indicate that the introduced method indeed has lower complexity with the same sorting errors as the conventional methods.

  8. NIH Toolbox Cognition Battery (NIHTB-CB): The List Sorting Test to Measure Working Memory

    PubMed Central

    Tulsky, David S.; Carlozzi, Noelle; Chiaravalloti, Nancy D.; Beaumont, Jennifer L.; Kisala, Pamela A.; Mungas, Dan; Conway, Kevin; Gershon, Richard

    2015-01-01

    The List Sorting Working Memory Test was designed to assess working memory (WM) as part of the NIH Toolbox Cognition Battery. List Sorting is a sequencing task requiring children and adults to sort and sequence stimuli that are presented visually and auditorily. Validation data are presented for 268 participants ages 20 to 85 years. A subset of participants (N=89) was retested 7 to 21 days later. As expected, the List Sorting Test had moderately high correlations with other measures of working memory and executive functioning (convergent validity) but a low correlation with a test of receptive vocabulary (discriminant validity). Furthermore, List Sorting demonstrates expected changes over the age span and has excellent test-retest reliability. Collectively, these results provide initial support the construct validity of the List Sorting Working Memory Measure as a measure of working memory. However, the relation between the List Sorting Test and general executive function has yet to be determined. PMID:24959983

  9. CONCH: A Visual Basic program for interactive processing of ion-microprobe analytical data

    NASA Astrophysics Data System (ADS)

    Nelson, David R.

    2006-11-01

    A Visual Basic program for flexible, interactive processing of ion-microprobe data acquired for quantitative trace element, 26Al- 26Mg, 53Mn- 53Cr, 60Fe- 60Ni and U-Th-Pb geochronology applications is described. Default but editable run-tables enable software identification of secondary ion species analyzed and for characterization of the standard used. Counts obtained for each species may be displayed in plots against analysis time and edited interactively. Count outliers can be automatically identified via a set of editable count-rejection criteria and displayed for assessment. Standard analyses are distinguished from Unknowns by matching of the analysis label with a string specified in the Set-up dialog, and processed separately. A generalized routine writes background-corrected count rates, ratios and uncertainties, plus weighted means and uncertainties for Standards and Unknowns, to a spreadsheet that may be saved as a text-delimited file. Specialized routines process trace-element concentration, 26Al- 26Mg, 53Mn- 53Cr, 60Fe- 60Ni, and Th-U disequilibrium analysis types, and U-Th-Pb isotopic data obtained for zircon, titanite, perovskite, monazite, xenotime and baddeleyite. Correction to measured Pb-isotopic, Pb/U and Pb/Th ratios for the presence of common Pb may be made using measured 204Pb counts, or the 207Pb or 208Pb counts following subtraction from these of the radiogenic component. Common-Pb corrections may be made automatically, using a (user-specified) common-Pb isotopic composition appropriate for that on the sample surface, or for that incorporated within the mineral at the time of its crystallization, depending on whether the 204Pb count rate determined for the Unknown is substantially higher than the average 204Pb count rate for all session standards. Pb/U inter-element fractionation corrections are determined using an interactive log e-log e plot of common-Pb corrected 206Pb/ 238U ratios against any nominated fractionation-sensitive species pair (commonly 238U 16O +/ 238U +) for session standards. Also displayed with this plot are calculated Pb/U and Pb/Th calibration line regression slopes, y-intercepts, calibration uncertainties, standard 204Pb- and 208Pb-corrected 207Pb/ 206Pb dates and other parameters useful for assessment of the calibration-line data. Calibrated data for Unknowns may be automatically grouped according to calculated date and displayed in color on interactive Wetherill Concordia, Tera-Wasserburg Concordia, Linearized Gaussian ("Probability Paper") and Gaussian-summation probability density diagrams.

  10. Regulation of synaptic activity by snapin-mediated endolysosomal transport and sorting

    PubMed Central

    Di Giovanni, Jerome; Sheng, Zu-Hang

    2015-01-01

    Recycling synaptic vesicles (SVs) transit through early endosomal sorting stations, which raises a fundamental question: are SVs sorted toward endolysosomal pathways? Here, we used snapin mutants as tools to assess how endolysosomal sorting and trafficking impact presynaptic activity in wild-type and snapin−/− neurons. Snapin acts as a dynein adaptor that mediates the retrograde transport of late endosomes (LEs) and interacts with dysbindin, a subunit of the endosomal sorting complex BLOC-1. Expressing dynein-binding defective snapin mutants induced SV accumulation at presynaptic terminals, mimicking the snapin−/− phenotype. Conversely, over-expressing snapin reduced SV pool size by enhancing SV trafficking to the endolysosomal pathway. Using a SV-targeted Ca2+ sensor, we demonstrate that snapin–dysbindin interaction regulates SV positional priming through BLOC-1/AP-3-dependent sorting. Our study reveals a bipartite regulation of presynaptic activity by endolysosomal trafficking and sorting: LE transport regulates SV pool size, and BLOC-1/AP-3-dependent sorting fine-tunes the Ca2+ sensitivity of SV release. Therefore, our study provides new mechanistic insights into the maintenance and regulation of SV pool size and synchronized SV fusion through snapin-mediated LE trafficking and endosomal sorting. PMID:26108535

  11. CellSort: a support vector machine tool for optimizing fluorescence-activated cell sorting and reducing experimental effort.

    PubMed

    Yu, Jessica S; Pertusi, Dante A; Adeniran, Adebola V; Tyo, Keith E J

    2017-03-15

    High throughput screening by fluorescence activated cell sorting (FACS) is a common task in protein engineering and directed evolution. It can also be a rate-limiting step if high false positive or negative rates necessitate multiple rounds of enrichment. Current FACS software requires the user to define sorting gates by intuition and is practically limited to two dimensions. In cases when multiple rounds of enrichment are required, the software cannot forecast the enrichment effort required. We have developed CellSort, a support vector machine (SVM) algorithm that identifies optimal sorting gates based on machine learning using positive and negative control populations. CellSort can take advantage of more than two dimensions to enhance the ability to distinguish between populations. We also present a Bayesian approach to predict the number of sorting rounds required to enrich a population from a given library size. This Bayesian approach allowed us to determine strategies for biasing the sorting gates in order to reduce the required number of enrichment rounds. This algorithm should be generally useful for improve sorting outcomes and reducing effort when using FACS. Source code available at http://tyolab.northwestern.edu/tools/ . k-tyo@northwestern.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  12. Effects of Intertidal Harvest Practices on Levels of Vibrio parahaemolyticus and Vibrio vulnificus Bacteria in Oysters.

    PubMed

    Jones, J L; Kinsey, T P; Johnson, L W; Porso, R; Friedman, B; Curtis, M; Wesighan, P; Schuster, R; Bowers, J C

    2016-08-01

    Vibrio parahaemolyticus and Vibrio vulnificus can grow rapidly in shellfish subjected to ambient air conditions, such as during intertidal exposure. In this study, levels of total and pathogenic (tdh(+) and/or trh(+)) V. parahaemolyticus and total V. vulnificus were determined in oysters collected from two study locations where intertidal harvest practices are common. Samples were collected directly off intertidal flats, after exposure (ambient air [Washington State] or refrigerated [New Jersey]), and after reimmersion by natural tidal cycles. Samples were processed using a most-probable-number (MPN) real-time PCR method for total and pathogenic V. parahaemolyticus or V. vulnificus In Washington State, the mean levels of V. parahaemolyticus increased 1.38 log MPN/g following intertidal exposure and dropped 1.41 log MPN/g after reimmersion for 1 day, but the levels were dependent upon the container type utilized. Pathogenic V. parahaemolyticus levels followed a similar trend. However, V. vulnificus levels increased 0.10 log MPN/g during intertidal exposure in Washington but decreased by >1 log MPN/g after reimmersion. In New Jersey, initial levels of all vibrios studied were not significantly altered during the refrigerated sorting and containerizing process. However, there was an increase in levels after the first day of reimmersion by 0.79, 0.72, 0.92, and 0.71 log MPN/g for total, tdh(+) and trh(+) V. parahaemolyticus, and V. vulnificus, respectively. The levels of all targets decreased to those similar to background after a second day of reimmersion. These data indicate that the intertidal harvest and handling practices for oysters that were studied in Washington and New Jersey do not increase the risk of illness from V. parahaemolyticus or V. vulnificus Vibrio parahaemolyticus and Vibrio vulnificus are the leading causes of seafood-associated infectious morbidity and mortality in the United States. Vibrio spp. can grow rapidly in shellfish subjected to ambient air conditions, such as during periods of intertidal exposure. When oysters are submersed with the incoming tide, the vibrios can be purged. However, data on the rates of increase and purging during intertidal harvest are scarce, which limits the accuracy of risk assessments. The objective of this study was to help fill these data gaps by determining the levels of total and pathogenic (tdh(+) and/or trh(+)) V. parahaemolyticus and V. vulnificus in oysters from two locations where intertidal harvest practices are common, using the current industry practices. The data generated provide insight into the responses of Vibrio spp. to relevant practices of the industry and public health, which can be incorporated into risk management decisions. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  13. Designing a monitoring network for contaminated ground water in fractured chalk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nativ, R.; Adar, E.M.; Becker, A.

    1999-01-01

    One of the challenges of monitoring network design in a fractured rock setting is the heterogeneity of the rocks. This paper summarizes the activities and problems associated with the monitoring of contaminated groundwater in porous, low-permeability fractured chalk in the Negev Desert, Israel. Preferential flow documented in the study area required siting the monitoring boreholes in the predominant fracture systems. Lineaments traced from aerial photographs were examined in the field to sort out the large-extension, through-going, multilayer fracture systems crossing the study area. At each proposed drilling site, these fractures were exposed below the sediment cover using trenches. Slanted boreholesmore » were drilled at a distance from the fracture systems so that each borehole would intersect the targeted fracture plane below the water table. Based on their short recovery period and contaminated ground water, these newly drilled, fracture-oriented boreholes appeared to be better connected to preferential flowpaths crossing the industrial site than the old boreholes existing on site. Other considerations concerning the drilling and logging of monitoring boreholes in a fractured media were: (1) coring provides better documentation of the vertical fracture distribution, but dry augering is less costly and enables immediate ground water sampling and the sampling of vadose rock for contaminant analysis; (2) caliper and TV camera logs appear to provide only partial information regarding the vertical fracture distribution; and (3) the information gained by deepening the monitoring boreholes and testing fractures crossing their uncased walls has to be carefully weighed against the risk of potential cross-contamination through the monitoring boreholes, which is enhanced in fractured media.« less

  14. Categorizing Variations of Student-Implemented Sorting Algorithms

    ERIC Educational Resources Information Center

    Taherkhani, Ahmad; Korhonen, Ari; Malmi, Lauri

    2012-01-01

    In this study, we examined freshmen students' sorting algorithm implementations in data structures and algorithms' course in two phases: at the beginning of the course before the students received any instruction on sorting algorithms, and after taking a lecture on sorting algorithms. The analysis revealed that many students have insufficient…

  15. COST EVALUATION OF AUTOMATED AND MANUAL POST- CONSUMER PLASTIC BOTTLE SORTING SYSTEMS

    EPA Science Inventory

    This project evaluates, on the basis of performance and cost, two Automated BottleSort® sorting systems for post-consumer commingled plastic containers developed by Magnetic Separation Systems. This study compares the costs to sort mixed bales of post-consumer plastic at these t...

  16. Application of visible spectroscopy in waste sorting

    NASA Astrophysics Data System (ADS)

    Spiga, Philippe; Bourely, Antoine

    2011-10-01

    Today, waste recycling, (bottles, papers...), is a mechanical operation: the waste are crushed, fused and agglomerated in order to obtain new manufactured products (e.g. new bottles, clothes ...). The plastics recycling is the main application in the color sorting process. The colorless plastics recovered are more valuable than the colored plastics. Other emergent applications are in the paper sorting, where the main goal is to sort dyed paper from white papers. Up to now, Pellenc Selective Technologies has manufactured color sorting machines based on RGB cameras. Three dimensions (red, green and blue) are no longer sufficient to detect low quantities of dye in the considered waste. In order to increase the efficiency of the color detection, a new sorting machine, based on visible spectroscopy, has been developed. This paper presents the principles of the two approaches and their difference in terms of sorting performance, making visible spectroscopy a clear winner.

  17. MetaSort untangles metagenome assembly by reducing microbial community complexity

    PubMed Central

    Ji, Peifeng; Zhang, Yanming; Wang, Jinfeng; Zhao, Fangqing

    2017-01-01

    Most current approaches to analyse metagenomic data rely on reference genomes. Novel microbial communities extend far beyond the coverage of reference databases and de novo metagenome assembly from complex microbial communities remains a great challenge. Here we present a novel experimental and bioinformatic framework, metaSort, for effective construction of bacterial genomes from metagenomic samples. MetaSort provides a sorted mini-metagenome approach based on flow cytometry and single-cell sequencing methodologies, and employs new computational algorithms to efficiently recover high-quality genomes from the sorted mini-metagenome by the complementary of the original metagenome. Through extensive evaluations, we demonstrated that metaSort has an excellent and unbiased performance on genome recovery and assembly. Furthermore, we applied metaSort to an unexplored microflora colonized on the surface of marine kelp and successfully recovered 75 high-quality genomes at one time. This approach will greatly improve access to microbial genomes from complex or novel communities. PMID:28112173

  18. Ubiquitin-dependent sorting of integral membrane proteins for degradation in lysosomes

    PubMed Central

    Piper, Robert C.

    2007-01-01

    Summary The pathways that deliver newly synthesized proteins that reside in lysosomes are well understood by comparison with our knowledge of how integral membrane proteins are sorted and delivered to the lysosome for degradation. Many membrane proteins are sorted to lysosomes following ubiquitination, which provides a sorting signal that can operate for sorting at the TGN (trans-Golgi network), at the plasma membrane or at the endosome for delivery into lumenal vesicles. Candidate multicomponent machines that can potentially move ubiquitinated integral membrane cargo proteins have been identified, but much work is still required to ascertain which of these candidates directly recognizes ubiquitinated cargo and what they do with cargo after recognition. In the case of the machinery required for sorting into the lumenal vesicles of endosomes, other functions have also been determined including a link between sorting and movement of endosomes along microtubules. PMID:17689064

  19. A Computerized English-Spanish Correlation Index to Five Biomedical Library Classification Schemes Based on MeSH*

    PubMed Central

    Muench, Eugene V.

    1971-01-01

    A computerized English/Spanish correlation index to five biomedical library classification schemes and a computerized English/Spanish, Spanish/English listings of MeSH are described. The index was accomplished by supplying appropriate classification numbers of five classification schemes (National Library of Medicine; Library of Congress; Dewey Decimal; Cunningham; Boston Medical) to MeSH and a Spanish translation of MeSH The data were keypunched, merged on magnetic tape, and sorted in a computer alphabetically by English and Spanish subject headings and sequentially by classification number. Some benefits and uses of the index are: a complete index to classification schemes based on MeSH terms; a tool for conversion of classification numbers when reclassifying collections; a Spanish index and a crude Spanish translation of five classification schemes; a data base for future applications, e.g., automatic classification. Other classification schemes, such as the UDC, and translations of MeSH into other languages can be added. PMID:5172471

  20. Web application for automatic prediction of gene translation elongation efficiency.

    PubMed

    Sokolov, Vladimir; Zuraev, Bulat; Lashin, Sergei; Matushkin, Yury

    2015-09-03

    Expression efficiency is one of the major characteristics describing genes in various modern investigations. Expression efficiency of genes is regulated at various stages: transcription, translation, posttranslational protein modification and others. In this study, a special EloE (Elongation Efficiency) web application is described. The EloE sorts the organism's genes in a descend order on their theoretical rate of the elongation stage of translation based on the analysis of their nucleotide sequences. Obtained theoretical data have a significant correlation with available experimental data of gene expression in various organisms. In addition, the program identifies preferential codons in organism's genes and defines distribution of potential secondary structures energy in 5´ and 3´ regions of mRNA. The EloE can be useful in preliminary estimation of translation elongation efficiency for genes for which experimental data are not available yet. Some results can be used, for instance, in other programs modeling artificial genetic structures in genetically engineered experiments.

  1. Big Data Analysis of Manufacturing Processes

    NASA Astrophysics Data System (ADS)

    Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert

    2015-11-01

    The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results.

  2. Characterization of Scots pine stump-root biomass as feed-stock for gasification.

    PubMed

    Eriksson, Daniel; Weiland, Fredrik; Hedman, Henry; Stenberg, Martin; Öhrman, Olov; Lestander, Torbjörn A; Bergsten, Urban; Öhman, Marcus

    2012-01-01

    The main objective was to explore the potential for gasifying Scots pine stump-root biomass (SRB). Washed thin roots, coarse roots, stump heartwood and stump sapwood were characterized (solid wood, milling and powder characteristics) before and during industrial processing. Non-slagging gasification of the SRB fuels and a reference stem wood was successful, and the gasification parameters (synthesis gas and bottom ash characteristics) were similar. However, the heartwood fuel had high levels of extractives (≈19%) compared to the other fuels (2-8%) and thereby ≈16% higher energy contents but caused disturbances during milling, storage, feeding and gasification. SRB fuels could be sorted automatically according to their extractives and moisture contents using near-infrared spectroscopy, and their amounts and quality in forests can be predicted using routinely collected stand data, biomass functions and drill core analyses. Thus, SRB gasification has great potential and the proposed characterizations exploit it. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Sound and speech detection and classification in a Health Smart Home.

    PubMed

    Fleury, A; Noury, N; Vacher, M; Glasson, H; Seri, J F

    2008-01-01

    Improvements in medicine increase life expectancy in the world and create a new bottleneck at the entrance of specialized and equipped institutions. To allow elderly people to stay at home, researchers work on ways to monitor them in their own environment, with non-invasive sensors. To meet this goal, smart homes, equipped with lots of sensors, deliver information on the activities of the person and can help detect distress situations. In this paper, we present a global speech and sound recognition system that can be set-up in a flat. We placed eight microphones in the Health Smart Home of Grenoble (a real living flat of 47m(2)) and we automatically analyze and sort out the different sounds recorded in the flat and the speech uttered (to detect normal or distress french sentences). We introduce the methods for the sound and speech recognition, the post-processing of the data and finally the experimental results obtained in real conditions in the flat.

  4. An Intelligent Gear Fault Diagnosis Methodology Using a Complex Wavelet Enhanced Convolutional Neural Network

    PubMed Central

    Sun, Weifang; Yao, Bin; Zeng, Nianyin; He, Yuchao; Cao, Xincheng; He, Wangpeng

    2017-01-01

    As a typical example of large and complex mechanical systems, rotating machinery is prone to diversified sorts of mechanical faults. Among these faults, one of the prominent causes of malfunction is generated in gear transmission chains. Although they can be collected via vibration signals, the fault signatures are always submerged in overwhelming interfering contents. Therefore, identifying the critical fault’s characteristic signal is far from an easy task. In order to improve the recognition accuracy of a fault’s characteristic signal, a novel intelligent fault diagnosis method is presented. In this method, a dual-tree complex wavelet transform (DTCWT) is employed to acquire the multiscale signal’s features. In addition, a convolutional neural network (CNN) approach is utilized to automatically recognise a fault feature from the multiscale signal features. The experiment results of the recognition for gear faults show the feasibility and effectiveness of the proposed method, especially in the gear’s weak fault features. PMID:28773148

  5. Nurse practitioner-based sign-out system to facilitate patient communication on a neurosurgical service: a pilot study with recommendations.

    PubMed

    Rabinovitch, Deborah L; Hamill, Melinda; Zanchetta, Clauda; Bernstein, Mark

    2009-12-01

    Failure to communicate important patient information between physicians causes medical errors and adverse patient events. On-call neurosurgery physicians at the Toronto Western Hospital do not know the medical details of all the patients that they are covering at night because they do not care for the entire service of patients during the day. Because there is no formal handover system to transfer patient information to the on-call physician, a nurse practitioner-based sign-out system was recently introduced. Its effectiveness for communication was evaluated with preintervention-postintervention questionnaires and by recording daily logins. There was a statistically significant decrease in number of logins after 8 weeks of use (p = .05, Fisher's exact test), and the tool was abandoned after 16 weeks. Modifications identified to improve the system include the ability to sort by attending physician and to automatically populate the list with new patients. Effective communication is important for reducing medical errors, and perhaps these modifications will facilitate this important endeavor.

  6. A novel scene-based non-uniformity correction method for SWIR push-broom hyperspectral sensors

    NASA Astrophysics Data System (ADS)

    Hu, Bin-Lin; Hao, Shi-Jing; Sun, De-Xin; Liu, Yin-Nian

    2017-09-01

    A novel scene-based non-uniformity correction (NUC) method for short-wavelength infrared (SWIR) push-broom hyperspectral sensors is proposed and evaluated. This method relies on the assumption that for each band there will be ground objects with similar reflectance to form uniform regions when a sufficient number of scanning lines are acquired. The uniform regions are extracted automatically through a sorting algorithm, and are used to compute the corresponding NUC coefficients. SWIR hyperspectral data from airborne experiment are used to verify and evaluate the proposed method, and results show that stripes in the scenes have been well corrected without any significant information loss, and the non-uniformity is less than 0.5%. In addition, the proposed method is compared to two other regular methods, and they are evaluated based on their adaptability to the various scenes, non-uniformity, roughness and spectral fidelity. It turns out that the proposed method shows strong adaptability, high accuracy and efficiency.

  7. Design of a mobile brain computer interface-based smart multimedia controller.

    PubMed

    Tseng, Kevin C; Lin, Bor-Shing; Wong, Alice May-Kuen; Lin, Bor-Shyh

    2015-03-06

    Music is a way of expressing our feelings and emotions. Suitable music can positively affect people. However, current multimedia control methods, such as manual selection or automatic random mechanisms, which are now applied broadly in MP3 and CD players, cannot adaptively select suitable music according to the user's physiological state. In this study, a brain computer interface-based smart multimedia controller was proposed to select music in different situations according to the user's physiological state. Here, a commercial mobile tablet was used as the multimedia platform, and a wireless multi-channel electroencephalograph (EEG) acquisition module was designed for real-time EEG monitoring. A smart multimedia control program built in the multimedia platform was developed to analyze the user's EEG feature and select music according his/her state. The relationship between the user's state and music sorted by listener's preference was also examined in this study. The experimental results show that real-time music biofeedback according a user's EEG feature may positively improve the user's attention state.

  8. Automated mango fruit assessment using fuzzy logic approach

    NASA Astrophysics Data System (ADS)

    Hasan, Suzanawati Abu; Kin, Teoh Yeong; Sauddin@Sa'duddin, Suraiya; Aziz, Azlan Abdul; Othman, Mahmod; Mansor, Ab Razak; Parnabas, Vincent

    2014-06-01

    In term of value and volume of production, mango is the third most important fruit product next to pineapple and banana. Accurate size assessment of mango fruits during harvesting is vital to ensure that they are classified to the grade accordingly. However, the current practice in mango industry is grading the mango fruit manually using human graders. This method is inconsistent, inefficient and labor intensive. In this project, a new method of automated mango size and grade assessment is developed using RGB fiber optic sensor and fuzzy logic approach. The calculation of maximum, minimum and mean values based on RGB fiber optic sensor and the decision making development using minimum entropy formulation to analyse the data and make the classification for the mango fruit. This proposed method is capable to differentiate three different grades of mango fruit automatically with 77.78% of overall accuracy compared to human graders sorting. This method was found to be helpful for the application in the current agricultural industry.

  9. A general method for generating bathymetric data for hydrodynamic computer models

    USGS Publications Warehouse

    Burau, J.R.; Cheng, R.T.

    1989-01-01

    To generate water depth data from randomly distributed bathymetric data for numerical hydrodymamic models, raw input data from field surveys, water depth data digitized from nautical charts, or a combination of the two are sorted to given an ordered data set on which a search algorithm is used to isolate data for interpolation. Water depths at locations required by hydrodynamic models are interpolated from the bathymetric data base using linear or cubic shape functions used in the finite-element method. The bathymetric database organization and preprocessing, the search algorithm used in finding the bounding points for interpolation, the mathematics of the interpolation formulae, and the features of the automatic generation of water depths at hydrodynamic model grid points are included in the analysis. This report includes documentation of two computer programs which are used to: (1) organize the input bathymetric data; and (2) to interpolate depths for hydrodynamic models. An example of computer program operation is drawn from a realistic application to the San Francisco Bay estuarine system. (Author 's abstract)

  10. What every teacher needs to know about clinical reasoning.

    PubMed

    Eva, Kevin W

    2005-01-01

    One of the core tasks assigned to clinical teachers is to enable students to sort through a cluster of features presented by a patient and accurately assign a diagnostic label, with the development of an appropriate treatment strategy being the end goal. Over the last 30 years there has been considerable debate within the health sciences education literature regarding the model that best describes how expert clinicians generate diagnostic decisions. The purpose of this essay is to provide a review of the research literature on clinical reasoning for frontline clinical teachers. The strengths and weaknesses of different approaches to clinical reasoning will be examined using one of the core divides between various models (that of analytic (i.e. conscious/controlled) versus non-analytic (i.e. unconscious/automatic) reasoning strategies) as an orienting framework. Recent work suggests that clinical teachers should stress the importance of both forms of reasoning, thereby enabling students to marshal reasoning processes in a flexible and context-specific manner. Specific implications are drawn from this overview for clinical teachers.

  11. Word Sorts for General Music Classes

    ERIC Educational Resources Information Center

    Cardany, Audrey Berger

    2015-01-01

    Word sorts are standard practice for aiding children in acquiring skills in English language arts. When included in the general music classroom, word sorts may aid students in acquiring a working knowledge of music vocabulary. The author shares a word sort activity drawn from vocabulary in John Lithgow's children's book "Never Play…

  12. Geometry-aware multiscale image registration via OBBTree-based polyaffine log-demons.

    PubMed

    Seiler, Christof; Pennec, Xavier; Reyes, Mauricio

    2011-01-01

    Non-linear image registration is an important tool in many areas of image analysis. For instance, in morphometric studies of a population of brains, free-form deformations between images are analyzed to describe the structural anatomical variability. Such a simple deformation model is justified by the absence of an easy expressible prior about the shape changes. Applying the same algorithms used in brain imaging to orthopedic images might not be optimal due to the difference in the underlying prior on the inter-subject deformations. In particular, using an un-informed deformation prior often leads to local minima far from the expected solution. To improve robustness and promote anatomically meaningful deformations, we propose a locally affine and geometry-aware registration algorithm that automatically adapts to the data. We build upon the log-domain demons algorithm and introduce a new type of OBBTree-based regularization in the registration with a natural multiscale structure. The regularization model is composed of a hierarchy of locally affine transformations via their logarithms. Experiments on mandibles show improved accuracy and robustness when used to initialize the demons, and even similar performance by direct comparison to the demons, with a significantly lower degree of freedom. This closes the gap between polyaffine and non-rigid registration and opens new ways to statistically analyze the registration results.

  13. Acoustic Emission and Echo Signal Compensation Techniques Applied to an Ultrasonic Logging-While-Drilling Caliper.

    PubMed

    Yao, Yongchao; Ju, Xiaodong; Lu, Junqiang; Men, Baiyong

    2017-06-10

    A logging-while-drilling (LWD) caliper is a tool used for the real-time measurement of a borehole diameter in oil drilling engineering. This study introduces the mechanical structure and working principle of a new LWD caliper based on ultrasonic distance measurement (UDM). The detection range is a major performance index of a UDM system. This index is determined by the blind zone length and remote reflecting interface detection capability of the system. To reduce the blind zone length and detect near the reflecting interface, a full bridge acoustic emission technique based on bootstrap gate driver (BGD) and metal-oxide-semiconductor field effect transistor (MOSFET) is designed by analyzing the working principle and impedance characteristics of a given piezoelectric transducer. To detect the remote reflecting interface and reduce the dynamic range of the received echo signals, the relationships between the echo amplitude and propagation distance of ultrasonic waves are determined. A signal compensation technique based on time-varying amplification theory, which can automatically change the gain according to the echo arrival time is designed. Lastly, the aforementioned techniques and corresponding circuits are experimentally verified. Results show that the blind zone length in the UDM system of the LWD caliper is significantly reduced and the capability to detect the remote reflecting interface is considerably improved.

  14. Acoustic Emission and Echo Signal Compensation Techniques Applied to an Ultrasonic Logging-While-Drilling Caliper

    PubMed Central

    Yao, Yongchao; Ju, Xiaodong; Lu, Junqiang; Men, Baiyong

    2017-01-01

    A logging-while-drilling (LWD) caliper is a tool used for the real-time measurement of a borehole diameter in oil drilling engineering. This study introduces the mechanical structure and working principle of a new LWD caliper based on ultrasonic distance measurement (UDM). The detection range is a major performance index of a UDM system. This index is determined by the blind zone length and remote reflecting interface detection capability of the system. To reduce the blind zone length and detect near the reflecting interface, a full bridge acoustic emission technique based on bootstrap gate driver (BGD) and metal-oxide-semiconductor field effect transistor (MOSFET) is designed by analyzing the working principle and impedance characteristics of a given piezoelectric transducer. To detect the remote reflecting interface and reduce the dynamic range of the received echo signals, the relationships between the echo amplitude and propagation distance of ultrasonic waves are determined. A signal compensation technique based on time-varying amplification theory, which can automatically change the gain according to the echo arrival time is designed. Lastly, the aforementioned techniques and corresponding circuits are experimentally verified. Results show that the blind zone length in the UDM system of the LWD caliper is significantly reduced and the capability to detect the remote reflecting interface is considerably improved. PMID:28604603

  15. Comparison between deterministic and statistical wavelet estimation methods through predictive deconvolution: Seismic to well tie example from the North Sea

    NASA Astrophysics Data System (ADS)

    de Macedo, Isadora A. S.; da Silva, Carolina B.; de Figueiredo, J. J. S.; Omoboya, Bode

    2017-01-01

    Wavelet estimation as well as seismic-to-well tie procedures are at the core of every seismic interpretation workflow. In this paper we perform a comparative study of wavelet estimation methods for seismic-to-well tie. Two approaches to wavelet estimation are discussed: a deterministic estimation, based on both seismic and well log data, and a statistical estimation, based on predictive deconvolution and the classical assumptions of the convolutional model, which provides a minimum-phase wavelet. Our algorithms, for both wavelet estimation methods introduce a semi-automatic approach to determine the optimum parameters of deterministic wavelet estimation and statistical wavelet estimation and, further, to estimate the optimum seismic wavelets by searching for the highest correlation coefficient between the recorded trace and the synthetic trace, when the time-depth relationship is accurate. Tests with numerical data show some qualitative conclusions, which are probably useful for seismic inversion and interpretation of field data, by comparing deterministic wavelet estimation and statistical wavelet estimation in detail, especially for field data example. The feasibility of this approach is verified on real seismic and well data from Viking Graben field, North Sea, Norway. Our results also show the influence of the washout zones on well log data on the quality of the well to seismic tie.

  16. The effects of hoechst 33342 staining and the male sample donor on the sorting efficiency of canine spermatozoa.

    PubMed

    Rodenas, C; Lucas, X; Tarantini, T; Del Olmo, D; Roca, J; Vazquez, J M; Martinez, E A; Parrilla, I

    2014-02-01

    The aim of this study was to evaluate the influence of Hoechst 33342 (H-42) concentration and of the male donor on the efficiency of sex-sorting procedure in canine spermatozoa. Semen samples from six dogs (three ejaculates/dog) were diluted to 100 × 10(6) sperm/ml, split into four aliquots, stained with increasing H-42 concentrations (5, 7.5, 10 and 12.5 μl, respectively) and sorted by flow cytometry. The rates of non-viable (FDA+), oriented (OS) and selected spermatozoa (SS), as well as the average sorting rates (SR, sorted spermatozoa/s), were used to determine the sorting efficiency. The effects of the sorting procedure on the quality of sorted spermatozoa were evaluated in terms of total motility (TM), percentage of viable spermatozoa (spermatozoa with membrane and acrosomal integrity) and percentage of spermatozoa with reacted/damaged acrosomes. X- and Y-chromosome-bearing sperm populations were identified in all of the samples stained with 7.5, 10 and 12.5 μl of H-42, while these two populations were only identified in 77.5% of samples stained with 5 μl. The values of OS, SS and SR were influenced by the male donor (p < 0.01) but not by the H-42 concentration used. The quality of sorted sperm samples immediately after sorting was similar to that of fresh samples, while centrifugation resulted in significant reduction (p < 0.05) in TM and in the percentage of viable spermatozoa and a significant increase (p < 0.01) in the percentage of spermatozoa with damage/reacted acrosomes. In conclusion, the sex-sorting of canine spermatozoa by flow cytometry can be performed successfully using H-42 concentrations between 7.5 and 12.5 μl. The efficiency of the sorting procedure varies based on the dog from which the sperm sample derives. © 2013 Blackwell Verlag GmbH.

  17. 3-D Vp/Vs Ratio Distribution in the Geothermal Reservoir at Basel, Switzerland, from Microseismic Data

    NASA Astrophysics Data System (ADS)

    Kummerow, J.; Reshetnikov, A.; Häring, M.; Asanuma, H.

    2012-12-01

    Thousands of microseismic events occurred during and after the stimulation of the 4.5km deep Basel 1 well at the Deep Heat Mining Project in Basel, Switzerland, in December 2006. The located seismicity extends about 1km in vertical direction and also 1km in NNW-SSE direction, consistent with the orientation of the maximum horizontal stress. In this study, we analyze 2100 events with magnitudes Mw>0.0, which were recorded by six borehole seismometers between December 2, 2006, and June 7, 2007. We first identify event multiplets based on waveform similarity and apply an automatic, iterative arrival time optimization to calculate high-precision P and S time picks for the multiplet events. Local estimates of the Vp/Vs ratio in the stimulated Basel geothermal reservoir are then obtained from the slope of the demeaned differential S versus P arrival times. The average value of Vp/Vs=1.70 is close to the characteristic reservoir value of 1.72, which was determined independently from sonic log measurements. Also, in the vicinity of the borehole, the depth distribution of Vp/Vs correlates well with the low-pass filtered sonic log data: Vp/Vs values are less than 1.70 at the top of the seismicity cloud at <3.9km depth, close to average at 4.0-4.4km depth, and exceed the value of 1.75 at larger depth (4.4-4.6km), consistent with the sonic log data. Furthermore, we observe a correlation of anomalous Vp/Vs values with zones of enhanced seismic reflectivity which were resolved by microseismic reflection imaging. Away from the borehole, increased Vp/Vs ratios also seem to correlate with domains of high event density, possibly indicating fluid migration paths.

  18. A genetic meta-algorithm-assisted inversion approach: hydrogeological study for the determination of volumetric rock properties and matrix and fluid parameters in unsaturated formations

    NASA Astrophysics Data System (ADS)

    Szabó, Norbert Péter

    2018-03-01

    An evolutionary inversion approach is suggested for the interpretation of nuclear and resistivity logs measured by direct-push tools in shallow unsaturated sediments. The efficiency of formation evaluation is improved by estimating simultaneously (1) the petrophysical properties that vary rapidly along a drill hole with depth and (2) the zone parameters that can be treated as constant, in one inversion procedure. In the workflow, the fractional volumes of water, air, matrix and clay are estimated in adjacent depths by linearized inversion, whereas the clay and matrix properties are updated using a float-encoded genetic meta-algorithm. The proposed inversion method provides an objective estimate of the zone parameters that appear in the tool response equations applied to solve the forward problem, which can significantly increase the reliability of the petrophysical model as opposed to setting these parameters arbitrarily. The global optimization meta-algorithm not only assures the best fit between the measured and calculated data but also gives a reliable solution, practically independent of the initial model, as laboratory data are unnecessary in the inversion procedure. The feasibility test uses engineering geophysical sounding logs observed in an unsaturated loessy-sandy formation in Hungary. The multi-borehole extension of the inversion technique is developed to determine the petrophysical properties and their estimation errors along a profile of drill holes. The genetic meta-algorithmic inversion method is recommended for hydrogeophysical logging applications of various kinds to automatically extract the volumetric ratios of rock and fluid constituents as well as the most important zone parameters in a reliable inversion procedure.

  19. P and S automatic picks for 3D earthquake tomography in NE Italy

    NASA Astrophysics Data System (ADS)

    Lovisa, L.; Bragato, P.; Gentili, S.

    2006-12-01

    Earthquake tomography is useful to study structural and geological features of the crust. In particular, it uses P and S arrival times for reconstructing weaves velocity fields and locating earthquakes hypocenters. However, tomography needs a large effort to provide a high number of manual picks. On the other side, many automatic picking methods have been proposed, but they are usually applied to preliminary elaboration of the data (fast alert and automatic bulletin generation); they are generally considered not reliable for tomography. In this work, we present and discuss the results of Vp, Vs and Vp/Vs tomographies obtained using automatic picks generated by the system TAPNEI (Gentili and Bragato 2006), applied in the NE Italy. Preliminarily, in order to estimate the error in comparison with the unknown true arrival times, an analysis on the picking quality is done. The tests have been performed using two dataset: the first is made up by 240 earthquakes automatically picked by TAPNEI; the second counts in the same earthquakes but manually picked (OGS database). The grid and the software used to perform tomography (Sim28, Michelini and Mc Evilly, 1991) are the same in the two cases. Vp, Vs and Vp/Vs fields of the two tomographies and their differences are shown on vertical sections. In addiction, the differences in earthquakes locations are studied; in particular, the quality of the accuracy of the localizations has been analyzed by estimating the distance of the hypocenter distributions with respect to the manual locations. The analysis include also a qualitative comparison with an independent tomography (Gentile et al., 2000) performed using Simulps (Evans et al, 1994) on a set of 224 earthquakes accurately selected and manually relocated. The quality of the pickings and the comparison with the tomography obtained by manual data suggest that earthquake tomography with automatic data can provide reliable results. We suggest the use of such data when a large quantity of recordings must by quickly analyzed to provide some preliminary results (e.g., to decide about further data acquisition when using temporary networks) or when a sort of "real-time tomography" is required (e.g., continuous imaging of volcanoes during their activity). References Evans J.R., Eberhart-Phillips D., and Thurber C.H. (1994). User's manual for simulps12 for imaging vp and vp/vs: a derivative of the Thurber tomographic inversion simul3 for local earthquakes locations and explosions, U.S.Geol. Surv. Open File Report, 7 pp. Gentile, G. F., Bressan, G., Burlini, L., De Franco, R., 2000, Three - dimensional Vp and Vp/Vs models of the upper crust in the Friuli area (Northeastern Italy)., Geophys. Journ. Int., 141, 457-478. Gentili S. and Bragato P. L., 2006,"A neural-tree-based system for automatic location of earthquakes in Northeastern Italy" Journal of Seismology, Volume 10, Number 1, pp.73-89. Michelini, A., Mcevilly, T. V., 1991, "Seismological studies at Parkfield; I, Simultaneous inversion for velocity structure and hypocenters using cubic B-splines parameterization.", Bulletin of the Seismological Society of America, 81, 2, 524-552.

  20. Sorting out Ideas about Function

    ERIC Educational Resources Information Center

    Hillen, Amy F.; Malik, LuAnn

    2013-01-01

    Card sorting has the potential to provide opportunities for exploration of a variety of topics and levels. In a card-sorting task, each participant is presented with a set of cards--each of which depicts a relationship--and is asked to sort the cards into categories that make sense to him or her. The concept of function is critical to…

  1. Gender Sorting across K-12 Schools in the United States

    ERIC Educational Resources Information Center

    Long, Mark C.; Conger, Dylan

    2013-01-01

    This article documents evidence of nonrandom gender sorting across K-12 schools in the United States. The sorting exists among coed schools and at all grade levels, and it is highest in the secondary school grades. We observe some gender sorting across school sectors and types: for instance, males are slightly underrepresented in private schools…

  2. Lazarus's BASIC ID: Making Initial Client Assessments Using Q-Sorts.

    ERIC Educational Resources Information Center

    Miller, Mark J.

    1987-01-01

    Presents overview of Lazarus's multimodal therapy model and the Q-sort, an observer-evaluation scoring instrument. Outlines feasibility of integrating Q-sort within multimodal model. Describes both a preliminary attempt using expert raters to categorize Q-sort cards within the model and a case study on how to assess client by incorporating Q-sort…

  3. Flankers Facilitate 3-Year-Olds' Performance in a Card-Sorting Task

    ERIC Educational Resources Information Center

    Jordan, Patricia L.; Morton, J. Bruce

    2008-01-01

    Three-year-old children often act inflexibly in card-sorting tasks by continuing to sort by an old rule after being asked to switch and sort by a new rule. This inflexibility has been variously attributed to age-related constraints on higher order rule use, object redescription, and attention shifting. In 2 experiments, flankers that were…

  4. My eSorts and Digital Extensions of Word Study

    ERIC Educational Resources Information Center

    Zucker, Tricia A.; Invernizzi, Marcia

    2008-01-01

    "My eSorts" is a strategy for helping children learn to read and spell in a socially motivated context. It is based on developmental spelling research and the word study approach to teaching phonics and spelling. "eSorting" employs digital desktop publishing tools that allow children to author their own electronic word sorts and then share these…

  5. Continuous sorting of Brownian particles using coupled photophoresis and asymmetric potential cycling.

    PubMed

    Ng, Tuck Wah; Neild, Adrian; Heeraman, Pascal

    2008-03-15

    Feasible sorters need to function rapidly and permit the input and delivery of particles continuously. Here, we describe a scheme that incorporates (i) restricted spatial input location and (ii) orthogonal sort and movement direction features. Sorting is achieved using an asymmetric potential that is cycled on and off, whereas movement is accomplished using photophoresis. Simulations with 0.2 and 0.5 microm diameter spherical particles indicate that sorting can commence quickly from a continuous stream. Procedures to optimize the sorting scheme are also described.

  6. Application of Raman spectroscopy to identification and sorting of post-consumer plastics for recycling

    DOEpatents

    Sommer, Edward J.; Rich, John T.

    2001-01-01

    A high accuracy rapid system for sorting a plurality of waste products by polymer type. The invention involves the application of Raman spectroscopy and complex identification techniques to identify and sort post-consumer plastics for recycling. The invention reads information unique to the molecular structure of the materials to be sorted to identify their chemical compositions and uses rapid high volume sorting techniques to sort them into product streams at commercially viable throughput rates. The system employs a laser diode (20) for irradiating the material sample (10), a spectrograph (50) is used to determine the Raman spectrum of the material sample (10) and a microprocessor based controller (70) is employed to identify the polymer type of the material sample (10).

  7. PFAAT version 2.0: a tool for editing, annotating, and analyzing multiple sequence alignments.

    PubMed

    Caffrey, Daniel R; Dana, Paul H; Mathur, Vidhya; Ocano, Marco; Hong, Eun-Jong; Wang, Yaoyu E; Somaroo, Shyamal; Caffrey, Brian E; Potluri, Shobha; Huang, Enoch S

    2007-10-11

    By virtue of their shared ancestry, homologous sequences are similar in their structure and function. Consequently, multiple sequence alignments are routinely used to identify trends that relate to function. This type of analysis is particularly productive when it is combined with structural and phylogenetic analysis. Here we describe the release of PFAAT version 2.0, a tool for editing, analyzing, and annotating multiple sequence alignments. Support for multiple annotations is a key component of this release as it provides a framework for most of the new functionalities. The sequence annotations are accessible from the alignment and tree, where they are typically used to label sequences or hyperlink them to related databases. Sequence annotations can be created manually or extracted automatically from UniProt entries. Once a multiple sequence alignment is populated with sequence annotations, sequences can be easily selected and sorted through a sophisticated search dialog. The selected sequences can be further analyzed using statistical methods that explicitly model relationships between the sequence annotations and residue properties. Residue annotations are accessible from the alignment viewer and are typically used to designate binding sites or properties for a particular residue. Residue annotations are also searchable, and allow one to quickly select alignment columns for further sequence analysis, e.g. computing percent identities. Other features include: novel algorithms to compute sequence conservation, mapping conservation scores to a 3D structure in Jmol, displaying secondary structure elements, and sorting sequences by residue composition. PFAAT provides a framework whereby end-users can specify knowledge for a protein family in the form of annotation. The annotations can be combined with sophisticated analysis to test hypothesis that relate to sequence, structure and function.

  8. Automatic identification of high impact articles in PubMed to support clinical decision making.

    PubMed

    Bian, Jiantao; Morid, Mohammad Amin; Jonnalagadda, Siddhartha; Luo, Gang; Del Fiol, Guilherme

    2017-09-01

    The practice of evidence-based medicine involves integrating the latest best available evidence into patient care decisions. Yet, critical barriers exist for clinicians' retrieval of evidence that is relevant for a particular patient from primary sources such as randomized controlled trials and meta-analyses. To help address those barriers, we investigated machine learning algorithms that find clinical studies with high clinical impact from PubMed®. Our machine learning algorithms use a variety of features including bibliometric features (e.g., citation count), social media attention, journal impact factors, and citation metadata. The algorithms were developed and evaluated with a gold standard composed of 502 high impact clinical studies that are referenced in 11 clinical evidence-based guidelines on the treatment of various diseases. We tested the following hypotheses: (1) our high impact classifier outperforms a state-of-the-art classifier based on citation metadata and citation terms, and PubMed's® relevance sort algorithm; and (2) the performance of our high impact classifier does not decrease significantly after removing proprietary features such as citation count. The mean top 20 precision of our high impact classifier was 34% versus 11% for the state-of-the-art classifier and 4% for PubMed's® relevance sort (p=0.009); and the performance of our high impact classifier did not decrease significantly after removing proprietary features (mean top 20 precision=34% vs. 36%; p=0.085). The high impact classifier, using features such as bibliometrics, social media attention and MEDLINE® metadata, outperformed previous approaches and is a promising alternative to identifying high impact studies for clinical decision support. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. [Experiences with an anesthesia protocol written by computer].

    PubMed

    Karliczek, G F; Brenken, U; van den Broeke, J J; Mooi, B; de Geus, A F; Wiersma, G; Oosterhaven, S

    1988-04-01

    Since December 1983, we have used a computer system for charting and data logging in cardiac and thoracic anesthesia. These computers, designed as stand-alone units, were developed at our hospital based on Motorola 6809 microprocessor systems. All measurements derived from anesthetic monitoring, ventilator, and heart-lung machine are automatically sampled at regular intervals and stored for later data management. Laboratory results are automatically received from the hospital computer system. The user communicates with the system via a terminal and a keyboard; this also facilitates the entering of all comments, medications, infusions, and fluid losses. All data are continuously displayed on an A3 format anesthetic chart using a multi-pen, flat-bed plotter. The operation of the system has proved to be simple and needs less time than charting by hand, while the result, the display on the chart, is far clearer and more complete than any handwritten document. Up to now 3,200 operations (corresponding to 12,500 anesthetic h) have been documented. The failure rate of the system, defined as an interruption of the documentation for more than 30 min is 2.1%. Further development of the system is discussed. A data base for processing the stored data has been developed and is being tested at present.

  10. Quality assurance in the production of pipe fittings by automatic laser-based material identification

    NASA Astrophysics Data System (ADS)

    Moench, Ingo; Peter, Laszlo; Priem, Roland; Sturm, Volker; Noll, Reinhard

    1999-09-01

    In plants of the chemical, nuclear and off-shore industry, application specific high-alloyed steels are used for pipe fittings. Mixing of different steel grades can lead to corrosion with severe consequential damages. Growing quality requirements and environmental responsibilities demand a 100% material control in the production of the pipe fittings. Therefore, LIFT, an automatic inspection machine, was developed to insure against any mix of material grades. LIFT is able to identify more than 30 different steel grades. The inspection method is based on Laser-Induced Breakdown Spectrometry (LIBS). An expert system, which can be easily trained and recalibrated, was developed for the data evaluation. The result of the material inspection is transferred to an external handling system via a PLC interface. The duration of the inspection process is 2 seconds. The graphical user interface was developed with respect to the requirements of an unskilled operator. The software is based on a realtime operating system and provides a safe and reliable operation. An interface for the remote maintenance by modem enables a fast operational support. Logged data are retrieved and evaluated. This is the basis for an adaptive improvement of the configuration of LIFT with respect to changing requirements in the production line. Within the first six months of routine operation, about 50000 pipe fittings were inspected.

  11. Purge- and intensive-purge decontamination of dental units contaminated with biofilm

    PubMed Central

    Kramer, Axel; Assadian, Ojan; Bachfeld, Danny; Meyer, Georg

    2012-01-01

    Introduction: During hygienic-microbiological monitoring of the water quality in dental units, the total bacterial colony count was found to exceed the limits for drinking water quality; in addition, mold contamination was detected. The presumed cause was irregular decontamination of the units through purging and intensive decontamination. Methods: To decontaminate the units, the manufacturer’s recommended program for cleaning and intensive decontamination was intensified by shortened intervals over a 2-week period. For Sirona units, instead of once a day, the automatic purge program was run every morning and evening for 20 min each time, and instead of once a month, intensive decontamination was performed every two weeks; this schedule has been maintained since then. For KaVo units, cleaning with the hydroclean function was carried out for 2.5 min every morning and evening. The automatic intensive decontamination was run daily instead of weekly. A maintenance log book was introduced, in which decontamination/cleaning was confirmed by the operator’s signature. Results: Within 5 weeks, all previously contaminated units were decontaminated. Discussion: By shortening the cleaning and intensive decontamination intervals in a 2-week period with subsequent control that the recommended maintenance intervals were kept, it was possible to guarantee drinking-water quality in the dental units of both manufacturers. PMID:22558045

  12. Unsupervised neural spike sorting for high-density microelectrode arrays with convolutive independent component analysis.

    PubMed

    Leibig, Christian; Wachtler, Thomas; Zeck, Günther

    2016-09-15

    Unsupervised identification of action potentials in multi-channel extracellular recordings, in particular from high-density microelectrode arrays with thousands of sensors, is an unresolved problem. While independent component analysis (ICA) achieves rapid unsupervised sorting, it ignores the convolutive structure of extracellular data, thus limiting the unmixing to a subset of neurons. Here we present a spike sorting algorithm based on convolutive ICA (cICA) to retrieve a larger number of accurately sorted neurons than with instantaneous ICA while accounting for signal overlaps. Spike sorting was applied to datasets with varying signal-to-noise ratios (SNR: 3-12) and 27% spike overlaps, sampled at either 11.5 or 23kHz on 4365 electrodes. We demonstrate how the instantaneity assumption in ICA-based algorithms has to be relaxed in order to improve the spike sorting performance for high-density microelectrode array recordings. Reformulating the convolutive mixture as an instantaneous mixture by modeling several delayed samples jointly is necessary to increase signal-to-noise ratio. Our results emphasize that different cICA algorithms are not equivalent. Spike sorting performance was assessed with ground-truth data generated from experimentally derived templates. The presented spike sorter was able to extract ≈90% of the true spike trains with an error rate below 2%. It was superior to two alternative (c)ICA methods (≈80% accurately sorted neurons) and comparable to a supervised sorting. Our new algorithm represents a fast solution to overcome the current bottleneck in spike sorting of large datasets generated by simultaneous recording with thousands of electrodes. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Development and implementation of the software for visualization and analysis of data geophysical loggers

    NASA Astrophysics Data System (ADS)

    Gordeev, V. F.; Malyshkov, S. Yu.; Botygin, I. A.; Sherstnev, V. S.; Sherstneva, A. I.

    2017-11-01

    The general trend of modern ecological geophysics is changing priorities towards rapid assessment, management and prediction of ecological and engineering soil stability as well as developing brand new geophysical technologies. The article describes researches conducted by using multi-canal geophysical logger MGR-01 (developed by IMCES SB RAS), which allows to measure flux density of very low-frequency electromagnetic radiation. It is shown that natural pulsed electromagnetic fields of the earthen lithosphere can be a source of new information on Earth's crust and processes in it, including earthquakes. The device is intended for logging electromagnetic processes in Earth's crust, geophysical exploration, finding structural and lithological inhomogeneities, monitoring the geodynamic movement of Earth's crust, express assessment of seismic hazards. The data is gathered automatically from observation point network in Siberia

  14. Statistical methodology for the analysis of dye-switch microarray experiments

    PubMed Central

    Mary-Huard, Tristan; Aubert, Julie; Mansouri-Attia, Nadera; Sandra, Olivier; Daudin, Jean-Jacques

    2008-01-01

    Background In individually dye-balanced microarray designs, each biological sample is hybridized on two different slides, once with Cy3 and once with Cy5. While this strategy ensures an automatic correction of the gene-specific labelling bias, it also induces dependencies between log-ratio measurements that must be taken into account in the statistical analysis. Results We present two original statistical procedures for the statistical analysis of individually balanced designs. These procedures are compared with the usual ML and REML mixed model procedures proposed in most statistical toolboxes, on both simulated and real data. Conclusion The UP procedure we propose as an alternative to usual mixed model procedures is more efficient and significantly faster to compute. This result provides some useful guidelines for the analysis of complex designs. PMID:18271965

  15. Two Types of Perseveration in the Dimension Change Card Sort Task

    ERIC Educational Resources Information Center

    Hanania, Rima

    2010-01-01

    In the Dimension Change Card Sort (DCCS) task, 3-year-olds can sort cards well by one dimension but have difficulty in switching to sort the same cards by another dimension when asked; that is, they perseverate on the first relevant information. What is the information that children perseverate on? Using a new version of the DCCS, the experiments…

  16. Cell-Free Reconstitution of Multivesicular Body Formation and Receptor Sorting

    PubMed Central

    Sun, Wei; Vida, Thomas A.; Sirisaengtaksin, Natalie; Merrill, Samuel A.; Hanson, Phyllis I.; Bean, Andrew J.

    2010-01-01

    The number of surface membrane proteins and their residence time on the plasma membrane are critical determinants of cellular responses to cues that can control plasticity, growth and differentiation. After internalization, the ultimate fate of many plasma membrane proteins is dependent on whether they are sorted for internalization into the lumenal vesicles of multivesicular bodies (MVBs), an obligate step prior to lysosomal degradation. To help to elucidate the mechanisms underlying MVB sorting, we have developed a novel cell-free assay that reconstitutes the sorting of a prototypical membrane protein, the epidermal growth factor receptor, with which we have probed some of its molecular requirements. The sorting event measured is dependent on cytosol, ATP, time, temperature and an intact proton gradient. Depletion of Hrs inhibited biochemical and morphological measures of sorting that were rescued by inclusion of recombinant Hrs in the assay. Moreover, depletion of signal-transducing adaptor molecule (STAM), or addition of mutated ATPase-deficient Vps4, also inhibited sorting. This assay reconstitutes the maturation of late endosomes, including the formation of internal vesicles and the sorting of a membrane protein, and allows biochemical investigation of this process. PMID:20214752

  17. An Unsupervised Online Spike-Sorting Framework.

    PubMed

    Knieling, Simeon; Sridharan, Kousik S; Belardinelli, Paolo; Naros, Georgios; Weiss, Daniel; Mormann, Florian; Gharabaghi, Alireza

    2016-08-01

    Extracellular neuronal microelectrode recordings can include action potentials from multiple neurons. To separate spikes from different neurons, they can be sorted according to their shape, a procedure referred to as spike-sorting. Several algorithms have been reported to solve this task. However, when clustering outcomes are unsatisfactory, most of them are difficult to adjust to achieve the desired results. We present an online spike-sorting framework that uses feature normalization and weighting to maximize the distinctiveness between different spike shapes. Furthermore, multiple criteria are applied to either facilitate or prevent cluster fusion, thereby enabling experimenters to fine-tune the sorting process. We compare our method to established unsupervised offline (Wave_Clus (WC)) and online (OSort (OS)) algorithms by examining their performance in sorting various test datasets using two different scoring systems (AMI and the Adamos metric). Furthermore, we evaluate sorting capabilities on intra-operative recordings using established quality metrics. Compared to WC and OS, our algorithm achieved comparable or higher scores on average and produced more convincing sorting results for intra-operative datasets. Thus, the presented framework is suitable for both online and offline analysis and could substantially improve the quality of microelectrode-based data evaluation for research and clinical application.

  18. Low power and high accuracy spike sorting microprocessor with on-line interpolation and re-alignment in 90 nm CMOS process.

    PubMed

    Chen, Tung-Chien; Ma, Tsung-Chuan; Chen, Yun-Yu; Chen, Liang-Gee

    2012-01-01

    Accurate spike sorting is an important issue for neuroscientific and neuroprosthetic applications. The sorting of spikes depends on the features extracted from the neural waveforms, and a better sorting performance usually comes with a higher sampling rate (SR). However for the long duration experiments on free-moving subjects, the miniaturized and wireless neural recording ICs are the current trend, and the compromise on sorting accuracy is usually made by a lower SR for the lower power consumption. In this paper, we implement an on-chip spike sorting processor with integrated interpolation hardware in order to improve the performance in terms of power versus accuracy. According to the fabrication results in 90nm process, if the interpolation is appropriately performed during the spike sorting, the system operated at the SR of 12.5 k samples per second (sps) can outperform the one not having interpolation at 25 ksps on both accuracy and power.

  19. Algorithm Sorts Groups Of Data

    NASA Technical Reports Server (NTRS)

    Evans, J. D.

    1987-01-01

    For efficient sorting, algorithm finds set containing minimum or maximum most significant data. Sets of data sorted as desired. Sorting process simplified by reduction of each multielement set of data to single representative number. First, each set of data expressed as polynomial with suitably chosen base, using elements of set as coefficients. Most significant element placed in term containing largest exponent. Base selected by examining range in value of data elements. Resulting series summed to yield single representative number. Numbers easily sorted, and each such number converted back to original set of data by successive division. Program written in BASIC.

  20. An investigation into the design and performance of an automatic shape control system for a Sendzimir cold rolling mill

    NASA Astrophysics Data System (ADS)

    Dutton, Kenneth

    Shape (or flatness) control for rolled steel strip is becoming increasingly important as customer requirements become more stringent. Automatic shape control is now more or less mandatory on all new four-high cold mills, but no comprehensive scheme yet exists on a Sendzimir mill. This is due to the complexity of the control system design on such a mill, where many more degrees of freedom for control exist than is the case with the four-high mills.The objective of the current work is to develop, from first principles, such a system; including automatic control of the As-U-Roll and first intermediate roll actuators in response to the measured strip shape. This thesis concerns itself primarily with the As-U-Roll control system. The material presented is extremely wide-ranging. Areas covered include the development of original static and dynamic mathematical models of the mill systems, and testing of the plant by data-logging to tune these models. A basic control system philosophy proposed by other workers is modified and developed to suit the practical system requirements and the data provided by the models. The control strategy is tested by comprehensive multivariable simulation studies. Finally, details are given of the practical problems faced when installing the system on the plant. These include problems of manual control inter-action bumpless transfer and integral desaturation.At the time of presentation of the thesis, system commissioning is still in progress and production results are therefore not yet available. Nevertheless, the simulation studies predict a successful outcome, although performance is expected to be limited until the first intermediate roll actuators are eventually included in the scheme also.

  1. Chip-based droplet sorting

    DOEpatents

    Beer, Neil Reginald; Lee, Abraham; Hatch, Andrew

    2014-07-01

    A non-contact system for sorting monodisperse water-in-oil emulsion droplets in a microfluidic device based on the droplet's contents and their interaction with an applied electromagnetic field or by identification and sorting.

  2. A QR code identification technology in package auto-sorting system

    NASA Astrophysics Data System (ADS)

    di, Yi-Juan; Shi, Jian-Ping; Mao, Guo-Yong

    2017-07-01

    Traditional manual sorting operation is not suitable for the development of Chinese logistics. For better sorting packages, a QR code recognition technology is proposed to identify the QR code label on the packages in package auto-sorting system. The experimental results compared with other algorithms in literatures demonstrate that the proposed method is valid and its performance is superior to other algorithms.

  3. When Seeing Is Knowing: The Role of Visual Cues in the Dissociation between Children's Rule Knowledge and Rule Use

    ERIC Educational Resources Information Center

    Buss, Aaron T.; Spencer, John P.

    2012-01-01

    The Dimensional Change Card Sort (DCCS) task requires children to switch from sorting cards based on shape or color to sorting based on the other dimension. Typically, 3-year-olds perseverate, whereas 4-year-olds flexibly sort by different dimensions. Zelazo and colleagues (1996, Cognitive Development, 11, 37-63) asked children questions about the…

  4. BayesMotif: de novo protein sorting motif discovery from impure datasets.

    PubMed

    Hu, Jianjun; Zhang, Fan

    2010-01-18

    Protein sorting is the process that newly synthesized proteins are transported to their target locations within or outside of the cell. This process is precisely regulated by protein sorting signals in different forms. A major category of sorting signals are amino acid sub-sequences usually located at the N-terminals or C-terminals of protein sequences. Genome-wide experimental identification of protein sorting signals is extremely time-consuming and costly. Effective computational algorithms for de novo discovery of protein sorting signals is needed to improve the understanding of protein sorting mechanisms. We formulated the protein sorting motif discovery problem as a classification problem and proposed a Bayesian classifier based algorithm (BayesMotif) for de novo identification of a common type of protein sorting motifs in which a highly conserved anchor is present along with a less conserved motif regions. A false positive removal procedure is developed to iteratively remove sequences that are unlikely to contain true motifs so that the algorithm can identify motifs from impure input sequences. Experiments on both implanted motif datasets and real-world datasets showed that the enhanced BayesMotif algorithm can identify anchored sorting motifs from pure or impure protein sequence dataset. It also shows that the false positive removal procedure can help to identify true motifs even when there is only 20% of the input sequences containing true motif instances. We proposed BayesMotif, a novel Bayesian classification based algorithm for de novo discovery of a special category of anchored protein sorting motifs from impure datasets. Compared to conventional motif discovery algorithms such as MEME, our algorithm can find less-conserved motifs with short highly conserved anchors. Our algorithm also has the advantage of easy incorporation of additional meta-sequence features such as hydrophobicity or charge of the motifs which may help to overcome the limitations of PWM (position weight matrix) motif model.

  5. Sorted bedforms developed on sandy lobes fed by small ephemeral streams (Catalan continental shelf)

    NASA Astrophysics Data System (ADS)

    Durán, R.; Guillén, J.; Muñoz, A.; Guerrero, Q.

    2016-12-01

    The morphology and sedimentological characteristics of sorted bedforms identified in the Catalan continental shelf (NW Mediterranean Sea) have been characterized using multibeam echosounder data and sediment samples collected in 2013 within the FORMED project. Bathymetric data was compared with previous data gathered in 2004 within the ESPACE project to assess the decadal stability of these bedforms. The sorted bedforms were observed on the inner shelf at water depths from 10 to 40 m, along a coastal stretch of more than 3 km. They are associated with elongated patches of low backscatter, corresponding to fine sand. The fine-grained sediment patches are located off small bays fed by short, intermittent streams, extending down to 40 m water depth. The sorted bedforms exhibit elongated shapes with subtle relief (up to 1 m) and are oriented nearly perpendicular to the shoreline. In cross-section, the sorted bedforms display lateral symmetry in bathymetric relief and backscatter, with high backscatter corresponding to poorly sorted coarse sand (median size of 0.55-0.96 mm) centered on the bathymetric depression, and low backscatter consisting of well-sorted fine to medium sand (median sized of 0.22-0.35 mm) on the crest. The local input of well-sorted fine sand supplied by ephemeral streams over the coarse sand domain of the infralittoral prograding wedge contributes to the bed sediment heterogeneity (mixture of sediment), which is further reorganized into sorted bedforms. The sorted bedforms are better developed in deeper waters (20-40 m) than near the shoreline, probably due to stronger wave forcing in the shallower shelf that prevents the maintenance of these morphologies. At a decadal time scale, the morphological evolution of these bedforms indicates that they are persistent features, showing small changes in their boundaries, which is in agreement with previous observations and numerical simulations that highlighted the persistence and long-term stability of sorted bedforms at water depths greater than 15-20 m over annual or even decadal timescales.

  6. Put your hands up! Gesturing improves preschoolers' executive function.

    PubMed

    Rhoads, Candace L; Miller, Patricia H; Jaeger, Gina O

    2018-09-01

    This study addressed the causal direction of a previously reported relation between preschoolers' gesturing and their executive functioning on the Dimensional Change Card Sort (DCCS) sorting-switch task. Gesturing the relevant dimension for sorting was induced in a Gesture group through instructions, imitation, and prompts. In contrast, the Control group was instructed to "think hard" when sorting. Preschoolers (N = 50) performed two DCCS tasks: (a) sort by size and then spatial orientation of two objects and (b) sort by shape and then proximity of the two objects. An examination of performance over trials permitted a fine-grained depiction of patterns of younger and older children in the Gesture and Control conditions. After the relevant dimension was switched, the Gesture group had more accurate sorts than the Control group, particularly among younger children on the second task. Moreover, the amount of gesturing predicted the number of correct sorts among younger children on the second task. The overall association between gesturing and sorting was not reflected at the level of individual trials, perhaps indicating covert gestural representation on some trials or the triggering of a relevant verbal representation by the gesturing. The delayed benefit of gesturing, until the second task, in the younger children may indicate a utilization deficiency. Results are discussed in terms of theories of gesturing and thought. The findings open up a new avenue of research and theorizing about the possible role of gesturing in emerging executive function. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. [CD34(+)/CD123(+) cell sorting from the patients with leukemia by Midi MACS method].

    PubMed

    Wang, Guang-Ping; Cao, Xin-Yu; Xin, Hong-Ya; Li, Qun; Qi, Zhen-Hua; Chen, Fang-Ping

    2006-10-01

    The aim of this study was to sort the CD34(+)/CD123(+) cells from the bone marrow cells of patients with acute myeloid leukemia (AML) by Midi MACS method. Firstly, the bone marrow mononuclear cells (BMMNC) were isolated from the patients with AML with Ficoll Paque, CD34(+) cells were then isolated by Midi MACS method followed by the isolation of CD34(+)/CD123(+) cells from the fraction of CD34(+) cells. The enrichment and recovery of CD34(+) and CD34(+)/CD123(+) cells were assayed by FACS technique. The results showed that the enrichment of CD34(+) cells was up to 98.73%, its average enrichment was 95.6%, and the recovery of CD34(+) was 84.6%, its average recovery was 51% after the first round sorting, by the second round sorting, the enrichment of CD34(+)/CD123(+) cells was up to 99.23%, its average enrichment was 83%. With regard to BMMNCs before sorting, the recovery of CD34(+)/CD123(+) was 34%. But, on the CD34(+) cells obtained by the first round sorting, its recovery was 56%. In conclusion, these results confirmed that the method of Midi MACS sorting can be applied to sort CD34(+)/CD123(+) cells from the bone marrow cells of AML patients, which give rise to the similar enrichment and recovery of the sorted cells with that of literature reported by the method of FACS.

  8. Electrophysiological properties of prion-positive cardiac progenitors derived from murine embryonic stem cells.

    PubMed

    Fujii, Hiroshi; Ikeuchi, Yu; Kurata, Yasutaka; Ikeda, Nobuhito; Bahrudin, Udin; Li, Peili; Nakayama, Yuji; Endo, Ryo; Hasegawa, Akira; Morikawa, Kumi; Miake, Junichiro; Yoshida, Akio; Hidaka, Kyoko; Morisaki, Takayuki; Ninomiya, Haruaki; Shirayoshi, Yasuaki; Yamamoto, Kazuhiro; Hisatome, Ichiro

    2012-01-01

    The prion protein (PrP) has been reported to serve as a surface maker for isolation of cardiomyogenic progenitors from murine embryonic stem (ES) cells. Although PrP-positive cells exhibited automaticity, their electrophysiological characteristics remain unresolved. The aim of the present study was therefore to investigate the electrophysiological properties of PrP-positive cells in comparison with those of HCN4p-or Nkx2.5-positive cells. Differentiation of AB1, HCN5p-EGFP and hcgp7 ES cells into cardiac progenitors was induced by embryoid body (EB) formation. EBs were dissociated and cells expressing PrP, HCN4-EGFP and/or Nkx2.5-GFP were collected via flow cytometry. Sorted cells were subjected to reverse transcriptase-polymerase chain reaction, immunostaining and patch-clamp experiments. PrP-positive cells expressed mRNA of undifferentiation markers, first and second heart field markers, and cardiac-specific genes and ion channels, indicating their commitment to cardiomyogenic progenitors. PrP-positive cells with automaticity showed positive and negative chronotropic responses to isoproterenol and carbamylcholine, respectively. Hyperpolarization-activated cation current (I(f)) was barely detectable, whereas Na(+) and L-type Ca(2+) channel currents were frequently observed. Their spontaneous activity was slowed by inhibition of sarcoplasmic reticulum Ca(2+) uptake and release but not by blocking I(f). The maximum diastolic potential of their spontaneous firings was more depolarized than that of Nkx2.5-GFP-positive cells. PrP-positive cells contained cardiac progenitors that separated from the lineage of sinoatrial node cells. PrP can be used as a marker to enrich nascent cardiac progenitors.

  9. A public resource facilitating clinical use of genomes

    PubMed Central

    Ball, Madeleine P.; Thakuria, Joseph V.; Zaranek, Alexander Wait; Clegg, Tom; Rosenbaum, Abraham M.; Wu, Xiaodi; Angrist, Misha; Bhak, Jong; Bobe, Jason; Callow, Matthew J.; Cano, Carlos; Chou, Michael F.; Chung, Wendy K.; Douglas, Shawn M.; Estep, Preston W.; Gore, Athurva; Hulick, Peter; Labarga, Alberto; Lee, Je-Hyuk; Lunshof, Jeantine E.; Kim, Byung Chul; Kim, Jong-Il; Li, Zhe; Murray, Michael F.; Nilsen, Geoffrey B.; Peters, Brock A.; Raman, Anugraha M.; Rienhoff, Hugh Y.; Robasky, Kimberly; Wheeler, Matthew T.; Vandewege, Ward; Vorhaus, Daniel B.; Yang, Joyce L.; Yang, Luhan; Aach, John; Ashley, Euan A.; Drmanac, Radoje; Kim, Seong-Jin; Li, Jin Billy; Peshkin, Leonid; Seidman, Christine E.; Seo, Jeong-Sun; Zhang, Kun; Rehm, Heidi L.; Church, George M.

    2012-01-01

    Rapid advances in DNA sequencing promise to enable new diagnostics and individualized therapies. Achieving personalized medicine, however, will require extensive research on highly reidentifiable, integrated datasets of genomic and health information. To assist with this, participants in the Personal Genome Project choose to forgo privacy via our institutional review board- approved “open consent” process. The contribution of public data and samples facilitates both scientific discovery and standardization of methods. We present our findings after enrollment of more than 1,800 participants, including whole-genome sequencing of 10 pilot participant genomes (the PGP-10). We introduce the Genome-Environment-Trait Evidence (GET-Evidence) system. This tool automatically processes genomes and prioritizes both published and novel variants for interpretation. In the process of reviewing the presumed healthy PGP-10 genomes, we find numerous literature references implying serious disease. Although it is sometimes impossible to rule out a late-onset effect, stringent evidence requirements can address the high rate of incidental findings. To that end we develop a peer production system for recording and organizing variant evaluations according to standard evidence guidelines, creating a public forum for reaching consensus on interpretation of clinically relevant variants. Genome analysis becomes a two-step process: using a prioritized list to record variant evaluations, then automatically sorting reviewed variants using these annotations. Genome data, health and trait information, participant samples, and variant interpretations are all shared in the public domain—we invite others to review our results using our participant samples and contribute to our interpretations. We offer our public resource and methods to further personalized medical research. PMID:22797899

  10. Perseveration and the Status of 3-Year-Olds' Knowledge in a Card-Sorting Task: Evidence from Studies Involving Congruent Flankers

    ERIC Educational Resources Information Center

    Jordan, Patricia L.; Morton, J. Bruce

    2012-01-01

    Infants and young children often perseverate despite apparent knowledge of the correct response. Two Experiments addressed questions concerning the status of such knowledge in the context of a card-sorting task. In Experiment 1, three groups of 3-year-olds sorted bivalent cards one way and then were instructed to switch and sort the same cards…

  11. Activation of epidermal growth factor receptor mediates receptor axon sorting and extension in the developing olfactory system of the moth Manduca sexta.

    PubMed

    Gibson, Nicholas J; Tolbert, Leslie P

    2006-04-10

    During development of the adult olfactory system of the moth Manduca sexta, olfactory receptor neurons extend axons from the olfactory epithelium in the antenna into the brain. As they arrive at the brain, interactions with centrally derived glial cells cause axons to sort and fasciculate with other axons destined to innervate the same glomeruli. Here we report studies indicating that activation of the epidermal growth factor receptor (EGFR) is involved in axon ingrowth and targeting. Blocking the EGFR kinase domain pharmacologically leads to stalling of many axons in the sorting zone and nerve layer as well as abnormal axonal fasciculation in the sorting zone. We also find that neuroglian, an IgCAM known to activate the EGFR through homophilic interactions in other systems, is transiently present on olfactory receptor neuron axons and on glia during the critical stages of the sorting process. The neuroglian is resistant to extraction with Triton X-100 in the sorting zone and nerve layer, possibly indicating its stabilization by homophilic binding in these regions. Our results suggest a mechanism whereby neuroglian molecules on axons and possibly sorting zone glia bind homophilically, leading to activation of EGFRs, with subsequent effects on axon sorting, pathfinding, and extension, and glomerulus development. Copyright 2006 Wiley-Liss, Inc.

  12. Activation of EGF Receptor Mediates Receptor Axon Sorting and Extension in the Developing Olfactory System of the Moth Manduca sexta

    PubMed Central

    Gibson, Nicholas J.; Tolbert, Leslie P.

    2008-01-01

    During development of the adult olfactory system of the moth Manduca sexta, olfactory receptor neurons extend axons from the olfactory epithelium in the antenna into the brain. As they arrive at the brain, interactions with centrally-derived glial cells cause axons to sort and fasciculate with other axons destined to innervate the same glomeruli. Here we report studies that indicate that activation of the epidermal growth factor receptor (EGFR) is involved in axon ingrowth and targeting. Blocking the EGFR kinase domain pharmacologically leads to stalling of many axons in the sorting zone and nerve layer, as well as abnormal axonal fasciculation in the sorting zone. We also find that neuroglian, an IgCAM known to activate the EGFR through homophilic interactions in other systems, is transiently present on olfactory receptor neuron axons and on glia during the critical stages of the sorting process. The neuroglian is resistant to extraction with Triton X-100 in the sorting zone and nerve layer, possibly indicating its stabilization by homophilic binding in these regions. Our results suggest a mechanism whereby neuroglian molecules on axons and possibly sorting zone glia bind homophilically, leading to activation of EGFRs with subsequent effects on axon sorting, pathfinding, and extension, and glomerulus development. PMID:16498681

  13. Effect of staining and freezing media on sortability of stallion spermatozoa and their post-thaw viability after sex-sorting and cryopreservation.

    PubMed

    Clulow, J R; Buss, H; Evans, G; Sieme, H; Rath, D; Morris, L H A; Maxwell, W M C

    2012-02-01

    Sex-sorted, frozen-thawed stallion spermatozoa remain out of reach of commercial horse breeders because of the low efficiency of the sex-sorting process and unacceptable fertility rates after insemination. Two experiments were designed to test the effects of alternative staining and freezing media to improve the viability of sex-sorted frozen-thawed stallion spermatozoa. Experiment 1 compared two freezing media, INRA 82(®) and a modified lactose-ethylenediaminetetraacetic acid (EDTA), for the cryopreservation of sex-sorted stallion spermatozoa. No significant differences between the two freezing media could be identified, suggesting that both cryodiluents would be suitable for incorporation into a sex-preselection protocol for stallion spermatozoa. Experiment 2 compared Kenney's modified Tyrode's (KMT) and Sperm TALP (Sp-TALP) as the staining and incubation medium for stallion spermatozoa prior to sex-sorting. A significant increase in the percentage of acrosome-reacted spermatozoa occurred after staining and incubation in the clarified Sp-TALP compared with KMT. As no improvements in sorting rates were achieved using Sp-TALP, it was concluded that stallion sorting protocols could include KMT as the staining and incubation medium while either INRA 82(®) or lactose-EDTA could be employed as a cryodiluents. © 2011 Blackwell Verlag GmbH.

  14. Sorted bedform pattern evolution: Persistence, destruction and self-organized intermittency

    NASA Astrophysics Data System (ADS)

    Goldstein, Evan B.; Murray, A. Brad; Coco, Giovanni

    2011-12-01

    We investigate the long-term evolution of inner continental shelf sorted bedform patterns. Numerical modeling suggests that a range of behaviors are possible, from pattern persistence to spatial-temporal intermittency. Sorted bedform persistence results from a robust sorting feedback that operates when the seabed features a sufficient concentration of coarse material. In the absence of storm events, pattern maturation processes such as defect dynamics and pattern migration tend to cause the burial of coarse material and excavation of fine material, leading to the fining of the active layer. Vertical sorting occurs until a critical state of active layer coarseness is reached. This critical state results in the local cessation of the sorting feedback, leading to a self-organized spatially intermittent pattern, a hallmark of observed sorted bedforms. Bedforms in shallow conditions and those subject to high wave climates may be temporally intermittent features as a result of increased wave orbital velocity during storms. Erosion, or deposition of bimodal sediment, similarly leads to a spatially intermittent pattern, with individual coarse domains exhibiting temporal intermittence. Recurring storm events cause coarsening of the seabed (strengthening the sorting feedback) and the development of large wavelength patterns. Cessation of storm events leads to the superposition of storm (large wavelength) and inter-storm (small wavelength) patterns and spatial heterogeneity of pattern modes.

  15. The smartphone as a platform for wearable cameras in health research.

    PubMed

    Gurrin, Cathal; Qiu, Zhengwei; Hughes, Mark; Caprani, Niamh; Doherty, Aiden R; Hodges, Steve E; Smeaton, Alan F

    2013-03-01

    The Microsoft SenseCam, a small camera that is worn on the chest via a lanyard, increasingly is being deployed in health research. However, the SenseCam and other wearable cameras are not yet in widespread use because of a variety of factors. It is proposed that the ubiquitous smartphones can provide a more accessible alternative to SenseCam and similar devices. To perform an initial evaluation of the potential of smartphones to become an alternative to a wearable camera such as the SenseCam. In 2012, adults were supplied with a smartphone, which they wore on a lanyard, that ran life-logging software. Participants wore the smartphone for up to 1 day and the resulting life-log data were both manually annotated and automatically analyzed for the presence of visual concepts. The results were compared to prior work using the SenseCam. In total, 166,000 smartphone photos were gathered from 47 individuals, along with associated sensor readings. The average time spent wearing the device across all users was 5 hours 39 minutes (SD=4 hours 11 minutes). A subset of 36,698 photos was selected for manual annotation by five researchers. Software analysis of these photos supports the automatic identification of activities to a similar level of accuracy as for SenseCam images in a previous study. Many aspects of the functionality of a SenseCam largely can be replicated, and in some cases enhanced, by the ubiquitous smartphone platform. This makes smartphones good candidates for a new generation of wearable sensing devices in health research, because of their widespread use across many populations. It is envisioned that smartphones will provide a compelling alternative to the dedicated SenseCam hardware for a number of users and application areas. This will be achieved by integrating new types of sensor data, leveraging the smartphone's real-time connectivity and rich user interface, and providing support for a range of relatively sophisticated applications. Copyright © 2013 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  16. A Sequence of Sorting Strategies.

    ERIC Educational Resources Information Center

    Duncan, David R.; Litwiller, Bonnie H.

    1984-01-01

    Describes eight increasingly sophisticated and efficient sorting algorithms including linear insertion, binary insertion, shellsort, bubble exchange, shakersort, quick sort, straight selection, and tree selection. Provides challenges for the reader and the student to program these efficiently. (JM)

  17. UBE4B Protein Couples Ubiquitination and Sorting Machineries to Enable Epidermal Growth Factor Receptor (EGFR) Degradation*

    PubMed Central

    Sirisaengtaksin, Natalie; Gireud, Monica; Yan, Qing; Kubota, Yoshihisa; Meza, Denisse; Waymire, Jack C.; Zage, Peter E.; Bean, Andrew J.

    2014-01-01

    The signaling of plasma membrane proteins is tuned by internalization and sorting in the endocytic pathway prior to recycling or degradation in lysosomes. Ubiquitin modification allows recognition and association of cargo with endosomally associated protein complexes, enabling sorting of proteins to be degraded from those to be recycled. The mechanism that provides coordination between the cellular machineries that mediate ubiquitination and endosomal sorting is unknown. We report that the ubiquitin ligase UBE4B is recruited to endosomes in response to epidermal growth factor receptor (EGFR) activation by binding to Hrs, a key component of endosomal sorting complex required for transport (ESCRT) 0. We identify the EGFR as a substrate for UBE4B, establish UBE4B as a regulator of EGFR degradation, and describe a mechanism by which UBE4B regulates endosomal sorting, affecting cellular levels of the EGFR and its downstream signaling. We propose a model in which the coordinated action of UBE4B, ESCRT-0, and the deubiquitinating enzyme USP8 enable the endosomal sorting and lysosomal degradation of the EGFR. PMID:24344129

  18. Preserving and Archiving Astronomical Photographic Plates

    NASA Astrophysics Data System (ADS)

    Castelaz, M. W.; Cline, J. D.

    2005-05-01

    Astronomical objects change with time. New observations complement past observations recorded on photographic plates. Analyses of changes provide essential routes to information about an object's formation, constitution and evolution. Preserving a century of photographic plate observations is thus of paramount importance. Plate collections are presently widely dispersed; plates may be stored in poor conditions, and are effectively inaccessible to both researchers and historians. We describe a planned project at Pisgah Astronomical Research Institute to preserve the collections of astronomical plates in the United States by gathering them into a single storage location. Collections will be sorted, cleaned, and cataloged on-line so as to provide access to researchers. Full scientific and historic use of the material then requires the observations themselves to be accessible digitally. The project's goal will be the availability of these data as a unique, fully-maintained scientific and educational resource. The new archive will support trans-disciplinary research such as the chemistry of the Earth's atmosphere, library information science, trends in local weather patterns, and impacts of urbanization on telescope use, while the hand-written observatory logs will be a valuable resource for science historians and biographers.

  19. Drying and decontamination of raw pistachios with sequential infrared drying, tempering and hot air drying.

    PubMed

    Venkitasamy, Chandrasekar; Brandl, Maria T; Wang, Bini; McHugh, Tara H; Zhang, Ruihong; Pan, Zhongli

    2017-04-04

    Pistachio nuts have been associated with outbreaks of foodborne disease and the industry has been impacted by numerous product recalls due to contamination with Salmonella enterica. The current hot air drying of pistachios has low energy efficiency and drying rates, and also does not guarantee the microbial safety of products. In the study described herein, dehulled and water-sorted pistachios with a moisture content (MC) of 38.14% (wet basis) were dried in a sequential infrared and hot air (SIRHA) drier to <9% MC. The decontamination efficacy was assessed by inoculating pistachios with Enterococcus faecium, a surrogate of Salmonella enterica used for quality control in the almond industry. Drying with IR alone saved 105min (34.4%) of drying time compared with hot air drying. SIRHA drying of pistachios for 2h with infrared (IR) heat followed by tempering at a product temperature of 70°C for 2h and then by hot air drying shortened the drying time by 40min (9.1%) compared with drying by hot air only. This SIRHA method also reduced the E. faecium cell population by 6.1-logCFU/g kernel and 5.41-logCFU/g shell of pistachios. The free fatty acid contents of SIRHA dried pistachios were on par with that of hot air dried samples. Despite significant differences in peroxide values (PV) of pistachio kernels dried with the SIRHA method compared with hot air drying at 70°C, the PV were within the permissible limit of 5Meq/kg for edible oils. Our findings demonstrate the efficacy of SIRHA drying in achieving simultaneous drying and decontamination of pistachios. Published by Elsevier B.V.

  20. Construction of Various γ34.5 Deleted Fluorescent-Expressing Oncolytic herpes Simplex type 1 (oHSV) for Generation and Isolation of HSV-Based Vectors

    PubMed

    Abdoli, Shahriyar; Roohvand, Farzin; Teimoori-Toolabi, Ladan; Shokrgozar, Mohammad Ali; Bahrololoumi, Mina; Azadmanesh, Kayhan

    2017-07-01

    Oncolytic herpes simplex virus (oHSV)-based vectors lacking γ34.5 gene, are considered as ideal templates to construct efficient vectors for (targeted) cancer gene therapy. Herein, we reported the construction of three single/dually-flourescence labeled and γ34.5-deleted, recombinant HSV-1 vectors for rapid generation and easy selection/isolation of different HSV-Based vectors. Generation of recombinant viruses was performed with conventional homologous recombination methods using green fluorescent protein (GFP) and BleCherry harboring shuttle vectors. Viruses were isolated by direct fluorescence observation and standard plaque purifying methods and confirmed by PCR and sequencing and flow cytometry. XTT and plaque assay titration were performed on Vero, U87MG, and T98 GBM cell lines. We generated three recombinant viruses, HSV-GFP, HSV-GR (Green-Red), and HSV-Red. The HSV-GFP showed two log higher titer (1010 PFU) than wild type (108 PFU). In contrast, HSV-GR and HSV-Red showed one log lower titer (107 PFU) than parental HSV. Cytotoxicity analysis showed that HSV-GR and HSV-Red can lyse target tumor cells at multiplicity of infection of 10 and 1 (P<0.001). Moreover, HSV-GFP showed higher infection potency (98%) in comparison with HSV-GR (82%). Our oHSVs provide a simple and an efficient platform for construction and rapid isolation of 2nd and 3rd generation oHSVs by replacing the inserted dyes with transgenes and also for rapid identification via fluorescence activated cell sorting. These vectors can also be used for tracing the efficacy of therapeutic agents on target cells, imaging of neural or tumoral cells in vitro/in vivo and as oncolytic agents in cancer therapy.

Top