Sample records for scale automatic preparative

  1. Addressing case specific biogas plant tasks: industry oriented methane yields derived from 5L Automatic Methane Potential Test Systems in batch or semi-continuous tests using realistic inocula, substrate particle sizes and organic loading.

    PubMed

    Kolbl, Sabina; Paloczi, Attila; Panjan, Jože; Stres, Blaž

    2014-02-01

    The primary aim of the study was to develop and validate an in-house upscale of Automatic Methane Potential Test System II for studying real-time inocula and real-scale substrates in batch, codigestion and enzyme enhanced hydrolysis experiments, in addition to semi-continuous operation of the developed equipment and experiments testing inoculum functional quality. The successful upscale to 5L enabled comparison of different process configurations in shorter preparation times with acceptable accuracy and high-through put intended for industrial decision making. The adoption of the same scales, equipment and methodologies in batch and semi-continuous tests mirroring those at full scale biogas plants resulted in matching methane yields between the two laboratory tests and full-scale, confirming thus the increased decision making value of the approach for industrial operations. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. An automatic tooth preparation technique: A preliminary study

    NASA Astrophysics Data System (ADS)

    Yuan, Fusong; Wang, Yong; Zhang, Yaopeng; Sun, Yuchun; Wang, Dangxiao; Lyu, Peijun

    2016-04-01

    The aim of this study is to validate the feasibility and accuracy of a new automatic tooth preparation technique in dental healthcare. An automatic tooth preparation robotic device with three-dimensional motion planning software was developed, which controlled an ultra-short pulse laser (USPL) beam (wavelength 1,064 nm, pulse width 15 ps, output power 30 W, and repeat frequency rate 100 kHz) to complete the tooth preparation process. A total of 15 freshly extracted human intact first molars were collected and fixed into a phantom head, and the target preparation shapes of these molars were designed using customised computer-aided design (CAD) software. The accuracy of tooth preparation was evaluated using the Geomagic Studio and Imageware software, and the preparing time of each tooth was recorded. Compared with the target preparation shape, the average shape error of the 15 prepared molars was 0.05-0.17 mm, the preparation depth error of the occlusal surface was approximately 0.097 mm, and the error of the convergence angle was approximately 1.0°. The average preparation time was 17 minutes. These results validated the accuracy and feasibility of the automatic tooth preparation technique.

  3. An automatic tooth preparation technique: A preliminary study.

    PubMed

    Yuan, Fusong; Wang, Yong; Zhang, Yaopeng; Sun, Yuchun; Wang, Dangxiao; Lyu, Peijun

    2016-04-29

    The aim of this study is to validate the feasibility and accuracy of a new automatic tooth preparation technique in dental healthcare. An automatic tooth preparation robotic device with three-dimensional motion planning software was developed, which controlled an ultra-short pulse laser (USPL) beam (wavelength 1,064 nm, pulse width 15 ps, output power 30 W, and repeat frequency rate 100 kHz) to complete the tooth preparation process. A total of 15 freshly extracted human intact first molars were collected and fixed into a phantom head, and the target preparation shapes of these molars were designed using customised computer-aided design (CAD) software. The accuracy of tooth preparation was evaluated using the Geomagic Studio and Imageware software, and the preparing time of each tooth was recorded. Compared with the target preparation shape, the average shape error of the 15 prepared molars was 0.05-0.17 mm, the preparation depth error of the occlusal surface was approximately 0.097 mm, and the error of the convergence angle was approximately 1.0°. The average preparation time was 17 minutes. These results validated the accuracy and feasibility of the automatic tooth preparation technique.

  4. Taxonomic classification of soils using digital information from LANDSAT data. Huayllamarca and eucaliptus areas. M.S. Thesis - Bolivia Univ.

    NASA Technical Reports Server (NTRS)

    Quiroga, S. Q.

    1977-01-01

    The applicability of LANDSAT digital information to soil mapping is described. A compilation of all cartographic information and bibliography of the study area is made. LANDSAT MSS images on a scale of 1:250,000 are interpreted and a physiographic map with legend is prepared. The study area is inspected and a selection of the sample areas is made. A digital map of the different soil units is produced and the computer mapping units are checked against the soil units encountered in the field. The soil boundaries obtained by automatic mapping were not substantially changed by field work. The accuracy of the automatic mapping is rather high.

  5. Automatic Text Structuring and Summarization.

    ERIC Educational Resources Information Center

    Salton, Gerard; And Others

    1997-01-01

    Discussion of the use of information retrieval techniques for automatic generation of semantic hypertext links focuses on automatic text summarization. Topics include World Wide Web links, text segmentation, and evaluation of text summarization by comparing automatically generated abstracts with manually prepared abstracts. (Author/LRW)

  6. Automatically Preparing Safe SQL Queries

    NASA Astrophysics Data System (ADS)

    Bisht, Prithvi; Sistla, A. Prasad; Venkatakrishnan, V. N.

    We present the first sound program source transformation approach for automatically transforming the code of a legacy web application to employ PREPARE statements in place of unsafe SQL queries. Our approach therefore opens the way for eradicating the SQL injection threat vector from legacy web applications.

  7. [Study on the appropriate parameters of automatic full crown tooth preparation for dental tooth preparation robot].

    PubMed

    Yuan, F S; Wang, Y; Zhang, Y P; Sun, Y C; Wang, D X; Lyu, P J

    2017-05-09

    Objective: To further study the most suitable parameters for automatic full crown preparation using oral clinical micro robot. Its purpose is to improve the quality of automated tooth preparing for the system and to lay the foundation for clinical application. Methods: Twenty selected artificial resin teeth were used as sample teeth. The micro robot automatic tooth preparation system was used in dental clinic to control the picosecond laser beam to complete two dimensional cutting on the resin tooth sample according to the motion planning path. Using the laser scanning measuring microscope, each layer of cutting depth values was obtained and the average value was calculated. The monolayer cutting depth was determined. The three-dimensional (3D) data of the target resin teeth was obtained using internal scanner, and the CAD data of full-crown tooth preparation was designed by CAD self-develged software. According to the depth of the single layer, 11 complete resin teeth in phantom head were automatically prepared by the robot controlling the laser focused spot in accordance with the layer-cutting way. And the accuracy of resin tooth preparation was evaluated with the software. Using the same method, monolayer cutting depth parameter for cutting dental hard tissue was obtained. Then 15 extracted mandibular and maxillary first molars went through automatic full crown tooth preparation. And the 3D data of tooth preparations were obtained with intra oral scanner. The software was used to evaluate the accuracy of tooth preparation. Results: The results indicated that the single cutting depth of cutting resin teeth and in vitro teeth by picosecond laser were (60.0±2.6) and (45.0±3.6) μm, respectively. Using the tooth preparation robot, 11 artificial resin teeth and 15 complete natural teeth were automatically prepared, and the average time were (13.0±0.7), (17.0±1.8) min respectively. Through software evaluation, the average preparation depth of the occlusal surface of 11 resin teeth was approximately (2.089±0.026) mm, the error was about (0.089±0.026) mm; the average convergence angle was about 6.56°±0.30°, the error was about 0.56°±0.30°. Compared with the target preparation shape, the average shape error of the 11 resin tooth preparations was about 0.02-0.11 mm. And the average preparation depth of the occlusal surface of 15 natural teeth was approximately (2.097±0.022) mm, the error was about (0.097±0.022) mm; the average convergence angle was about 6.98°±0.35°, the error was about 0.98°±0.35°. Compared with the target preparation shape, the average shape error of the 15 natural tooth preparations was about 0.05-0.17 mm. Conclusions: The experimental results indicate that the automatic tooth preparation for resin teeth and the teeth were completed according to the specific parameters of the single cutting depth by the micro robot controlling picosecond laser respectively, its preparation accuracy met the clinical needs. And the suitability of the parameter was confirmed.

  8. Combining contour detection algorithms for the automatic extraction of the preparation line from a dental 3D measurement

    NASA Astrophysics Data System (ADS)

    Ahlers, Volker; Weigl, Paul; Schachtzabel, Hartmut

    2005-04-01

    Due to the increasing demand for high-quality ceramic crowns and bridges, the CAD/CAM-based production of dental restorations has been a subject of intensive research during the last fifteen years. A prerequisite for the efficient processing of the 3D measurement of prepared teeth with a minimal amount of user interaction is the automatic determination of the preparation line, which defines the sealing margin between the restoration and the prepared tooth. Current dental CAD/CAM systems mostly require the interactive definition of the preparation line by the user, at least by means of giving a number of start points. Previous approaches to the automatic extraction of the preparation line rely on single contour detection algorithms. In contrast, we use a combination of different contour detection algorithms to find several independent potential preparation lines from a height profile of the measured data. The different algorithms (gradient-based, contour-based, and region-based) show their strengths and weaknesses in different clinical situations. A classifier consisting of three stages (range check, decision tree, support vector machine), which is trained by human experts with real-world data, finally decides which is the correct preparation line. In a test with 101 clinical preparations, a success rate of 92.0% has been achieved. Thus the combination of different contour detection algorithms yields a reliable method for the automatic extraction of the preparation line, which enables the setup of a turn-key dental CAD/CAM process chain with a minimal amount of interactive screen work.

  9. [Automated analysis of bacterial preparations manufactured on automatic heat fixation and staining equipment].

    PubMed

    2012-01-01

    Heat fixation of preparations was made in the fixation bath designed by EMKO (Russia). Programmable "Emkosteiner" (EMKO, Russia) was used for trial staining. Reagents set Micko-GRAM-NITsF was applied for Gram's method of staining. It was demostrated that automatic smear fixation equipment and programmable staining ensure high-quality imaging (1% chromaticity variation) good enough for standardization of Gram's staining of microbial preparations.

  10. Regulation and Measurement of the Heat Generated by Automatic Tooth Preparation in a Confined Space.

    PubMed

    Yuan, Fusong; Zheng, Jianqiao; Sun, Yuchun; Wang, Yong; Lyu, Peijun

    2017-06-01

    The aim of this study was to assess and regulate heat generation in the dental pulp cavity and circumambient temperature around a tooth during laser ablation with a femtosecond laser in a confined space. The automatic tooth preparing technique is one of the traditional oral clinical technology innovations. In this technique, a robot controlled an ultrashort pulse laser to automatically complete the three-dimensional teeth preparing in a confined space. The temperature control is the main measure for protecting the tooth nerve. Ten tooth specimens were irradiated with a femtosecond laser controlled by a robot in a confined space to generate 10 teeth preparation. During the process, four thermocouple sensors were used to record the pulp cavity and circumambient environment temperatures with or without air cooling. A statistical analysis of the temperatures was performed between the conditions with and without air cooling (p < 0.05). The recordings showed that the temperature with air cooling was lower than that without air cooling and that the heat generated in the pulp cavity was lower than the threshold for dental pulp damage. These results indicate that femtosecond laser ablation with air cooling might be an appropriate method for automatic tooth preparing.

  11. Lithology and aggregate quality attributes for the digital geologic map of Colorado

    USGS Publications Warehouse

    Knepper, Daniel H.; Green, Gregory N.; Langer, William H.

    1999-01-01

    This geologic map was prepared as a part of a study of digital methods and techniques as applied to complex geologic maps. The geologic map was digitized from the original scribe sheets used to prepare the published Geologic Map of Colorado (Tweto 1979). Consequently the digital version is at 1:500,000 scale using the Lambert Conformal Conic map projection parameters of the state base map. Stable base contact prints of the scribe sheets were scanned on a Tektronix 4991 digital scanner. The scanner automatically converts the scanned image to an ASCII vector format. These vectors were transferred to a VAX minicomputer, where they were then loaded into ARC/INFO. Each vector and polygon was given attributes derived from the original 1979 geologic map.

  12. Automatic image enhancement based on multi-scale image decomposition

    NASA Astrophysics Data System (ADS)

    Feng, Lu; Wu, Zhuangzhi; Pei, Luo; Long, Xiong

    2014-01-01

    In image processing and computational photography, automatic image enhancement is one of the long-range objectives. Recently the automatic image enhancement methods not only take account of the globe semantics, like correct color hue and brightness imbalances, but also the local content of the image, such as human face and sky of landscape. In this paper we describe a new scheme for automatic image enhancement that considers both global semantics and local content of image. Our automatic image enhancement method employs the multi-scale edge-aware image decomposition approach to detect the underexposure regions and enhance the detail of the salient content. The experiment results demonstrate the effectiveness of our approach compared to existing automatic enhancement methods.

  13. Near-real-time simulation and internet-based delivery of forecast-flood inundation maps using two-dimensional hydraulic modeling--A pilot study for the Snoqualmie River, Washington

    USGS Publications Warehouse

    Jones, Joseph L.; Fulford, Janice M.; Voss, Frank D.

    2002-01-01

    A system of numerical hydraulic modeling, geographic information system processing, and Internet map serving, supported by new data sources and application automation, was developed that generates inundation maps for forecast floods in near real time and makes them available through the Internet. Forecasts for flooding are generated by the National Weather Service (NWS) River Forecast Center (RFC); these forecasts are retrieved automatically by the system and prepared for input to a hydraulic model. The model, TrimR2D, is a new, robust, two-dimensional model capable of simulating wide varieties of discharge hydrographs and relatively long stream reaches. TrimR2D was calibrated for a 28-kilometer reach of the Snoqualmie River in Washington State, and is used to estimate flood extent, depth, arrival time, and peak time for the RFC forecast. The results of the model are processed automatically by a Geographic Information System (GIS) into maps of flood extent, depth, and arrival and peak times. These maps subsequently are processed into formats acceptable by an Internet map server (IMS). The IMS application is a user-friendly interface to access the maps over the Internet; it allows users to select what information they wish to see presented and allows the authors to define scale-dependent availability of map layers and their symbology (appearance of map features). For example, the IMS presents a background of a digital USGS 1:100,000-scale quadrangle at smaller scales, and automatically switches to an ortho-rectified aerial photograph (a digital photograph that has camera angle and tilt distortions removed) at larger scales so viewers can see ground features that help them identify their area of interest more effectively. For the user, the option exists to select either background at any scale. Similar options are provided for both the map creator and the viewer for the various flood maps. This combination of a robust model, emerging IMS software, and application interface programming should allow the technology developed in the pilot study to be applied to other river systems where NWS forecasts are provided routinely.

  14. Estimating spatial travel times using automatic vehicle identification data

    DOT National Transportation Integrated Search

    2001-01-01

    Prepared ca. 2001. The paper describes an algorithm that was developed for estimating reliable and accurate average roadway link travel times using Automatic Vehicle Identification (AVI) data. The algorithm presented is unique in two aspects. First, ...

  15. Enhanced Automatic Question Creator--EAQC: Concept, Development and Evaluation of an Automatic Test Item Creation Tool to Foster Modern e-Education

    ERIC Educational Resources Information Center

    Gutl, Christian; Lankmayr, Klaus; Weinhofer, Joachim; Hofler, Margit

    2011-01-01

    Research in automated creation of test items for assessment purposes became increasingly important during the recent years. Due to automatic question creation it is possible to support personalized and self-directed learning activities by preparing appropriate and individualized test items quite easily with relatively little effort or even fully…

  16. Graph-Based Semantic Web Service Composition for Healthcare Data Integration.

    PubMed

    Arch-Int, Ngamnij; Arch-Int, Somjit; Sonsilphong, Suphachoke; Wanchai, Paweena

    2017-01-01

    Within the numerous and heterogeneous web services offered through different sources, automatic web services composition is the most convenient method for building complex business processes that permit invocation of multiple existing atomic services. The current solutions in functional web services composition lack autonomous queries of semantic matches within the parameters of web services, which are necessary in the composition of large-scale related services. In this paper, we propose a graph-based Semantic Web Services composition system consisting of two subsystems: management time and run time. The management-time subsystem is responsible for dependency graph preparation in which a dependency graph of related services is generated automatically according to the proposed semantic matchmaking rules. The run-time subsystem is responsible for discovering the potential web services and nonredundant web services composition of a user's query using a graph-based searching algorithm. The proposed approach was applied to healthcare data integration in different health organizations and was evaluated according to two aspects: execution time measurement and correctness measurement.

  17. Graph-Based Semantic Web Service Composition for Healthcare Data Integration

    PubMed Central

    2017-01-01

    Within the numerous and heterogeneous web services offered through different sources, automatic web services composition is the most convenient method for building complex business processes that permit invocation of multiple existing atomic services. The current solutions in functional web services composition lack autonomous queries of semantic matches within the parameters of web services, which are necessary in the composition of large-scale related services. In this paper, we propose a graph-based Semantic Web Services composition system consisting of two subsystems: management time and run time. The management-time subsystem is responsible for dependency graph preparation in which a dependency graph of related services is generated automatically according to the proposed semantic matchmaking rules. The run-time subsystem is responsible for discovering the potential web services and nonredundant web services composition of a user's query using a graph-based searching algorithm. The proposed approach was applied to healthcare data integration in different health organizations and was evaluated according to two aspects: execution time measurement and correctness measurement. PMID:29065602

  18. Mosaic construction, processing, and review of very large electron micrograph composites

    NASA Astrophysics Data System (ADS)

    Vogt, Robert C., III; Trenkle, John M.; Harmon, Laurel A.

    1996-11-01

    A system of programs is described for acquisition, mosaicking, cueing and interactive review of large-scale transmission electron micrograph composite images. This work was carried out as part of a final-phase clinical analysis study of a drug for the treatment of diabetic peripheral neuropathy. MOre than 500 nerve biopsy samples were prepared, digitally imaged, processed, and reviewed. For a given sample, typically 1000 or more 1.5 megabyte frames were acquired, for a total of between 1 and 2 gigabytes of data per sample. These frames were then automatically registered and mosaicked together into a single virtual image composite, which was subsequently used to perform automatic cueing of axons and axon clusters, as well as review and marking by qualified neuroanatomists. Statistics derived from the review process were used to evaluate the efficacy of the drug in promoting regeneration of myelinated nerve fibers. This effort demonstrates a new, entirely digital capability for doing large-scale electron micrograph studies, in which all of the relevant specimen data can be included at high magnification, as opposed to simply taking a random sample of discrete locations. It opens up the possibility of a new era in electron microscopy--one which broadens the scope of questions that this imaging modality can be used to answer.

  19. Machine for Automatic Bacteriological Pour Plate Preparation

    PubMed Central

    Sharpe, A. N.; Biggs, D. R.; Oliver, R. J.

    1972-01-01

    A fully automatic system for preparing poured plates for bacteriological analyses has been constructed and tested. The machine can make decimal dilutions of bacterial suspensions, dispense measured amounts into petri dishes, add molten agar, mix the dish contents, and label the dishes with sample and dilution numbers at the rate of 2,000 dishes per 8-hr day. In addition, the machine can be programmed to select different media so that plates for different types of bacteriological analysis may be made automatically from the same sample. The machine uses only the components of the media and sterile polystyrene petri dishes; requirements for all other materials, such as sterile pipettes and capped bottles of diluents and agar, are eliminated. Images PMID:4560475

  20. Comparison between manual scaling and Autoscala automatic scaling applied to Sodankylä Geophysical Observatory ionograms

    NASA Astrophysics Data System (ADS)

    Enell, Carl-Fredrik; Kozlovsky, Alexander; Turunen, Tauno; Ulich, Thomas; Välitalo, Sirkku; Scotto, Carlo; Pezzopane, Michael

    2016-03-01

    This paper presents a comparison between standard ionospheric parameters manually and automatically scaled from ionograms recorded at the high-latitude Sodankylä Geophysical Observatory (SGO, ionosonde SO166, 64.1° geomagnetic latitude), located in the vicinity of the auroral oval. The study is based on 2610 ionograms recorded during the period June-December 2013. The automatic scaling was made by means of the Autoscala software. A few typical examples are shown to outline the method, and statistics are presented regarding the differences between manually and automatically scaled values of F2, F1, E and sporadic E (Es) layer parameters. We draw the conclusions that: 1. The F2 parameters scaled by Autoscala, foF2 and M(3000)F2, are reliable. 2. F1 is identified by Autoscala in significantly fewer cases (about 50 %) than in the manual routine, but if identified the values of foF1 are reliable. 3. Autoscala frequently (30 % of the cases) detects an E layer when the manual scaling process does not. When identified by both methods, the Autoscala E-layer parameters are close to those manually scaled, foE agreeing to within 0.4 MHz. 4. Es and parameters of Es identified by Autoscala are in many cases different from those of the manual scaling. Scaling of Es at auroral latitudes is often a difficult task.

  1. Automated imaging of cellular spheroids with selective plane illumination microscopy on a chip (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Paiè, Petra; Bassi, Andrea; Bragheri, Francesca; Osellame, Roberto

    2017-02-01

    Selective plane illumination microscopy (SPIM) is an optical sectioning technique that allows imaging of biological samples at high spatio-temporal resolution. Standard SPIM devices require dedicated set-ups, complex sample preparation and accurate system alignment, thus limiting the automation of the technique, its accessibility and throughput. We present a millimeter-scaled optofluidic device that incorporates selective plane illumination and fully automatic sample delivery and scanning. To this end an integrated cylindrical lens and a three-dimensional fluidic network were fabricated by femtosecond laser micromachining into a single glass chip. This device can upgrade any standard fluorescence microscope to a SPIM system. We used SPIM on a CHIP to automatically scan biological samples under a conventional microscope, without the need of any motorized stage: tissue spheroids expressing fluorescent proteins were flowed in the microchannel at constant speed and their sections were acquired while passing through the light sheet. We demonstrate high-throughput imaging of the entire sample volume (with a rate of 30 samples/min), segmentation and quantification in thick (100-300 μm diameter) cellular spheroids. This optofluidic device gives access to SPIM analyses to non-expert end-users, opening the way to automatic and fast screening of a high number of samples at subcellular resolution.

  2. Theory of wavelet-based coarse-graining hierarchies for molecular dynamics.

    PubMed

    Rinderspacher, Berend Christopher; Bardhan, Jaydeep P; Ismail, Ahmed E

    2017-07-01

    We present a multiresolution approach to compressing the degrees of freedom and potentials associated with molecular dynamics, such as the bond potentials. The approach suggests a systematic way to accelerate large-scale molecular simulations with more than two levels of coarse graining, particularly applications of polymeric materials. In particular, we derive explicit models for (arbitrarily large) linear (homo)polymers and iterative methods to compute large-scale wavelet decompositions from fragment solutions. This approach does not require explicit preparation of atomistic-to-coarse-grained mappings, but instead uses the theory of diffusion wavelets for graph Laplacians to develop system-specific mappings. Our methodology leads to a hierarchy of system-specific coarse-grained degrees of freedom that provides a conceptually clear and mathematically rigorous framework for modeling chemical systems at relevant model scales. The approach is capable of automatically generating as many coarse-grained model scales as necessary, that is, to go beyond the two scales in conventional coarse-grained strategies; furthermore, the wavelet-based coarse-grained models explicitly link time and length scales. Furthermore, a straightforward method for the reintroduction of omitted degrees of freedom is presented, which plays a major role in maintaining model fidelity in long-time simulations and in capturing emergent behaviors.

  3. Automatic recognition of holistic functional brain networks using iteratively optimized convolutional neural networks (IO-CNN) with weak label initialization.

    PubMed

    Zhao, Yu; Ge, Fangfei; Liu, Tianming

    2018-07-01

    fMRI data decomposition techniques have advanced significantly from shallow models such as Independent Component Analysis (ICA) and Sparse Coding and Dictionary Learning (SCDL) to deep learning models such Deep Belief Networks (DBN) and Convolutional Autoencoder (DCAE). However, interpretations of those decomposed networks are still open questions due to the lack of functional brain atlases, no correspondence across decomposed or reconstructed networks across different subjects, and significant individual variabilities. Recent studies showed that deep learning, especially deep convolutional neural networks (CNN), has extraordinary ability of accommodating spatial object patterns, e.g., our recent works using 3D CNN for fMRI-derived network classifications achieved high accuracy with a remarkable tolerance for mistakenly labelled training brain networks. However, the training data preparation is one of the biggest obstacles in these supervised deep learning models for functional brain network map recognitions, since manual labelling requires tedious and time-consuming labours which will sometimes even introduce label mistakes. Especially for mapping functional networks in large scale datasets such as hundreds of thousands of brain networks used in this paper, the manual labelling method will become almost infeasible. In response, in this work, we tackled both the network recognition and training data labelling tasks by proposing a new iteratively optimized deep learning CNN (IO-CNN) framework with an automatic weak label initialization, which enables the functional brain networks recognition task to a fully automatic large-scale classification procedure. Our extensive experiments based on ABIDE-II 1099 brains' fMRI data showed the great promise of our IO-CNN framework. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Negative Life Events and Antenatal Depression among Pregnant Women in Rural China: The Role of Negative Automatic Thoughts.

    PubMed

    Wang, Yang; Wang, Xiaohua; Liu, Fangnan; Jiang, Xiaoning; Xiao, Yun; Dong, Xuehan; Kong, Xianglei; Yang, Xuemei; Tian, Donghua; Qu, Zhiyong

    2016-01-01

    Few studies have looked at the relationship between psychological and the mental health status of pregnant women in rural China. The current study aims to explore the potential mediating effect of negative automatic thoughts between negative life events and antenatal depression. Data were collected in June 2012 and October 2012. 495 rural pregnant women were interviewed. Depressive symptoms were measured by the Edinburgh postnatal depression scale, stresses of pregnancy were measured by the pregnancy pressure scale, negative automatic thoughts were measured by the automatic thoughts questionnaire, and negative life events were measured by the life events scale for pregnant women. We used logistic regression and path analysis to test the mediating effect. The prevalence of antenatal depression was 13.7%. In the logistic regression, the only socio-demographic and health behavior factor significantly related to antenatal depression was sleep quality. Negative life events were not associated with depression in the fully adjusted model. Path analysis showed that the eventual direct and general effects of negative automatic thoughts were 0.39 and 0.51, which were larger than the effects of negative life events. This study suggested that there was a potentially significant mediating effect of negative automatic thoughts. Pregnant women who had lower scores of negative automatic thoughts were more likely to suffer less from negative life events which might lead to antenatal depression.

  5. Depression, automatic thoughts, alexithymia, and assertiveness in patients with tension-type headache.

    PubMed

    Yücel, Basak; Kora, Kaan; Ozyalçín, Süleyman; Alçalar, Nilüfer; Ozdemir, Ozay; Yücel, Aysen

    2002-03-01

    The role of psychological factors related to headache has long been a focus of investigation. The aim of this study was to evaluate depression, automatic thoughts, alexithymia, and assertiveness in persons with tension-type headache and to compare the results with those from healthy controls. One hundred five subjects with tension-type headache (according to the criteria of the International Headache Society classification) and 70 controls were studied. The Beck Depression Inventory, Automatic Thoughts Scale, Toronto Alexithymia Scale, and Rathus Assertiveness Schedule were administered to both groups. Sociodemographic variables and headache features were evaluated via a semistructured scale. Compared with healthy controls, the subjects with headache had significantly higher scores on measures of depression, automatic thoughts, and alexithymia and lower scores on assertiveness. Subjects with chronic tension-type headache had higher depression and automatic thoughts scores than those with episodic tension-type headache. These findings suggested that persons with tension-type headache have high depression scores and also may have difficulty with expression of their emotions. Headache frequency appears to influence the likelihood of coexisting depression.

  6. An automatic robotic system for three-dimensional tooth crown preparation using a picosecond laser.

    PubMed

    Wang, Lei; Wang, Dangxiao; Zhang, Yuru; Ma, Lei; Sun, Yuchun; Lv, Peijun

    2014-09-01

    Laser techniques have been introduced into dentistry to overcome the drawbacks of traditional treatment methods. The existing methods in dental clinical operations for tooth crown preparation have several drawbacks which affect the long-term success of the dental treatment. To develop an improved robotic system to manipulate the laser beam to achieve safe and accurate three-dimensional (3D) tooth ablation, and thus to realize automatic tooth crown preparation in clinical operations. We present an automatic laser ablation system for tooth crown preparation in dental restorative operations. The system, combining robotics and laser technology, is developed to control the laser focus in three-dimensional motion aiming for high speed and accuracy crown preparation. The system consists of an end-effector, a real-time monitor and a tooth fixture. A layer-by-layer ablation method is developed to control the laser focus during the crown preparation. Experiments are carried out with picosecond laser on wax resin and teeth. The accuracy of the system is satisfying, achieving the average linear errors of 0.06 mm for wax resin and 0.05 mm for dentin. The angle errors are 4.33° for wax resin and 0.5° for dentin. The depth errors for wax resin and dentin are both within 0.1 mm. The ablation time is 1.5 hours for wax resin and 3.5 hours for dentin. The ablation experimental results show that the movement range and the resolution of the robotic system can meet the requirements of typical dental operations for tooth crown preparation. Also, the errors of tooth shape and preparation angle are able to satisfy the requirements of clinical crown preparation. Although the experimental results illustrate the potential of using picosecond lasers for 3D tooth crown preparation, many research issues still need to be studied before the system can be applied to clinical operations. © 2014 Wiley Periodicals, Inc.

  7. Integrating personalized medical test contents with XML and XSL-FO.

    PubMed

    Toddenroth, Dennis; Dugas, Martin; Frankewitsch, Thomas

    2011-03-01

    In 2004 the adoption of a modular curriculum at the medical faculty in Muenster led to the introduction of centralized examinations based on multiple-choice questions (MCQs). We report on how organizational challenges of realizing faculty-wide personalized tests were addressed by implementation of a specialized software module to automatically generate test sheets from individual test registrations and MCQ contents. Key steps of the presented method for preparing personalized test sheets are (1) the compilation of relevant item contents and graphical media from a relational database with database queries, (2) the creation of Extensible Markup Language (XML) intermediates, and (3) the transformation into paginated documents. The software module by use of an open source print formatter consistently produced high-quality test sheets, while the blending of vectorized textual contents and pixel graphics resulted in efficient output file sizes. Concomitantly the module permitted an individual randomization of item sequences to prevent illicit collusion. The automatic generation of personalized MCQ test sheets is feasible using freely available open source software libraries, and can be efficiently deployed on a faculty-wide scale.

  8. Automated Deployment of Advanced Controls and Analytics in Buildings

    NASA Astrophysics Data System (ADS)

    Pritoni, Marco

    Buildings use 40% of primary energy in the US. Recent studies show that developing energy analytics and enhancing control strategies can significantly improve their energy performance. However, the deployment of advanced control software applications has been mostly limited to academic studies. Larger-scale implementations are prevented by the significant engineering time and customization required, due to significant differences among buildings. This study demonstrates how physics-inspired data-driven models can be used to develop portable analytics and control applications for buildings. Specifically, I demonstrate application of these models in all phases of the deployment of advanced controls and analytics in buildings: in the first phase, "Site Preparation and Interface with Legacy Systems" I used models to discover or map relationships among building components, automatically gathering metadata (information about data points) necessary to run the applications. During the second phase: "Application Deployment and Commissioning", models automatically learn system parameters, used for advanced controls and analytics. In the third phase: "Continuous Monitoring and Verification" I utilized models to automatically measure the energy performance of a building that has implemented advanced control strategies. In the conclusions, I discuss future challenges and suggest potential strategies for these innovative control systems to be widely deployed in the market. This dissertation provides useful new tools in terms of procedures, algorithms, and models to facilitate the automation of deployment of advanced controls and analytics and accelerate their wide adoption in buildings.

  9. WOLF; automatic typing program

    USGS Publications Warehouse

    Evenden, G.I.

    1982-01-01

    A FORTRAN IV program for the Hewlett-Packard 1000 series computer provides for automatic typing operations and can, when employed with manufacturer's text editor, provide a system to greatly facilitate preparation of reports, letters and other text. The input text and imbedded control data can perform nearly all of the functions of a typist. A few of the features available are centering, titles, footnotes, indentation, page numbering (including Roman numerals), automatic paragraphing, and two forms of tab operations. This documentation contains both user and technical description of the program.

  10. Grinding Parts For Automatic Welding

    NASA Technical Reports Server (NTRS)

    Burley, Richard K.; Hoult, William S.

    1989-01-01

    Rollers guide grinding tool along prospective welding path. Skatelike fixture holds rotary grinder or file for machining large-diameter rings or ring segments in preparation for welding. Operator grasps handles to push rolling fixture along part. Rollers maintain precise dimensional relationship so grinding wheel cuts precise depth. Fixture-mounted grinder machines surface to quality sufficient for automatic welding; manual welding with attendant variations and distortion not necessary. Developed to enable automatic welding of parts, manual welding of which resulted in weld bead permeated with microscopic fissures.

  11. Automated single-slide staining device. [in clinical bacteriology

    NASA Technical Reports Server (NTRS)

    Wilkins, J. R.; Mills, S. M.

    1975-01-01

    An automatic single-slide Gram staining device is described. A timer-actuated solenoid controls the dispensing of gentian violet, Gram iodine solution, decolorizer, and 1% aqueous safranin in proper sequence and for the time required for optimum staining. The amount of stain or reagent delivered is controlled by means of stopcocks below each solenoid. Used stains and reagents can be flushed automatically or manually. Smears Gram stained automatically are equal in quality to those prepared manually. The time to complete one Gram cycle is 4.80 min.

  12. a Method for the Seamlines Network Automatic Selection Based on Building Vector

    NASA Astrophysics Data System (ADS)

    Li, P.; Dong, Y.; Hu, Y.; Li, X.; Tan, P.

    2018-04-01

    In order to improve the efficiency of large scale orthophoto production of city, this paper presents a method for automatic selection of seamlines network in large scale orthophoto based on the buildings' vector. Firstly, a simple model of the building is built by combining building's vector, height and DEM, and the imaging area of the building on single DOM is obtained. Then, the initial Voronoi network of the measurement area is automatically generated based on the positions of the bottom of all images. Finally, the final seamlines network is obtained by optimizing all nodes and seamlines in the network automatically based on the imaging areas of the buildings. The experimental results show that the proposed method can not only get around the building seamlines network quickly, but also remain the Voronoi network' characteristics of projection distortion minimum theory, which can solve the problem of automatic selection of orthophoto seamlines network in image mosaicking effectively.

  13. Towards native-state imaging in biological context in the electron microscope

    PubMed Central

    Weston, Anne E.; Armer, Hannah E. J.

    2009-01-01

    Modern cell biology is reliant on light and fluorescence microscopy for analysis of cells, tissues and protein localisation. However, these powerful techniques are ultimately limited in resolution by the wavelength of light. Electron microscopes offer much greater resolution due to the shorter effective wavelength of electrons, allowing direct imaging of sub-cellular architecture. The harsh environment of the electron microscope chamber and the properties of the electron beam have led to complex chemical and mechanical preparation techniques, which distance biological samples from their native state and complicate data interpretation. Here we describe recent advances in sample preparation and instrumentation, which push the boundaries of high-resolution imaging. Cryopreparation, cryoelectron microscopy and environmental scanning electron microscopy strive to image samples in near native state. Advances in correlative microscopy and markers enable high-resolution localisation of proteins. Innovation in microscope design has pushed the boundaries of resolution to atomic scale, whilst automatic acquisition of high-resolution electron microscopy data through large volumes is finally able to place ultrastructure in biological context. PMID:19916039

  14. Pilot-scale cooling tower to evaluate corrosion, scaling, and biofouling control strategies for cooling system makeup water.

    PubMed

    Chien, S H; Hsieh, M K; Li, H; Monnell, J; Dzombak, D; Vidic, R

    2012-02-01

    Pilot-scale cooling towers can be used to evaluate corrosion, scaling, and biofouling control strategies when using particular cooling system makeup water and particular operating conditions. To study the potential for using a number of different impaired waters as makeup water, a pilot-scale system capable of generating 27,000 kJ∕h heat load and maintaining recirculating water flow with a Reynolds number of 1.92 × 10(4) was designed to study these critical processes under conditions that are similar to full-scale systems. The pilot-scale cooling tower was equipped with an automatic makeup water control system, automatic blowdown control system, semi-automatic biocide feeding system, and corrosion, scaling, and biofouling monitoring systems. Observed operational data revealed that the major operating parameters, including temperature change (6.6 °C), cycles of concentration (N = 4.6), water flow velocity (0.66 m∕s), and air mass velocity (3660 kg∕h m(2)), were controlled quite well for an extended period of time (up to 2 months). Overall, the performance of the pilot-scale cooling towers using treated municipal wastewater was shown to be suitable to study critical processes (corrosion, scaling, biofouling) and evaluate cooling water management strategies for makeup waters of complex quality.

  15. Spectral saliency via automatic adaptive amplitude spectrum analysis

    NASA Astrophysics Data System (ADS)

    Wang, Xiaodong; Dai, Jialun; Zhu, Yafei; Zheng, Haiyong; Qiao, Xiaoyan

    2016-03-01

    Suppressing nonsalient patterns by smoothing the amplitude spectrum at an appropriate scale has been shown to effectively detect the visual saliency in the frequency domain. Different filter scales are required for different types of salient objects. We observe that the optimal scale for smoothing amplitude spectrum shares a specific relation with the size of the salient region. Based on this observation and the bottom-up saliency detection characterized by spectrum scale-space analysis for natural images, we propose to detect visual saliency, especially with salient objects of different sizes and locations via automatic adaptive amplitude spectrum analysis. We not only provide a new criterion for automatic optimal scale selection but also reserve the saliency maps corresponding to different salient objects with meaningful saliency information by adaptive weighted combination. The performance of quantitative and qualitative comparisons is evaluated by three different kinds of metrics on the four most widely used datasets and one up-to-date large-scale dataset. The experimental results validate that our method outperforms the existing state-of-the-art saliency models for predicting human eye fixations in terms of accuracy and robustness.

  16. Automatic Scaling of Digisonde Ionograms Computer Program and Numerical Analysis Documentation,

    DTIC Science & Technology

    1983-02-01

    and Huang, 1983). This method is ideally suited for autoscaled results as discussed in Reference 1. The results of ARTIST are outputted to a standard... ARTIST , the autoscaling routine has been tested on some 8000 ionograms from Goose Bay, Labrador, for the months January, April, July, and September of 1980...Automatic scaling of Digisonde ionograms by-~a ’R-4 ARTIST system is discussed and ref erence is made to the ARTIST’s success in scaling over 8000

  17. Automatic Scoring of Paper-and-Pencil Figural Responses. Research Report.

    ERIC Educational Resources Information Center

    Martinez, Michael E.; And Others

    Large-scale testing is dominated by the multiple-choice question format. Widespread use of the format is due, in part, to the ease with which multiple-choice items can be scored automatically. This paper examines automatic scoring procedures for an alternative item type: figural response. Figural response items call for the completion or…

  18. [Development of a Software for Automatically Generated Contours in Eclipse TPS].

    PubMed

    Xie, Zhao; Hu, Jinyou; Zou, Lian; Zhang, Weisha; Zou, Yuxin; Luo, Kelin; Liu, Xiangxiang; Yu, Luxin

    2015-03-01

    The automatic generation of planning targets and auxiliary contours have achieved in Eclipse TPS 11.0. The scripting language autohotkey was used to develop a software for automatically generated contours in Eclipse TPS. This software is named Contour Auto Margin (CAM), which is composed of operational functions of contours, script generated visualization and script file operations. RESULTS Ten cases in different cancers have separately selected, in Eclipse TPS 11.0 scripts generated by the software could not only automatically generate contours but also do contour post-processing. For different cancers, there was no difference between automatically generated contours and manually created contours. The CAM is a user-friendly and powerful software, and can automatically generated contours fast in Eclipse TPS 11.0. With the help of CAM, it greatly save plan preparation time and improve working efficiency of radiation therapy physicists.

  19. Automatic zebrafish heartbeat detection and analysis for zebrafish embryos.

    PubMed

    Pylatiuk, Christian; Sanchez, Daniela; Mikut, Ralf; Alshut, Rüdiger; Reischl, Markus; Hirth, Sofia; Rottbauer, Wolfgang; Just, Steffen

    2014-08-01

    A fully automatic detection and analysis method of heartbeats in videos of nonfixed and nonanesthetized zebrafish embryos is presented. This method reduces the manual workload and time needed for preparation and imaging of the zebrafish embryos, as well as for evaluating heartbeat parameters such as frequency, beat-to-beat intervals, and arrhythmicity. The method is validated by a comparison of the results from automatic and manual detection of the heart rates of wild-type zebrafish embryos 36-120 h postfertilization and of embryonic hearts with bradycardia and pauses in the cardiac contraction.

  20. Method and appartus for converting static in-ground vehicle scales into weigh-in-motion systems

    DOEpatents

    Muhs, Jeffrey D.; Scudiere, Matthew B.; Jordan, John K.

    2002-01-01

    An apparatus and method for converting in-ground static weighing scales for vehicles to weigh-in-motion systems. The apparatus upon conversion includes the existing in-ground static scale, peripheral switches and an electronic module for automatic computation of the weight. By monitoring the velocity, tire position, axle spacing, and real time output from existing static scales as a vehicle drives over the scales, the system determines when an axle of a vehicle is on the scale at a given time, monitors the combined weight output from any given axle combination on the scale(s) at any given time, and from these measurements automatically computes the weight of each individual axle and gross vehicle weight by an integration, integration approximation, and/or signal averaging technique.

  1. Research directions in large scale systems and decentralized control

    NASA Technical Reports Server (NTRS)

    Tenney, R. R.

    1980-01-01

    Control theory provides a well established framework for dealing with automatic decision problems and a set of techniques for automatic decision making which exploit special structure, but it does not deal well with complexity. The potential exists for combining control theoretic and knowledge based concepts into a unified approach. The elements of control theory are diagrammed, including modern control and large scale systems.

  2. Toolbox for Research, or how to facilitate a central data management in small-scale research projects.

    PubMed

    Bialke, Martin; Rau, Henriette; Thamm, Oliver C; Schuldt, Ronny; Penndorf, Peter; Blumentritt, Arne; Gött, Robert; Piegsa, Jens; Bahls, Thomas; Hoffmann, Wolfgang

    2018-01-25

    In most research projects budget, staff and IT infrastructures are limiting resources. Especially for small-scale registries and cohort studies professional IT support and commercial electronic data capture systems are too expensive. Consequently, these projects use simple local approaches (e.g. Excel) for data capture instead of a central data management including web-based data capture and proper research databases. This leads to manual processes to merge, analyze and, if possible, pseudonymize research data of different study sites. To support multi-site data capture, storage and analyses in small-scall research projects, corresponding requirements were analyzed within the MOSAIC project. Based on the identified requirements, the Toolbox for Research was developed as a flexible software solution for various research scenarios. Additionally, the Toolbox facilitates data integration of research data as well as metadata by performing necessary procedures automatically. Also, Toolbox modules allow the integration of device data. Moreover, separation of personally identifiable information and medical data by using only pseudonyms for storing medical data ensures the compliance to data protection regulations. This pseudonymized data can then be exported in SPSS format in order to enable scientists to prepare reports and analyses. The Toolbox for Research was successfully piloted in the German Burn Registry in 2016 facilitating the documentation of 4350 burn cases at 54 study sites. The Toolbox for Research can be downloaded free of charge from the project website and automatically installed due to the use of Docker technology.

  3. University of São Paulo Reasons for Smoking Scale: a new tool for the evaluation of smoking motivation.

    PubMed

    Souza, Elisa Sebba Tosta de; Crippa, José Alexandre de Souza; Pasian, Sonia Regina; Martinez, José Antônio Baddini

    2010-01-01

    To develop a new scale aimed at evaluating smoking motivation by incorporating questions and domains from the 68-item Wisconsin Inventory of Smoking Dependence Motives (WISDM-68) into the Modified Reasons for Smoking Scale (MRSS). Nine WISDM-68 questions regarding affiliative attachment, cue exposure/associative processes, and weight control were added to the 21 questions of the MRSS. The new scale, together with the Fagerström Test for Nicotine Dependence (FTND), was administered to 311 smokers (214 males; mean age = 37.6 ± 10.8 years; mean number of cigarettes smoked per day = 15.0 ± 9.2), who also provided additional information. We used exploratory factor analysis in order to determine the factor structure of the scale. The influence that certain clinical features had on the scores of the final factor solution was also analyzed. The factor analysis revealed a 21-question solution grouped into nine factors: addiction, pleasure from smoking, tension reduction, stimulation, automatism, handling, social smoking, weight control, and affiliative attachment. For the overall scale, the Cronbach's alpha coefficient was 0.83. Females scored significantly higher for addiction, tension reduction, handling, weight control, and affiliative attachment than did males. The FTND score correlated positively with addiction, tension reduction, stimulation, automatism, social smoking, and affiliative attachment. The number of cigarettes smoked per day was associated with addiction, tension reduction, stimulation, automatism, affiliative attachment, and handling. The level of exhaled CO correlated positively with addiction, automatism, and affiliative attachment. The new scale provides an acceptable framework of motivational factors for smoking, with satisfactory psychometric properties and reliability.

  4. Automatic approach to deriving fuzzy slope positions

    NASA Astrophysics Data System (ADS)

    Zhu, Liang-Jun; Zhu, A.-Xing; Qin, Cheng-Zhi; Liu, Jun-Zhi

    2018-03-01

    Fuzzy characterization of slope positions is important for geographic modeling. Most of the existing fuzzy classification-based methods for fuzzy characterization require extensive user intervention in data preparation and parameter setting, which is tedious and time-consuming. This paper presents an automatic approach to overcoming these limitations in the prototype-based inference method for deriving fuzzy membership value (or similarity) to slope positions. The key contribution is a procedure for finding the typical locations and setting the fuzzy inference parameters for each slope position type. Instead of being determined totally by users in the prototype-based inference method, in the proposed approach the typical locations and fuzzy inference parameters for each slope position type are automatically determined by a rule set based on prior domain knowledge and the frequency distributions of topographic attributes. Furthermore, the preparation of topographic attributes (e.g., slope gradient, curvature, and relative position index) is automated, so the proposed automatic approach has only one necessary input, i.e., the gridded digital elevation model of the study area. All compute-intensive algorithms in the proposed approach were speeded up by parallel computing. Two study cases were provided to demonstrate that this approach can properly, conveniently and quickly derive the fuzzy slope positions.

  5. Research Directory for Manpower, Personnel, Training, and Human Factors.

    DTIC Science & Technology

    1991-01-01

    Enhance Automatic Recognition of Speech in Noisy, Highly Stressful Environments Cofod R* Lica Systems Inc 703-359-0996 Smart Contract Preparation...Lab 301-278-2946 Smart Contract Preparation Expediter Frezell T LTCOL Human Engineering Lab 301-278-5998 Impulse Noise Hazard Information Processing R&D

  6. Development of EnergyPlus Utility to Batch Simulate Building Energy Performance on a National Scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valencia, Jayson F.; Dirks, James A.

    2008-08-29

    EnergyPlus is a simulation program that requires a large number of details to fully define and model a building. Hundreds or even thousands of lines in a text file are needed to run the EnergyPlus simulation depending on the size of the building. To manually create these files is a time consuming process that would not be practical when trying to create input files for thousands of buildings needed to simulate national building energy performance. To streamline the process needed to create the input files for EnergyPlus, two methods were created to work in conjunction with the National Renewable Energymore » Laboratory (NREL) Preprocessor; this reduced the hundreds of inputs needed to define a building in EnergyPlus to a small set of high-level parameters. The first method uses Java routines to perform all of the preprocessing on a Windows machine while the second method carries out all of the preprocessing on the Linux cluster by using an in-house built utility called Generalized Parametrics (GPARM). A comma delimited (CSV) input file is created to define the high-level parameters for any number of buildings. Each method then takes this CSV file and uses the data entered for each parameter to populate an extensible markup language (XML) file used by the NREL Preprocessor to automatically prepare EnergyPlus input data files (idf) using automatic building routines and macro templates. Using a Linux utility called “make”, the idf files can then be automatically run through the Linux cluster and the desired data from each building can be aggregated into one table to be analyzed. Creating a large number of EnergyPlus input files results in the ability to batch simulate building energy performance and scale the result to national energy consumption estimates.« less

  7. Challenges and Opportunities: One Stop Processing of Automatic Large-Scale Base Map Production Using Airborne LIDAR Data Within GIS Environment. Case Study: Makassar City, Indonesia

    NASA Astrophysics Data System (ADS)

    Widyaningrum, E.; Gorte, B. G. H.

    2017-05-01

    LiDAR data acquisition is recognized as one of the fastest solutions to provide basis data for large-scale topographical base maps worldwide. Automatic LiDAR processing is believed one possible scheme to accelerate the large-scale topographic base map provision by the Geospatial Information Agency in Indonesia. As a progressive advanced technology, Geographic Information System (GIS) open possibilities to deal with geospatial data automatic processing and analyses. Considering further needs of spatial data sharing and integration, the one stop processing of LiDAR data in a GIS environment is considered a powerful and efficient approach for the base map provision. The quality of the automated topographic base map is assessed and analysed based on its completeness, correctness, quality, and the confusion matrix.

  8. Floating-point scaling technique for sources separation automatic gain control

    NASA Astrophysics Data System (ADS)

    Fermas, A.; Belouchrani, A.; Ait-Mohamed, O.

    2012-07-01

    Based on the floating-point representation and taking advantage of scaling factor indetermination in blind source separation (BSS) processing, we propose a scaling technique applied to the separation matrix, to avoid the saturation or the weakness in the recovered source signals. This technique performs an automatic gain control in an on-line BSS environment. We demonstrate the effectiveness of this technique by using the implementation of a division-free BSS algorithm with two inputs, two outputs. The proposed technique is computationally cheaper and efficient for a hardware implementation compared to the Euclidean normalisation.

  9. Integration and segregation of large-scale brain networks during short-term task automatization

    PubMed Central

    Mohr, Holger; Wolfensteller, Uta; Betzel, Richard F.; Mišić, Bratislav; Sporns, Olaf; Richiardi, Jonas; Ruge, Hannes

    2016-01-01

    The human brain is organized into large-scale functional networks that can flexibly reconfigure their connectivity patterns, supporting both rapid adaptive control and long-term learning processes. However, it has remained unclear how short-term network dynamics support the rapid transformation of instructions into fluent behaviour. Comparing fMRI data of a learning sample (N=70) with a control sample (N=67), we find that increasingly efficient task processing during short-term practice is associated with a reorganization of large-scale network interactions. Practice-related efficiency gains are facilitated by enhanced coupling between the cingulo-opercular network and the dorsal attention network. Simultaneously, short-term task automatization is accompanied by decreasing activation of the fronto-parietal network, indicating a release of high-level cognitive control, and a segregation of the default mode network from task-related networks. These findings suggest that short-term task automatization is enabled by the brain's ability to rapidly reconfigure its large-scale network organization involving complementary integration and segregation processes. PMID:27808095

  10. Partial polygon pruning of hydrographic features in automated generalization

    USGS Publications Warehouse

    Stum, Alexander K.; Buttenfield, Barbara P.; Stanislawski, Larry V.

    2017-01-01

    This paper demonstrates a working method to automatically detect and prune portions of waterbody polygons to support creation of a multi-scale hydrographic database. Water features are known to be sensitive to scale change; and thus multiple representations are required to maintain visual and geographic logic at smaller scales. Partial pruning of polygonal features—such as long and sinuous reservoir arms, stream channels that are too narrow at the target scale, and islands that begin to coalesce—entails concurrent management of the length and width of polygonal features as well as integrating pruned polygons with other generalized point and linear hydrographic features to maintain stream network connectivity. The implementation follows data representation standards developed by the U.S. Geological Survey (USGS) for the National Hydrography Dataset (NHD). Portions of polygonal rivers, streams, and canals are automatically characterized for width, length, and connectivity. This paper describes an algorithm for automatic detection and subsequent processing, and shows results for a sample of NHD subbasins in different landscape conditions in the United States.

  11. Computer-assisted image processing to detect spores from the fungus Pandora neoaphidis.

    PubMed

    Korsnes, Reinert; Westrum, Karin; Fløistad, Erling; Klingen, Ingeborg

    2016-01-01

    This contribution demonstrates an example of experimental automatic image analysis to detect spores prepared on microscope slides derived from trapping. The application is to monitor aerial spore counts of the entomopathogenic fungus Pandora neoaphidis which may serve as a biological control agent for aphids. Automatic detection of such spores can therefore play a role in plant protection. The present approach for such detection is a modification of traditional manual microscopy of prepared slides, where autonomous image recording precedes computerised image analysis. The purpose of the present image analysis is to support human visual inspection of imagery data - not to replace it. The workflow has three components:•Preparation of slides for microscopy.•Image recording.•Computerised image processing where the initial part is, as usual, segmentation depending on the actual data product. Then comes identification of blobs, calculation of principal axes of blobs, symmetry operations and projection on a three parameter egg shape space.

  12. Determination of vanadium(V) by direct automatic potentiometric titration with EDTA using a chemically modified electrode as a potentiometric sensor.

    PubMed

    Quintar, S E; Santagata, J P; Cortinez, V A

    2005-10-15

    A chemically modified electrode (CME) was prepared and studied as a potentiometric sensor for the end-point detection in the automatic titration of vanadium(V) with EDTA. The CME was constructed with a paste prepared by mixing spectral-grade graphite powder, Nujol oil and N-2-naphthoyl-N-p-tolylhydroxamic acid (NTHA). Buffer systems, pH effects and the concentration range were studied. Interference ions were separated by applying a liquid-liquid extraction procedure. The CME did not require any special conditioning before using. The electrode was constructed with very inexpensive materials and was easily made. It could be continuously used, at least two months without removing the paste. Automatic potentiometric titration curves were obtained for V(V) within 5 x 10(-5) to 2 x 10(-3)M with acceptable accuracy and precision. The developed method was applied to V(V) determination in alloys for hip prosthesis.

  13. PACE 2: Pricing and Cost Estimating Handbook

    NASA Technical Reports Server (NTRS)

    Stewart, R. D.; Shepherd, T.

    1977-01-01

    An automatic data processing system to be used for the preparation of industrial engineering type manhour and material cost estimates has been established. This computer system has evolved into a highly versatile and highly flexible tool which significantly reduces computation time, eliminates computational errors, and reduces typing and reproduction time for estimators and pricers since all mathematical and clerical functions are automatic once basic inputs are derived.

  14. [The maintenance of automatic analysers and associated documentation].

    PubMed

    Adjidé, V; Fournier, P; Vassault, A

    2010-12-01

    The maintenance of automatic analysers and associated documentation taking part in the requirements of the ISO 15189 Standard and the French regulation as well have to be defined in the laboratory policy. The management of the periodic maintenance and documentation shall be implemented and fulfilled. The organisation of corrective maintenance has to be managed to avoid interruption of the task of the laboratory. The different recommendations concern the identification of materials including automatic analysers, the environmental conditions to take into account, the documentation provided by the manufacturer and documents prepared by the laboratory including procedures for maintenance.

  15. Automatic alignment method for calibration of hydrometers

    NASA Astrophysics Data System (ADS)

    Lee, Y. J.; Chang, K. H.; Chon, J. C.; Oh, C. Y.

    2004-04-01

    This paper presents a new method to automatically align specific scale-marks for the calibration of hydrometers. A hydrometer calibration system adopting the new method consists of a vision system, a stepping motor, and software to control the system. The vision system is composed of a CCD camera and a frame grabber, and is used to acquire images. The stepping motor moves the camera, which is attached to the vessel containing a reference liquid, along the hydrometer. The operating program has two main functions: to process images from the camera to find the position of the horizontal plane and to control the stepping motor for the alignment of the horizontal plane with a particular scale-mark. Any system adopting this automatic alignment method is a convenient and precise means of calibrating a hydrometer. The performance of the proposed method is illustrated by comparing the calibration results using the automatic alignment method with those obtained using the manual method.

  16. When the brain changes its mind: Oscillatory dynamics of conflict processing and response switching in a flanker task during alcohol challenge.

    PubMed

    Beaton, Lauren E; Azma, Sheeva; Marinkovic, Ksenija

    2018-01-01

    Despite the subjective experience of being in full and deliberate control of our actions, our daily routines rely on a continuous and interactive engagement of sensory evaluation and response preparation streams. They unfold automatically and unconsciously and are seamlessly integrated with cognitive control which is mobilized by stimuli that evoke ambiguity or response conflict. Methods with high spatio-temporal sensitivity are needed to provide insight into the interplay between automatic and controlled processing. This study used anatomically-constrained MEG to examine the underlying neural dynamics in a flanker task that manipulated S-R incongruity at the stimulus (SI) and response levels (RI). Though irrelevant, flankers evoked automatic preparation of motor plans which had to be suppressed and reversed following the target presentation on RI trials. Event-related source power estimates in beta (15-25 Hz) frequency band in the sensorimotor cortex tracked motor preparation and response in real time and revealed switching from the incorrectly-primed to the correctly-responding hemisphere. In contrast, theta oscillations (4-7 Hz) were sensitive to the levels of incongruity as the medial and ventrolateral frontal cortices were especially activated by response conflict. These two areas are key to cognitive control and their integrated contributions to response inhibition and switching were revealed by phase-locked co-oscillations. These processes were pharmacologically manipulated with a moderate alcohol beverage or a placebo administered to healthy social drinkers. Alcohol selectively decreased accuracy to response conflict. It strongly attenuated theta oscillations during decision making and partly re-sculpted relative contributions of the frontal network without affecting the motor switching process subserved by beta band. Our results indicate that motor preparation is initiated automatically even when counterproductive but that it is monitored and regulated by the prefrontal cognitive control processes under conflict. They further confirm that the regulative top-down functions are particularly vulnerable to alcohol intoxication.

  17. When the brain changes its mind: Oscillatory dynamics of conflict processing and response switching in a flanker task during alcohol challenge

    PubMed Central

    Beaton, Lauren E.; Azma, Sheeva; Marinkovic, Ksenija

    2018-01-01

    Despite the subjective experience of being in full and deliberate control of our actions, our daily routines rely on a continuous and interactive engagement of sensory evaluation and response preparation streams. They unfold automatically and unconsciously and are seamlessly integrated with cognitive control which is mobilized by stimuli that evoke ambiguity or response conflict. Methods with high spatio-temporal sensitivity are needed to provide insight into the interplay between automatic and controlled processing. This study used anatomically-constrained MEG to examine the underlying neural dynamics in a flanker task that manipulated S-R incongruity at the stimulus (SI) and response levels (RI). Though irrelevant, flankers evoked automatic preparation of motor plans which had to be suppressed and reversed following the target presentation on RI trials. Event-related source power estimates in beta (15–25 Hz) frequency band in the sensorimotor cortex tracked motor preparation and response in real time and revealed switching from the incorrectly-primed to the correctly-responding hemisphere. In contrast, theta oscillations (4–7 Hz) were sensitive to the levels of incongruity as the medial and ventrolateral frontal cortices were especially activated by response conflict. These two areas are key to cognitive control and their integrated contributions to response inhibition and switching were revealed by phase-locked co-oscillations. These processes were pharmacologically manipulated with a moderate alcohol beverage or a placebo administered to healthy social drinkers. Alcohol selectively decreased accuracy to response conflict. It strongly attenuated theta oscillations during decision making and partly re-sculpted relative contributions of the frontal network without affecting the motor switching process subserved by beta band. Our results indicate that motor preparation is initiated automatically even when counterproductive but that it is monitored and regulated by the prefrontal cognitive control processes under conflict. They further confirm that the regulative top-down functions are particularly vulnerable to alcohol intoxication. PMID:29329355

  18. Automatic differentiation for design sensitivity analysis of structural systems using multiple processors

    NASA Technical Reports Server (NTRS)

    Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi

    1994-01-01

    An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.

  19. Long-term maintenance of human induced pluripotent stem cells by automated cell culture system.

    PubMed

    Konagaya, Shuhei; Ando, Takeshi; Yamauchi, Toshiaki; Suemori, Hirofumi; Iwata, Hiroo

    2015-11-17

    Pluripotent stem cells, such as embryonic stem cells and induced pluripotent stem (iPS) cells, are regarded as new sources for cell replacement therapy. These cells can unlimitedly expand under undifferentiated conditions and be differentiated into multiple cell types. Automated culture systems enable the large-scale production of cells. In addition to reducing the time and effort of researchers, an automated culture system improves the reproducibility of cell cultures. In the present study, we newly designed a fully automated cell culture system for human iPS maintenance. Using an automated culture system, hiPS cells maintained their undifferentiated state for 60 days. Automatically prepared hiPS cells had a potency of differentiation into three germ layer cells including dopaminergic neurons and pancreatic cells.

  20. Development of satellite borne nickel hydrogen battery experiment equipment for ETS-6

    NASA Astrophysics Data System (ADS)

    Kuwashima, Saburou; Kamimori, Norimitsu; Kusawake, Hiroaki; Takahashi, Kazumichi

    1992-08-01

    An overview of the support rendered for the Engineering Test Satellite-6 (ETS-6) system integration test and protoflight test by the ETS-6 borne experimental nickel hydrogen battery development part is presented. Articles in the ETS-6 specifications and procedures related to the experimental battery were prepared or supported in preparation because of the battery's special characteristics such as its automatic control dependency on the bus voltage, thermal sensitivity equivalent to that of other batteries and so forth. System tests were witnessed and the acquired data were evaluated. Charging characteristics from 0 V were verified at trickle charging rate, using a flight scale model of Nickel Hydrogen (Ni-H2) Battery (NHB) after long term storage and an engineering model of the Ni-H2 Battery Controller (NHC). Requests for approval were submitted to the related self governing bodies in accordance with the Explosives Control Law when NHB's were charged and discharged. Installation and calibration data acquisition of the inner pressure sensors for the Ni-H2 battery cells for the flight model NHB were conducted and the battery assembly was started.

  1. Proceedings of the International Conference on Stiff Computation, April 12-14, 1982, Park City, Utah. Volume I.

    DTIC Science & Technology

    1982-01-01

    physical reasoning and based on computational experience with similar equations. There is another non- automatic way: through proper scaling of all...1979) for an automatic scheme for this scaling on a digitial computer . Shampine(1980) reports a special definition of stiffness appropriate for...an analog for a laboratory that typically already has a digital computer . The digitial is much more versatile. Also there does not yet exist " software

  2. Chem Ed Compacts

    ERIC Educational Resources Information Center

    Wolf, Walter A., Ed.

    1977-01-01

    Presents classroom and laboratory teaching and demonstration ideas, including a demonstration of optical rotation, automatic potentiometric titration, preparation of radioactive lead, and an organic lab practical in library resources. (SL)

  3. Very Large Scale Integrated Circuits for Military Systems.

    DTIC Science & Technology

    1981-01-01

    ABBREVIATIONS A/D Analog-to-digital C AGC Automatic Gain Control A A/J Anti-jam ASP Advanced Signal Processor AU Arithmetic Units C.AD Computer-Aided...ESM) equipments (Ref. 23); in lieu of an adequate automatic proces- sing capability, the function is now performed manually (Ref. 24), which involves...a human operator, displays, etc., and a sacrifice in performance (acquisition speed, saturation signal density). Various automatic processing

  4. Use of automatic door closers improves fire safety.

    PubMed

    Waterman, T E

    1979-01-01

    In a series of 16 full-scale fire tests, investigators at the IIT Research Institute have concluded that automatic door control in the room of fire origin can significantly reduce the spread of toxic smoke and gases. The researchers also investigated the effects of sprinkler actuation, and the functional relationship between sprinklers and automatic door closers. This report presents the results of the study, and presents recommendations for health-care facilities.

  5. An eigenvalue approach for the automatic scaling of unknowns in model-based reconstructions: Application to real-time phase-contrast flow MRI.

    PubMed

    Tan, Zhengguo; Hohage, Thorsten; Kalentev, Oleksandr; Joseph, Arun A; Wang, Xiaoqing; Voit, Dirk; Merboldt, K Dietmar; Frahm, Jens

    2017-12-01

    The purpose of this work is to develop an automatic method for the scaling of unknowns in model-based nonlinear inverse reconstructions and to evaluate its application to real-time phase-contrast (RT-PC) flow magnetic resonance imaging (MRI). Model-based MRI reconstructions of parametric maps which describe a physical or physiological function require the solution of a nonlinear inverse problem, because the list of unknowns in the extended MRI signal equation comprises multiple functional parameters and all coil sensitivity profiles. Iterative solutions therefore rely on an appropriate scaling of unknowns to numerically balance partial derivatives and regularization terms. The scaling of unknowns emerges as a self-adjoint and positive-definite matrix which is expressible by its maximal eigenvalue and solved by power iterations. The proposed method is applied to RT-PC flow MRI based on highly undersampled acquisitions. Experimental validations include numerical phantoms providing ground truth and a wide range of human studies in the ascending aorta, carotid arteries, deep veins during muscular exercise and cerebrospinal fluid during deep respiration. For RT-PC flow MRI, model-based reconstructions with automatic scaling not only offer velocity maps with high spatiotemporal acuity and much reduced phase noise, but also ensure fast convergence as well as accurate and precise velocities for all conditions tested, i.e. for different velocity ranges, vessel sizes and the simultaneous presence of signals with velocity aliasing. In summary, the proposed automatic scaling of unknowns in model-based MRI reconstructions yields quantitatively reliable velocities for RT-PC flow MRI in various experimental scenarios. Copyright © 2017 John Wiley & Sons, Ltd.

  6. Automatic three-dimensional measurement of large-scale structure based on vision metrology.

    PubMed

    Zhu, Zhaokun; Guan, Banglei; Zhang, Xiaohu; Li, Daokui; Yu, Qifeng

    2014-01-01

    All relevant key techniques involved in photogrammetric vision metrology for fully automatic 3D measurement of large-scale structure are studied. A new kind of coded target consisting of circular retroreflective discs is designed, and corresponding detection and recognition algorithms based on blob detection and clustering are presented. Then a three-stage strategy starting with view clustering is proposed to achieve automatic network orientation. As for matching of noncoded targets, the concept of matching path is proposed, and matches for each noncoded target are found by determination of the optimal matching path, based on a novel voting strategy, among all possible ones. Experiments on a fixed keel of airship have been conducted to verify the effectiveness and measuring accuracy of the proposed methods.

  7. Large-scale high-resolution non-invasive geophysical archaeological prospection for the investigation of entire archaeological landscapes

    NASA Astrophysics Data System (ADS)

    Trinks, Immo; Neubauer, Wolfgang; Hinterleitner, Alois; Kucera, Matthias; Löcker, Klaus; Nau, Erich; Wallner, Mario; Gabler, Manuel; Zitz, Thomas

    2014-05-01

    Over the past three years the 2010 in Vienna founded Ludwig Boltzmann Institute for Archaeological Prospection and Virtual Archaeology (http://archpro.lbg.ac.at), in collaboration with its ten European partner organizations, has made considerable progress in the development and application of near-surface geophysical survey technology and methodology mapping square kilometres rather than hectares in unprecedented spatial resolution. The use of multiple novel motorized multichannel GPR and magnetometer systems (both Förster/Fluxgate and Cesium type) in combination with advanced and centimetre precise positioning systems (robotic totalstations and Realtime Kinematic GPS) permitting efficient navigation in open fields have resulted in comprehensive blanket coverage archaeological prospection surveys of important cultural heritage sites, such as the landscape surrounding Stonehenge in the framework of the Stonehenge Hidden Landscape Project, the mapping of the World Cultural Heritage site Birka-Hovgården in Sweden, or the detailed investigation of the Roman urban landscape of Carnuntum near Vienna. Efficient state-of-the-art archaeological prospection survey solutions require adequate fieldwork methodologies and appropriate data processing tools for timely quality control of the data in the field and large-scale data visualisations after arrival back in the office. The processed and optimized visualisations of the geophysical measurement data provide the basis for subsequent archaeological interpretation. Integration of the high-resolution geophysical prospection data with remote sensing data acquired through aerial photography, airborne laser- and hyperspectral-scanning, terrestrial laser-scanning or detailed digital terrain models derived through photogrammetric methods permits improved understanding and spatial analysis as well as the preparation of comprehensible presentations for the stakeholders (scientific community, cultural heritage managers, public). Of paramount importance in regard to large-scale high-resolution data acquisition when using motorized survey systems is the exact data positioning as well as the removal of any measurement effects caused by the survey vehicle. The large amount of generated data requires efficient semi-automatic and automatized tools for the extraction and rendering of important information. Semi-automatic data segmentation and classification precede the detailed 3D archaeological interpretation, which still requires considerable manual input. We present the latest technological and methodological developments in regard to motorized near-surface GPR and magnetometer prospection as well as application examples from different iconic European archaeological sites.

  8. 10 CFR Appendix O to Part 110 - Illustrative List of Fuel Element Fabrication Plant Equipment and Components Under NRC's Export...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... performance and safety during reactor operation. Also, in all cases precise control of processes, procedures... performance. (a) Items that are considered especially designed or prepared for the fabrication of fuel... pellets; (2) Automatic welding machines especially designed or prepared for welding end caps onto the fuel...

  9. Back-and-Forth Methodology for Objective Voice Quality Assessment: From/to Expert Knowledge to/from Automatic Classification of Dysphonia

    NASA Astrophysics Data System (ADS)

    Fredouille, Corinne; Pouchoulin, Gilles; Ghio, Alain; Revis, Joana; Bonastre, Jean-François; Giovanni, Antoine

    2009-12-01

    This paper addresses voice disorder assessment. It proposes an original back-and-forth methodology involving an automatic classification system as well as knowledge of the human experts (machine learning experts, phoneticians, and pathologists). The goal of this methodology is to bring a better understanding of acoustic phenomena related to dysphonia. The automatic system was validated on a dysphonic corpus (80 female voices), rated according to the GRBAS perceptual scale by an expert jury. Firstly, focused on the frequency domain, the classification system showed the interest of 0-3000 Hz frequency band for the classification task based on the GRBAS scale. Later, an automatic phonemic analysis underlined the significance of consonants and more surprisingly of unvoiced consonants for the same classification task. Submitted to the human experts, these observations led to a manual analysis of unvoiced plosives, which highlighted a lengthening of VOT according to the dysphonia severity validated by a preliminary statistical analysis.

  10. MatchGUI: A Graphical MATLAB-Based Tool for Automatic Image Co-Registration

    NASA Technical Reports Server (NTRS)

    Ansar, Adnan I.

    2011-01-01

    MatchGUI software, based on MATLAB, automatically matches two images and displays the match result by superimposing one image on the other. A slider bar allows focus to shift between the two images. There are tools for zoom, auto-crop to overlap region, and basic image markup. Given a pair of ortho-rectified images (focused primarily on Mars orbital imagery for now), this software automatically co-registers the imagery so that corresponding image pixels are aligned. MatchGUI requires minimal user input, and performs a registration over scale and inplane rotation fully automatically

  11. Automatic recognition and analysis of synapses. [in brain tissue

    NASA Technical Reports Server (NTRS)

    Ungerleider, J. A.; Ledley, R. S.; Bloom, F. E.

    1976-01-01

    An automatic system for recognizing synaptic junctions would allow analysis of large samples of tissue for the possible classification of specific well-defined sets of synapses based upon structural morphometric indices. In this paper the three steps of our system are described: (1) cytochemical tissue preparation to allow easy recognition of the synaptic junctions; (2) transmitting the tissue information to a computer; and (3) analyzing each field to recognize the synapses and make measurements on them.

  12. A new method of automatic landmark tagging for shape model construction via local curvature scale

    NASA Astrophysics Data System (ADS)

    Rueda, Sylvia; Udupa, Jayaram K.; Bai, Li

    2008-03-01

    Segmentation of organs in medical images is a difficult task requiring very often the use of model-based approaches. To build the model, we need an annotated training set of shape examples with correspondences indicated among shapes. Manual positioning of landmarks is a tedious, time-consuming, and error prone task, and almost impossible in the 3D space. To overcome some of these drawbacks, we devised an automatic method based on the notion of c-scale, a new local scale concept. For each boundary element b, the arc length of the largest homogeneous curvature region connected to b is estimated as well as the orientation of the tangent at b. With this shape description method, we can automatically locate mathematical landmarks selected at different levels of detail. The method avoids the use of landmarks for the generation of the mean shape. The selection of landmarks on the mean shape is done automatically using the c-scale method. Then, these landmarks are propagated to each shape in the training set, defining this way the correspondences among the shapes. Altogether 12 strategies are described along these lines. The methods are evaluated on 40 MRI foot data sets, the object of interest being the talus bone. The results show that, for the same number of landmarks, the proposed methods are more compact than manual and equally spaced annotations. The approach is applicable to spaces of any dimensionality, although we have focused in this paper on 2D shapes.

  13. Apparatus enables automatic microanalysis of body fluids

    NASA Technical Reports Server (NTRS)

    Soffen, G. A.; Stuart, J. L.

    1966-01-01

    Apparatus will automatically and quantitatively determine body fluid constituents which are amenable to analysis by fluorometry or colorimetry. The results of the tests are displayed as percentages of full scale deflection on a strip-chart recorder. The apparatus can also be adapted for microanalysis of various other fluids.

  14. Decreased Load on General Motor Preparation and Visual-Working Memory while Preparing Familiar as Compared to Unfamiliar Movement Sequences

    ERIC Educational Resources Information Center

    De Kleine, Elian; Van der Lubbe, Rob H. J.

    2011-01-01

    Learning movement sequences is thought to develop from an initial controlled attentive phase to a more automatic inattentive phase. Furthermore, execution of sequences becomes faster with practice, which may result from changes at a general motor processing level rather than at an effector specific motor processing level. In the current study, we…

  15. Recent Research on the Automated Mass Measuring System

    NASA Astrophysics Data System (ADS)

    Yao, Hong; Ren, Xiao-Ping; Wang, Jian; Zhong, Rui-Lin; Ding, Jing-An

    The research development of robotic measurement system as well as the representative automatic system were introduced in the paper, and then discussed a sub-multiple calibration scheme adopted on a fully-automatic CCR10 system effectively. Automatic robot system can be able to perform the dissemination of the mass scale without any manual intervention as well as the fast speed calibration of weight samples against a reference weight. At the last, evaluation of the expanded uncertainty was given out.

  16. Application of Magnetic Nanoparticles in Pretreatment Device for POPs Analysis in Water

    NASA Astrophysics Data System (ADS)

    Chu, Dongzhi; Kong, Xiangfeng; Wu, Bingwei; Fan, Pingping; Cao, Xuan; Zhang, Ting

    2018-01-01

    In order to reduce process time and labour force of POPs pretreatment, and solve the problem that extraction column was easily clogged, the paper proposed a new technology of extraction and enrichment which used magnetic nanoparticles. Automatic pretreatment system had automatic sampling unit, extraction enrichment unit and elution enrichment unit. The paper briefly introduced the preparation technology of magnetic nanoparticles, and detailly introduced the structure and control system of automatic pretreatment system. The result of magnetic nanoparticles mass recovery experiments showed that the system had POPs analysis preprocessing capability, and the recovery rate of magnetic nanoparticles were over 70%. In conclusion, the author proposed three points optimization recommendation.

  17. Quantitative Analysis of Heavy Metals in Water Based on LIBS with an Automatic Device for Sample Preparation

    NASA Astrophysics Data System (ADS)

    Hu, Li; Zhao, Nanjing; Liu, Wenqing; Meng, Deshuo; Fang, Li; Wang, Yin; Yu, Yang; Ma, Mingjun

    2015-08-01

    Heavy metals in water can be deposited on graphite flakes, which can be used as an enrichment method for laser-induced breakdown spectroscopy (LIBS) and is studied in this paper. The graphite samples were prepared with an automatic device, which was composed of a loading and unloading module, a quantitatively adding solution module, a rapid heating and drying module and a precise rotating module. The experimental results showed that the sample preparation methods had no significant effect on sample distribution and the LIBS signal accumulated in 20 pulses was stable and repeatable. With an increasing amount of the sample solution on the graphite flake, the peak intensity at Cu I 324.75 nm accorded with the exponential function with a correlation coefficient of 0.9963 and the background intensity remained unchanged. The limit of detection (LOD) was calculated through linear fitting of the peak intensity versus the concentration. The LOD decreased rapidly with an increasing amount of sample solution until the amount exceeded 20 mL and the correlation coefficient of exponential function fitting was 0.991. The LOD of Pb, Ni, Cd, Cr and Zn after evaporating different amounts of sample solution on the graphite flakes was measured and the variation tendency of their LOD with sample solution amounts was similar to the tendency for Cu. The experimental data and conclusions could provide a reference for automatic sample preparation and heavy metal in situ detection. supported by National Natural Science Foundation of China (No. 60908018), National High Technology Research and Development Program of China (No. 2013AA065502) and Anhui Province Outstanding Youth Science Fund of China (No. 1108085J19)

  18. Automatic Real Time Ionogram Scaler with True Height Analysis - Artist

    DTIC Science & Technology

    1983-07-01

    scaled. The corresponding autoscaled values were compared with the manual scaled h’F, h’F2, fminF, foE, foEs, h’E and hlEs. The ARTIST program...I ... , ·~ J .,\\; j~~·n! I:\\’~ .. IC HT:/\\L rritw!E I ONOGI\\AM SCALER ’:!"[’!’if T:\\!_1!: H~:IGHT ANALYSIS - ARTIST P...S. TYPE OF REPORT & PERiCO COVERED Scientific Report No. 7 AUTOMATIC REAL TIME IONOGRAM SCALER WITH TRUE HEIGHT ANALYSIS - ARTIST 6. PERFORMING O𔃾G

  19. Self-aggregation in scaled principal component space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, Chris H.Q.; He, Xiaofeng; Zha, Hongyuan

    2001-10-05

    Automatic grouping of voluminous data into meaningful structures is a challenging task frequently encountered in broad areas of science, engineering and information processing. These data clustering tasks are frequently performed in Euclidean space or a subspace chosen from principal component analysis (PCA). Here we describe a space obtained by a nonlinear scaling of PCA in which data objects self-aggregate automatically into clusters. Projection into this space gives sharp distinctions among clusters. Gene expression profiles of cancer tissue subtypes, Web hyperlink structure and Internet newsgroups are analyzed to illustrate interesting properties of the space.

  20. 21 CFR 876.5630 - Peritoneal dialysis system and accessories.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... peritoneal dialysis, a source of dialysate, and, in some cases, a water purification mechanism. After the...”) or dialysate prepared from dialysate concentrate and sterile purified water (for automatic peritoneal...

  1. 21 CFR 876.5630 - Peritoneal dialysis system and accessories.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... peritoneal dialysis, a source of dialysate, and, in some cases, a water purification mechanism. After the...”) or dialysate prepared from dialysate concentrate and sterile purified water (for automatic peritoneal...

  2. 21 CFR 876.5630 - Peritoneal dialysis system and accessories.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... peritoneal dialysis, a source of dialysate, and, in some cases, a water purification mechanism. After the...”) or dialysate prepared from dialysate concentrate and sterile purified water (for automatic peritoneal...

  3. 21 CFR 876.5630 - Peritoneal dialysis system and accessories.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... peritoneal dialysis, a source of dialysate, and, in some cases, a water purification mechanism. After the...”) or dialysate prepared from dialysate concentrate and sterile purified water (for automatic peritoneal...

  4. 21 CFR 876.5630 - Peritoneal dialysis system and accessories.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... peritoneal dialysis, a source of dialysate, and, in some cases, a water purification mechanism. After the...”) or dialysate prepared from dialysate concentrate and sterile purified water (for automatic peritoneal...

  5. Automatic delineation and 3D visualization of the human ventricular system using probabilistic neural networks

    NASA Astrophysics Data System (ADS)

    Hatfield, Fraser N.; Dehmeshki, Jamshid

    1998-09-01

    Neurosurgery is an extremely specialized area of medical practice, requiring many years of training. It has been suggested that virtual reality models of the complex structures within the brain may aid in the training of neurosurgeons as well as playing an important role in the preparation for surgery. This paper focuses on the application of a probabilistic neural network to the automatic segmentation of the ventricles from magnetic resonance images of the brain, and their three dimensional visualization.

  6. AdjScales: Visualizing Differences between Adjectives for Language Learners

    NASA Astrophysics Data System (ADS)

    Sheinman, Vera; Tokunaga, Takenobu

    In this study we introduce AdjScales, a method for scaling similar adjectives by their strength. It combines existing Web-based computational linguistic techniques in order to automatically differentiate between similar adjectives that describe the same property by strength. Though this kind of information is rarely present in most of the lexical resources and dictionaries, it may be useful for language learners that try to distinguish between similar words. Additionally, learners might gain from a simple visualization of these differences using unidimensional scales. The method is evaluated by comparison with annotation on a subset of adjectives from WordNet by four native English speakers. It is also compared against two non-native speakers of English. The collected annotation is an interesting resource in its own right. This work is a first step toward automatic differentiation of meaning between similar words for language learners. AdjScales can be useful for lexical resource enhancement.

  7. Automatic analysis of stereoscopic satellite image pairs for determination of cloud-top height and structure

    NASA Technical Reports Server (NTRS)

    Hasler, A. F.; Strong, J.; Woodward, R. H.; Pierce, H.

    1991-01-01

    Results are presented on an automatic stereo analysis of cloud-top heights from nearly simultaneous satellite image pairs from the GOES and NOAA satellites, using a massively parallel processor computer. Comparisons of computer-derived height fields and manually analyzed fields show that the automatic analysis technique shows promise for performing routine stereo analysis in a real-time environment, providing a useful forecasting tool by augmenting observational data sets of severe thunderstorms and hurricanes. Simulations using synthetic stereo data show that it is possible to automatically resolve small-scale features such as 4000-m-diam clouds to about 1500 m in the vertical.

  8. New Data on the Topside Electron Density Distribution

    NASA Technical Reports Server (NTRS)

    Huang, Xue-Qin; Reinisch, Bodo; Bilitza, Dieter; Benson, Robert F.

    2001-01-01

    The existing uncertainties about the electron density profiles in the topside ionosphere, i.e., in the height region from hmF2 to approx. 2000 km, require the search for new data sources. The ISIS and Alouette topside sounder satellites from the sixties to the eighties recorded millions of ionograms and most were not analyzed in terms of electron density profiles. In recent years an effort started to digitize the analog recordings to prepare the ionograms for computerized analysis. As of November 2001 about 350,000 ionograms have been digitized from the original 7-track analog tapes. These data are available in binary and CDF format from the anonymous ftp site of the National Space Science Data Center. A search site and browse capabilities on CDAWeb assist the scientific usage of these data. All information and access links can be found at http://nssdc.gsfc.nasa.gov/space/isis/isis-status.html. This paper describes the ISIS data restoration effort and shows how the digital ionograms are automatically processed into electron density profiles from satellite orbit altitude (1400 km for ISIS-2) down to the F peak. Because of the large volume of data an automated processing algorithm is imperative. The automatic topside ionogram scaler with true height algorithm TOPIST software developed for this task is successfully scaling approx.70 % of the ionograms. An 'editing process' is available to manually scale the more difficult ionograms. The automated processing of the digitized ISIS ionograms is now underway, producing a much-needed database of topside electron density profiles for ionospheric modeling covering more than one solar cycle. The ISIS data restoration efforts are supported through NASA's Applied Systems and Information Research Program.

  9. Experience in connecting the power generating units of thermal power plants to automatic secondary frequency regulation within the united power system of Russia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhukov, A. V.; Komarov, A. N.; Safronov, A. N.

    The principles of central control of the power generating units of thermal power plants by automatic secondary frequency and active power overcurrent regulation systems, and the algorithms for interactions between automatic power control systems for the power production units in thermal power plants and centralized systems for automatic frequency and power regulation, are discussed. The order of switching the power generating units of thermal power plants over to control by a centralized system for automatic frequency and power regulation and by the Central Coordinating System for automatic frequency and power regulation is presented. The results of full-scale system tests ofmore » the control of power generating units of the Kirishskaya, Stavropol, and Perm GRES (State Regional Electric Power Plants) by the Central Coordinating System for automatic frequency and power regulation at the United Power System of Russia on September 23-25, 2008, are reported.« less

  10. [The effects of interpretation bias for social events and automatic thoughts on social anxiety].

    PubMed

    Aizawa, Naoki

    2015-08-01

    Many studies have demonstrated that individuals with social anxiety interpret ambiguous social situations negatively. It is, however, not clear whether the interpretation bias discriminatively contributes to social anxiety in comparison with depressive automatic thoughts. The present study investigated the effects of negative interpretation bias and automatic thoughts on social anxiety. The Social Intent Interpretation-Questionnaire, which measures the tendency to interpret ambiguous social events as implying other's rejective intents, the short Japanese version of the Automatic Thoughts Questionnaire-Revised, and the Anthropophobic Tendency Scale were administered to 317 university students. Covariance structure analysis indicated that both rejective intent interpretation bias and negative automatic thoughts contributed to mental distress in social situations mediated by a sense of powerlessness and excessive concern about self and others in social situations. Positive automatic thoughts reduced mental distress. These results indicate the importance of interpretation bias and negative automatic thoughts in the development and maintenance of social anxiety. Implications for understanding of the cognitive features of social anxiety were discussed.

  11. Automatic counting and classification of bacterial colonies using hyperspectral imaging

    USDA-ARS?s Scientific Manuscript database

    Detection and counting of bacterial colonies on agar plates is a routine microbiology practice to get a rough estimate of the number of viable cells in a sample. There have been a variety of different automatic colony counting systems and software algorithms mainly based on color or gray-scale pictu...

  12. Automatic detection of Parkinson's disease in running speech spoken in three different languages.

    PubMed

    Orozco-Arroyave, J R; Hönig, F; Arias-Londoño, J D; Vargas-Bonilla, J F; Daqrouq, K; Skodda, S; Rusz, J; Nöth, E

    2016-01-01

    The aim of this study is the analysis of continuous speech signals of people with Parkinson's disease (PD) considering recordings in different languages (Spanish, German, and Czech). A method for the characterization of the speech signals, based on the automatic segmentation of utterances into voiced and unvoiced frames, is addressed here. The energy content of the unvoiced sounds is modeled using 12 Mel-frequency cepstral coefficients and 25 bands scaled according to the Bark scale. Four speech tasks comprising isolated words, rapid repetition of the syllables /pa/-/ta/-/ka/, sentences, and read texts are evaluated. The method proves to be more accurate than classical approaches in the automatic classification of speech of people with PD and healthy controls. The accuracies range from 85% to 99% depending on the language and the speech task. Cross-language experiments are also performed confirming the robustness and generalization capability of the method, with accuracies ranging from 60% to 99%. This work comprises a step forward for the development of computer aided tools for the automatic assessment of dysarthric speech signals in multiple languages.

  13. The evolution and devolution of cognitive control: The costs of deliberation in a competitive world

    PubMed Central

    Tomlin, Damon; Rand, David G.; Ludvig, Elliot A.; Cohen, Jonathan D.

    2015-01-01

    Dual-system theories of human cognition, under which fast automatic processes can complement or compete with slower deliberative processes, have not typically been incorporated into larger scale population models used in evolutionary biology, macroeconomics, or sociology. However, doing so may reveal important phenomena at the population level. Here, we introduce a novel model of the evolution of dual-system agents using a resource-consumption paradigm. By simulating agents with the capacity for both automatic and controlled processing, we illustrate how controlled processing may not always be selected over rigid, but rapid, automatic processing. Furthermore, even when controlled processing is advantageous, frequency-dependent effects may exist whereby the spread of control within the population undermines this advantage. As a result, the level of controlled processing in the population can oscillate persistently, or even go extinct in the long run. Our model illustrates how dual-system psychology can be incorporated into population-level evolutionary models, and how such a framework can be used to examine the dynamics of interaction between automatic and controlled processing that transpire over an evolutionary time scale. PMID:26078086

  14. The evolution and devolution of cognitive control: The costs of deliberation in a competitive world.

    PubMed

    Tomlin, Damon; Rand, David G; Ludvig, Elliot A; Cohen, Jonathan D

    2015-06-16

    Dual-system theories of human cognition, under which fast automatic processes can complement or compete with slower deliberative processes, have not typically been incorporated into larger scale population models used in evolutionary biology, macroeconomics, or sociology. However, doing so may reveal important phenomena at the population level. Here, we introduce a novel model of the evolution of dual-system agents using a resource-consumption paradigm. By simulating agents with the capacity for both automatic and controlled processing, we illustrate how controlled processing may not always be selected over rigid, but rapid, automatic processing. Furthermore, even when controlled processing is advantageous, frequency-dependent effects may exist whereby the spread of control within the population undermines this advantage. As a result, the level of controlled processing in the population can oscillate persistently, or even go extinct in the long run. Our model illustrates how dual-system psychology can be incorporated into population-level evolutionary models, and how such a framework can be used to examine the dynamics of interaction between automatic and controlled processing that transpire over an evolutionary time scale.

  15. ADMAP (automatic data manipulation program)

    NASA Technical Reports Server (NTRS)

    Mann, F. I.

    1971-01-01

    Instructions are presented on the use of ADMAP, (automatic data manipulation program) an aerospace data manipulation computer program. The program was developed to aid in processing, reducing, plotting, and publishing electric propulsion trajectory data generated by the low thrust optimization program, HILTOP. The program has the option of generating SC4020 electric plots, and therefore requires the SC4020 routines to be available at excution time (even if not used). Several general routines are present, including a cubic spline interpolation routine, electric plotter dash line drawing routine, and single parameter and double parameter sorting routines. Many routines are tailored for the manipulation and plotting of electric propulsion data, including an automatic scale selection routine, an automatic curve labelling routine, and an automatic graph titling routine. Data are accepted from either punched cards or magnetic tape.

  16. Modeling the Hydrologic Effects of Large-Scale Green Infrastructure Projects with GIS

    NASA Astrophysics Data System (ADS)

    Bado, R. A.; Fekete, B. M.; Khanbilvardi, R.

    2015-12-01

    Impervious surfaces in urban areas generate excess runoff, which in turn causes flooding, combined sewer overflows, and degradation of adjacent surface waters. Municipal environmental protection agencies have shown a growing interest in mitigating these effects with 'green' infrastructure practices that partially restore the perviousness and water holding capacity of urban centers. Assessment of the performance of current and future green infrastructure projects is hindered by the lack of adequate hydrological modeling tools; conventional techniques fail to account for the complex flow pathways of urban environments, and detailed analyses are difficult to prepare for the very large domains in which green infrastructure projects are implemented. Currently, no standard toolset exists that can rapidly and conveniently predict runoff, consequent inundations, and sewer overflows at a city-wide scale. We demonstrate how streamlined modeling techniques can be used with open-source GIS software to efficiently model runoff in large urban catchments. Hydraulic parameters and flow paths through city blocks, roadways, and sewer drains are automatically generated from GIS layers, and ultimately urban flow simulations can be executed for a variety of rainfall conditions. With this methodology, users can understand the implications of large-scale land use changes and green/gray storm water retention systems on hydraulic loading, peak flow rates, and runoff volumes.

  17. Reduction of ammonia and volatile organic compounds from food waste-composting facilities using a novel anti-clogging biofilter system.

    PubMed

    Ryu, Hee Wook; Cho, Kyung-Suk; Lee, Tae-Ho

    2011-04-01

    The performance of a pilot-scale anti-clogging biofilter system (ABS) was evaluated over a period of 125days for treating ammonia and volatile organic compounds emitted from a full-scale food waste-composting facility. The pilot-scale ABS was designed to intermittently and automatically remove excess biomass using an agitator. When the pressure drop in the polyurethane filter bed was increased to a set point (50 mm H(2)O m(-1)), due to excess biomass acclimation, the agitator automatically worked by the differential pressure switch, without biofilter shutdown. A high removal efficiency (97-99%) was stably maintained for the 125 days after an acclimation period of 1 week, even thought the inlet gas concentrations fluctuated from 0.16 to 0.55 g m(-3). Due the intermittent automatic agitation of the filter bed, the biomass concentration and pressure drop in the biofilter were maintained within the ranges of 1.1-2.0 g-DCW g PU(-1) and below 50 mm H(2)O m(-1), respectively. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Breaking the Language Barrier

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Preparation for the Apollo Soyuz mission entailed large-scale informational exchange that was accomplished by a computerized translation system. Based on this technology of commercial machine translation, a system known as SYSTRAN II was developed by LATSEC, Inc. and the World Translation Company of Canada. This system increases the output of a human translator by five to eight times, affording cost savings by allowing a large increase in document production without hiring additional people. Extra savings accrue from automatic production of camera-ready copy. Applications include translation of service manuals, proposals and tenders, planning studies, catalogs, list of parts and prices, textbooks, technical reports and education/training materials. System is operational for six language pairs. Systran users include Xerox Corporation, General Motors of Canada, Bell Northern Research of Canada, the U.S. Air Force and the European Commission. The company responsible for the production of SYSTRAN II has changed its name to SYSTRAN.

  19. Observation of three-dimensional internal structure of steel materials by means of serial sectioning with ultrasonic elliptical vibration cutting.

    PubMed

    Fujisaki, K; Yokota, H; Nakatsuchi, H; Yamagata, Y; Nishikawa, T; Udagawa, T; Makinouchi, A

    2010-01-01

    A three-dimensional (3D) internal structure observation system based on serial sectioning was developed from an ultrasonic elliptical vibration cutting device and an optical microscope combined with a high-precision positioning device. For bearing steel samples, the cutting device created mirrored surfaces suitable for optical metallography, even for long-cutting distances during serial sectioning of these ferrous materials. Serial sectioning progressed automatically by means of numerical control. The system was used to observe inclusions in steel materials on a scale of several tens of micrometers. Three specimens containing inclusions were prepared from bearing steels. These inclusions could be detected as two-dimensional (2D) sectional images with resolution better than 1 mum. A three-dimensional (3D) model of each inclusion was reconstructed from the 2D serial images. The microscopic 3D models had sharp edges and complicated surfaces.

  20. Forest Clearcutting and Site Preparation on a Saline Soil in East Texas: Impacts on Water Quality

    Treesearch

    Matthew McBroom; Mingteh Chang; Alexander K. Sayok

    2002-01-01

    Three 0.02 hectare plot-watersheds were installed on a saline soil in the Davy Crockett National Forest near Apple Springs, Texas. Each plot was installed with an H-flume, FW-1 automatic water level recorder, Coshocton N-1 runoff sampler, and two storage tanks. One watershed was undisturbed forested and served a control, one was clearcut without any site-preparation,...

  1. Psychometric Properties of the Children's Automatic Thoughts Scale (CATS) in Chinese Adolescents.

    PubMed

    Sun, Ling; Rapee, Ronald M; Tao, Xuan; Yan, Yulei; Wang, Shanshan; Xu, Wei; Wang, Jianping

    2015-08-01

    The Children's Automatic Thoughts Scale (CATS) is a 40-item self-report questionnaire designed to measure children's negative thoughts. This study examined the psychometric properties of the Chinese translation of the CATS. Participants included 1,993 students (average age = 14.73) from three schools in Mainland China. A subsample of the participants was retested after 4 weeks. Confirmatory factor analysis replicated the original structure with four first-order factors loading on a single higher-order factor. The convergent and divergent validity of the CATS were good. The CATS demonstrated high internal consistency and test-retest reliability. Boys scored higher on the CATS-hostility subscale, but there were no other gender differences. Older adolescents (15-18 years) reported higher scores than younger adolescents (12-14 years) on the total score and on the physical threat, social threat, and hostility subscales. The CATS proved to be a reliable and valid measure of automatic thoughts in Chinese adolescents.

  2. THE APPLICATION OF ENGLISH-WORD MORPHOLOGY TO AUTOMATIC INDEXING AND EXTRACTING. ANNUAL SUMMARY REPORT.

    ERIC Educational Resources Information Center

    DOLBY, J.L.; AND OTHERS

    THE STUDY IS CONCERNED WITH THE LINGUISTIC PROBLEM INVOLVED IN TEXT COMPRESSION--EXTRACTING, INDEXING, AND THE AUTOMATIC CREATION OF SPECIAL-PURPOSE CITATION DICTIONARIES. IN SPITE OF EARLY SUCCESS IN USING LARGE-SCALE COMPUTERS TO AUTOMATE CERTAIN HUMAN TASKS, THESE PROBLEMS REMAIN AMONG THE MOST DIFFICULT TO SOLVE. ESSENTIALLY, THE PROBLEM IS TO…

  3. Design Support System for Coloring Illustrations by Using the Colors Preferred by a User as Determined from the Hue Patterns of Illustrations Prepared by that User

    NASA Astrophysics Data System (ADS)

    Fukai, Hironobu; Mitsukura, Yasue

    We propose a new design support system that can color illustrations according to a person's color preferences that are determined on the basis of the color patterns of illustrations prepared by that person. Recently, many design tools for promoting free design have been developed. However, preferences for various colors differ depending on individual personality. Therefore, a system that can automatically color various designs on the basis of human preference is required. In this study, we propose an automatic modeling system that can be used to model illustrations. To verify the effectiveness of the proposed system, we simulate a coloring design experiment to determine the color patterns preferred by some subjects by using various design data. By using the design data, we determine each subjects preferred color pattern, and send feedback on these individual color patterns to the proposed system.

  4. A photodiode based on PbS nanocrystallites for FYTRONIX solar panel automatic tracking controller

    NASA Astrophysics Data System (ADS)

    Wageh, S.; Farooq, W. A.; Tataroğlu, A.; Dere, A.; Al-Sehemi, Abdullah G.; Al-Ghamdi, Ahmed A.; Yakuphanoglu, F.

    2017-12-01

    The structural, optical and photoelectrical properties of the fabricated Al/PbS/p-Si/Al photodiode based on PbS nanocrystallites were investigated. The PbS nanocrystallites were characterized by X-ray diffraction (XRD), UV-VIS-NIR, Infrared and Raman spectroscopy. The XRD diffraction peaks show that the prepared PbS nanostructure is in high crystalline state. Various electrical parameters of the prepared photodiode were analyzed from the electrical characteristics based on I-V and C-V-G. The photodiode has a high rectification ratio of 5.85×104 at dark and ±4 V. Moreover, The photocurrent results indicate a strong photovoltaic behavior. The frequency dependence of capacitance and conductance characteristics was attributed to depletion region behavior of the photodiode. The diode was used to control solar panel power automatic tracking controller in dual axis. The fabricated photodiode works as a photosensor to control Solar tracking systems.

  5. Automatic assembly of micro-optical components

    NASA Astrophysics Data System (ADS)

    Gengenbach, Ulrich K.

    1996-12-01

    Automatic assembly becomes an important issue as hybrid micro systems enter industrial fabrication. Moving from a laboratory scale production with manual assembly and bonding processes to automatic assembly requires a thorough re- evaluation of the design, the characteristics of the individual components and of the processes involved. Parts supply for automatic operation, sensitive and intelligent grippers adapted to size, surface and material properties of the microcomponents gain importance when the superior sensory and handling skills of a human are to be replaced by a machine. This holds in particular for the automatic assembly of micro-optical components. The paper outlines these issues exemplified at the automatic assembly of a micro-optical duplexer consisting of a micro-optical bench fabricated by the LIGA technique, two spherical lenses, a wavelength filter and an optical fiber. Spherical lenses, wavelength filter and optical fiber are supplied by third party vendors, which raises the question of parts supply for automatic assembly. The bonding processes for these components include press fit and adhesive bonding. The prototype assembly system with all relevant components e.g. handling system, parts supply, grippers and control is described. Results of first automatic assembly tests are presented.

  6. Automatic, semi-automatic and manual validation of urban drainage data.

    PubMed

    Branisavljević, N; Prodanović, D; Pavlović, D

    2010-01-01

    Advances in sensor technology and the possibility of automated long distance data transmission have made continuous measurements the preferable way of monitoring urban drainage processes. Usually, the collected data have to be processed by an expert in order to detect and mark the wrong data, remove them and replace them with interpolated data. In general, the first step in detecting the wrong, anomaly data is called the data quality assessment or data validation. Data validation consists of three parts: data preparation, validation scores generation and scores interpretation. This paper will present the overall framework for the data quality improvement system, suitable for automatic, semi-automatic or manual operation. The first two steps of the validation process are explained in more detail, using several validation methods on the same set of real-case data from the Belgrade sewer system. The final part of the validation process, which is the scores interpretation, needs to be further investigated on the developed system.

  7. Comparison of the efficiency between two sampling plans for aflatoxins analysis in maize

    PubMed Central

    Mallmann, Adriano Olnei; Marchioro, Alexandro; Oliveira, Maurício Schneider; Rauber, Ricardo Hummes; Dilkin, Paulo; Mallmann, Carlos Augusto

    2014-01-01

    Variance and performance of two sampling plans for aflatoxins quantification in maize were evaluated. Eight lots of maize were sampled using two plans: manual, using sampling spear for kernels; and automatic, using a continuous flow to collect milled maize. Total variance and sampling, preparation, and analysis variance were determined and compared between plans through multifactor analysis of variance. Four theoretical distribution models were used to compare aflatoxins quantification distributions in eight maize lots. The acceptance and rejection probabilities for a lot under certain aflatoxin concentration were determined using variance and the information on the selected distribution model to build the operational characteristic curves (OC). Sampling and total variance were lower at the automatic plan. The OC curve from the automatic plan reduced both consumer and producer risks in comparison to the manual plan. The automatic plan is more efficient than the manual one because it expresses more accurately the real aflatoxin contamination in maize. PMID:24948911

  8. Machine Beats Experts: Automatic Discovery of Skill Models for Data-Driven Online Course Refinement

    ERIC Educational Resources Information Center

    Matsuda, Noboru; Furukawa, Tadanobu; Bier, Norman; Faloutsos, Christos

    2015-01-01

    How can we automatically determine which skills must be mastered for the successful completion of an online course? Large-scale online courses (e.g., MOOCs) often contain a broad range of contents frequently intended to be a semester's worth of materials; this breadth often makes it difficult to articulate an accurate set of skills and knowledge…

  9. An Algorithm for Automatic Checking of Exercises in a Dynamic Geometry System: iGeom

    ERIC Educational Resources Information Center

    Isotani, Seiji; de Oliveira Brandao, Leonidas

    2008-01-01

    One of the key issues in e-learning environments is the possibility of creating and evaluating exercises. However, the lack of tools supporting the authoring and automatic checking of exercises for specifics topics (e.g., geometry) drastically reduces advantages in the use of e-learning environments on a larger scale, as usually happens in Brazil.…

  10. [A wavelet-transform-based method for the automatic detection of late-type stars].

    PubMed

    Liu, Zhong-tian; Zhao, Rrui-zhen; Zhao, Yong-heng; Wu, Fu-chao

    2005-07-01

    The LAMOST project, the world largest sky survey project, urgently needs an automatic late-type stars detection system. However, to our knowledge, no effective methods for automatic late-type stars detection have been reported in the literature up to now. The present study work is intended to explore possible ways to deal with this issue. Here, by "late-type stars" we mean those stars with strong molecule absorption bands, including oxygen-rich M, L and T type stars and carbon-rich C stars. Based on experimental results, the authors find that after a wavelet transform with 5 scales on the late-type stars spectra, their frequency spectrum of the transformed coefficient on the 5th scale consistently manifests a unimodal distribution, and the energy of frequency spectrum is largely concentrated on a small neighborhood centered around the unique peak. However, for the spectra of other celestial bodies, the corresponding frequency spectrum is of multimodal and the energy of frequency spectrum is dispersible. Based on such a finding, the authors presented a wavelet-transform-based automatic late-type stars detection method. The proposed method is shown by extensive experiments to be practical and of good robustness.

  11. A new fast scanning system for the measurement of large angle tracks in nuclear emulsions

    NASA Astrophysics Data System (ADS)

    Alexandrov, A.; Buonaura, A.; Consiglio, L.; D'Ambrosio, N.; De Lellis, G.; Di Crescenzo, A.; Di Marco, N.; Galati, G.; Lauria, A.; Montesi, M. C.; Pupilli, F.; Shchedrina, T.; Tioukov, V.; Vladymyrov, M.

    2015-11-01

    Nuclear emulsions have been widely used in particle physics to identify new particles through the observation of their decays thanks to their unique spatial resolution. Nevertheless, before the advent of automatic scanning systems, the emulsion analysis was very demanding in terms of well trained manpower. Due to this reason, they were gradually replaced by electronic detectors, until the '90s, when automatic microscopes started to be developed in Japan and in Europe. Automatic scanning was essential to conceive large scale emulsion-based neutrino experiments like CHORUS, DONUT and OPERA. Standard scanning systems have been initially designed to recognize tracks within a limited angular acceptance (θ lesssim 30°) where θ is the track angle with respect to a line perpendicular to the emulsion plane. In this paper we describe the implementation of a novel fast automatic scanning system aimed at extending the track recognition to the full angular range and improving the present scanning speed. Indeed, nuclear emulsions do not have any intrinsic limit to detect particle direction. Such improvement opens new perspectives to use nuclear emulsions in several fields in addition to large scale neutrino experiments, like muon radiography, medical applications and dark matter directional detection.

  12. Stereophotogrammetry in studies of riparian vegetation dynamics

    NASA Astrophysics Data System (ADS)

    Hortobagyi, Borbala; Vautier, Franck; Corenblit, Dov; Steiger, Johannes

    2014-05-01

    Riparian vegetation responds to hydrogeomorphic disturbances and also controls sediment deposition and erosion. Spatio-temporal riparian vegetation dynamics within fluvial corridors have been quantified in many studies using aerial photographs and GIS. However, this approach does not allow the consideration of woody vegetation growth rates (i.e. vertical dimension) which are fundamental when studying feedbacks between the processes of fluvial landform construction and vegetation establishment and succession. We built 3D photogrammetric models of vegetation height based on aerial argentic and digital photographs from sites of the Allier and Garonne Rivers (France). The models were realized at two different spatial scales and with two different methods. The "large" scale corresponds to the reach of the river corridor on the Allier river (photograph taken in 2009) and the "small" scale to river bars of the Allier (photographs taken in 2002, 2009) and Garonne Rivers (photographs taken in 2000, 2002, 2006 and 2010). At the corridor scale, we generated vegetation height models using an automatic procedure. This method is fast but can only be used with digital photographs. At the bar scale, we constructed the models manually using a 3D visualization on the screen. This technique showed good results for digital and also argentic photographs but is very time-consuming. A diachronic study was performed in order to investigate vegetation succession by distinguishing three different classes according to the vegetation height: herbs (<1 m), shrubs (1-4 m) or trees (>4 m). Both methods, i.e. automatic and manual, were employed to study the evolution of the three vegetation classes and the recruitment of new vegetation patches. A comparison was conducted between the vegetation height given by models (automatic and manual) and the vegetation height measured in the field. The manually produced models (small scale) were of a precision of 0.5-1 m, allowing the quantification of woody vegetation growth rates. Thus, our results show that the manual method we developed is accurate to quantify vegetation growth rates at small scales, whereas the less accurate automatic method is appropriate to study vegetation succession at the corridor scale. Both methods are complementary and will contribute to a further exploration of the mutual relationships between hydrogeomorphic processes, topography and vegetation dynamics within alluvial systems, adding the quantification of the vertical dimension of riparian vegetation to their spatio-temporal characteristics.

  13. Design of an automatic weight scale for an isolette

    NASA Technical Reports Server (NTRS)

    Peterka, R. J.; Griffin, W.

    1974-01-01

    The design of an infant weight scale is reported that fits into an isolette without disturbing its controlled atmosphere. The scale platform uses strain gages to measure electronically deflections of cantilever beams positioned at its four corners. The weight of the infant is proportional to the sum of the output voltages produced by the gauges on each beam of the scale.

  14. Prototype Implementation of Web and Desktop Applications for ALMA Science Verification Data and the Lessons Learned

    NASA Astrophysics Data System (ADS)

    Eguchi, S.; Kawasaki, W.; Shirasaki, Y.; Komiya, Y.; Kosugi, G.; Ohishi, M.; Mizumoto, Y.

    2013-10-01

    ALMA is estimated to generate TB scale data during only one observation; astronomers need to identify which part of the data they are really interested in. We have been developing new GUI software for this purpose utilizing the VO interface: ALMA Web Quick Look System (ALMAWebQL) and ALMA Desktop Application (Vissage). The former is written in JavaScript and HTML5 generated from Java code by the Google Web Toolkit, and the latter is in pure Java. An essential point of our approach is how to reduce network traffic: we prepare, in advance, “compressed” FITS files of 2x2x1 (horizontal, vertical, and spectral directions, respectively) binning, 2 x 2 x 2 binning, 4 x 4 x 2 binning data, and so on. These files are hidden from users, and Web QL automatically chooses the proper one for each user operation. Through this work, we find that network traffic in our system is still a bottleneck towards TB scale data distribution. Hence we have to develop alternative data containers for much faster data processing. In this paper, we introduce our data analysis systems, and describe what we learned through the development.

  15. Automatic Generation of English-Japanese Translation Pattern Utilizing Genetic Programming Technique

    NASA Astrophysics Data System (ADS)

    Matsumura, Koki; Tamekuni, Yuji; Kimura, Shuhei

    There are a lot of constructional differences in an English-Japanese phrase template, and that often makes the act of translation difficult. Moreover, there exist various and tremendous phrase templates and sentence to be refered to. It is not easy to prepare the corpus that covers the all. Therefore, it is very significant to generate the translation pattern of the sentence pattern automatically from a viewpoint of the translation success rate and the capacity of the pattern dictionary. Then, for the purpose of realizing the automatic generation of the translation pattern, this paper proposed the new method for the generation of the translation pattern by using the genetic programming technique (GP). The technique tries to generate the translation pattern of various sentences which are not registered in the phrase template dictionary automatically by giving the genetic operation to the parsing tree of a basic pattern. The tree consists of the pair of the English-Japanese sentence generated as the first stage population. The analysis tree data base with 50,100,150,200 pairs was prepared as the first stage population. And this system was applied and executed for an English input of 1,555 sentences. As a result, the analysis tree increases from 200 to 517, and the accuracy rate of the translation pattern has improved from 42.57% to 70.10%. And, 86.71% of the generated translations was successfully done, whose meanings are enough acceptable and understandable. It seemed that this proposal technique became a clue to raise the translation success rate, and to find the possibility of the reduction of the analysis tree data base.

  16. Automatic range selector

    DOEpatents

    McNeilly, Clyde E.

    1977-01-04

    A device is provided for automatically selecting from a plurality of ranges of a scale of values to which a meter may be made responsive, that range which encompasses the value of an unknown parameter. A meter relay indicates whether the unknown is of greater or lesser value than the range to which the meter is then responsive. The rotatable part of a stepping relay is rotated in one direction or the other in response to the indication from the meter relay. Various positions of the rotatable part are associated with particular scales. Switching means are sensitive to the position of the rotatable part to couple the associated range to the meter.

  17. The Center for Optimized Structural Studies (COSS) platform for automation in cloning, expression, and purification of single proteins and protein-protein complexes.

    PubMed

    Mlynek, Georg; Lehner, Anita; Neuhold, Jana; Leeb, Sarah; Kostan, Julius; Charnagalov, Alexej; Stolt-Bergner, Peggy; Djinović-Carugo, Kristina; Pinotsis, Nikos

    2014-06-01

    Expression in Escherichia coli represents the simplest and most cost effective means for the production of recombinant proteins. This is a routine task in structural biology and biochemistry where milligrams of the target protein are required in high purity and monodispersity. To achieve these criteria, the user often needs to screen several constructs in different expression and purification conditions in parallel. We describe a pipeline, implemented in the Center for Optimized Structural Studies, that enables the systematic screening of expression and purification conditions for recombinant proteins and relies on a series of logical decisions. We first use bioinformatics tools to design a series of protein fragments, which we clone in parallel, and subsequently screen in small scale for optimal expression and purification conditions. Based on a scoring system that assesses soluble expression, we then select the top ranking targets for large-scale purification. In the establishment of our pipeline, emphasis was put on streamlining the processes such that it can be easily but not necessarily automatized. In a typical run of about 2 weeks, we are able to prepare and perform small-scale expression screens for 20-100 different constructs followed by large-scale purification of at least 4-6 proteins. The major advantage of our approach is its flexibility, which allows for easy adoption, either partially or entirely, by any average hypothesis driven laboratory in a manual or robot-assisted manner.

  18. Evaluating Accuracy of the Sunnova Pro Platform Shade Measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sunnova's new solar energy design platform, Sunnova Pro, automatically generates a 3D model of a building and surrounding shading objects. The product is designed to automate the process of engineering a system, sizing batteries and preparing sales proposals.

  19. Finding Malicious Cyber Discussions in Social Media

    DTIC Science & Technology

    2015-12-11

    automatically filter cyber discussions from Stack Exchange, Reddit, and Twitter posts written in English. Criminal hackers often use social media...monitoring hackers on Facebook and in private chat rooms. As a result, system administrators were prepared to counter distributed denial-of-service

  20. Automated clinical system for chromosome analysis

    NASA Technical Reports Server (NTRS)

    Castleman, K. R.; Friedan, H. J.; Johnson, E. T.; Rennie, P. A.; Wall, R. J. (Inventor)

    1978-01-01

    An automatic chromosome analysis system is provided wherein a suitably prepared slide with chromosome spreads thereon is placed on the stage of an automated microscope. The automated microscope stage is computer operated to move the slide to enable detection of chromosome spreads on the slide. The X and Y location of each chromosome spread that is detected is stored. The computer measures the chromosomes in a spread, classifies them by group or by type and also prepares a digital karyotype image. The computer system can also prepare a patient report summarizing the result of the analysis and listing suspected abnormalities.

  1. A scale space based algorithm for automated segmentation of single shot tagged MRI of shearing deformation.

    PubMed

    Sprengers, Andre M J; Caan, Matthan W A; Moerman, Kevin M; Nederveen, Aart J; Lamerichs, Rolf M; Stoker, Jaap

    2013-04-01

    This study proposes a scale space based algorithm for automated segmentation of single-shot tagged images of modest SNR. Furthermore the algorithm was designed for analysis of discontinuous or shearing types of motion, i.e. segmentation of broken tag patterns. The proposed algorithm utilises non-linear scale space for automatic segmentation of single-shot tagged images. The algorithm's ability to automatically segment tagged shearing motion was evaluated in a numerical simulation and in vivo. A typical shearing deformation was simulated in a Shepp-Logan phantom allowing for quantitative evaluation of the algorithm's success rate as a function of both SNR and the amount of deformation. For a qualitative in vivo evaluation tagged images showing deformations in the calf muscles and eye movement in a healthy volunteer were acquired. Both the numerical simulation and the in vivo tagged data demonstrated the algorithm's ability for automated segmentation of single-shot tagged MR provided that SNR of the images is above 10 and the amount of deformation does not exceed the tag spacing. The latter constraint can be met by adjusting the tag delay or the tag spacing. The scale space based algorithm for automatic segmentation of single-shot tagged MR enables the application of tagged MR to complex (shearing) deformation and the processing of datasets with relatively low SNR.

  2. Quality Control of True Height Profiles Obtained Automatically from Digital Ionograms.

    DTIC Science & Technology

    1982-05-01

    nece.,ssary and Identify by block number) Ionosphere Digisonde Electron Density Profile Ionogram Autoscaling ARTIST 2 , ABSTRACT (Continue on reverae...analysis technique currently used with the ionogram traces scaled automatically by the ARTIST software [Reinisch and Huang, 1983; Reinisch et al...19841, and the generalized polynomial analysis technique POLAN [Titheridge, 1985], using the same ARTIST -identified ionogram traces. 2. To determine how

  3. The PR2D (Place, Route in 2-Dimensions) automatic layout computer program handbook

    NASA Technical Reports Server (NTRS)

    Edge, T. M.

    1978-01-01

    Place, Route in 2-Dimensions is a standard cell automatic layout computer program for generating large scale integrated/metal oxide semiconductor arrays. The program was utilized successfully for a number of years in both government and private sectors but until now was undocumented. The compilation, loading, and execution of the program on a Sigma V CP-V operating system is described.

  4. The Modified Reasons for Smoking Scale: factorial structure, gender effects and relationship with nicotine dependence and smoking cessation in French smokers.

    PubMed

    Berlin, Ivan; Singleton, Edward G; Pedarriosse, Anne-Marie; Lancrenon, Sylvie; Rames, Alexis; Aubin, Henri-Jean; Niaura, Raymond

    2003-11-01

    To assess the validity of the French version of the Modified Reasons for Smoking Scale (MRSS), and to identify which smoking patterns differentiate male and female smokers, which are related to tobacco dependence (as assessed by the Fagerström Test for Nicotine Dependence, FTND), to mood (Beck Depression Inventory II), to affect (Positive and Negative Affect Schedule) and which are predictors of successful quitting. Three hundred and thirty smokers [(mean +/- SD) aged 40 +/- 9 years, 145 (44%) women, mean FTND score: 6.2 +/- 2], candidates for a smoking cessation programme and smoking at least 15 cigarettes/day. Factor analysis of the 21-item scale gave the optimal fit for a seven-factor model, which accounted for 62.3% of the total variance. The following factors were identified: 'addictive smoking', 'pleasure from smoking', 'tension reduction/relaxation', 'social smoking', 'stimulation', 'habit/automatism' and 'handling'. The 'addictive smoking' score increased in a dose-dependent manner with number of cigarettes smoked per day; the 'habit/automatism' score was significantly higher, with more than 20 cigarettes per day than with < or = 20 cigarettes per day. The reasons for smoking were different for males and females: females scored higher on 'tension reduction/relaxation', 'stimulation' and 'social smoking'. A high level of dependence (FTND > or = 6) was associated with significantly higher scores only on 'addictive smoking', the association being stronger in females. Time to first cigarette after awakening was associated with higher 'addictive smoking' and 'habit/automatism' (P < 0.001). In a multivariate logistic regression, failed quitting was predicted by higher habit/automatism score (odds ratio = 1.44, 95% CI = 1.06-1.95, P = 0.02) and greater number of cigarettes smoked per day (odds ratio = 1.03, 95% CI = 1.01-1.06, p = 0.03). The questionnaire yielded a coherent factor structure; women smoked more for tension reduction/relaxation, stimulation and for social reasons than men; addictive smoking and automatic smoking behaviour were similar in both sexes and were associated strongly with a high level of nicotine dependence; the 'habit/automatism' score predicted failure to quit over and above cigarettes per day.

  5. Overt orienting of spatial attention and corticospinal excitability during action observation are unrelated

    PubMed Central

    Betti, Sonia; Castiello, Umberto; Guerra, Silvia

    2017-01-01

    Observing moving body parts can automatically activate topographically corresponding motor representations in the primary motor cortex (M1), the so-called direct matching. Novel neurophysiological findings from social contexts are nonetheless proving that this process is not automatic as previously thought. The motor system can flexibly shift from imitative to incongruent motor preparation, when requested by a social gesture. In the present study we aim to bring an increase in the literature by assessing whether and how diverting overt spatial attention might affect motor preparation in contexts requiring interactive responses from the onlooker. Experiment 1 shows that overt attention—although anchored to an observed biological movement—can be captured by a target object as soon as a social request for it becomes evident. Experiment 2 reveals that the appearance of a short-lasting red dot in the contralateral space can divert attention from the target, but not from the biological movement. Nevertheless, transcranial magnetic stimulation (TMS) over M1 combined with electromyography (EMG) recordings (Experiment 3) indicates that attentional interference reduces corticospinal excitability related to the observed movement, but not motor preparation for a complementary action on the target. This work provides evidence that social motor preparation is impermeable to attentional interference and that a double dissociation is present between overt orienting of spatial attention and neurophysiological markers of action observation. PMID:28319191

  6. Puzzle test: A tool for non-analytical clinical reasoning assessment.

    PubMed

    Monajemi, Alireza; Yaghmaei, Minoo

    2016-01-01

    Most contemporary clinical reasoning tests typically assess non-automatic thinking. Therefore, a test is needed to measure automatic reasoning or pattern recognition, which has been largely neglected in clinical reasoning tests. The Puzzle Test (PT) is dedicated to assess automatic clinical reasoning in routine situations. This test has been introduced first in 2009 by Monajemi et al in the Olympiad for Medical Sciences Students.PT is an item format that has gained acceptance in medical education, but no detailed guidelines exist for this test's format, construction and scoring. In this article, a format is described and the steps to prepare and administer valid and reliable PTs are presented. PT examines a specific clinical reasoning task: Pattern recognition. PT does not replace other clinical reasoning assessment tools. However, it complements them in strategies for assessing comprehensive clinical reasoning.

  7. Speedup for quantum optimal control from automatic differentiation based on graphics processing units

    NASA Astrophysics Data System (ADS)

    Leung, Nelson; Abdelhafez, Mohamed; Koch, Jens; Schuster, David

    2017-04-01

    We implement a quantum optimal control algorithm based on automatic differentiation and harness the acceleration afforded by graphics processing units (GPUs). Automatic differentiation allows us to specify advanced optimization criteria and incorporate them in the optimization process with ease. We show that the use of GPUs can speedup calculations by more than an order of magnitude. Our strategy facilitates efficient numerical simulations on affordable desktop computers and exploration of a host of optimization constraints and system parameters relevant to real-life experiments. We demonstrate optimization of quantum evolution based on fine-grained evaluation of performance at each intermediate time step, thus enabling more intricate control on the evolution path, suppression of departures from the truncated model subspace, as well as minimization of the physical time needed to perform high-fidelity state preparation and unitary gates.

  8. Automatic pelvis segmentation from x-ray images of a mouse model

    NASA Astrophysics Data System (ADS)

    Al Okashi, Omar M.; Du, Hongbo; Al-Assam, Hisham

    2017-05-01

    The automatic detection and quantification of skeletal structures has a variety of different applications for biological research. Accurate segmentation of the pelvis from X-ray images of mice in a high-throughput project such as the Mouse Genomes Project not only saves time and cost but also helps achieving an unbiased quantitative analysis within the phenotyping pipeline. This paper proposes an automatic solution for pelvis segmentation based on structural and orientation properties of the pelvis in X-ray images. The solution consists of three stages including pre-processing image to extract pelvis area, initial pelvis mask preparation and final pelvis segmentation. Experimental results on a set of 100 X-ray images showed consistent performance of the algorithm. The automated solution overcomes the weaknesses of a manual annotation procedure where intra- and inter-observer variations cannot be avoided.

  9. A Robust Automatic Ionospheric O/X Mode Separation Technique for Vertical Incidence Sounders

    NASA Astrophysics Data System (ADS)

    Harris, T. J.; Pederick, L. H.

    2017-12-01

    The sounding of the ionosphere by a vertical incidence sounder (VIS) is the oldest and most common technique for determining the state of the ionosphere. The automatic extraction of relevant ionospheric parameters from the ionogram image, referred to as scaling, is important for the effective utilization of data from large ionospheric sounder networks. Due to the Earth's magnetic field, the ionosphere is birefringent at radio frequencies, so a VIS will typically see two distinct returns for each frequency. For the automatic scaling of ionograms, it is highly desirable to be able to separate the two modes. Defence Science and Technology Group has developed a new VIS solution which is based on direct digital receiver technology and includes an algorithm to separate the O and X modes. This algorithm can provide high-quality separation even in difficult ionospheric conditions. In this paper we describe the algorithm and demonstrate its consistency and reliability in successfully separating 99.4% of the ionograms during a 27 day experimental campaign under sometimes demanding ionospheric conditions.

  10. Automatic script identification from images using cluster-based templates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hochberg, J.; Kerns, L.; Kelly, P.

    We have developed a technique for automatically identifying the script used to generate a document that is stored electronically in bit image form. Our approach differs from previous work in that the distinctions among scripts are discovered by an automatic learning procedure, without any handson analysis. We first develop a set of representative symbols (templates) for each script in our database (Cyrillic, Roman, etc.). We do this by identifying all textual symbols in a set of training documents, scaling each symbol to a fixed size, clustering similar symbols, pruning minor clusters, and finding each cluster`s centroid. To identify a newmore » document`s script, we identify and scale a subset of symbols from the document and compare them to the templates for each script. We choose the script whose templates provide the best match. Our current system distinguishes among the Armenian, Burmese, Chinese, Cyrillic, Ethiopic, Greek, Hebrew, Japanese, Korean, Roman, and Thai scripts with over 90% accuracy.« less

  11. Word associations contribute to machine learning in automatic scoring of degree of emotional tones in dream reports.

    PubMed

    Amini, Reza; Sabourin, Catherine; De Koninck, Joseph

    2011-12-01

    Scientific study of dreams requires the most objective methods to reliably analyze dream content. In this context, artificial intelligence should prove useful for an automatic and non subjective scoring technique. Past research has utilized word search and emotional affiliation methods, to model and automatically match human judges' scoring of dream report's negative emotional tone. The current study added word associations to improve the model's accuracy. Word associations were established using words' frequency of co-occurrence with their defining words as found in a dictionary and an encyclopedia. It was hypothesized that this addition would facilitate the machine learning model and improve its predictability beyond those of previous models. With a sample of 458 dreams, this model demonstrated an improvement in accuracy from 59% to 63% (kappa=.485) on the negative emotional tone scale, and for the first time reached an accuracy of 77% (kappa=.520) on the positive scale. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. [The mediating role of anger in the relationship between automatic thoughts and physical aggression in adolescents].

    PubMed

    Yavuzer, Yasemin; Karataş, Zeynep

    2013-01-01

    This study aimed to examine the mediating role of anger in the relationship between automatic thoughts and physical aggression in adolescents. The study included 224 adolescents in the 9th grade of 3 different high schools in central Burdur during the 2011-2012 academic year. Participants completed the Aggression Questionnaire and Automatic Thoughts Scale in their classrooms during counseling sessions. Data were analyzed using simple and multiple linear regression analysis. There were positive correlations between the adolescents' automatic thoughts, and physical aggression, and anger. According to regression analysis, automatic thoughts effectively predicted the level of physical aggression (b= 0.233, P < 0.001)) and anger (b= 0.325, P < 0.001). Analysis of the mediating role of anger showed that anger fully mediated the relationship between automatic thoughts and physical aggression (Sobel z = 5.646, P < 0.001). Anger fully mediated the relationship between automatic thoughts and physical aggression. Providing adolescents with anger management skills training is very important for the prevention of physical aggression. Such training programs should include components related to the development of an awareness of dysfunctional and anger-triggering automatic thoughts, and how to change them. As the study group included adolescents from Burdur, the findings can only be generalized to groups with similar characteristics.

  13. Automatic location of L/H transition times for physical studies with a large statistical basis

    NASA Astrophysics Data System (ADS)

    González, S.; Vega, J.; Murari, A.; Pereira, A.; Dormido-Canto, S.; Ramírez, J. M.; contributors, JET-EFDA

    2012-06-01

    Completely automatic techniques to estimate and validate L/H transition times can be essential in L/H transition analyses. The generation of databases with hundreds of transition times and without human intervention is an important step to accomplish (a) L/H transition physics analysis, (b) validation of L/H theoretical models and (c) creation of L/H scaling laws. An entirely unattended methodology is presented in this paper to build large databases of transition times in JET using time series. The proposed technique has been applied to a dataset of 551 JET discharges between campaigns C21 and C26. A prediction with discharges that show a clear signature in time series is made through the locating properties of the wavelet transform. It is an accurate prediction and the uncertainty interval is ±3.2 ms. The discharges with a non-clear pattern in the time series use an L/H mode classifier based on discharges with a clear signature. In this case, the estimation error shows a distribution with mean and standard deviation of 27.9 ms and 37.62 ms, respectively. Two different regression methods have been applied to the measurements acquired at the transition times identified by the automatic system. The obtained scaling laws for the threshold power are not significantly different from those obtained using the data at the transition times determined manually by the experts. The automatic methods allow performing physical studies with a large number of discharges, showing, for example, that there are statistically different types of transitions characterized by different scaling laws.

  14. National Earthquake Information Center Seismic Event Detections on Multiple Scales

    NASA Astrophysics Data System (ADS)

    Patton, J.; Yeck, W. L.; Benz, H.; Earle, P. S.; Soto-Cordero, L.; Johnson, C. E.

    2017-12-01

    The U.S. Geological Survey National Earthquake Information Center (NEIC) monitors seismicity on local, regional, and global scales using automatic picks from more than 2,000 near-real time seismic stations. This presents unique challenges in automated event detection due to the high variability in data quality, network geometries and density, and distance-dependent variability in observed seismic signals. To lower the overall detection threshold while minimizing false detection rates, NEIC has begun to test the incorporation of new detection and picking algorithms, including multiband (Lomax et al., 2012) and kurtosis (Baillard et al., 2014) pickers, and a new bayesian associator (Glass 3.0). The Glass 3.0 associator allows for simultaneous processing of variably scaled detection grids, each with a unique set of nucleation criteria (e.g., nucleation threshold, minimum associated picks, nucleation phases) to meet specific monitoring goals. We test the efficacy of these new tools on event detection in networks of various scales and geometries, compare our results with previous catalogs, and discuss lessons learned. For example, we find that on local and regional scales, rapid nucleation of small events may require event nucleation with both P and higher-amplitude secondary phases (e.g., S or Lg). We provide examples of the implementation of a scale-independent associator for an induced seismicity sequence (local-scale), a large aftershock sequence (regional-scale), and for monitoring global seismicity. Baillard, C., Crawford, W. C., Ballu, V., Hibert, C., & Mangeney, A. (2014). An automatic kurtosis-based P-and S-phase picker designed for local seismic networks. Bulletin of the Seismological Society of America, 104(1), 394-409. Lomax, A., Satriano, C., & Vassallo, M. (2012). Automatic picker developments and optimization: FilterPicker - a robust, broadband picker for real-time seismic monitoring and earthquake early-warning, Seism. Res. Lett. , 83, 531-540, doi: 10.1785/gssrl.83.3.531.

  15. Investigating the Potential of Deep Neural Networks for Large-Scale Classification of Very High Resolution Satellite Images

    NASA Astrophysics Data System (ADS)

    Postadjian, T.; Le Bris, A.; Sahbi, H.; Mallet, C.

    2017-05-01

    Semantic classification is a core remote sensing task as it provides the fundamental input for land-cover map generation. The very recent literature has shown the superior performance of deep convolutional neural networks (DCNN) for many classification tasks including the automatic analysis of Very High Spatial Resolution (VHR) geospatial images. Most of the recent initiatives have focused on very high discrimination capacity combined with accurate object boundary retrieval. Therefore, current architectures are perfectly tailored for urban areas over restricted areas but not designed for large-scale purposes. This paper presents an end-to-end automatic processing chain, based on DCNNs, that aims at performing large-scale classification of VHR satellite images (here SPOT 6/7). Since this work assesses, through various experiments, the potential of DCNNs for country-scale VHR land-cover map generation, a simple yet effective architecture is proposed, efficiently discriminating the main classes of interest (namely buildings, roads, water, crops, vegetated areas) by exploiting existing VHR land-cover maps for training.

  16. Treatment of duck house wastewater by a pilot-scale sequencing batch reactor system for sustainable duck production.

    PubMed

    Su, Jung-Jeng; Huang, Jeng-Fang; Wang, Yi-Lei; Hong, Yu-Ya

    2018-06-15

    The objective of this study is trying to solve water pollution problems related to duck house wastewater by developing a novel duck house wastewater treatment technology. A pilot-scale sequencing batch reactor (SBR) system using different hydraulic retention times (HRTs) for treating duck house wastewater was developed and applied in this study. Experimental results showed that removal efficiency of chemical oxygen demand in untreated duck house wastewater was 98.4, 98.4, 87.8, and 72.5% for the different HRTs of 5, 3, 1, and 0.5 d, respectively. In addition, removal efficiency of biochemical oxygen demand in untreated duck house wastewater was 99.6, 99.3, 90.4, and 58.0%, respectively. The pilot-scale SBR system was effective and deemed capable to be applied to treat duck house wastewater. It is feasible to apply an automatic SBR system on site based on the previous case study of the farm-scale automatic SBR systems for piggery wastewater treatment.

  17. Automatic Selection of Order Parameters in the Analysis of Large Scale Molecular Dynamics Simulations.

    PubMed

    Sultan, Mohammad M; Kiss, Gert; Shukla, Diwakar; Pande, Vijay S

    2014-12-09

    Given the large number of crystal structures and NMR ensembles that have been solved to date, classical molecular dynamics (MD) simulations have become powerful tools in the atomistic study of the kinetics and thermodynamics of biomolecular systems on ever increasing time scales. By virtue of the high-dimensional conformational state space that is explored, the interpretation of large-scale simulations faces difficulties not unlike those in the big data community. We address this challenge by introducing a method called clustering based feature selection (CB-FS) that employs a posterior analysis approach. It combines supervised machine learning (SML) and feature selection with Markov state models to automatically identify the relevant degrees of freedom that separate conformational states. We highlight the utility of the method in the evaluation of large-scale simulations and show that it can be used for the rapid and automated identification of relevant order parameters involved in the functional transitions of two exemplary cell-signaling proteins central to human disease states.

  18. Large - scale Rectangular Ruler Automated Verification Device

    NASA Astrophysics Data System (ADS)

    Chen, Hao; Chang, Luping; Xing, Minjian; Xie, Xie

    2018-03-01

    This paper introduces a large-scale rectangular ruler automated verification device, which consists of photoelectric autocollimator and self-designed mechanical drive car and data automatic acquisition system. The design of mechanical structure part of the device refer to optical axis design, drive part, fixture device and wheel design. The design of control system of the device refer to hardware design and software design, and the hardware mainly uses singlechip system, and the software design is the process of the photoelectric autocollimator and the automatic data acquisition process. This devices can automated achieve vertical measurement data. The reliability of the device is verified by experimental comparison. The conclusion meets the requirement of the right angle test procedure.

  19. Affordable, automatic quantitative fall risk assessment based on clinical balance scales and Kinect data.

    PubMed

    Colagiorgio, P; Romano, F; Sardi, F; Moraschini, M; Sozzi, A; Bejor, M; Ricevuti, G; Buizza, A; Ramat, S

    2014-01-01

    The problem of a correct fall risk assessment is becoming more and more critical with the ageing of the population. In spite of the available approaches allowing a quantitative analysis of the human movement control system's performance, the clinical assessment and diagnostic approach to fall risk assessment still relies mostly on non-quantitative exams, such as clinical scales. This work documents our current effort to develop a novel method to assess balance control abilities through a system implementing an automatic evaluation of exercises drawn from balance assessment scales. Our aim is to overcome the classical limits characterizing these scales i.e. limited granularity and inter-/intra-examiner reliability, to obtain objective scores and more detailed information allowing to predict fall risk. We used Microsoft Kinect to record subjects' movements while performing challenging exercises drawn from clinical balance scales. We then computed a set of parameters quantifying the execution of the exercises and fed them to a supervised classifier to perform a classification based on the clinical score. We obtained a good accuracy (~82%) and especially a high sensitivity (~83%).

  20. 34 CFR 395.1 - Terms.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... subtends an angle of no greater than 20°. (d) Cafeteria means a food dispensing facility capable of providing a broad variety of prepared foods and beverages (including hot meals) primarily through the use of..., confections, tobacco products, foods, beverages, and other articles or services dispensed automatically or...

  1. Operation and control software for APNEA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McClelland, J.H.; Storm, B.H. Jr.; Ahearn, J.

    1997-11-01

    The human interface software for the Lockheed Martin Specialty Components (LMSC) Active/Passive Neutron Examination & Analysis System (APENA) provides a user friendly operating environment for the movement and analysis of waste drums. It is written in Microsoft Visual C++ on a Windows NT platform. Object oriented and multitasking techniques are used extensively to maximize the capability of the system. A waste drum is placed on a loading platform with a fork lift and then automatically moved into the APNEA chamber in preparation for analysis. A series of measurements is performed, controlled by menu commands to hardware components attached as peripheralmore » devices, in order to create data files for analysis. The analysis routines use the files to identify the pertinent radioactive characteristics of the drum, including the type, location, and quantity of fissionable material. At the completion of the measurement process, the drum is automatically unloaded and the data are archived in preparation for storage as part of the drum`s data signature. 3 figs.« less

  2. The neural basis of the bystander effect--the influence of group size on neural activity when witnessing an emergency.

    PubMed

    Hortensius, Ruud; de Gelder, Beatrice

    2014-06-01

    Naturalistic observation and experimental studies in humans and other primates show that observing an individual in need automatically triggers helping behavior. The aim of the present study is to clarify the neurofunctional basis of social influences on individual helping behavior. We investigate whether when participants witness an emergency, while performing an unrelated color-naming task in an fMRI scanner, the number of bystanders present at the emergency influences neural activity in regions related to action preparation. The results show a decrease in activity with the increase in group size in the left pre- and postcentral gyri and left medial frontal gyrus. In contrast, regions related to visual perception and attention show an increase in activity. These results demonstrate the neural mechanisms of social influence on automatic action preparation that is at the core of helping behavior when witnessing an emergency. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Solving the AI Planning Plus Scheduling Problem Using Model Checking via Automatic Translation from the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL)

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Munoz, Cesar A.; Siminiceanu, Radu I.

    2007-01-01

    This paper describes a translator from a new planning language named the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL) model checker. This translator has been developed in support of the Spacecraft Autonomy for Vehicles and Habitats (SAVH) project sponsored by the Exploration Technology Development Program, which is seeking to mature autonomy technology for the vehicles and operations centers of Project Constellation.

  4. Intended actions and unexpected outcomes: automatic and controlled processing in a rapid motor task

    PubMed Central

    Cheyne, Douglas O.; Ferrari, Paul; Cheyne, James A.

    2012-01-01

    Human action involves a combination of controlled and automatic behavior. These processes may interact in tasks requiring rapid response selection or inhibition, where temporal constraints preclude timely intervention by conscious, controlled processes over automatized prepotent responses. Such contexts tend to produce frequent errors, but also rapidly executed correct responses, both of which may sometimes be perceived as surprising, unintended, or “automatic”. In order to identify neural processes underlying these two aspects of cognitive control, we measured neuromagnetic brain activity in 12 right-handed subjects during manual responses to rapidly presented digits, with an infrequent target digit that required switching response hand (bimanual task) or response finger (unimanual task). Automaticity of responding was evidenced by response speeding (shorter response times) prior to both failed and fast correct switches. Consistent with this automaticity interpretation of fast correct switches, we observed bilateral motor preparation, as indexed by suppression of beta band (15–30 Hz) oscillations in motor cortex, prior to processing of the switch cue in the bimanual task. In contrast, right frontal theta activity (4–8 Hz) accompanying correct switch responses began after cue onset, suggesting that it reflected controlled inhibition of the default response. Further, this activity was reduced on fast correct switch trials suggesting a more automatic mode of inhibitory control. We also observed post-movement (event-related negativity) ERN-like responses and theta band increases in medial and anterior frontal regions that were significantly larger on error trials, and may reflect a combination of error and delayed inhibitory signals. We conclude that both automatic and controlled processes are engaged in parallel during rapid motor tasks, and that the relative strength and timing of these processes may underlie both optimal task performance and subjective experiences of automaticity or control. PMID:22912612

  5. A semi-automatic annotation tool for cooking video

    NASA Astrophysics Data System (ADS)

    Bianco, Simone; Ciocca, Gianluigi; Napoletano, Paolo; Schettini, Raimondo; Margherita, Roberto; Marini, Gianluca; Gianforme, Giorgio; Pantaleo, Giuseppe

    2013-03-01

    In order to create a cooking assistant application to guide the users in the preparation of the dishes relevant to their profile diets and food preferences, it is necessary to accurately annotate the video recipes, identifying and tracking the foods of the cook. These videos present particular annotation challenges such as frequent occlusions, food appearance changes, etc. Manually annotate the videos is a time-consuming, tedious and error-prone task. Fully automatic tools that integrate computer vision algorithms to extract and identify the elements of interest are not error free, and false positive and false negative detections need to be corrected in a post-processing stage. We present an interactive, semi-automatic tool for the annotation of cooking videos that integrates computer vision techniques under the supervision of the user. The annotation accuracy is increased with respect to completely automatic tools and the human effort is reduced with respect to completely manual ones. The performance and usability of the proposed tool are evaluated on the basis of the time and effort required to annotate the same video sequences.

  6. Does gaze cueing produce automatic response activation: a lateralized readiness potential (LRP) study.

    PubMed

    Vainio, L; Heimola, M; Heino, H; Iljin, I; Laamanen, P; Seesjärvi, E; Paavilainen, P

    2014-05-01

    Previous research has shown that gaze cues facilitate responses to an upcoming target if the target location is compatible with the direction of the cue. Similar cueing effects have also been observed with central arrow cues. Both of these cueing effects have been attributed to a reflexive orienting of attention triggered by the cue. In addition, orienting of attention has been proposed to result in a partial response activation of the corresponding hand that, in turn, can be observed in the lateralized readiness potential (LRP), an electrophysiological indicator of automatic hand-motor response preparation. For instance, a central arrow cue has been observed to produce automatic hand-motor activation as indicated by the LRPs. The present study investigated whether gaze cues could also produce similar activation patterns in LRP. Although the standard gaze cueing effect was observed in the behavioural data, the LRP data did not reveal any consistent automatic hand-motor activation. The study suggests that motor processes associated with gaze cueing effect may operate exclusively at the level of oculomotor programming. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  7. All-automatic swimmer tracking system based on an optimized scaled composite JTC technique

    NASA Astrophysics Data System (ADS)

    Benarab, D.; Napoléon, T.; Alfalou, A.; Verney, A.; Hellard, P.

    2016-04-01

    In this paper, an all-automatic optimized JTC based swimmer tracking system is proposed and evaluated on real video database outcome from national and international swimming competitions (French National Championship, Limoges 2015, FINA World Championships, Barcelona 2013 and Kazan 2015). First, we proposed to calibrate the swimming pool using the DLT algorithm (Direct Linear Transformation). DLT calculates the homography matrix given a sufficient set of correspondence points between pixels and metric coordinates: i.e. DLT takes into account the dimensions of the swimming pool and the type of the swim. Once the swimming pool is calibrated, we extract the lane. Then we apply a motion detection approach to detect globally the swimmer in this lane. Next, we apply our optimized Scaled Composite JTC which consists of creating an adapted input plane that contains the predicted region and the head reference image. This latter is generated using a composite filter of fin images chosen from the database. The dimension of this reference will be scaled according to the ratio between the head's dimension and the width of the swimming lane. Finally, applying the proposed approach improves the performances of our previous tracking method by adding a detection module in order to achieve an all-automatic swimmer tracking system.

  8. Principal visual word discovery for automatic license plate detection.

    PubMed

    Zhou, Wengang; Li, Houqiang; Lu, Yijuan; Tian, Qi

    2012-09-01

    License plates detection is widely considered a solved problem, with many systems already in operation. However, the existing algorithms or systems work well only under some controlled conditions. There are still many challenges for license plate detection in an open environment, such as various observation angles, background clutter, scale changes, multiple plates, uneven illumination, and so on. In this paper, we propose a novel scheme to automatically locate license plates by principal visual word (PVW), discovery and local feature matching. Observing that characters in different license plates are duplicates of each other, we bring in the idea of using the bag-of-words (BoW) model popularly applied in partial-duplicate image search. Unlike the classic BoW model, for each plate character, we automatically discover the PVW characterized with geometric context. Given a new image, the license plates are extracted by matching local features with PVW. Besides license plate detection, our approach can also be extended to the detection of logos and trademarks. Due to the invariance virtue of scale-invariant feature transform feature, our method can adaptively deal with various changes in the license plates, such as rotation, scaling, illumination, etc. Promising results of the proposed approach are demonstrated with an experimental study in license plate detection.

  9. Automatic humidification system to support the assessment of food drying processes

    NASA Astrophysics Data System (ADS)

    Ortiz Hernández, B. D.; Carreño Olejua, A. R.; Castellanos Olarte, J. M.

    2016-07-01

    This work shows the main features of an automatic humidification system to provide drying air that match environmental conditions of different climate zones. This conditioned air is then used to assess the drying process of different agro-industrial products at the Automation and Control for Agro-industrial Processes Laboratory of the Pontifical Bolivarian University of Bucaramanga, Colombia. The automatic system allows creating and improving control strategies to supply drying air under specified conditions of temperature and humidity. The development of automatic routines to control and acquire real time data was made possible by the use of robust control systems and suitable instrumentation. The signals are read and directed to a controller memory where they are scaled and transferred to a memory unit. Using the IP address is possible to access data to perform supervision tasks. One important characteristic of this automatic system is the Dynamic Data Exchange Server (DDE) to allow direct communication between the control unit and the computer used to build experimental curves.

  10. INITIAL APPL;ICATION OF THE ADAPTIVE GRID AIR POLLUTION MODEL

    EPA Science Inventory

    The paper discusses an adaptive-grid algorithm used in air pollution models. The algorithm reduces errors related to insufficient grid resolution by automatically refining the grid scales in regions of high interest. Meanwhile the grid scales are coarsened in other parts of the d...

  11. Digital Automation and Real-Time Monitoring of an Original Installation for "Wet Combustion" of Organic Wastes

    NASA Astrophysics Data System (ADS)

    Morozov, Yegor; Tikhomirov, Alexander A.; Saltykov, Mikhail; Trifonov, Sergey V.; Kudenko, D.. Yurii A.

    2016-07-01

    An original method for "wet combustion" of organic wastes, which is being developed at the IBP SB RAS, is a very promising approach for regeneration of nutrient solutions for plants in future spacecraft closed Bioregenerative Life Support Systems (BLSS). The method is quick, ecofriendly, does not require special conditions such as high pressure and temperature, and the resulting nitrogen stays in forms easy for further preparation of the fertilizer. An experimental testbed of a new-generation closed ecosystem is being currently run at the IBP SB RAS to examine compatibility of the latest technologies for accelerating the cycling. Integration of "wet combustion" of organic wastes into the information system of closed ecosystem experimental testbed has been studied as part of preparatory work. Digital automation and real-time monitoring of original "wet combustion" installation operation parameters have been implemented. The new system enabled remotely controlled or automatic work of the installation. Data are stored in standard easily processed formats, allowing further mathematical processing where necessary. During ongoing experiments on improving "wet combustion" of organic wastes, automatic monitoring can notice slight changes in process parameters and record them in more detail. The ultimate goal of the study is to include the "wet combustion" installation into future full-scale experiment with humans, thus reducing the time spent by the crew on life support issues while living in the BLSS. The work was carried out with the financial support of the Russian Scientific Foundation (project 14-14-00599).

  12. Automated target recognition and tracking using an optical pattern recognition neural network

    NASA Technical Reports Server (NTRS)

    Chao, Tien-Hsin

    1991-01-01

    The on-going development of an automatic target recognition and tracking system at the Jet Propulsion Laboratory is presented. This system is an optical pattern recognition neural network (OPRNN) that is an integration of an innovative optical parallel processor and a feature extraction based neural net training algorithm. The parallel optical processor provides high speed and vast parallelism as well as full shift invariance. The neural network algorithm enables simultaneous discrimination of multiple noisy targets in spite of their scales, rotations, perspectives, and various deformations. This fully developed OPRNN system can be effectively utilized for the automated spacecraft recognition and tracking that will lead to success in the Automated Rendezvous and Capture (AR&C) of the unmanned Cargo Transfer Vehicle (CTV). One of the most powerful optical parallel processors for automatic target recognition is the multichannel correlator. With the inherent advantages of parallel processing capability and shift invariance, multiple objects can be simultaneously recognized and tracked using this multichannel correlator. This target tracking capability can be greatly enhanced by utilizing a powerful feature extraction based neural network training algorithm such as the neocognitron. The OPRNN, currently under investigation at JPL, is constructed with an optical multichannel correlator where holographic filters have been prepared using the neocognitron training algorithm. The computation speed of the neocognitron-type OPRNN is up to 10(exp 14) analog connections/sec that enabling the OPRNN to outperform its state-of-the-art electronics counterpart by at least two orders of magnitude.

  13. Learning-based image preprocessing for robust computer-aided detection

    NASA Astrophysics Data System (ADS)

    Raghupathi, Laks; Devarakota, Pandu R.; Wolf, Matthias

    2013-03-01

    Recent studies have shown that low dose computed tomography (LDCT) can be an effective screening tool to reduce lung cancer mortality. Computer-aided detection (CAD) would be a beneficial second reader for radiologists in such cases. Studies demonstrate that while iterative reconstructions (IR) improve LDCT diagnostic quality, it however degrades CAD performance significantly (increased false positives) when applied directly. For improving CAD performance, solutions such as retraining with newer data or applying a standard preprocessing technique may not be suffice due to high prevalence of CT scanners and non-uniform acquisition protocols. Here, we present a learning-based framework that can adaptively transform a wide variety of input data to boost an existing CAD performance. This not only enhances their robustness but also their applicability in clinical workflows. Our solution consists of applying a suitable pre-processing filter automatically on the given image based on its characteristics. This requires the preparation of ground truth (GT) of choosing an appropriate filter resulting in improved CAD performance. Accordingly, we propose an efficient consolidation process with a novel metric. Using key anatomical landmarks, we then derive consistent feature descriptors for the classification scheme that then uses a priority mechanism to automatically choose an optimal preprocessing filter. We demonstrate CAD prototype∗ performance improvement using hospital-scale datasets acquired from North America, Europe and Asia. Though we demonstrated our results for a lung nodule CAD, this scheme is straightforward to extend to other post-processing tools dedicated to other organs and modalities.

  14. Detecting buried explosive hazards with handheld GPR and deep learning

    NASA Astrophysics Data System (ADS)

    Besaw, Lance E.

    2016-05-01

    Buried explosive hazards (BEHs), including traditional landmines and homemade improvised explosives, have proven difficult to detect and defeat during and after conflicts around the world. Despite their various sizes, shapes and construction material, ground penetrating radar (GPR) is an excellent phenomenology for detecting BEHs due to its ability to sense localized differences in electromagnetic properties. Handheld GPR detectors are common equipment for detecting BEHs because of their flexibility (in part due to the human operator) and effectiveness in cluttered environments. With modern digital electronics and positioning systems, handheld GPR sensors can sense and map variation in electromagnetic properties while searching for BEHs. Additionally, large-scale computers have demonstrated an insatiable appetite for ingesting massive datasets and extracting meaningful relationships. This is no more evident than the maturation of deep learning artificial neural networks (ANNs) for image and speech recognition now commonplace in industry and academia. This confluence of sensing, computing and pattern recognition technologies offers great potential to develop automatic target recognition techniques to assist GPR operators searching for BEHs. In this work deep learning ANNs are used to detect BEHs and discriminate them from harmless clutter. We apply these techniques to a multi-antennae, handheld GPR with centimeter-accurate positioning system that was used to collect data over prepared lanes containing a wide range of BEHs. This work demonstrates that deep learning ANNs can automatically extract meaningful information from complex GPR signatures, complementing existing GPR anomaly detection and classification techniques.

  15. Automatic polyp detection in colonoscopy videos

    NASA Astrophysics Data System (ADS)

    Yuan, Zijie; IzadyYazdanabadi, Mohammadhassan; Mokkapati, Divya; Panvalkar, Rujuta; Shin, Jae Y.; Tajbakhsh, Nima; Gurudu, Suryakanth; Liang, Jianming

    2017-02-01

    Colon cancer is the second cancer killer in the US [1]. Colonoscopy is the primary method for screening and prevention of colon cancer, but during colonoscopy, a significant number (25% [2]) of polyps (precancerous abnormal growths inside of the colon) are missed; therefore, the goal of our research is to reduce the polyp miss-rate of colonoscopy. This paper presents a method to detect polyp automatically in a colonoscopy video. Our system has two stages: Candidate generation and candidate classification. In candidate generation (stage 1), we chose 3,463 frames (including 1,718 with-polyp frames) from real-time colonoscopy video database. We first applied processing procedures, namely intensity adjustment, edge detection and morphology operations, as pre-preparation. We extracted each connected component (edge contour) as one candidate patch from the pre-processed image. With the help of ground truth (GT) images, 2 constraints were implemented on each candidate patch, dividing and saving them into polyp group and non-polyp group. In candidate classification (stage 2), we trained and tested convolutional neural networks (CNNs) with AlexNet architecture [3] to classify each candidate into with-polyp or non-polyp class. Each with-polyp patch was processed by rotation, translation and scaling for invariant to get a much robust CNNs system. We applied leave-2-patients-out cross-validation on this model (4 of 6 cases were chosen as training set and the rest 2 were as testing set). The system accuracy and sensitivity are 91.47% and 91.76%, respectively.

  16. Small RNA Library Preparation Method for Next-Generation Sequencing Using Chemical Modifications to Prevent Adapter Dimer Formation.

    PubMed

    Shore, Sabrina; Henderson, Jordana M; Lebedev, Alexandre; Salcedo, Michelle P; Zon, Gerald; McCaffrey, Anton P; Paul, Natasha; Hogrefe, Richard I

    2016-01-01

    For most sample types, the automation of RNA and DNA sample preparation workflows enables high throughput next-generation sequencing (NGS) library preparation. Greater adoption of small RNA (sRNA) sequencing has been hindered by high sample input requirements and inherent ligation side products formed during library preparation. These side products, known as adapter dimer, are very similar in size to the tagged library. Most sRNA library preparation strategies thus employ a gel purification step to isolate tagged library from adapter dimer contaminants. At very low sample inputs, adapter dimer side products dominate the reaction and limit the sensitivity of this technique. Here we address the need for improved specificity of sRNA library preparation workflows with a novel library preparation approach that uses modified adapters to suppress adapter dimer formation. This workflow allows for lower sample inputs and elimination of the gel purification step, which in turn allows for an automatable sRNA library preparation protocol.

  17. Automatic assessment of voice quality according to the GRBAS scale.

    PubMed

    Sáenz-Lechón, Nicolás; Godino-Llorente, Juan I; Osma-Ruiz, Víctor; Blanco-Velasco, Manuel; Cruz-Roldán, Fernando

    2006-01-01

    Nowadays, the most extended techniques to measure the voice quality are based on perceptual evaluation by well trained professionals. The GRBAS scale is a widely used method for perceptual evaluation of voice quality. The GRBAS scale is widely used in Japan and there is increasing interest in both Europe and the United States. However, this technique needs well-trained experts, and is based on the evaluator's expertise, depending a lot on his own psycho-physical state. Furthermore, a great variability in the assessments performed from one evaluator to another is observed. Therefore, an objective method to provide such measurement of voice quality would be very valuable. In this paper, the automatic assessment of voice quality is addressed by means of short-term Mel cepstral parameters (MFCC), and learning vector quantization (LVQ) in a pattern recognition stage. Results show that this approach provides acceptable results for this purpose, with accuracy around 65% at the best.

  18. [Use of computer technologies for studying the morphological characteristics of the iris color in anthropology].

    PubMed

    Dorofeeva, A A; Khrustalev, A V; Krylov, Iu V; Bocharov, D A; Negasheva, M A

    2010-01-01

    Digital images of the iris were received for study peculiarities of the iris color during the anthropological examination of 578 students aged 16-24 years. Simultaneously with the registration of the digital images, the visual assessment of the eye color was carried out using the traditional scale of Bunak, based on 12 ocular prostheses. Original software for automatic determination of the iris color based on 12 classes scale of Bunak was designed, and computer version of that scale was developed. The software proposed allows to conduct the determination of the iris color with high validity based on numerical evaluation; its application may reduce the bias due to subjective assessment and methodological divergences of the different researchers. The software designed for automatic determination of the iris color may help develop both theoretical and applied anthropology, it may be used in forensic and emergency medicine, sports medicine, medico-genetic counseling and professional selection.

  19. Personal Privacy in an Information Society. Final Report.

    ERIC Educational Resources Information Center

    Privacy Protection Study Commission, Washington, DC.

    This report of the Privacy Protection Study Commission was prepared in response to a Congressional mandate to study data banks, automatic data processing programs, and information systems of governmental, regional and private organizations to determine standards and procedures in force for the protection of personal information. Recommendations…

  20. Investigating the Effect of Advanced Automatic Transmissions ...

    EPA Pesticide Factsheets

    EPA used the validated ALPHA model to predict the effectiveness improvement of real-world transmissions over a baseline four-speed transmission and to predict further improvements possible from future eight-speed transmissions. In preparation for the midterm evaluation (MTE) of the 2017-2025 light-duty GHG emissions rule.

  1. Master control data handling program uses automatic data input

    NASA Technical Reports Server (NTRS)

    Alliston, W.; Daniel, J.

    1967-01-01

    General purpose digital computer program is applicable for use with analysis programs that require basic data and calculated parameters as input. It is designed to automate input data preparation for flight control computer programs, but it is general enough to permit application in other areas.

  2. USSR Report, Consumer Goods and Domestic Trade, No. 62.

    DTIC Science & Technology

    1983-04-28

    dough preparation, automatic dough make-up and rolling machines and 3 others) is the most important task when producing equipment for the baking...candy production. It is planned to provide the production of flour confectionary items with completely mechanized lines for elongated types of cookies and

  3. Predictors of Mental Health Symptoms, Automatic Thoughts, and Self-Esteem Among University Students.

    PubMed

    Hiçdurmaz, Duygu; İnci, Figen; Karahan, Sevilay

    2017-01-01

    University youth is a risk group regarding mental health, and many mental health problems are frequent in this group. Sociodemographic factors such as level of income and familial factors such as relationship with father are reported to be associated with mental health symptoms, automatic thoughts, and self-esteem. Also, there are interrelations between mental health problems, automatic thoughts, and self-esteem. The extent of predictive effect of each of these variables on automatic thoughts, self-esteem, and mental health symptoms is not known. We aimed to determine the predictive factors of mental health symptoms, automatic thoughts, and self-esteem in university students. Participants were 530 students enrolled at a university in Turkey, during 2014-2015 academic year. Data were collected using the student information form, the Brief Symptom Inventory, the Automatic Thoughts Questionnaire, and the Rosenberg Self-Esteem Scale. Mental health symptoms, self-esteem, perception of the relationship with the father, and level of income as a student significantly predicted automatic thoughts. Automatic thoughts, mental health symptoms, participation in family decisions, and age had significant predictive effects on self-esteem. Finally, automatic thoughts, self-esteem, age, and perception of the relationship with the father had significant predictive effects on mental health symptoms. The predictive factors revealed in our study provide important information to practitioners and researchers by showing the elements that need to be screened for mental health of university students and issues that need to be included in counseling activities.

  4. Semi-automatic spray pyrolysis deposition of thin, transparent, titania films as blocking layers for dye-sensitized and perovskite solar cells.

    PubMed

    Krýsová, Hana; Krýsa, Josef; Kavan, Ladislav

    2018-01-01

    For proper function of the negative electrode of dye-sensitized and perovskite solar cells, the deposition of a nonporous blocking film is required on the surface of F-doped SnO 2 (FTO) glass substrates. Such a blocking film can minimise undesirable parasitic processes, for example, the back reaction of photoinjected electrons with the oxidized form of the redox mediator or with the hole-transporting medium can be avoided. In the present work, thin, transparent, blocking TiO 2 films are prepared by semi-automatic spray pyrolysis of precursors consisting of titanium diisopropoxide bis(acetylacetonate) as the main component. The variation in the layer thickness of the sprayed films is achieved by varying the number of spray cycles. The parameters investigated in this work were deposition temperature (150, 300 and 450 °C), number of spray cycles (20-200), precursor composition (with/without deliberately added acetylacetone), concentration (0.05 and 0.2 M) and subsequent post-calcination at 500 °C. The photo-electrochemical properties were evaluated in aqueous electrolyte solution under UV irradiation. The blocking properties were tested by cyclic voltammetry with a model redox probe with a simple one-electron-transfer reaction. Semi-automatic spraying resulted in the formation of transparent, homogeneous, TiO 2 films, and the technique allows for easy upscaling to large electrode areas. The deposition temperature of 450 °C was necessary for the fabrication of highly photoactive TiO 2 films. The blocking properties of the as-deposited TiO 2 films (at 450 °C) were impaired by post-calcination at 500 °C, but this problem could be addressed by increasing the number of spray cycles. The modification of the precursor by adding acetylacetone resulted in the fabrication of TiO 2 films exhibiting perfect blocking properties that were not influenced by post-calcination. These results will surely find use in the fabrication of large-scale dye-sensitized and perovskite solar cells.

  5. Management of natural resources through automatic cartographic inventory. [France

    NASA Technical Reports Server (NTRS)

    Rey, P.; Gourinard, Y.; Cambou, F. (Principal Investigator)

    1974-01-01

    The author has identified the following significant results. (1) Accurate recognition of previously known ground features from ERTS-1 imagery has been confirmed and a probable detection range for the major signatures can be given. (2) Unidentified elements, however, must be decoded by means of the equal densitometric value zone method. (3) Determination of these zonings involves an analogical treatment of images using the color equidensity methods (pseudo-color), color composites and especially temporal color composite (repetitive superposition). (4) After this analogical preparation, the digital equidensities can be processed by computer in the four MSS bands, according to a series of transfer operations from imagery and automatic cartography.

  6. Automatic Calibration of Stereo-Cameras Using Ordinary Chess-Board Patterns

    NASA Astrophysics Data System (ADS)

    Prokos, A.; Kalisperakis, I.; Petsa, E.; Karras, G.

    2012-07-01

    Automation of camera calibration is facilitated by recording coded 2D patterns. Our toolbox for automatic camera calibration using images of simple chess-board patterns is freely available on the Internet. But it is unsuitable for stereo-cameras whose calibration implies recovering camera geometry and their true-to-scale relative orientation. In contrast to all reported methods requiring additional specific coding to establish an object space coordinate system, a toolbox for automatic stereo-camera calibration relying on ordinary chess-board patterns is presented here. First, the camera calibration algorithm is applied to all image pairs of the pattern to extract nodes of known spacing, order them in rows and columns, and estimate two independent camera parameter sets. The actual node correspondences on stereo-pairs remain unknown. Image pairs of a textured 3D scene are exploited for finding the fundamental matrix of the stereo-camera by applying RANSAC to point matches established with the SIFT algorithm. A node is then selected near the centre of the left image; its match on the right image is assumed as the node closest to the corresponding epipolar line. This yields matches for all nodes (since these have already been ordered), which should also satisfy the 2D epipolar geometry. Measures for avoiding mismatching are taken. With automatically estimated initial orientation values, a bundle adjustment is performed constraining all pairs on a common (scaled) relative orientation. Ambiguities regarding the actual exterior orientations of the stereo-camera with respect to the pattern are irrelevant. Results from this automatic method show typical precisions not above 1/4 pixels for 640×480 web cameras.

  7. A multiparametric automatic method to monitor long-term reproducibility in digital mammography: results from a regional screening programme.

    PubMed

    Gennaro, G; Ballaminut, A; Contento, G

    2017-09-01

    This study aims to illustrate a multiparametric automatic method for monitoring long-term reproducibility of digital mammography systems, and its application on a large scale. Twenty-five digital mammography systems employed within a regional screening programme were controlled weekly using the same type of phantom, whose images were analysed by an automatic software tool. To assess system reproducibility levels, 15 image quality indices (IQIs) were extracted and compared with the corresponding indices previously determined by a baseline procedure. The coefficients of variation (COVs) of the IQIs were used to assess the overall variability. A total of 2553 phantom images were collected from the 25 digital mammography systems from March 2013 to December 2014. Most of the systems showed excellent image quality reproducibility over the surveillance interval, with mean variability below 5%. Variability of each IQI was 5%, with the exception of one index associated with the smallest phantom objects (0.25 mm), which was below 10%. The method applied for reproducibility tests-multi-detail phantoms, cloud automatic software tool to measure multiple image quality indices and statistical process control-was proven to be effective and applicable on a large scale and to any type of digital mammography system. • Reproducibility of mammography image quality should be monitored by appropriate quality controls. • Use of automatic software tools allows image quality evaluation by multiple indices. • System reproducibility can be assessed comparing current index value with baseline data. • Overall system reproducibility of modern digital mammography systems is excellent. • The method proposed and applied is cost-effective and easily scalable.

  8. Validity of Scores for a Developmental Writing Scale Based on Automated Scoring

    ERIC Educational Resources Information Center

    Attali, Yigal; Powers, Donald

    2009-01-01

    A developmental writing scale for timed essay-writing performance was created on the basis of automatically computed indicators of writing fluency, word choice, and conventions of standard written English. In a large-scale data collection effort that involved a national sample of more than 12,000 students from 4th, 6th, 8th, 10th, and 12th grade,…

  9. Preparing soft-bodied arthropods for microscope examination: Soft Scales (Insecta: Hemiptera: Coccidae)

    USDA-ARS?s Scientific Manuscript database

    Proper identification of soft scales (Hemiptera:Coccidae) requires preparation of the specimen on a microscope slide. This training video provides visual instruction on how to prepare soft scale specimens on microscope slides for examination and identification. Steps ranging from collection, speci...

  10. Computer-aided design of tooth preparations for automated development of fixed prosthodontics.

    PubMed

    Yuan, Fusong; Sun, Yuchun; Wang, Yong; Lv, Peijun

    2014-01-01

    This paper introduces a method to digitally design a virtual model of a tooth preparation of the mandibular first molar, by using the commercial three-dimensional (3D) computer-aided design software packages Geomagic and Imageware, and using the model as an input to automatic tooth preparing system. The procedure included acquisition of 3D data from dentate casts and digital modeling of the shape of the tooth preparation components, such as the margin, occlusal surface, and axial surface. The completed model data were stored as stereolithography (STL) files, which were used in a tooth preparation system to help to plan the trajectory. Meanwhile, the required mathematical models in the design process were introduced. The method was used to make an individualized tooth preparation of the mandibular first molar. The entire process took 15min. Using the method presented, a straightforward 3D shape of a full crown can be obtained to meet clinical needs prior to tooth preparation. © 2013 Published by Elsevier Ltd.

  11. Evolutionary game dynamics of controlled and automatic decision-making

    NASA Astrophysics Data System (ADS)

    Toupo, Danielle F. P.; Strogatz, Steven H.; Cohen, Jonathan D.; Rand, David G.

    2015-07-01

    We integrate dual-process theories of human cognition with evolutionary game theory to study the evolution of automatic and controlled decision-making processes. We introduce a model in which agents who make decisions using either automatic or controlled processing compete with each other for survival. Agents using automatic processing act quickly and so are more likely to acquire resources, but agents using controlled processing are better planners and so make more effective use of the resources they have. Using the replicator equation, we characterize the conditions under which automatic or controlled agents dominate, when coexistence is possible and when bistability occurs. We then extend the replicator equation to consider feedback between the state of the population and the environment. Under conditions in which having a greater proportion of controlled agents either enriches the environment or enhances the competitive advantage of automatic agents, we find that limit cycles can occur, leading to persistent oscillations in the population dynamics. Critically, however, these limit cycles only emerge when feedback occurs on a sufficiently long time scale. Our results shed light on the connection between evolution and human cognition and suggest necessary conditions for the rise and fall of rationality.

  12. Evolutionary game dynamics of controlled and automatic decision-making.

    PubMed

    Toupo, Danielle F P; Strogatz, Steven H; Cohen, Jonathan D; Rand, David G

    2015-07-01

    We integrate dual-process theories of human cognition with evolutionary game theory to study the evolution of automatic and controlled decision-making processes. We introduce a model in which agents who make decisions using either automatic or controlled processing compete with each other for survival. Agents using automatic processing act quickly and so are more likely to acquire resources, but agents using controlled processing are better planners and so make more effective use of the resources they have. Using the replicator equation, we characterize the conditions under which automatic or controlled agents dominate, when coexistence is possible and when bistability occurs. We then extend the replicator equation to consider feedback between the state of the population and the environment. Under conditions in which having a greater proportion of controlled agents either enriches the environment or enhances the competitive advantage of automatic agents, we find that limit cycles can occur, leading to persistent oscillations in the population dynamics. Critically, however, these limit cycles only emerge when feedback occurs on a sufficiently long time scale. Our results shed light on the connection between evolution and human cognition and suggest necessary conditions for the rise and fall of rationality.

  13. Coupling a regional warning system to a semantic engine on online news for enhancing landslide prediction

    NASA Astrophysics Data System (ADS)

    Battistini, Alessandro; Rosi, Ascanio; Segoni, Samuele; Catani, Filippo; Casagli, Nicola

    2017-04-01

    Landslide inventories are basic data for large scale landslide modelling, e.g. they are needed to calibrate and validate rainfall thresholds, physically based models and early warning systems. The setting up of landslide inventories with traditional methods (e.g. remote sensing, field surveys and manual retrieval of data from technical reports and local newspapers) is time consuming. The objective of this work is to automatically set up a landslide inventory using a state-of-the art semantic engine based on data mining on online news (Battistini et al., 2013) and to evaluate if the automatically generated inventory can be used to validate a regional scale landslide warning system based on rainfall-thresholds. The semantic engine scanned internet news in real time in a 50 months test period. At the end of the process, an inventory of approximately 900 landslides was set up for the Tuscany region (23,000 km2, Italy). The inventory was compared with the outputs of the regional landslide early warning system based on rainfall thresholds, and a good correspondence was found: e.g. 84% of the events reported in the news is correctly identified by the model. In addition, the cases of not correspondence were forwarded to the rainfall threshold developers, which used these inputs to update some of the thresholds. On the basis of the results obtained, we conclude that automatic validation of landslide models using geolocalized landslide events feedback is possible. The source of data for validation can be obtained directly from the internet channel using an appropriate semantic engine. We also automated the validation procedure, which is based on a comparison between forecasts and reported events. We verified that our approach can be automatically used for a near real time validation of the warning system and for a semi-automatic update of the rainfall thresholds, which could lead to an improvement of the forecasting effectiveness of the warning system. In the near future, the proposed procedure could operate in continuous time and could allow for a periodic update of landslide hazard models and landslide early warning systems with minimum human intervention. References: Battistini, A., Segoni, S., Manzo, G., Catani, F., Casagli, N. (2013). Web data mining for automatic inventory of geohazards at national scale. Applied Geography, 43, 147-158.

  14. Automatic crack detection and classification method for subway tunnel safety monitoring.

    PubMed

    Zhang, Wenyu; Zhang, Zhenjiang; Qi, Dapeng; Liu, Yun

    2014-10-16

    Cracks are an important indicator reflecting the safety status of infrastructures. This paper presents an automatic crack detection and classification methodology for subway tunnel safety monitoring. With the application of high-speed complementary metal-oxide-semiconductor (CMOS) industrial cameras, the tunnel surface can be captured and stored in digital images. In a next step, the local dark regions with potential crack defects are segmented from the original gray-scale images by utilizing morphological image processing techniques and thresholding operations. In the feature extraction process, we present a distance histogram based shape descriptor that effectively describes the spatial shape difference between cracks and other irrelevant objects. Along with other features, the classification results successfully remove over 90% misidentified objects. Also, compared with the original gray-scale images, over 90% of the crack length is preserved in the last output binary images. The proposed approach was tested on the safety monitoring for Beijing Subway Line 1. The experimental results revealed the rules of parameter settings and also proved that the proposed approach is effective and efficient for automatic crack detection and classification.

  15. Automatic vehicle counting using background subtraction method on gray scale images and morphology operation

    NASA Astrophysics Data System (ADS)

    Adi, K.; Widodo, A. P.; Widodo, C. E.; Pamungkas, A.; Putranto, A. B.

    2018-05-01

    Traffic monitoring on road needs to be done, the counting of the number of vehicles passing the road is necessary. It is more emphasized for highway transportation management in order to prevent efforts. Therefore, it is necessary to develop a system that is able to counting the number of vehicles automatically. Video processing method is able to counting the number of vehicles automatically. This research has development a system of vehicle counting on toll road. This system includes processes of video acquisition, frame extraction, and image processing for each frame. Video acquisition is conducted in the morning, at noon, in the afternoon, and in the evening. This system employs of background subtraction and morphology methods on gray scale images for vehicle counting. The best vehicle counting results were obtained in the morning with a counting accuracy of 86.36 %, whereas the lowest accuracy was in the evening, at 21.43 %. Differences in morning and evening results are caused by different illumination in the morning and evening. This will cause the values in the image pixels to be different.

  16. Difficulty identifying feelings and automatic activation in the fusiform gyrus in response to facial emotion.

    PubMed

    Eichmann, Mischa; Kugel, Harald; Suslow, Thomas

    2008-12-01

    Difficulties in identifying and differentiating one's emotions are a central characteristic of alexithymia. In the present study, automatic activation of the fusiform gyrus to facial emotion was investigated as a function of alexithymia as assessed by the 20-item Toronto Alexithymia Scale. During 3 Tesla fMRI scanning, pictures of faces bearing sad, happy, and neutral expressions masked by neutral faces were presented to 22 healthy adults who also responded to the Toronto Alexithymia Scale. The fusiform gyrus was selected as the region of interest, and voxel values of this region were extracted, summarized as means, and tested among the different conditions (sad, happy, and neutral faces). Masked sad facial emotions were associated with greater bilateral activation of the fusiform gyrus than masked neutral faces. The subscale, Difficulty Identifying Feelings, was negatively correlated with the neural response of the fusiform gyrus to masked sad faces. The correlation results suggest that automatic hyporesponsiveness of the fusiform gyrus to negative emotion stimuli may reflect problems in recognizing one's emotions in everyday life.

  17. Automatic Crack Detection and Classification Method for Subway Tunnel Safety Monitoring

    PubMed Central

    Zhang, Wenyu; Zhang, Zhenjiang; Qi, Dapeng; Liu, Yun

    2014-01-01

    Cracks are an important indicator reflecting the safety status of infrastructures. This paper presents an automatic crack detection and classification methodology for subway tunnel safety monitoring. With the application of high-speed complementary metal-oxide-semiconductor (CMOS) industrial cameras, the tunnel surface can be captured and stored in digital images. In a next step, the local dark regions with potential crack defects are segmented from the original gray-scale images by utilizing morphological image processing techniques and thresholding operations. In the feature extraction process, we present a distance histogram based shape descriptor that effectively describes the spatial shape difference between cracks and other irrelevant objects. Along with other features, the classification results successfully remove over 90% misidentified objects. Also, compared with the original gray-scale images, over 90% of the crack length is preserved in the last output binary images. The proposed approach was tested on the safety monitoring for Beijing Subway Line 1. The experimental results revealed the rules of parameter settings and also proved that the proposed approach is effective and efficient for automatic crack detection and classification. PMID:25325337

  18. Preparing soft-bodied arthropods for arthropods for microscope examination: Armored Scales (Insects: Hemiptera: Diaspididae)

    USDA-ARS?s Scientific Manuscript database

    Proper identification of armored scales (Hemiptera: Diaspididae) requires preparation of the specimen on a microscope slide. This training video provides visual instruction on how to prepare armored scales specimens on microscope slides for examination and identification. Steps ranging from collect...

  19. Ionospheric Research Using Digital Ionosondes.

    DTIC Science & Technology

    1983-07-01

    HEIGHT ANALYSIS, ARTIST 96 7.0 CHEMICAL RELEASE EXPERIMENTS AT NATAL 105 8.0 IONOSPHERIC HEATING EXPERIMENTS AT ARECIBO 114 9.0 DIGISONDE 128...Jan 82 20:30 to 12 AST 89 67 Thule 82-022 92 68 Integrated Height Characteristic Thule 82-022 93 69 ARTIST Ionogram Print 103 70 Automatic Profiles...Where Manual and Automatic Scalings Fall Within Indicated Limits 97 6a ARTIST Initialization 99 6b ARTIST Initialization 100 6c ARTIST Output 101 N

  20. Comparison of automatical thoughts among generalized anxiety disorder, major depressive disorder and generalized social phobia patients.

    PubMed

    Gül, A I; Simsek, G; Karaaslan, Ö; Inanir, S

    2015-08-01

    Automatic thoughts are measurable cognitive markers of the psychopathology and coping styles of individuals. This study measured and compared the automatic thoughts of patients with generalized anxiety disorder (GAD), major depressive disorder (MDD), and generalized social phobia (GSP). Fifty-two patients with GAD, 53 with MDD, and 50 with GSP and 52 healthy controls completed the validated Automatic Thoughts Questionnaire (ATQ) and a structured psychiatric interview. Patients with GAD, MDD, and GSP also completed the validated Generalized Anxiety Disorder-7 questionnaire, the Beck Depression Inventory (BDI), and the Liebowitz Social Anxiety Scale (LSAS) to determine the severity of their illnesses. All scales were completed before treatment and after diagnosis. The ATQ scores of all pairs of groups were compared. The ATQ scores of the GAD, MDD, and GSP groups were significantly higher than were those of the control group. We also found significant correlations among scores on the GAD-7, BDI, and LSAS. The mean age of patients with GSP was lower than was that of the other groups (30.90 ± 8.35). The significantly higher ATQ scores of the MDD, GAD, and GSP groups, compared with the control group, underscore the common cognitive psychopathology characterizing these three disorders. This finding confirms that similar cognitive therapy approaches should be effective for these patients. This study is the first to compare GAD, MDD, and GSP from a cognitive perspective.

  1. a Model Study of Small-Scale World Map Generalization

    NASA Astrophysics Data System (ADS)

    Cheng, Y.; Yin, Y.; Li, C. M.; Wu, W.; Guo, P. P.; Ma, X. L.; Hu, F. M.

    2018-04-01

    With the globalization and rapid development every filed is taking an increasing interest in physical geography and human economics. There is a surging demand for small scale world map in large formats all over the world. Further study of automated mapping technology, especially the realization of small scale production on a large scale global map, is the key of the cartographic field need to solve. In light of this, this paper adopts the improved model (with the map and data separated) in the field of the mapmaking generalization, which can separate geographic data from mapping data from maps, mainly including cross-platform symbols and automatic map-making knowledge engine. With respect to the cross-platform symbol library, the symbol and the physical symbol in the geographic information are configured at all scale levels. With respect to automatic map-making knowledge engine consists 97 types, 1086 subtypes, 21845 basic algorithm and over 2500 relevant functional modules.In order to evaluate the accuracy and visual effect of our model towards topographic maps and thematic maps, we take the world map generalization in small scale as an example. After mapping generalization process, combining and simplifying the scattered islands make the map more explicit at 1 : 2.1 billion scale, and the map features more complete and accurate. Not only it enhance the map generalization of various scales significantly, but achieve the integration among map-makings of various scales, suggesting that this model provide a reference in cartographic generalization for various scales.

  2. 5 CFR 532.233 - Preparation for full-scale wage surveys.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 1 2011-01-01 2011-01-01 false Preparation for full-scale wage surveys... REGULATIONS PREVAILING RATE SYSTEMS Prevailing Rate Determinations § 532.233 Preparation for full-scale wage... the local wage survey committee. (e) Selection and appointment of data collectors. (1) The local wage...

  3. An Automatic Method for Generating an Unbiased Intensity Normalizing Factor in Positron Emission Tomography Image Analysis After Stroke.

    PubMed

    Nie, Binbin; Liang, Shengxiang; Jiang, Xiaofeng; Duan, Shaofeng; Huang, Qi; Zhang, Tianhao; Li, Panlong; Liu, Hua; Shan, Baoci

    2018-06-07

    Positron emission tomography (PET) imaging of functional metabolism has been widely used to investigate functional recovery and to evaluate therapeutic efficacy after stroke. The voxel intensity of a PET image is the most important indicator of cellular activity, but is affected by other factors such as the basal metabolic ratio of each subject. In order to locate dysfunctional regions accurately, intensity normalization by a scale factor is a prerequisite in the data analysis, for which the global mean value is most widely used. However, this is unsuitable for stroke studies. Alternatively, a specified scale factor calculated from a reference region is also used, comprising neither hyper- nor hypo-metabolic voxels. But there is no such recognized reference region for stroke studies. Therefore, we proposed a totally data-driven automatic method for unbiased scale factor generation. This factor was generated iteratively until the residual deviation of two adjacent scale factors was reduced by < 5%. Moreover, both simulated and real stroke data were used for evaluation, and these suggested that our proposed unbiased scale factor has better sensitivity and accuracy for stroke studies.

  4. Study on the Preparation Process and Influential Factors of Large Area Environment-friendly Molten Carbonate Fuel Cell Matrix

    NASA Astrophysics Data System (ADS)

    Zhang, Ruiyun; Xu, Shisen; Cheng, Jian; Wang, Hongjian; Ren, Yongqiang

    2017-07-01

    Low-cost and high-performance matrix materials used in mass production of molten carbonate fuel cell (MCFC) were prepared by automatic casting machine with α-LiAlO2 powder material synthesized by gel-solid method, and distilled water as solvent. The single cell was assembled for generating test, and the good performance of the matrix was verified. The paper analyzed the factors affecting aqueous tape casting matrix preparation, such as solvent content, dispersant content, milling time, blade height and casting machine running speed, providing a solid basis for the mass production of large area environment-friendly matrix used in molten carbonate fuel cell.

  5. A Novel Method of Preparation of Inorganic Glasses by Microwave Irradiation

    NASA Astrophysics Data System (ADS)

    Vaidhyanathan, B.; Ganguli, Munia; Rao, K. J.

    1994-12-01

    Microwave heating is shown to provide an extremely facile and automatically temperature-controlled route to the synthesis of glasses. Glass-forming compositions of several traditional and novel glasses were melted in a kitchen microwave oven, typically within 5 min and quenched into glasses. This is only a fraction of the time required in normal glass preparation methods. The rapidity of melting minimizes undesirable features such as loss of components of the glass, variation of oxidation states of metal ions, and oxygen loss leading to reduced products in the glass such as metal particles. This novel procedure of preparation is applicable when at least one of the components of the glass-forming mixture absorbs microwaves.

  6. Automated Drug Identification for Urban Hospitals

    NASA Technical Reports Server (NTRS)

    Shirley, Donna L.

    1971-01-01

    Many urban hospitals are becoming overloaded with drug abuse cases requiring chemical analysis for identification of drugs. In this paper, the requirements for chemical analysis of body fluids for drugs are determined and a system model for automated drug analysis is selected. The system as modeled, would perform chemical preparation of samples, gas-liquid chromatographic separation of drugs in the chemically prepared samples, infrared spectrophotometric analysis of the drugs, and would utilize automatic data processing and control for drug identification. Requirements of cost, maintainability, reliability, flexibility, and operability are considered.

  7. Kit for the rapid preparation of .sup.99m Tc red blood cells

    DOEpatents

    Richards, Powell; Smith, Terry D.

    1976-01-01

    A method and sample kit for the preparation of .sup.99m Tc-labeled red blood cells in a closed, sterile system. A partially evacuated tube, containing a freeze-dried stannous citrate formulation with heparin as an anticoagulant, allows whole blood to be automatically drawn from the patient. The radioisotope is added at the end of the labeling sequence to minimize operator exposure. Consistent 97% yields in 20 minutes are obtained with small blood samples. Freeze-dried kits have remained stable after five months.

  8. A device for automatically measuring and supervising the critical care patient's urine output.

    PubMed

    Otero, Abraham; Palacios, Francisco; Akinfiev, Teodor; Fernández, Roemi

    2010-01-01

    Critical care units are equipped with commercial monitoring devices capable of sensing patients' physiological parameters and supervising the achievement of the established therapeutic goals. This avoids human errors in this task and considerably decreases the workload of the healthcare staff. However, at present there still is a very relevant physiological parameter that is measured and supervised manually by the critical care units' healthcare staff: urine output. This paper presents a patent-pending device capable of automatically recording and supervising the urine output of a critical care patient. A high precision scale is used to measure the weight of a commercial urine meter. On the scale's pan there is a support frame made up of Bosch profiles that isolates the scale from force transmission from the patient's bed, and guarantees that the urine flows properly through the urine meter input tube. The scale's readings are sent to a PC via Bluetooth where an application supervises the achievement of the therapeutic goals. The device is currently undergoing tests at a research unit associated with the University Hospital of Getafe in Spain.

  9. An automated, high-throughput plant phenotyping system using machine learning-based plant segmentation and image analysis.

    PubMed

    Lee, Unseok; Chang, Sungyul; Putra, Gian Anantrio; Kim, Hyoungseok; Kim, Dong Hwan

    2018-01-01

    A high-throughput plant phenotyping system automatically observes and grows many plant samples. Many plant sample images are acquired by the system to determine the characteristics of the plants (populations). Stable image acquisition and processing is very important to accurately determine the characteristics. However, hardware for acquiring plant images rapidly and stably, while minimizing plant stress, is lacking. Moreover, most software cannot adequately handle large-scale plant imaging. To address these problems, we developed a new, automated, high-throughput plant phenotyping system using simple and robust hardware, and an automated plant-imaging-analysis pipeline consisting of machine-learning-based plant segmentation. Our hardware acquires images reliably and quickly and minimizes plant stress. Furthermore, the images are processed automatically. In particular, large-scale plant-image datasets can be segmented precisely using a classifier developed using a superpixel-based machine-learning algorithm (Random Forest), and variations in plant parameters (such as area) over time can be assessed using the segmented images. We performed comparative evaluations to identify an appropriate learning algorithm for our proposed system, and tested three robust learning algorithms. We developed not only an automatic analysis pipeline but also a convenient means of plant-growth analysis that provides a learning data interface and visualization of plant growth trends. Thus, our system allows end-users such as plant biologists to analyze plant growth via large-scale plant image data easily.

  10. A quality score for coronary artery tree extraction results

    NASA Astrophysics Data System (ADS)

    Cao, Qing; Broersen, Alexander; Kitslaar, Pieter H.; Lelieveldt, Boudewijn P. F.; Dijkstra, Jouke

    2018-02-01

    Coronary artery trees (CATs) are often extracted to aid the fully automatic analysis of coronary artery disease on coronary computed tomography angiography (CCTA) images. Automatically extracted CATs often miss some arteries or include wrong extractions which require manual corrections before performing successive steps. For analyzing a large number of datasets, a manual quality check of the extraction results is time-consuming. This paper presents a method to automatically calculate quality scores for extracted CATs in terms of clinical significance of the extracted arteries and the completeness of the extracted CAT. Both right dominant (RD) and left dominant (LD) anatomical statistical models are generated and exploited in developing the quality score. To automatically determine which model should be used, a dominance type detection method is also designed. Experiments are performed on the automatically extracted and manually refined CATs from 42 datasets to evaluate the proposed quality score. In 39 (92.9%) cases, the proposed method is able to measure the quality of the manually refined CATs with higher scores than the automatically extracted CATs. In a 100-point scale system, the average scores for automatically and manually refined CATs are 82.0 (+/-15.8) and 88.9 (+/-5.4) respectively. The proposed quality score will assist the automatic processing of the CAT extractions for large cohorts which contain both RD and LD cases. To the best of our knowledge, this is the first time that a general quality score for an extracted CAT is presented.

  11. An automatic scaling method for obtaining the trace and parameters from oblique ionogram based on hybrid genetic algorithm

    NASA Astrophysics Data System (ADS)

    Song, Huan; Hu, Yaogai; Jiang, Chunhua; Zhou, Chen; Zhao, Zhengyu; Zou, Xianjian

    2016-12-01

    Scaling oblique ionogram plays an important role in obtaining ionospheric structure at the midpoint of oblique sounding path. The paper proposed an automatic scaling method to extract the trace and parameters of oblique ionogram based on hybrid genetic algorithm (HGA). The extracted 10 parameters come from F2 layer and Es layer, such as maximum observation frequency, critical frequency, and virtual height. The method adopts quasi-parabolic (QP) model to describe F2 layer's electron density profile that is used to synthesize trace. And it utilizes secant theorem, Martyn's equivalent path theorem, image processing technology, and echoes' characteristics to determine seven parameters' best fit values, and three parameter's initial values in QP model to set up their searching spaces which are the needed input data of HGA. Then HGA searches the three parameters' best fit values from their searching spaces based on the fitness between the synthesized trace and the real trace. In order to verify the performance of the method, 240 oblique ionograms are scaled and their results are compared with manual scaling results and the inversion results of the corresponding vertical ionograms. The comparison results show that the scaling results are accurate or at least adequate 60-90% of the time.

  12. Study of the cerrado vegetation in the Federal District area from orbital data. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Aoki, H.; Dossantos, J. R.

    1980-01-01

    The physiognomic units of cerrado in the area of Distrito Federal (DF) were studied through the visual and automatic analysis of products provided by Multispectral Scanning System (MSS) of LANDSAT. The visual analysis of the multispectral images in black and white, at the 1:250,000 scale, was made based on the texture and tonal patterns. The automatic analysis of the compatible computer tapes (CCT) was made by means of IMAGE-100 system. The following conclusions were obtained: (1) the delimitation of cerrado vegetation forms can be made by the visual and automatic analysis; (2) in the visual analysis, the principal parameter used to discriminate the cerrado forms was the tonal pattern, independently of the year's seasons, and the channel 5 gave better information; (3) in the automatic analysis, the data of the four channels of MSS can be used in the discrimination of the cerrado forms; and (4) in the automatic analysis, the four channels combination possibilities gave more information in the separation of cerrado units when soil types were considered.

  13. Recent advances in automatic alignment system for the National Ignition Facility

    NASA Astrophysics Data System (ADS)

    Wilhelmsen, Karl; Awwal, Abdul A. S.; Kalantar, Dan; Leach, Richard; Lowe-Webb, Roger; McGuigan, David; Miller Kamm, Vicki

    2011-03-01

    The automatic alignment system for the National Ignition Facility (NIF) is a large-scale parallel system that directs all 192 laser beams along the 300-m optical path to a 50-micron focus at target chamber in less than 50 minutes. The system automatically commands 9,000 stepping motors to adjust mirrors and other optics based upon images acquired from high-resolution digital cameras viewing beams at various locations. Forty-five control loops per beamline request image processing services running on a LINUX cluster to analyze these images of the beams and references, and automatically steer the beams toward the target. This paper discusses the upgrades to the NIF automatic alignment system to handle new alignment needs and evolving requirements as related to various types of experiments performed. As NIF becomes a continuously-operated system and more experiments are performed, performance monitoring is increasingly important for maintenance and commissioning work. Data, collected during operations, is analyzed for tuning of the laser and targeting maintenance work. Handling evolving alignment and maintenance needs is expected for the planned 30-year operational life of NIF.

  14. A New Automatic Method of Urban Areas Mapping in East Asia from LANDSAT Data

    NASA Astrophysics Data System (ADS)

    XU, R.; Jia, G.

    2012-12-01

    Cities, as places where human activities are concentrated, account for a small percent of global land cover but are frequently cited as the chief causes of, and solutions to, climate, biogeochemistry, and hydrology processes at local, regional, and global scales. Accompanying with uncontrolled economic growth, urban sprawl has been attributed to the accelerating integration of East Asia into the world economy and involved dramatic changes in its urban form and land use. To understand the impact of urban extent on biogeophysical processes, reliable mapping of built-up areas is particularly essential in eastern cities as a result of their characteristics of smaller patches, more fragile, and a lower fraction of the urban landscape which does not have natural than in the West. Segmentation of urban land from other land-cover types using remote sensing imagery can be done by standard classification processes as well as a logic rule calculation based on spectral indices and their derivations. Efforts to establish such a logic rule with no threshold for automatically mapping are highly worthwhile. Existing automatic methods are reviewed, and then a proposed approach is introduced including the calculation of the new index and the improved logic rule. Following this, existing automatic methods as well as the proposed approach are compared in a common context. Afterwards, the proposed approach is tested separately in cities of large, medium, and small scale in East Asia selected from different LANDSAT images. The results are promising as the approach can efficiently segment urban areas, even in the presence of more complex eastern cities. Key words: Urban extraction; Automatic Method; Logic Rule; LANDSAT images; East AisaThe Proposed Approach of Extraction of Urban Built-up Areas in Guangzhou, China

  15. Act Now to Transform School Systems. 2011 PIE Network Summit Policy Briefs

    ERIC Educational Resources Information Center

    Miles, Karen Hawley; Baroody, Karen

    2011-01-01

    The U.S.'s educational system is at a crossroads. Preparing every student for college and careers in the information age requires that school districts invest more and differently in teaching effectiveness, time, individual attention, and information systems. But even before a decline in revenue, district leaders face automatic increases in…

  16. 20 CFR 416.968 - Skill requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... which needs little or no judgment to do simple duties that can be learned on the job in a short period... materials from machines which are automatic or operated by others), or machine tending, and a person can usually learn to do the job in 30 days, and little specific vocational preparation and judgment are needed...

  17. 20 CFR 416.968 - Skill requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... which needs little or no judgment to do simple duties that can be learned on the job in a short period... materials from machines which are automatic or operated by others), or machine tending, and a person can usually learn to do the job in 30 days, and little specific vocational preparation and judgment are needed...

  18. US Army Proposed Automatic Test Equipment Software Development and Support Facility.

    DTIC Science & Technology

    1982-10-29

    programs would be prepared as weapon and prime system operating software. The ATE Software Development and Support Facility will help prevent the TPS...ONE AS A STANDARD **Partially being Developed (2) UNDER DEVELOP- by Navy CSS Prgram MENT (3) NEEDS TAILOR- (5) NEEDS ING FOR ARMY DEVELOPMENT A- 2

  19. Effect of mixing time and speed on experimental baking and dough testing with a 200g pin-mixer

    USDA-ARS?s Scientific Manuscript database

    Under mixing or over mixing the dough results in varied experimental loaf volumes. Bread preparation requires a trained baker to evaluate dough development and determine stop points of mixer. Instrumentation and electronic control of the dough mixer would allow for automatic mixing. This study us...

  20. Wildland resource information system: user's guide

    Treesearch

    Robert M. Russell; David A. Sharpnack; Elliot L. Amidon

    1975-01-01

    This user's guide provides detailed information about how to use the computer programs of WRIS, a computer system for storing and manipulating data about land areas. Instructions explain how to prepare maps, digitize by automatic scanners or by hand, produce polygon maps, and combine map layers. Support programs plot maps, store them on tapes, produce summaries,...

  1. NASA/ASEE Summer Faculty Fellowship Program

    NASA Technical Reports Server (NTRS)

    Hosler, E. Ramon (Editor); Armstrong, Dennis W. (Editor)

    1989-01-01

    The contractor's report contains all sixteen final reports prepared by the participants in the 1989 Summer Faculty Fellowship Program. Reports describe research projects on a number of different topics. Interface software, metal corrosion, rocket triggering lightning, automatic drawing, 60-Hertz power, carotid-cardiac baroreflex, acoustic fields, robotics, AI, CAD/CAE, cryogenics, titanium, and flow measurement are discussed.

  2. 7 CFR 1717.156 - Transitional assistance affecting preexisting loans.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... lengthened by 2 years. On the borrower's request RUS will prepare documents necessary for the advance of loan... affecting preexisting loans. The fund advance period for an insured loan, which is the period during which RUS may advance loan funds to a borrower, terminates automatically after a specific period of time...

  3. 20 CFR 416.968 - Skill requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... which needs little or no judgment to do simple duties that can be learned on the job in a short period... materials from machines which are automatic or operated by others), or machine tending, and a person can usually learn to do the job in 30 days, and little specific vocational preparation and judgment are needed...

  4. 20 CFR 416.968 - Skill requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... which needs little or no judgment to do simple duties that can be learned on the job in a short period... materials from machines which are automatic or operated by others), or machine tending, and a person can usually learn to do the job in 30 days, and little specific vocational preparation and judgment are needed...

  5. 20 CFR 416.968 - Skill requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... which needs little or no judgment to do simple duties that can be learned on the job in a short period... materials from machines which are automatic or operated by others), or machine tending, and a person can usually learn to do the job in 30 days, and little specific vocational preparation and judgment are needed...

  6. Enhanced performance of polybenzimidazole-based high temperature proton exchange membrane fuel cell with gas diffusion electrodes prepared by automatic catalyst spraying under irradiation technique

    NASA Astrophysics Data System (ADS)

    Su, Huaneng; Pasupathi, Sivakumar; Bladergroen, Bernard Jan; Linkov, Vladimir; Pollet, Bruno G.

    2013-11-01

    Gas diffusion electrodes (GDEs) prepared by a novel automatic catalyst spraying under irradiation (ACSUI) technique are investigated for improving the performance of phosphoric acid (PA)-doped polybenzimidazole (PBI) high temperature proton exchange membrane fuel cell (PEMFC). The physical properties of the GDEs are characterized by pore size distribution and scanning electron microscopy (SEM). The electrochemical properties of the membrane electrode assembly (MEA) with the GDEs are evaluated and analyzed by polarization curve, cyclic voltammetry (CV) and electrochemistry impedance spectroscopy (EIS). Effects of PTFE binder content, PA impregnation and heat treatment on the GDEs are investigated to determine the optimum performance of the single cell. At ambient pressure and 160 °C, the maximum power density can reach 0.61 W cm-2, and the current density at 0.6 V is up to 0.38 A cm-2, with H2/air and a platinum loading of 0.5 mg cm-2 on both electrodes. The MEA with the GDEs shows good stability for fuel cell operating in a short term durability test.

  7. Automatic testing and assessment of neuroanatomy using a digital brain atlas: method and development of computer- and mobile-based applications.

    PubMed

    Nowinski, Wieslaw L; Thirunavuukarasuu, Arumugam; Ananthasubramaniam, Anand; Chua, Beng Choon; Qian, Guoyu; Nowinska, Natalia G; Marchenko, Yevgen; Volkau, Ihar

    2009-10-01

    Preparation of tests and student's assessment by the instructor are time consuming. We address these two tasks in neuroanatomy education by employing a digital media application with a three-dimensional (3D), interactive, fully segmented, and labeled brain atlas. The anatomical and vascular models in the atlas are linked to Terminologia Anatomica. Because the cerebral models are fully segmented and labeled, our approach enables automatic and random atlas-derived generation of questions to test location and naming of cerebral structures. This is done in four steps: test individualization by the instructor, test taking by the students at their convenience, automatic student assessment by the application, and communication of the individual assessment to the instructor. A computer-based application with an interactive 3D atlas and a preliminary mobile-based application were developed to realize this approach. The application works in two test modes: instructor and student. In the instructor mode, the instructor customizes the test by setting the scope of testing and student performance criteria, which takes a few seconds. In the student mode, the student is tested and automatically assessed. Self-testing is also feasible at any time and pace. Our approach is automatic both with respect to test generation and student assessment. It is also objective, rapid, and customizable. We believe that this approach is novel from computer-based, mobile-based, and atlas-assisted standpoints.

  8. OpenSim: A Flexible Distributed Neural Network Simulator with Automatic Interactive Graphics.

    PubMed

    Jarosch, Andreas; Leber, Jean Francois

    1997-06-01

    An object-oriented simulator called OpenSim is presented that achieves a high degree of flexibility by relying on a small set of building blocks. The state variables and algorithms put in this framework can easily be accessed through a command shell. This allows one to distribute a large-scale simulation over several workstations and to generate the interactive graphics automatically. OpenSim opens new possibilities for cooperation among Neural Network researchers. Copyright 1997 Elsevier Science Ltd.

  9. Modeling and Performance Optimization of Large-Scale Data-Communication Networks.

    DTIC Science & Technology

    1981-06-01

    IT-17, no. 1, pp. 71-76, 1971. 12. Y. Ho, M. Kastner, and E. Wong, "Teams, market signalling, and information theory," IEEE Trans. Automat. Contr...modifies the flow assignment to satisfy end-to-end delay constraints. 3.2.1 Rationale for Min-Hop Strategr The Min-Hop algorithm proposed in this...Prentice-Hall, 1980. Ho, Y., M. Kostner and E. Wong, "Teams, market signalling, and information theory," IEEE Trans. Automat. Contr., vol. AC-23, pp

  10. Automatic item generation implemented for measuring artistic judgment aptitude.

    PubMed

    Bezruczko, Nikolaus

    2014-01-01

    Automatic item generation (AIG) is a broad class of methods that are being developed to address psychometric issues arising from internet and computer-based testing. In general, issues emphasize efficiency, validity, and diagnostic usefulness of large scale mental testing. Rapid prominence of AIG methods and their implicit perspective on mental testing is bringing painful scrutiny to many sacred psychometric assumptions. This report reviews basic AIG ideas, then presents conceptual foundations, image model development, and operational application to artistic judgment aptitude testing.

  11. Bellman Ford algorithm - in Routing Information Protocol (RIP)

    NASA Astrophysics Data System (ADS)

    Krianto Sulaiman, Oris; Mahmud Siregar, Amir; Nasution, Khairuddin; Haramaini, Tasliyah

    2018-04-01

    In a large scale network need a routing that can handle a lot number of users, one of the solutions to cope with large scale network is by using a routing protocol, There are 2 types of routing protocol that is static and dynamic, Static routing is manually route input based on network admin, while dynamic routing is automatically route input formed based on existing network. Dynamic routing is efficient used to network extensively because of the input of route automatic formed, Routing Information Protocol (RIP) is one of dynamic routing that uses the bellman-ford algorithm where this algorithm will search for the best path that traversed the network by leveraging the value of each link, so with the bellman-ford algorithm owned by RIP can optimize existing networks.

  12. Response variability in rapid automatized naming predicts reading comprehension

    PubMed Central

    Li, James J.; Cutting, Laurie E.; Ryan, Matthew; Zilioli, Monica; Denckla, Martha B.; Mahone, E. Mark

    2009-01-01

    A total of 37 children ages 8 to 14 years, screened for word-reading difficulties (23 with attention-deficit/hyperactivity disorder, ADHD; 14 controls) completed oral reading and rapid automatized naming (RAN) tests. RAN trials were segmented into pause and articulation time and intraindividual variability. There were no group differences on reading or RAN variables. Color- and letter-naming pause times and number-naming articulation time were significant predictors of reading fluency. In contrast, number and letter pause variability were predictors of comprehension. Results support analysis of subcomponents of RAN and add to literature emphasizing intraindividual variability as a marker for response preparation, which has relevance to reading comprehension. PMID:19221923

  13. Automated Meteor Detection by All-Sky Digital Camera Systems

    NASA Astrophysics Data System (ADS)

    Suk, Tomáš; Šimberová, Stanislava

    2017-12-01

    We have developed a set of methods to detect meteor light traces captured by all-sky CCD cameras. Operating at small automatic observatories (stations), these cameras create a network spread over a large territory. Image data coming from these stations are merged in one central node. Since a vast amount of data is collected by the stations in a single night, robotic storage and analysis are essential to processing. The proposed methodology is adapted to data from a network of automatic stations equipped with digital fish-eye cameras and includes data capturing, preparation, pre-processing, analysis, and finally recognition of objects in time sequences. In our experiments we utilized real observed data from two stations.

  14. Musculoskeletal Simulation Model Generation from MRI Data Sets and Motion Capture Data

    NASA Astrophysics Data System (ADS)

    Schmid, Jérôme; Sandholm, Anders; Chung, François; Thalmann, Daniel; Delingette, Hervé; Magnenat-Thalmann, Nadia

    Today computer models and computer simulations of the musculoskeletal system are widely used to study the mechanisms behind human gait and its disorders. The common way of creating musculoskeletal models is to use a generic musculoskeletal model based on data derived from anatomical and biomechanical studies of cadaverous specimens. To adapt this generic model to a specific subject, the usual approach is to scale it. This scaling has been reported to introduce several errors because it does not always account for subject-specific anatomical differences. As a result, a novel semi-automatic workflow is proposed that creates subject-specific musculoskeletal models from magnetic resonance imaging (MRI) data sets and motion capture data. Based on subject-specific medical data and a model-based automatic segmentation approach, an accurate modeling of the anatomy can be produced while avoiding the scaling operation. This anatomical model coupled with motion capture data, joint kinematics information, and muscle-tendon actuators is finally used to create a subject-specific musculoskeletal model.

  15. Automatic co-registration of 3D multi-sensor point clouds

    NASA Astrophysics Data System (ADS)

    Persad, Ravi Ancil; Armenakis, Costas

    2017-08-01

    We propose an approach for the automatic coarse alignment of 3D point clouds which have been acquired from various platforms. The method is based on 2D keypoint matching performed on height map images of the point clouds. Initially, a multi-scale wavelet keypoint detector is applied, followed by adaptive non-maxima suppression. A scale, rotation and translation-invariant descriptor is then computed for all keypoints. The descriptor is built using the log-polar mapping of Gabor filter derivatives in combination with the so-called Rapid Transform. In the final step, source and target height map keypoint correspondences are determined using a bi-directional nearest neighbour similarity check, together with a threshold-free modified-RANSAC. Experiments with urban and non-urban scenes are presented and results show scale errors ranging from 0.01 to 0.03, 3D rotation errors in the order of 0.2° to 0.3° and 3D translation errors from 0.09 m to 1.1 m.

  16. An Automatic Instrument to Study the Spatial Scaling Behavior of Emissivity

    PubMed Central

    Tian, Jing; Zhang, Renhua; Su, Hongbo; Sun, Xiaomin; Chen, Shaohui; Xia, Jun

    2008-01-01

    In this paper, the design of an automatic instrument for measuring the spatial distribution of land surface emissivity is presented, which makes the direct in situ measurement of the spatial distribution of emissivity possible. The significance of this new instrument lies in two aspects. One is that it helps to investigate the spatial scaling behavior of emissivity and temperature; the other is that, the design of the instrument provides theoretical and practical foundations for the implement of measuring distribution of surface emissivity on airborne or spaceborne. To improve the accuracy of the measurements, the emissivity measurement and its uncertainty are examined in a series of carefully designed experiments. The impact of the variation of target temperature and the environmental irradiance on the measurement of emissivity is analyzed as well. In addition, the ideal temperature difference between hot environment and cool environment is obtained based on numerical simulations. Finally, the scaling behavior of surface emissivity caused by the heterogeneity of target is discussed. PMID:27879735

  17. A flow-batch analyzer with piston propulsion applied to automatic preparation of calibration solutions for Mn determination in mineral waters by ET AAS.

    PubMed

    Almeida, Luciano F; Vale, Maria G R; Dessuy, Morgana B; Silva, Márcia M; Lima, Renato S; Santos, Vagner B; Diniz, Paulo H D; Araújo, Mário C U

    2007-10-31

    The increasing development of miniaturized flow systems and the continuous monitoring of chemical processes require dramatically simplified and cheap flow schemes and instrumentation with large potential for miniaturization and consequent portability. For these purposes, the development of systems based on flow and batch technologies may be a good alternative. Flow-batch analyzers (FBA) have been successfully applied to implement analytical procedures, such as: titrations, sample pre-treatment, analyte addition and screening analysis. In spite of its favourable characteristics, the previously proposed FBA uses peristaltic pumps to propel the fluids and this kind of propulsion presents high cost and large dimension, making unfeasible its miniaturization and portability. To overcome these drawbacks, a low cost, robust, compact and non-propelled by peristaltic pump FBA is proposed. It makes use of a lab-made piston coupled to a mixing chamber and a step motor controlled by a microcomputer. The piston-propelled FBA (PFBA) was applied for automatic preparation of calibration solutions for manganese determination in mineral waters by electrothermal atomic-absorption spectrometry (ET AAS). Comparing the results obtained with two sets of calibration curves (five by manual and five by PFBA preparations), no significant statistical differences at a 95% confidence level were observed by applying the paired t-test. The standard deviation of manual and PFBA procedures were always smaller than 0.2 and 0.1mugL(-1), respectively. By using PFBA it was possible to prepare about 80 calibration solutions per hour.

  18. A startling acoustic stimulus facilitates voluntary lower extremity movements and automatic postural responses in people with chronic stroke.

    PubMed

    Coppens, Milou J M; Roelofs, Jolanda M B; Donkers, Nicole A J; Nonnekes, Jorik; Geurts, Alexander C H; Weerdesteyn, Vivian

    2018-05-14

    A startling acoustic stimulus (SAS) involuntary releases prepared movements at accelerated latencies, known as the StartReact effect. Previous work has demonstrated intact StartReact in paretic upper extremity movements in people after stroke, suggesting preserved motor preparation. The question remains whether motor preparation of lower extremity movements is also unaffected after stroke. Here, we investigated StartReact effects on ballistic lower extremity movements and on automatic postural responses (APRs) following perturbations to standing balance. These APRs are particularly interesting as they are critical to prevent a fall following balance perturbations, but show substantial delays and poor muscle coordination after stroke. Twelve chronic stroke patients and 12 healthy controls performed voluntary ankle dorsiflexion movements in response to a visual stimulus, and responded to backward balance perturbations evoking APRs. Twenty-five percent of all trials contained a SAS (120 dB) simultaneously with the visual stimulus or balance perturbation. As expected, in the absence of a SAS muscle and movement onset latencies at the paretic side were delayed compared to the non-paretic leg and to controls. The SAS accelerated ankle dorsiflexion onsets in both the legs of the stroke subjects and in controls. Following perturbations, the SAS accelerated bilateral APR onsets not only in controls, but for the first time, we also demonstrated this effect in people after stroke. Moreover, APR inter- and intra-limb muscle coordination was rather weak in our stroke subjects, but substantially improved when the SAS was applied. These findings show preserved movement preparation, suggesting that there is residual (subcortical) capacity for motor recovery.

  19. Rhythms can overcome temporal orienting deficit after right frontal damage.

    PubMed

    Triviño, Mónica; Arnedo, Marisa; Lupiáñez, Juan; Chirivella, Javier; Correa, Angel

    2011-12-01

    The main aim of this study was to test whether the use of rhythmic information to induce temporal expectations can overcome the deficit in controlled temporal preparation shown by patients with frontal damage (i.e. temporal orienting and foreperiod effects). Two tasks were administered to a group of 15 patients with a frontal brain lesion and a group of 15 matched control subjects: a Symbolic Cued Task where the predictive information regarding the time of target appearance was provided by a symbolic cue (short line-early vs. long line-late interval) and a Rhythm Cued Task where the predictive temporal information was provided by a rhythm (fast rhythm-early vs. slow rhythm-late interval). The results of the Symbolic Cued Task replicated both the temporal orienting deficit in right frontal patients and the absence of foreperiod effects in both right and left frontal patients, reported in our previous study (Triviño, Correa, Arnedo, & Lupiañez, 2010). However, in the Rhythm Cued Task, the right frontal group showed normal temporal orienting and foreperiod effects, while the left frontal group showed a significant deficit of both effects. These findings show that automatic temporal preparation, as induced by a rhythm, can help frontal patients to make effective use of implicit temporal information to respond at the optimum time. Our neuropsychological findings also provide a novel suggestion for a neural model, in which automatic temporal preparation is left-lateralized and controlled temporal preparation is right-lateralized in the frontal lobes. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Increased Automaticity and Altered Temporal Preparation Following Sleep Deprivation

    PubMed Central

    Kong, Danyang; Asplund, Christopher L.; Ling, Aiqing; Chee, Michael W.L.

    2015-01-01

    Study Objectives: Temporal expectation enables us to focus limited processing resources, thereby optimizing perceptual and motor processing for critical upcoming events. We investigated the effects of total sleep deprivation (TSD) on temporal expectation by evaluating the foreperiod and sequential effects during a psychomotor vigilance task (PVT). We also examined how these two measures were modulated by vulnerability to TSD. Design: Three 10-min visual PVT sessions using uniformly distributed foreperiods were conducted in the wake-maintenance zone the evening before sleep deprivation (ESD) and three more in the morning following approximately 22 h of TSD. TSD vulnerable and nonvulnerable groups were determined by a tertile split of participants based on the change in the number of behavioral lapses recorded during ESD and TSD. A subset of participants performed six additional 10-min modified auditory PVTs with exponentially distributed foreperiods during rested wakefulness (RW) and TSD to test the effect of temporal distribution on foreperiod and sequential effects. Setting: Sleep laboratory. Participants: There were 172 young healthy participants (90 males) with regular sleep patterns. Nineteen of these participants performed the modified auditory PVT. Measurements and Results: Despite behavioral lapses and slower response times, sleep deprived participants could still perceive the conditional probability of temporal events and modify their level of preparation accordingly. Both foreperiod and sequential effects were magnified following sleep deprivation in vulnerable individuals. Only the foreperiod effect increased in nonvulnerable individuals. Conclusions: The preservation of foreperiod and sequential effects suggests that implicit time perception and temporal preparedness are intact during total sleep deprivation. Individuals appear to reallocate their depleted preparatory resources to more probable event timings in ongoing trials, whereas vulnerable participants also rely more on automatic processes. Citation: Kong D, Asplund CL, Ling A, Chee MWL. Increased automaticity and altered temporal preparation following sleep deprivation. SLEEP 2015;38(8):1219–1227. PMID:25845689

  1. Characterization of Instructor and Student Use of Ubiquitous Presenter, a Presentation System Enabling Spontaneity and Digital Archiving

    NASA Astrophysics Data System (ADS)

    Price, Edward; Malani, Roshni; Simon, Beth

    2007-01-01

    Ubiquitous Presenter (UP) is a digital presentation system that allows an instructor with a Tablet PC to spontaneously modify prepared slides, while automatically archiving the inked slides on the web. For two introductory physics classes, we examine the types of slides instructors prepare and the ways in which they add ink to the slides. Modes of usage include: using ink to explicitly link multiple representations; making prepared figures dynamic by animating them with ink; and preparing slides with sparse text or figures, then adding extensive annotations during class. In addition, through an analysis of surveys and of web server logs, we examine student reaction to the system, as well as how often and in what ways students' utilize archived material. In general, students find the system valuable and frequently review the presentations online.

  2. Automatic dispersion, long-term stability of multi-walled carbon nanotubes in high concentration electrolytes

    NASA Astrophysics Data System (ADS)

    Ma, Lan; He, Yi; Luo, Pingya; Zhang, Liyun; Yu, Yalu

    2018-02-01

    Nanoparticles have been known as the useful materials in working fluids for petroleum industry. But the stabilization of nano-scaled materials in water-based working fluids at high salinities is still a big challenge. In this study, we successfully prepared the anionic polymer/multi-walled carbon nanotubes (MWNTs) composites by covalently wrapping of MWNTs with poly (sodium 4-styrenesulfonate) (PSS) to improve the stability of MWNTs in high concentration electrolytes. The PSS/MWNTs composites can automatically disperse in salinity up to 15 wt% NaCl and API brines (8 wt% NaCl + 2 wt% CaCl2). Hydrodynamic diameters of composites were measured as a function of ionic strength and API brines by dynamic light scattering (DLS). By varying the concentration of brines, hydrodynamic diameter of PSS/MWNTs composites in brines fluctuated between 545 ± 110 nm for 14 days and 673 ± 171 nm for 30 days. Above results showed that PSS/MWNTs could be well stable in high salts solutions for a long period of time. After wrapped with PSS, the diameters of nanotubes changed from 30 40 to 430 nm, the thickness of wrapped polymer is about 400 nm by analysis of morphologies. The zeta potentials of PSS/MWNTs composites in various salinity of brines kept at approximately - 41 - 52 mV. Therefore, the well dispersion of PSS/MWNTs in high salinity is due to large negative charges of poly (sodium 4-styrenesulfonate), which provide enough electrostatic repulsion and steric repulsion to hinder compression of electric double layer caused by high concentration electrolytes.

  3. On the possibility of producing definitive magnetic observatory data within less than one year

    NASA Astrophysics Data System (ADS)

    Mandić, Igor; Korte, Monika

    2017-04-01

    Geomagnetic observatory data are fundamental in geomagnetic field studies and are widely used in other applications. Often they are combined with satellite and ground survey data. Unfortunately, the observatory definitive data are only available with a time lag ranging from several months up to more than a year. The reason for this lag is the annual production of the final calibration values, i.e. baselines that are used to correct preliminary data from continuously recording magnetometers. In this paper, we will show that the preparation of definitive geomagnetic data is possible within a calendar year and presents an original method for prompt and automatic estimation of the observatory baselines. The new baselines, obtained in a mostly automatic manner, are compared with the baselines reported on INTERMAGNET DVDs for the 2009-2011 period. The high quality of the baselines obtained by the proposed method indicates its suitability for data processing in fully automatic observatories when automated absolute instruments will be deployed at remote sites.

  4. Examination of a cognitive model of stress, burnout, and intention to resign for Japanese nurses.

    PubMed

    Ohue, Takashi; Moriyama, Michiko; Nakaya, Takashi

    2011-06-01

    A reduction in burnout is required to decrease the voluntary turnover of nurses. This study was carried out with the aim of establishing a cognitive model of stress, burnout, and intention to resign for nurses. A questionnaire survey was administered to 336 nurses (27 male and 309 female) who had worked for ≤5 years at a hospital with multiple departments. The survey included an evaluation of burnout (Maslach Burnout Inventory), stress (Nursing Job Stressor Scale), automatic thoughts (Automatic Thoughts Questionnaire-Revised), and irrational beliefs (Japanese Irrational Belief Test), in addition to the intention to resign. The stressors that affected burnout in the nurses included conflict with other nursing staff, nursing role conflict, qualitative workload, quantitative workload, and conflict with patients. The irrational beliefs that were related to burnout included dependence, problem avoidance, and helplessness. In order to examine the automatic thoughts affecting burnout, groups with low and high negative automatic thoughts and low and high positive automatic thoughts were established. A two-way ANOVA showed a significant interaction of these factors with emotional exhaustion, but no significant interaction with depersonalization and a personal sense of accomplishment. Only the major effect was significant. The final model showed a process of "stressor → irrational beliefs → negative automatic thoughts/positive automatic thoughts → burnout". In addition, a relationship between burnout and an intention to resign was shown. These results suggest that stress and burnout in nurses might be prevented and that the number of nurses who leave their position could be decreased by changing irrational beliefs to rational beliefs, decreasing negative automatic thoughts, and facilitating positive automatic thoughts. © 2010 The Authors. Japan Journal of Nursing Science © 2010 Japan Academy of Nursing Science.

  5. A fast and fully automatic registration approach based on point features for multi-source remote-sensing images

    NASA Astrophysics Data System (ADS)

    Yu, Le; Zhang, Dengrong; Holden, Eun-Jung

    2008-07-01

    Automatic registration of multi-source remote-sensing images is a difficult task as it must deal with the varying illuminations and resolutions of the images, different perspectives and the local deformations within the images. This paper proposes a fully automatic and fast non-rigid image registration technique that addresses those issues. The proposed technique performs a pre-registration process that coarsely aligns the input image to the reference image by automatically detecting their matching points by using the scale invariant feature transform (SIFT) method and an affine transformation model. Once the coarse registration is completed, it performs a fine-scale registration process based on a piecewise linear transformation technique using feature points that are detected by the Harris corner detector. The registration process firstly finds in succession, tie point pairs between the input and the reference image by detecting Harris corners and applying a cross-matching strategy based on a wavelet pyramid for a fast search speed. Tie point pairs with large errors are pruned by an error-checking step. The input image is then rectified by using triangulated irregular networks (TINs) to deal with irregular local deformations caused by the fluctuation of the terrain. For each triangular facet of the TIN, affine transformations are estimated and applied for rectification. Experiments with Quickbird, SPOT5, SPOT4, TM remote-sensing images of the Hangzhou area in China demonstrate the efficiency and the accuracy of the proposed technique for multi-source remote-sensing image registration.

  6. Development and Operation of an Automatic Rotor Trim Control System for use During the UH-60 Individual Blade Control Wind Tunnel Test

    NASA Technical Reports Server (NTRS)

    Theodore, Colin R.

    2010-01-01

    A full-scale wind tunnel test to evaluate the effects of Individual Blade Control (IBC) on the performance, vibration, noise and loads of a UH-60A rotor was recently completed in the National Full-Scale Aerodynamics Complex (NFAC) 40- by 80-Foot Wind Tunnel [1]. A key component of this wind tunnel test was an automatic rotor trim control system that allowed the rotor trim state to be set more precisely, quickly and repeatably than was possible with the rotor operator setting the trim condition manually. The trim control system was also able to maintain the desired trim condition through changes in IBC actuation both in open- and closed-loop IBC modes, and through long-period transients in wind tunnel flow. This ability of the trim control system to automatically set and maintain a steady rotor trim enabled the effects of different IBC inputs to be compared at common trim conditions and to perform these tests quickly without requiring the rotor operator to re-trim the rotor. The trim control system described in this paper was developed specifically for use during the IBC wind tunnel test

  7. Polarization transformation as an algorithm for automatic generalization and quality assessment

    NASA Astrophysics Data System (ADS)

    Qian, Haizhong; Meng, Liqiu

    2007-06-01

    Since decades it has been a dream of cartographers to computationally mimic the generalization processes in human brains for the derivation of various small-scale target maps or databases from a large-scale source map or database. This paper addresses in a systematic way the polarization transformation (PT) - a new algorithm that serves both the purpose of automatic generalization of discrete features and the quality assurance. By means of PT, two dimensional point clusters or line networks in the Cartesian system can be transformed into a polar coordinate system, which then can be unfolded as a single spectrum line r = f(α), where r and a stand for the polar radius and the polar angle respectively. After the transformation, the original features will correspond to nodes on the spectrum line delimited between 0° and 360° along the horizontal axis, and between the minimum and maximum polar radius along the vertical axis. Since PT is a lossless transformation, it allows a straighforward analysis and comparison of the original and generalized distributions, thus automatic generalization and quality assurance can be down in this way. Examples illustrate that PT algorithm meets with the requirement of generalization of discrete spatial features and is more scientific.

  8. New Data Source for Studying and Modelling the Topside Ionosphere

    NASA Technical Reports Server (NTRS)

    Huang, Xue-Qin; Reinisch, Bodo; Bilitza, Dieter; Benson, Robert

    2001-01-01

    The existing uncertainties about density profiles in the topside ionosphere, i.e., in the height regime from hmF2 to approx. 2000 km, requires the search for new data sources. Millions of ionograms had been recorded by the ISIS and Alouette satellites in the sixties and seventies, that never were analyzed in terms of electron density profiles. In recent years an effort started to digitize the analog recordings to prepare the ionograms for computerized analysis. This paper shows how the digital ionograms are processed and the electron density profiles (from satellite orbit altitude, 1400 km for ISIS-2, down to the F peak) are calculated. The most difficult part of the task is the automatic scaling of the echo traces in the ISIS ionograms. Unlike the ionograms from modern ionosondes, the ISIS ionograms do not identify the wave polarization of the different echo traces, so physical logic must be applied to identify the ordinary ()) and extraordinary (X) traces, and this is not always successful. Characteristic resonance features seen in the topside ionograms occur at the gyro and plasma frequencies. An elaborate scheme was developed to identify these resonance frequencies in order to determine the local plasma and gyrofrequencies. This information helps in the identification of the O and X traces, and it provides the starting density of the electron density profile. The inversion of the echo traces into electron density profiles uses the same modified Chebyshev polynomial fitting technique that is successfully applied in the ground-based Digisonde network. The automatic topside ionogram scaler with true height algorithm TOPIST is successfully scaling approx. 70% of the ionograms. An 'editing process' is available to manually scale the more difficult ionograms. The home page for the ISIS project is at http://nssdc.gsfc.nasa.gov/space/isis/isis-status.html. It provides access to as of January 2001, 3000,000 digitized ISIS ionogram data and to related software. A search page lets users select data location, time, and a host of other search criteria. The automated processing of the ISIS ionograms will begin later this year and the electron density profiles will be made available from the project home page. The ISIS data restoration efforts are supported through NASA's Applied Systems and Information Research Program.

  9. Linking the runoff response at micro-plot and catchment scale following wildfire and terracing, north-central Portugal

    NASA Astrophysics Data System (ADS)

    Martins, Martinho A. S.; Rial-Rivas, María E.; Machado, Ana I.; Serpa, Dalila; Prats, Sergio A.; Faria, Sílvia R.; Varela, María E. T.; González-Pelayo, Óscar; Keizer, J. Jacob

    2015-04-01

    Wildfires are known as one of the principal natural hazards affecting the Mediterranean region. This includes Portugal, where wildfires have affected some 100.000 ha of rural lands each year. The effects of wildfires on runoff generation and/or the associated soil (fertility) losses have been studied in Portugal for more than two decades. Some of these studies have reported strong and sometimes extreme hydrological responses in recently burnt areas. Forestry operations in such areas have increasingly come to include bench terracing in preparation of new eucalypt plantations. The hydrological impacts of bench terracing, however, have received little research attention so far and the few existing publications are limited to small spatial scales. The construction of terraces is commonly considered an effective practice for soil conservation on steep slopes, having been applied by mankind since early history. Nonetheless, the present authors have measured high rates of splash as well as inter-rill erosion on recently constructed terraces, and have regularly observed rill formation, including on forest tracks which typically constitute an extensive network in such bench terraced plantations. The present study was carried out in a 29-ha forest catchment in north-central Portugal that was burnt by a wildfire during the summer of 2010, logged during early winter 2010/11, and then bench terraced with bulldozers during late winter 2011, some 6 months after the wildfire. The catchment outlet was instrumented immediately after the fire with an automatic hydrometric station comprising two subsequent flumes with maximum discharge capacities of 120 and 1700 l sec-1. Within the catchment, rainfall was measured using several automatic and storage gauges and overland flow was monitored on two contrasting slopes using 3 micro-plots of approximately 0.25m2 on each slope.Overland flow was measured at 1- to 2-weekly intervals during the hydrological years of 2010/11 and 2011/12, i.e. during the first six months after the wildfire but before the bench terracing and during the subsequent 18 months. While data analysis is still ongoing, preliminary results suggested that bench terracing had a greater impact on runoff generation than the wildfire itself, especially at the micro-plot scale

  10. Magsat investigation. [Canadian shield

    NASA Technical Reports Server (NTRS)

    Hall, D. H. (Principal Investigator)

    1980-01-01

    A computer program was prepared for modeling segments of the Earth's crust allowing for heterogeneity in magnetization in calculating the Earth's field at Magsat heights. This permits investigation of a large number of possible models in assessing the magnetic signatures of subprovinces of the Canadian shield. The fit between the model field and observed fields is optimized in a semi-automatic procedure.

  11. Proceedings of the Lake Wilderness Attention Conference. Interim Technical Report, August 1, 1980 through September 30, 1980.

    ERIC Educational Resources Information Center

    Lansman, Marcy, Ed.; Hunt, Earl, Ed.

    This technical report contains papers prepared by the 11 speakers at the 1980 Lake Wilderness (Seattle, Washington) Conference on Attention. The papers are divided into general models, physiological evidence, and visual attention categories. Topics of the papers include the following: (1) willed versus automatic control of behavior; (2) multiple…

  12. Towards Automatically Detecting Whether Student Learning Is Shallow

    ERIC Educational Resources Information Center

    Gowda, Sujith M.; Baker, Ryan S.; Corbett, Albert T.; Rossi, Lisa M.

    2013-01-01

    Recent research has extended student modeling to infer not just whether a student knows a skill or set of skills, but also whether the student has achieved robust learning--learning that enables the student to transfer their knowledge and prepares them for future learning (PFL). However, a student may fail to have robust learning in two fashions:…

  13. 10 CFR Appendix O to Part 110 - Illustrative List of Fuel Element Fabrication Plant Equipment and Components Under NRC's Export...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... performance and safety during reactor operation. Also, in all cases precise control of processes, procedures... elements include equipment that: (1) Normally comes in direct contact with, or directly processes or... pellets; (2) Automatic welding machines especially designed or prepared for welding end caps onto the fuel...

  14. "UML Quiz": Automatic Conversion of Web-Based E-Learning Content in Mobile Applications

    ERIC Educational Resources Information Center

    von Franqué, Alexander; Tellioglu, Hilda

    2014-01-01

    Many educational institutions use Learning Management Systems to provide e-learning content to their students. This often includes quizzes that can help students to prepare for exams. However, the content is usually web-optimized and not very usable on mobile devices. In this work a native mobile application ("UML Quiz") that imports…

  15. Vehicle-to-Grid Automatic Load Sharing with Driver Preference in Micro-Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yubo; Nazaripouya, Hamidreza; Chu, Chi-Cheng

    Integration of Electrical Vehicles (EVs) with power grid not only brings new challenges for load management, but also opportunities for distributed storage and generation. This paper comprehensively models and analyzes distributed Vehicle-to-Grid (V2G) for automatic load sharing with driver preference. In a micro-grid with limited communications, V2G EVs need to decide load sharing based on their own power and voltage profile. A droop based controller taking into account driver preference is proposed in this paper to address the distributed control of EVs. Simulations are designed for three fundamental V2G automatic load sharing scenarios that include all system dynamics of suchmore » applications. Simulation results demonstrate that active power sharing is achieved proportionally among V2G EVs with consideration of driver preference. In additional, the results also verify the system stability and reactive power sharing analysis in system modelling, which sheds light on large scale V2G automatic load sharing in more complicated cases.« less

  16. A cloud-based system for automatic glaucoma screening.

    PubMed

    Fengshou Yin; Damon Wing Kee Wong; Ying Quan; Ai Ping Yow; Ngan Meng Tan; Gopalakrishnan, Kavitha; Beng Hai Lee; Yanwu Xu; Zhuo Zhang; Jun Cheng; Jiang Liu

    2015-08-01

    In recent years, there has been increasing interest in the use of automatic computer-based systems for the detection of eye diseases including glaucoma. However, these systems are usually standalone software with basic functions only, limiting their usage in a large scale. In this paper, we introduce an online cloud-based system for automatic glaucoma screening through the use of medical image-based pattern classification technologies. It is designed in a hybrid cloud pattern to offer both accessibility and enhanced security. Raw data including patient's medical condition and fundus image, and resultant medical reports are collected and distributed through the public cloud tier. In the private cloud tier, automatic analysis and assessment of colour retinal fundus images are performed. The ubiquitous anywhere access nature of the system through the cloud platform facilitates a more efficient and cost-effective means of glaucoma screening, allowing the disease to be detected earlier and enabling early intervention for more efficient intervention and disease management.

  17. An automatic multi-atlas prostate segmentation in MRI using a multiscale representation and a label fusion strategy

    NASA Astrophysics Data System (ADS)

    Álvarez, Charlens; Martínez, Fabio; Romero, Eduardo

    2015-01-01

    The pelvic magnetic Resonance images (MRI) are used in Prostate cancer radiotherapy (RT), a process which is part of the radiation planning. Modern protocols require a manual delineation, a tedious and variable activity that may take about 20 minutes per patient, even for trained experts. That considerable time is an important work ow burden in most radiological services. Automatic or semi-automatic methods might improve the efficiency by decreasing the measure times while conserving the required accuracy. This work presents a fully automatic atlas- based segmentation strategy that selects the more similar templates for a new MRI using a robust multi-scale SURF analysis. Then a new segmentation is achieved by a linear combination of the selected templates, which are previously non-rigidly registered towards the new image. The proposed method shows reliable segmentations, obtaining an average DICE Coefficient of 79%, when comparing with the expert manual segmentation, under a leave-one-out scheme with the training database.

  18. Dual current readout for precision plating

    NASA Technical Reports Server (NTRS)

    Iceland, W. F.

    1970-01-01

    Bistable amplifier prevents damage in the low range circuitry of a dual scale ammeter. It senses the current and switches automatically to the high range circuitry as the current rises above a preset level.

  19. Automatic Brain Portion Segmentation From Magnetic Resonance Images of Head Scans Using Gray Scale Transformation and Morphological Operations.

    PubMed

    Somasundaram, Karuppanagounder; Ezhilarasan, Kamalanathan

    2015-01-01

    To develop an automatic skull stripping method for magnetic resonance imaging (MRI) of human head scans. The proposed method is based on gray scale transformation and morphological operations. The proposed method has been tested with 20 volumes of normal T1-weighted images taken from Internet Brain Segmentation Repository. Experimental results show that the proposed method gives better results than the popular skull stripping methods Brain Extraction Tool and Brain Surface Extractor. The average value of Jaccard and Dice coefficients are 0.93 and 0.962 respectively. In this article, we have proposed a novel skull stripping method using intensity transformation and morphological operations. This is a low computational complexity method but gives competitive or better results than that of the popular skull stripping methods Brain Surface Extractor and Brain Extraction Tool.

  20. Improved dynamical scaling analysis using the kernel method for nonequilibrium relaxation.

    PubMed

    Echinaka, Yuki; Ozeki, Yukiyasu

    2016-10-01

    The dynamical scaling analysis for the Kosterlitz-Thouless transition in the nonequilibrium relaxation method is improved by the use of Bayesian statistics and the kernel method. This allows data to be fitted to a scaling function without using any parametric model function, which makes the results more reliable and reproducible and enables automatic and faster parameter estimation. Applying this method, the bootstrap method is introduced and a numerical discrimination for the transition type is proposed.

  1. A semi-automatic traffic sign detection, classification, and positioning system

    NASA Astrophysics Data System (ADS)

    Creusen, I. M.; Hazelhoff, L.; de With, P. H. N.

    2012-01-01

    The availability of large-scale databases containing street-level panoramic images offers the possibility to perform semi-automatic surveying of real-world objects such as traffic signs. These inventories can be performed significantly more efficiently than using conventional methods. Governmental agencies are interested in these inventories for maintenance and safety reasons. This paper introduces a complete semi-automatic traffic sign inventory system. The system consists of several components. First, a detection algorithm locates the 2D position of the traffic signs in the panoramic images. Second, a classification algorithm is used to identify the traffic sign. Third, the 3D position of the traffic sign is calculated using the GPS position of the photographs. Finally, the results are listed in a table for quick inspection and are also visualized in a web browser.

  2. Application of an automatic cloud tracking technique to Meteosat water vapor and infrared observations

    NASA Technical Reports Server (NTRS)

    Endlich, R. M.; Wolf, D. E.

    1980-01-01

    The automatic cloud tracking system was applied to METEOSAT 6.7 micrometers water vapor measurements to learn whether the system can track the motions of water vapor patterns. Data for the midlatitudes, subtropics, and tropics were selected from a sequence of METEOSAT pictures for 25 April 1978. Trackable features in the water vapor patterns were identified using a clustering technique and the features were tracked by two different methods. In flat (low contrast) water vapor fields, the automatic motion computations were not reliable, but in areas where the water vapor fields contained small scale structure (such as in the vicinity of active weather phenomena) the computations were successful. Cloud motions were computed using METEOSAT infrared observations (including tropical convective systems and midlatitude jet stream cirrus).

  3. A fast and automatic mosaic method for high-resolution satellite images

    NASA Astrophysics Data System (ADS)

    Chen, Hongshun; He, Hui; Xiao, Hongyu; Huang, Jing

    2015-12-01

    We proposed a fast and fully automatic mosaic method for high-resolution satellite images. First, the overlapped rectangle is computed according to geographical locations of the reference and mosaic images and feature points on both the reference and mosaic images are extracted by a scale-invariant feature transform (SIFT) algorithm only from the overlapped region. Then, the RANSAC method is used to match feature points of both images. Finally, the two images are fused into a seamlessly panoramic image by the simple linear weighted fusion method or other method. The proposed method is implemented in C++ language based on OpenCV and GDAL, and tested by Worldview-2 multispectral images with a spatial resolution of 2 meters. Results show that the proposed method can detect feature points efficiently and mosaic images automatically.

  4. Biosignal Analysis to Assess Mental Stress in Automatic Driving of Trucks: Palmar Perspiration and Masseter Electromyography

    PubMed Central

    Zheng, Rencheng; Yamabe, Shigeyuki; Nakano, Kimihiko; Suda, Yoshihiro

    2015-01-01

    Nowadays insight into human-machine interaction is a critical topic with the large-scale development of intelligent vehicles. Biosignal analysis can provide a deeper understanding of driver behaviors that may indicate rationally practical use of the automatic technology. Therefore, this study concentrates on biosignal analysis to quantitatively evaluate mental stress of drivers during automatic driving of trucks, with vehicles set at a closed gap distance apart to reduce air resistance to save energy consumption. By application of two wearable sensor systems, a continuous measurement was realized for palmar perspiration and masseter electromyography, and a biosignal processing method was proposed to assess mental stress levels. In a driving simulator experiment, ten participants completed automatic driving with 4, 8, and 12 m gap distances from the preceding vehicle, and manual driving with about 25 m gap distance as a reference. It was found that mental stress significantly increased when the gap distances decreased, and an abrupt increase in mental stress of drivers was also observed accompanying a sudden change of the gap distance during automatic driving, which corresponded to significantly higher ride discomfort according to subjective reports. PMID:25738768

  5. Automatic thermographic image defect detection of composites

    NASA Astrophysics Data System (ADS)

    Luo, Bin; Liebenberg, Bjorn; Raymont, Jeff; Santospirito, SP

    2011-05-01

    Detecting defects, and especially reliably measuring defect sizes, are critical objectives in automatic NDT defect detection applications. In this work, the Sentence software is proposed for the analysis of pulsed thermography and near IR images of composite materials. Furthermore, the Sentence software delivers an end-to-end, user friendly platform for engineers to perform complete manual inspections, as well as tools that allow senior engineers to develop inspection templates and profiles, reducing the requisite thermographic skill level of the operating engineer. Finally, the Sentence software can also offer complete independence of operator decisions by the fully automated "Beep on Defect" detection functionality. The end-to-end automatic inspection system includes sub-systems for defining a panel profile, generating an inspection plan, controlling a robot-arm and capturing thermographic images to detect defects. A statistical model has been built to analyze the entire image, evaluate grey-scale ranges, import sentencing criteria and automatically detect impact damage defects. A full width half maximum algorithm has been used to quantify the flaw sizes. The identified defects are imported into the sentencing engine which then sentences (automatically compares analysis results against acceptance criteria) the inspection by comparing the most significant defect or group of defects against the inspection standards.

  6. The QT Scale: A Weight Scale Measuring the QTc Interval.

    PubMed

    Couderc, Jean-Philippe; Beshaw, Connor; Niu, Xiaodan; Serrano-Finetti, Ernesto; Casas, Oscar; Pallas-Areny, Ramon; Rosero, Spencer; Zareba, Wojciech

    2017-01-01

    Despite the strong evidence of the clinical utility of QTc prolongation as a surrogate marker of cardiac risk, QTc measurement is not part of clinical routine either in hospital or in physician offices. We evaluated a novel device ("the QT scale") to measure heart rate (HR) and QTc interval. The QT scale is a weight scale embedding an ECG acquisition system with four limb sensors (feet and hands: lead I, II, and III). We evaluated the reliability of QT scale in healthy subjects (cohort 1) and cardiac patients (cohorts 2 and 3) considering a learning (cohort 2) and two validation cohorts. The QT scale and the standard 12-lead recorder were compared using intraclass correlation coefficient (ICC) in cohorts 2 and 3. Absolute value of heart rate and QTc intervals between manual and automatic measurements using ECGs from the QT scale and a clinical device were compared in cohort 1. We enrolled 16 subjects in cohort 1 (8 w, 8 m; 32 ± 8 vs 34 ± 10 years, P = 0.7), 51 patients in cohort 2 (13 w, 38 m; 61 ± 16 vs 58 ± 18 years, P = 0.6), and 13 AF patients in cohort 3 (4 w, 9 m; 63 ± 10 vs 64 ± 10 years, P = 0.9). Similar automatic heart rate and QTc were delivered by the scale and the clinical device in cohort 1: paired difference in RR and QTc were -7 ± 34 milliseconds (P = 0.37) and 3.4 ± 28.6 milliseconds (P = 0.64), respectively. The measurement of stability was slightly lower in ECG from the QT scale than from the clinical device (ICC: 91% vs 80%) in cohort 3. The "QT scale device" delivers valid heart rate and QTc interval measurements. © 2016 Wiley Periodicals, Inc.

  7. Automatic and deliberate affective associations with sexual stimuli in women with lifelong vaginismus before and after therapist-aided exposure treatment.

    PubMed

    Melles, Reinhilde J; ter Kuile, Moniek M; Dewitte, Marieke; van Lankveld, Jacques J D M; Brauer, Marieke; de Jong, Peter J

    2014-03-01

    The intense fear response to vaginal penetration in women with lifelong vaginismus, who have never been able to experience coitus, may reflect negative automatic and deliberate appraisals of vaginal penetration stimuli which might be modified by exposure treatment. The aim of this study is to examine whether (i) sexual stimuli elicit relatively strong automatic and deliberate threat associations in women with vaginismus, as well as relatively negative automatic and deliberate global affective associations, compared with symptom-free women; and (ii) these automatic and more deliberate attitudes can be modified by therapist-aided exposure treatment. A single target Implicit Association Test (st-IAT) was used to index automatic threat associations, and an Affective Simon Task (AST) to index global automatic affective associations. Participants were women with lifelong vaginismus (N = 68) and women without sexual problems (N = 70). The vaginismus group was randomly allocated to treatment (n = 34) and a waiting list control condition (n = 34). Indices of automatic threat were obtained by the st-IAT and automatic global affective associations by the AST, visual analogue scales (VAS) were used to assess deliberate appraisals of the sexual pictures (fear and global positive affect). More deliberate fear and less global positive affective associations with sexual stimuli were found in women with vaginismus. Following therapist-aided exposure treatment, the strength of fear was strongly reduced, whereas global positive affective associations were strengthened. Automatic associations did not differ between women with and without vaginismus and did not change following treatment. Relatively stronger negative (threat or global affect) associations with sexual stimuli in vaginismus appeared restricted to the deliberate level. Therapist-aided exposure treatment was effective in reducing subjective fear of sexual penetration stimuli and led to more global positive affective associations with sexual stimuli. The impact of exposure might be further improved by strengthening the association between vaginal penetration and positive affect (e.g., by using counter-conditioning techniques). © 2013 International Society for Sexual Medicine.

  8. MeSH Now: automatic MeSH indexing at PubMed scale via learning to rank.

    PubMed

    Mao, Yuqing; Lu, Zhiyong

    2017-04-17

    MeSH indexing is the task of assigning relevant MeSH terms based on a manual reading of scholarly publications by human indexers. The task is highly important for improving literature retrieval and many other scientific investigations in biomedical research. Unfortunately, given its manual nature, the process of MeSH indexing is both time-consuming (new articles are not immediately indexed until 2 or 3 months later) and costly (approximately ten dollars per article). In response, automatic indexing by computers has been previously proposed and attempted but remains challenging. In order to advance the state of the art in automatic MeSH indexing, a community-wide shared task called BioASQ was recently organized. We propose MeSH Now, an integrated approach that first uses multiple strategies to generate a combined list of candidate MeSH terms for a target article. Through a novel learning-to-rank framework, MeSH Now then ranks the list of candidate terms based on their relevance to the target article. Finally, MeSH Now selects the highest-ranked MeSH terms via a post-processing module. We assessed MeSH Now on two separate benchmarking datasets using traditional precision, recall and F 1 -score metrics. In both evaluations, MeSH Now consistently achieved over 0.60 in F-score, ranging from 0.610 to 0.612. Furthermore, additional experiments show that MeSH Now can be optimized by parallel computing in order to process MEDLINE documents on a large scale. We conclude that MeSH Now is a robust approach with state-of-the-art performance for automatic MeSH indexing and that MeSH Now is capable of processing PubMed scale documents within a reasonable time frame. http://www.ncbi.nlm.nih.gov/CBBresearch/Lu/Demo/MeSHNow/ .

  9. A procedural method for the efficient implementation of full-custom VLSI designs

    NASA Technical Reports Server (NTRS)

    Belk, P.; Hickey, N.

    1987-01-01

    An imbedded language system for the layout of very large scale integration (VLSI) circuits is examined. It is shown that through the judicious use of this system, a large variety of circuits can be designed with circuit density and performance comparable to traditional full-custom design methods, but with design costs more comparable to semi-custom design methods. The high performance of this methodology is attributable to the flexibility of procedural descriptions of VLSI layouts and to a number of automatic and semi-automatic tools within the system.

  10. Sexual Modes Questionnaire (SMQ): Translation and Psychometric Properties of the Italian Version of the Automatic Thought Scale.

    PubMed

    Nimbi, Filippo Maria; Tripodi, Francesca; Simonelli, Chiara; Nobre, Pedro

    2018-03-01

    The Sexual Modes Questionnaire (SMQ) is a validated and widespread used tool to assess the association among negative automatic thoughts, emotions, and sexual response during sexual activity in men and women. To test the psychometric characteristics of the Italian version of the SMQ focusing on the Automatic Thoughts subscale (SMQ-AT). After linguistic translation, the psychometric properties (internal consistency, construct, and discriminant validity) were evaluated. 1,051 participants (425 men and 626 women, 776 healthy and 275 clinical groups complaining about sexual problems) participated in the present study. 2 confirmatory factor analyses were conducted to test the fit of the original factor structures of the SMQ versions. In addition, 2 principal component analyses were performed to highlight 2 new factorial structures that were further validated with confirmatory factor analyses. Cronbach α and composite reliability were used as internal consistency measures and differences between clinical and control groups were run to test the discriminant validity for the male and female versions. The associations with emotions and sexual functioning measures also are reported. Principal component analyses identified 5 factors in the male version: erection concerns thoughts, lack of erotic thoughts, age- and body-related thoughts, negative thoughts toward sex, and worries about partner's evaluation and failure anticipation thoughts. In the female version 6 factors were found: sexual abuse thoughts, lack of erotic thoughts, low self-body image thoughts, failure and disengagement thoughts, sexual passivity and control, and partner's lack of affection. Confirmatory factor analysis supported the adequacy of the factor structure for men and women. Moreover, the SMQ showed a strong association with emotional response and sexual functioning, differentiating between clinical and control groups. This measure is useful to evaluate patients and design interventions focused on negative automatic thoughts during sexual activity and to develop multicultural research. This study reports on the translation and validation of the Italian version of a clinically useful and widely used measure (assessing automatic thoughts during sexual activity). Limits regarding sampling technique and use of the Automatic Thoughts subscale are discussed in the article. The present findings support the validity and the internal consistency of the Italian version of the SMQ-AT and allow the assessment of negative automatic thoughts during sexual activity for clinical and research purposes. Nimbi FM, Tripodi F, Simonelli C, Nobre P. Sexual Modes Questionnaire (SMQ): Translation and Psychometric Properties of the Italian Version of the Automatic Thought Scale. J Sex Med 2018;15:396-409. Copyright © 2018 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.

  11. Automation of preparation of nonmetallic samples for analysis by atomic absorption and inductively coupled plasma spectrometry

    NASA Technical Reports Server (NTRS)

    Wittmann, A.; Willay, G.

    1986-01-01

    For a rapid preparation of solutions intended for analysis by inductively coupled plasma emission spectrometry or atomic absorption spectrometry, an automatic device called Plasmasol was developed. This apparatus used the property of nonwettability of glassy C to fuse the sample in an appropriate flux. The sample-flux mixture is placed in a composite crucible, then heated at high temperature, swirled until full dissolution is achieved, and then poured into a water-filled beaker. After acid addition, dissolution of the melt, and filling to the mark, the solution is ready for analysis. The analytical results obtained, either for oxide samples or for prereduced iron ores show that the solutions prepared with this device are undistinguished from those obtained by manual dissolutions done by acid digestion or by high temperature fusion. Preparation reproducibility and analytical tests illustrate the performance of Plasmasol.

  12. A word processor optimized for preparing journal articles and student papers.

    PubMed

    Wolach, A H; McHale, M A

    2001-11-01

    A new Windows-based word processor for preparing journal articles and student papers is described. In addition to standard features found in word processors, the present word processor provides specific help in preparing manuscripts. Clicking on "Reference Help (APA Form)" in the "File" menu provides a detailed help system for entering the references in a journal article. Clicking on "Examples and Explanations of APA Form" provides a help system with examples of the various sections of a review article, journal article that has one experiment, or journal article that has two or more experiments. The word processor can automatically place the manuscript page header and page number at the top of each page using the form required by APA and Psychonomic Society journals. The "APA Form" submenu of the "Help" menu provides detailed information about how the word processor is optimized for preparing articles and papers.

  13. Imbalance in Multiple Sclerosis: A Result of Slowed Spinal Somatosensory Conduction

    PubMed Central

    Cameron, Michelle H.; Horak, Fay B.; Herndon, Robert R.; Bourdette, Dennis

    2009-01-01

    Balance problems and falls are common in people with multiple sclerosis (MS) but their cause and nature are not well understood. It is known that MS affects many areas of the central nervous system that can impact postural responses to maintain balance, including the cerebellum and the spinal cord. Cerebellar balance disorders are associated with normal latencies but reduced scaling of postural responses. We therefore examined the latency and scaling of automatic postural responses, and their relationship to somatosensory evoked potentials (SSEPs), in 10 people with MS and imbalance and 10 age-, sex-matched, healthy controls. The latency and scaling of postural responses to backward surface translations of 5 different velocities and amplitudes, and the latency of spinal and supraspinal somatosensory conduction, were examined. Subjects with MS had large, but very delayed automatic postural response latencies compared to controls (161ms ± 31 vs 102 ± 21, p < 0.01) and these postural response latencies correlated with the latencies of their spinal SSEPs (r=0.73, p< 0.01). Subjects with MS also had normal or excessive scaling of postural response amplitude to perturbation velocity and amplitude. Longer latency postural responses were associated with less velocity scaling and more amplitude scaling. Balance deficits in people with MS appear to be caused by slowed spinal somatosensory conduction and not by cerebellar involvement. People with MS appear to compensate for their slowed spinal somatosensory conduction by increasing the amplitude scaling and the magnitude of their postural responses. PMID:18570015

  14. Automatically Determining Scale Within Unstructured Point Clouds

    NASA Astrophysics Data System (ADS)

    Kadamen, Jayren; Sithole, George

    2016-06-01

    Three dimensional models obtained from imagery have an arbitrary scale and therefore have to be scaled. Automatically scaling these models requires the detection of objects in these models which can be computationally intensive. Real-time object detection may pose problems for applications such as indoor navigation. This investigation poses the idea that relational cues, specifically height ratios, within indoor environments may offer an easier means to obtain scales for models created using imagery. The investigation aimed to show two things, (a) that the size of objects, especially the height off ground is consistent within an environment, and (b) that based on this consistency, objects can be identified and their general size used to scale a model. To test the idea a hypothesis is first tested on a terrestrial lidar scan of an indoor environment. Later as a proof of concept the same test is applied to a model created using imagery. The most notable finding was that the detection of objects can be more readily done by studying the ratio between the dimensions of objects that have their dimensions defined by human physiology. For example the dimensions of desks and chairs are related to the height of an average person. In the test, the difference between generalised and actual dimensions of objects were assessed. A maximum difference of 3.96% (2.93cm) was observed from automated scaling. By analysing the ratio between the heights (distance from the floor) of the tops of objects in a room, identification was also achieved.

  15. Automatic mobile device synchronization and remote control system for high-performance medical applications.

    PubMed

    Constantinescu, L; Kim, J; Chan, C; Feng, D

    2007-01-01

    The field of telemedicine is in need of generic solutions that harness the power of small, easily carried computing devices to increase efficiency and decrease the likelihood of medical errors. Our study resolved to build a framework to bridge the gap between handheld and desktop solutions by developing an automated network protocol that wirelessly propagates application data and images prepared by a powerful workstation to handheld clients for storage, display and collaborative manipulation. To this end, we present the Mobile Active Medical Protocol (MAMP), a framework capable of nigh-effortlessly linking medical workstation solutions to corresponding control interfaces on handheld devices for remote storage, control and display. The ease-of-use, encapsulation and applicability of this automated solution is designed to provide significant benefits to the rapid development of telemedical solutions. Our results demonstrate that the design of this system allows an acceptable data transfer rate, a usable framerate for diagnostic solutions and enough flexibility to enable its use in a wide variety of cases. To this end, we also present a large-scale multi-modality image viewer as an example application based on the MAMP.

  16. [An expedient semi-automatic procedure for the preparation of large quantities of bioindicators especially for use in gas sterilization processes].

    PubMed

    Spicher, G; Borchers, U

    1985-06-01

    Bioindicators serve to test the efficacy of disinfection and sterilization procedures. Such indicators mostly consist of a support (filter paper, as a rule) to which micro-organisms have been fixed by drying. The authors have used a thread as support and a special apparatus for semi-automatic preparation of the bioindicators. The components of the device are either commercially available or may be prepared from commercially available material without difficulty. The principle of the method is as follows: The thread serving as the support is drawn slowly, at constant speed, through the suspension of test organisms and dried in an air stream immediately afterwards. The apparatus consists of a cylindrical glass tube of a few centimeters in diameter, an electric motor slowly rotating the cylinder, a fan, a magnetic stirrer, and an ice-water bath. A small vial containing the germ suspension is immersed in the ice-water bath. The vial is sealed by a screw cap with two glass tubes of about 3 mm inner diameter passing through it. One of the glass tubes being bent in its upper part reaches far down into the vial to leave just enough play for free rotation of a magnetic stirring rod. This tube serves to introduce the thread into the germ suspension. The second straight tube does not reach as far down as the first one. Its lower opening should not be immersed in the germ suspension. This tube serves as a guide for the returning thread. Preparation begins by winding the thread to be soaked with the suspension around the cylinder.(ABSTRACT TRUNCATED AT 250 WORDS)

  17. Brief scale measuring patient preparedness for hospital discharge to home: Psychometric properties.

    PubMed

    Graumlich, James F; Novotny, Nancy L; Aldag, Jean C

    2008-01-01

    Adverse events occur when patients transition from the hospital to outpatient care. For quality improvement and research purposes, clinicians need appropriate, reliable, and valid survey instruments to measure and improve the discharge processes. The object was to describe psychometric properties of the Brief PREPARED (B-PREPARED) instrument to measure preparedness for hospital discharge from the patient's perspective. The study was a prospective cohort of 460 patient or proxy telephone interviews following hospital discharge home. We administered the Satisfaction with Information about Medicines Scale and the PREPARED instrument 1 week after discharge. PREPARED measured patients' perceptions of quality and outcome of the discharge-planning processes. Four weeks after discharge, interviewers elicited emergency department visits. The main outcome was the B-PREPARED scale value: the sum of scores from 11 items. Internal consistency, construct, and predictive validity were assessed. : The mean B-PREPARED scale value was 17.3 +/- 4.2 (SD) with a range of 3 to 22. High scores reflected high preparedness. Principal component analysis identified 3 domains: self-care information, equipment/services, and confidence. The B-PREPARED had acceptable internal consistency (Cronbach's alpha 0.76) and construct validity. The B-PREPARED correlated with medication information satisfaction (P < 0.001). Higher median B-PREPARED scores appropriately discriminated patients with no worry about managing at home from worriers (P < 0.001) and predicted patients without emergency department visits after discharge from those who had visits (P = 0.011). The B-PREPARED scale measured patients' perceptions of their preparedness for hospital discharge home with acceptable internal consistency and construct and predictive validity. Brevity may potentiate use by patients and proxies. Clinicians and researchers may use B-PREPARED to evaluate discharge interventions. (c) 2008 Society of Hospital Medicine.

  18. Automatic rock detection for in situ spectroscopy applications on Mars

    NASA Astrophysics Data System (ADS)

    Mahapatra, Pooja; Foing, Bernard H.

    A novel algorithm for rock detection has been developed for effectively utilising Mars rovers, and enabling autonomous selection of target rocks that require close-contact spectroscopic measurements. The algorithm demarcates small rocks in terrain images as seen by cameras on a Mars rover during traverse. This information may be used by the rover for selection of geologically relevant sample rocks, and (in conjunction with a rangefinder) to pick up target samples using a robotic arm for automatic in situ determination of rock composition and mineralogy using, for example, a Raman spectrometer. Determining rock samples within the region that are of specific interest without physically approaching them significantly reduces time, power and risk. Input images in colour are converted to greyscale for intensity analysis. Bilateral filtering is used for texture removal while preserving rock boundaries. Unsharp masking is used for contrast enhance-ment. Sharp contrasts in intensities are detected using Canny edge detection, with thresholds that are calculated from the image obtained after contrast-limited adaptive histogram equalisation of the unsharp masked image. Scale-space representations are then generated by convolving this image with a Gaussian kernel. A scale-invariant blob detector (Laplacian of the Gaussian, LoG) detects blobs independently of their sizes, and therefore requires a multi-scale approach with automatic scale se-lection. The scale-space blob detector consists of convolution of the Canny edge-detected image with a scale-normalised LoG at several scales, and finding the maxima of squared LoG response in scale-space. After the extraction of local intensity extrema, the intensity profiles along rays going out of the local extremum are investigated. An ellipse is fitted to the region determined by significant changes in the intensity profiles. The fitted ellipses are overlaid on the original Mars terrain image for a visual estimation of the rock detection accuracy, and the number of ellipses are counted. Since geometry and illumination have the least effect on small rocks, the proposed algorithm is effective in detecting small rocks (or bigger rocks at larger distances from the camera) that consist of a small fraction of image pixels. Acknowledgements: The first author would like to express her gratitude to the European Space Agency (ESA/ESTEC) and the International Lunar Exploration Working Group (ILEWG) for their support of this work.

  19. Mitosis Counting in Breast Cancer: Object-Level Interobserver Agreement and Comparison to an Automatic Method

    PubMed Central

    Veta, Mitko; van Diest, Paul J.; Jiwa, Mehdi; Al-Janabi, Shaimaa; Pluim, Josien P. W.

    2016-01-01

    Background Tumor proliferation speed, most commonly assessed by counting of mitotic figures in histological slide preparations, is an important biomarker for breast cancer. Although mitosis counting is routinely performed by pathologists, it is a tedious and subjective task with poor reproducibility, particularly among non-experts. Inter- and intraobserver reproducibility of mitosis counting can be improved when a strict protocol is defined and followed. Previous studies have examined only the agreement in terms of the mitotic count or the mitotic activity score. Studies of the observer agreement at the level of individual objects, which can provide more insight into the procedure, have not been performed thus far. Methods The development of automatic mitosis detection methods has received large interest in recent years. Automatic image analysis is viewed as a solution for the problem of subjectivity of mitosis counting by pathologists. In this paper we describe the results from an interobserver agreement study between three human observers and an automatic method, and make two unique contributions. For the first time, we present an analysis of the object-level interobserver agreement on mitosis counting. Furthermore, we train an automatic mitosis detection method that is robust with respect to staining appearance variability and compare it with the performance of expert observers on an “external” dataset, i.e. on histopathology images that originate from pathology labs other than the pathology lab that provided the training data for the automatic method. Results The object-level interobserver study revealed that pathologists often do not agree on individual objects, even if this is not reflected in the mitotic count. The disagreement is larger for objects from smaller size, which suggests that adding a size constraint in the mitosis counting protocol can improve reproducibility. The automatic mitosis detection method can perform mitosis counting in an unbiased way, with substantial agreement with human experts. PMID:27529701

  20. Mitosis Counting in Breast Cancer: Object-Level Interobserver Agreement and Comparison to an Automatic Method.

    PubMed

    Veta, Mitko; van Diest, Paul J; Jiwa, Mehdi; Al-Janabi, Shaimaa; Pluim, Josien P W

    2016-01-01

    Tumor proliferation speed, most commonly assessed by counting of mitotic figures in histological slide preparations, is an important biomarker for breast cancer. Although mitosis counting is routinely performed by pathologists, it is a tedious and subjective task with poor reproducibility, particularly among non-experts. Inter- and intraobserver reproducibility of mitosis counting can be improved when a strict protocol is defined and followed. Previous studies have examined only the agreement in terms of the mitotic count or the mitotic activity score. Studies of the observer agreement at the level of individual objects, which can provide more insight into the procedure, have not been performed thus far. The development of automatic mitosis detection methods has received large interest in recent years. Automatic image analysis is viewed as a solution for the problem of subjectivity of mitosis counting by pathologists. In this paper we describe the results from an interobserver agreement study between three human observers and an automatic method, and make two unique contributions. For the first time, we present an analysis of the object-level interobserver agreement on mitosis counting. Furthermore, we train an automatic mitosis detection method that is robust with respect to staining appearance variability and compare it with the performance of expert observers on an "external" dataset, i.e. on histopathology images that originate from pathology labs other than the pathology lab that provided the training data for the automatic method. The object-level interobserver study revealed that pathologists often do not agree on individual objects, even if this is not reflected in the mitotic count. The disagreement is larger for objects from smaller size, which suggests that adding a size constraint in the mitosis counting protocol can improve reproducibility. The automatic mitosis detection method can perform mitosis counting in an unbiased way, with substantial agreement with human experts.

  1. Automatic Testing and Assessment of Neuroanatomy Using a Digital Brain Atlas: Method and Development of Computer- and Mobile-Based Applications

    ERIC Educational Resources Information Center

    Nowinski, Wieslaw L.; Thirunavuukarasuu, Arumugam; Ananthasubramaniam, Anand; Chua, Beng Choon; Qian, Guoyu; Nowinska, Natalia G.; Marchenko, Yevgen; Volkau, Ihar

    2009-01-01

    Preparation of tests and student's assessment by the instructor are time consuming. We address these two tasks in neuroanatomy education by employing a digital media application with a three-dimensional (3D), interactive, fully segmented, and labeled brain atlas. The anatomical and vascular models in the atlas are linked to "Terminologia…

  2. A Verbal-Instruction System to Help Persons with Multiple Disabilities Perform Complex Food- and Drink-Preparation Tasks Independently

    ERIC Educational Resources Information Center

    Lancioni, Giulio E.; Singh, Nirbhay N.; O'Reilly, Mark F.; Sigafoos, Jeff; Oliva, Doretta; Smaldone, Angela; La Martire, Maria L.; Alberti, Gloria; Scigliuzzo, Francesca

    2011-01-01

    In a recent single-case study, we showed that a new verbal-instruction system, ensuring the automatic presentation of step instructions, was beneficial for promoting the task performance of a woman with multiple disabilities (including blindness). The present study was aimed at replicating and extending the aforementioned investigation with three…

  3. Connecting Lines of Research on Task Model Variables, Automatic Item Generation, and Learning Progressions in Game-Based Assessment

    ERIC Educational Resources Information Center

    Graf, Edith Aurora

    2014-01-01

    In "How Task Features Impact Evidence from Assessments Embedded in Simulations and Games," Almond, Kim, Velasquez, and Shute have prepared a thought-provoking piece contrasting the roles of task model variables in a traditional assessment of mathematics word problems to their roles in "Newton's Playground," a game designed…

  4. The Effects of Preconscious Cues upon the Automatic Activation of Self-Esteem of Selected Middle School Students.

    ERIC Educational Resources Information Center

    Ledford, Bruce R.; Ledford, Suzanne Y.

    This study investigated whether grade six students' self-esteem could be affected by the presentation of a selected stimulus below the threshold of conscious awareness via the medium of a specially prepared paper. It also investigated whether any statistically significant differences existed between the effects on self-esteem of a selected…

  5. Tales of the Expected: The Influence of Students' Expectations on Question Validity and Implications for Writing Exam Questions

    ERIC Educational Resources Information Center

    Crisp, Victoria; Sweiry, Ezekiel; Ahmed, Ayesha; Pollitt, Alastair

    2008-01-01

    Background: Through classroom preparation and exposure to past papers, textbooks and practice tests students develop expectations about examinations: what will be asked, how it will be asked and how they will be judged. Expectations are also involved in the automatic process of understanding questions. Where a question and a student's expectations…

  6. SUMMARY OF ELECTRIC SERVICE COSTS FOR TOTALLY AIR CONDITIONED SCHOOLS PREPARED FOR HOUSTON INDEPENDENT SCHOOL DISTRICT, MAY 31, 1967.

    ERIC Educational Resources Information Center

    WHITESIDES, M.M.

    THIS REPORT IS A COMPILATION OF DATA ON ELECTRIC AIR CONDITIONING COSTS, OPERATIONS AND MAINTENANCE. AIR CONDITIONING UNITS ARE COMPARED IN TERMS OF ELECTRIC VERSUS NON-ELECTRIC, AUTOMATIC VERSUS OPERATED, AIR COOLED VERSUS WATER COOLED, RECIPROCATING VERSUS CENTRIFUGAL COMPRESSORS, SPACE AND NOISE, REHEAT, MAINTENANCE AND ORIGINAL COST. DATA ARE…

  7. International Congress on Glass XII (in several languages)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doremus, R H; LaCourse, W C; Mackenzie, J D

    1980-01-01

    A total of 158 papers are included under nine headings: structure and glass formation; optical properties; electrical and magnetic properties; mechanical properties and relaxation; mass transport; chemical durability and surfaces; nucleation; crystallization; and glass ceramics; processing; and automatic controls. Separate abstracts were prepared for eight papers; four of the remaining papers had been processed previously for the data base. (DLC)

  8. All-Union Conference on Information Retrieval Systems and Automatic Processing of Scientific and Technical Information, 3rd, Moscow, 1967, Transactions. (Selected Articles).

    ERIC Educational Resources Information Center

    Air Force Systems Command, Wright-Patterson AFB, OH. Foreign Technology Div.

    The role and place of the machine in scientific and technical information is explored including: basic trends in the development of information retrieval systems; preparation of engineering and scientific cadres with respect to mechanization and automation of information works; the logic of descriptor retrieval systems; the 'SETKA-3' automated…

  9. Automatic reactor for solid-phase synthesis of molecularly imprinted polymeric nanoparticles (MIP NPs) in water.

    PubMed

    Poma, Alessandro; Guerreiro, Antonio; Caygill, Sarah; Moczko, Ewa; Piletsky, Sergey

    We report the development of an automated chemical reactor for solid-phase synthesis of MIP NPs in water. Operational parameters are under computer control, requiring minimal operator intervention. In this study, "ready for use" MIP NPs with sub-nanomolar affinity are prepared against pepsin A, trypsin and α-amylase in only 4 hours.

  10. Automatic reactor for solid-phase synthesis of molecularly imprinted polymeric nanoparticles (MIP NPs) in water

    PubMed Central

    Poma, Alessandro; Guerreiro, Antonio; Caygill, Sarah; Moczko, Ewa; Piletsky, Sergey

    2015-01-01

    We report the development of an automated chemical reactor for solid-phase synthesis of MIP NPs in water. Operational parameters are under computer control, requiring minimal operator intervention. In this study, “ready for use” MIP NPs with sub-nanomolar affinity are prepared against pepsin A, trypsin and α-amylase in only 4 hours. PMID:26722622

  11. Identifying images of handwritten digits using deep learning in H2O

    NASA Astrophysics Data System (ADS)

    Sadhasivam, Jayakumar; Charanya, R.; Kumar, S. Harish; Srinivasan, A.

    2017-11-01

    Automatic digit recognition is of popular interest today. Deep learning techniques make it possible for object recognition in image data. Perceiving the digit has turned into a fundamental part as far as certifiable applications. Since, digits are composed in various styles in this way to distinguish the digit it is important to perceive and arrange it with the assistance of machine learning methods. This exploration depends on supervised learning vector quantization neural system arranged under counterfeit artificial neural network. The pictures of digits are perceived, prepared and tried. After the system is made digits are prepared utilizing preparing dataset vectors and testing is connected to the pictures of digits which are separated to each other by fragmenting the picture and resizing the digit picture as needs be for better precision.

  12. KARMA: the observation preparation tool for KMOS

    NASA Astrophysics Data System (ADS)

    Wegner, Michael; Muschielok, Bernard

    2008-08-01

    KMOS is a multi-object integral field spectrometer working in the near infrared which is currently being built for the ESO VLT by a consortium of UK and German institutes. It is capable of selecting up to 24 target fields for integral field spectroscopy simultaneously by means of 24 robotic pick-off arms. For the preparation of observations with KMOS a dedicated preparation tool KARMA ("KMOS Arm Allocator") will be provided which optimizes the assignment of targets to these arms automatically, thereby taking target priorities and several mechanical and optical constraints into account. For this purpose two efficient algorithms, both being able to cope with the underlying optimization problem in a different way, were developed. We present the concept and architecture of KARMA in general and the optimization algorithms in detail.

  13. Effects of specimen preparation on the electromagnetic property measurements of solid materials with an automatic network analyzer

    NASA Technical Reports Server (NTRS)

    Long, E. R., Jr.

    1986-01-01

    Effects of specimen preparation on measured values of an acrylic's electomagnetic properties at X-band microwave frequencies, TE sub 1,0 mode, utilizing an automatic network analyzer have been studied. For 1 percent or less error, a gap between the specimen edge and the 0.901-in. wall of the specimen holder was the most significant parameter. The gap had to be less than 0.002 in. The thickness variation and alignment errors in the direction parallel to the 0.901-in. wall were equally second most significant and had to be less than 1 degree. Errors in the measurement f the thickness were third most significant. They had to be less than 3 percent. The following parameters caused errors of 1 percent or less: ratios of specimen-holder thicknesses of more than 15 percent, gaps between the specimen edge and the 0.401-in. wall less than 0.045 in., position errors less than 15 percent, surface roughness, hickness variation in the direction parallel to the 0.401-in. wall less than 35 percent, and specimen alignment in the direction parallel to the 0.401-in. wall mass than 5 degrees.

  14. Development of mass measurement equipment using an electronic mass-comparator for gravimetric preparation of reference gas mixtures

    NASA Astrophysics Data System (ADS)

    Matsumoto, Nobuhiro; Watanabe, Takuro; Maruyama, Masaaki; Horimoto, Yoshiyuki; Maeda, Tsuneaki; Kato, Kenji

    2004-06-01

    The gravimetric method is the most popular method for preparing reference gas mixtures with high accuracy. We have designed and manufactured novel mass measurement equipment for gravimetric preparation of reference gas mixtures. This equipment consists of an electronic mass-comparator with a maximum capacity of 15 kg and readability of 1 mg and an automatic cylinder exchanger. The structure of this equipment is simpler and the cost is much lower than a conventional mechanical knife-edge type large balance used for gravimetric preparation of primary gas mixtures in Japan. This cylinder exchanger can mount two cylinders alternatively on the weighing pan of the comparator. In this study, the performance of the equipment has been evaluated. At first, the linearity and repeatability of the mass measurement were evaluated using standard mass pieces. Then, binary gas mixtures of propane and nitrogen were prepared and compared with those prepared with the conventional knife-edge type balance. The comparison resulted in good consistency at the compatibility criterion described in ISO6143:2001.

  15. Semi-automatic spray pyrolysis deposition of thin, transparent, titania films as blocking layers for dye-sensitized and perovskite solar cells

    PubMed Central

    Krýsová, Hana; Kavan, Ladislav

    2018-01-01

    For proper function of the negative electrode of dye-sensitized and perovskite solar cells, the deposition of a nonporous blocking film is required on the surface of F-doped SnO2 (FTO) glass substrates. Such a blocking film can minimise undesirable parasitic processes, for example, the back reaction of photoinjected electrons with the oxidized form of the redox mediator or with the hole-transporting medium can be avoided. In the present work, thin, transparent, blocking TiO2 films are prepared by semi-automatic spray pyrolysis of precursors consisting of titanium diisopropoxide bis(acetylacetonate) as the main component. The variation in the layer thickness of the sprayed films is achieved by varying the number of spray cycles. The parameters investigated in this work were deposition temperature (150, 300 and 450 °C), number of spray cycles (20–200), precursor composition (with/without deliberately added acetylacetone), concentration (0.05 and 0.2 M) and subsequent post-calcination at 500 °C. The photo-electrochemical properties were evaluated in aqueous electrolyte solution under UV irradiation. The blocking properties were tested by cyclic voltammetry with a model redox probe with a simple one-electron-transfer reaction. Semi-automatic spraying resulted in the formation of transparent, homogeneous, TiO2 films, and the technique allows for easy upscaling to large electrode areas. The deposition temperature of 450 °C was necessary for the fabrication of highly photoactive TiO2 films. The blocking properties of the as-deposited TiO2 films (at 450 °C) were impaired by post-calcination at 500 °C, but this problem could be addressed by increasing the number of spray cycles. The modification of the precursor by adding acetylacetone resulted in the fabrication of TiO2 films exhibiting perfect blocking properties that were not influenced by post-calcination. These results will surely find use in the fabrication of large-scale dye-sensitized and perovskite solar cells. PMID:29719764

  16. Development and Testing of Geo-Processing Models for the Automatic Generation of Remediation Plan and Navigation Data to Use in Industrial Disaster Remediation

    NASA Astrophysics Data System (ADS)

    Lucas, G.; Lénárt, C.; Solymosi, J.

    2015-08-01

    This paper introduces research done on the automatic preparation of remediation plans and navigation data for the precise guidance of heavy machinery in clean-up work after an industrial disaster. The input test data consists of a pollution extent shapefile derived from the processing of hyperspectral aerial survey data from the Kolontár red mud disaster. Three algorithms were developed and the respective scripts were written in Python. The first model aims at drawing a parcel clean-up plan. The model tests four different parcel orientations (0, 90, 45 and 135 degree) and keeps the plan where clean-up parcels are less numerous considering it is an optimal spatial configuration. The second model drifts the clean-up parcel of a work plan both vertically and horizontally following a grid pattern with sampling distance of a fifth of a parcel width and keep the most optimal drifted version; here also with the belief to reduce the final number of parcel features. The last model aims at drawing a navigation line in the middle of each clean-up parcel. The models work efficiently and achieve automatic optimized plan generation (parcels and navigation lines). Applying the first model we demonstrated that depending on the size and geometry of the features of the contaminated area layer, the number of clean-up parcels generated by the model varies in a range of 4% to 38% from plan to plan. Such a significant variation with the resulting feature numbers shows that the optimal orientation identification can result in saving work, time and money in remediation. The various tests demonstrated that the model gains efficiency when 1/ the individual features of contaminated area present a significant orientation with their geometry (features are long), 2/ the size of pollution extent features becomes closer to the size of the parcels (scale effect). The second model shows only 1% difference with the variation of feature number; so this last is less interesting for planning optimization applications. Last model rather simply fulfils the task it was designed for by drawing navigation lines.

  17. Automatic analysis of diabetic peripheral neuropathy using multi-scale quantitative morphology of nerve fibres in corneal confocal microscopy imaging.

    PubMed

    Dabbah, M A; Graham, J; Petropoulos, I N; Tavakoli, M; Malik, R A

    2011-10-01

    Diabetic peripheral neuropathy (DPN) is one of the most common long term complications of diabetes. Corneal confocal microscopy (CCM) image analysis is a novel non-invasive technique which quantifies corneal nerve fibre damage and enables diagnosis of DPN. This paper presents an automatic analysis and classification system for detecting nerve fibres in CCM images based on a multi-scale adaptive dual-model detection algorithm. The algorithm exploits the curvilinear structure of the nerve fibres and adapts itself to the local image information. Detected nerve fibres are then quantified and used as feature vectors for classification using random forest (RF) and neural networks (NNT) classifiers. We show, in a comparative study with other well known curvilinear detectors, that the best performance is achieved by the multi-scale dual model in conjunction with the NNT classifier. An evaluation of clinical effectiveness shows that the performance of the automated system matches that of ground-truth defined by expert manual annotation. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. Automatic Matching of Large Scale Images and Terrestrial LIDAR Based on App Synergy of Mobile Phone

    NASA Astrophysics Data System (ADS)

    Xia, G.; Hu, C.

    2018-04-01

    The digitalization of Cultural Heritage based on ground laser scanning technology has been widely applied. High-precision scanning and high-resolution photography of cultural relics are the main methods of data acquisition. The reconstruction with the complete point cloud and high-resolution image requires the matching of image and point cloud, the acquisition of the homonym feature points, the data registration, etc. However, the one-to-one correspondence between image and corresponding point cloud depends on inefficient manual search. The effective classify and management of a large number of image and the matching of large image and corresponding point cloud will be the focus of the research. In this paper, we propose automatic matching of large scale images and terrestrial LiDAR based on APP synergy of mobile phone. Firstly, we develop an APP based on Android, take pictures and record related information of classification. Secondly, all the images are automatically grouped with the recorded information. Thirdly, the matching algorithm is used to match the global and local image. According to the one-to-one correspondence between the global image and the point cloud reflection intensity image, the automatic matching of the image and its corresponding laser radar point cloud is realized. Finally, the mapping relationship between global image, local image and intensity image is established according to homonym feature point. So we can establish the data structure of the global image, the local image in the global image, the local image corresponding point cloud, and carry on the visualization management and query of image.

  19. Perceiving pain in others: validation of a dual processing model.

    PubMed

    McCrystal, Kalie N; Craig, Kenneth D; Versloot, Judith; Fashler, Samantha R; Jones, Daniel N

    2011-05-01

    Accurate perception of another person's painful distress would appear to be accomplished through sensitivity to both automatic (unintentional, reflexive) and controlled (intentional, purposive) behavioural expression. We examined whether observers would construe diverse behavioural cues as falling within these domains, consistent with cognitive neuroscience findings describing activation of both automatic and controlled neuroregulatory processes. Using online survey methodology, 308 research participants rated behavioural cues as "goal directed vs. non-goal directed," "conscious vs. unconscious," "uncontrolled vs. controlled," "fast vs. slow," "intentional (deliberate) vs. unintentional," "stimulus driven (obligatory) vs. self driven," and "requiring contemplation vs. not requiring contemplation." The behavioural cues were the 39 items provided by the PROMIS pain behaviour bank, constructed to be representative of the diverse possibilities for pain expression. Inter-item correlations among rating scales provided evidence of sufficient internal consistency justifying a single score on an automatic/controlled dimension (excluding the inconsistent fast vs. slow scale). An initial exploratory factor analysis on 151 participant data sets yielded factors consistent with "controlled" and "automatic" actions, as well as behaviours characterized as "ambiguous." A confirmatory factor analysis using the remaining 151 data sets replicated EFA findings, supporting theoretical predictions that observers would distinguish immediate, reflexive, and spontaneous reactions (primarily facial expression and paralinguistic features of speech) from purposeful and controlled expression (verbal behaviour, instrumental behaviour requiring ongoing, integrated responses). There are implicit dispositions to organize cues signaling pain in others into the well-defined categories predicted by dual process theory. Copyright © 2011 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  20. Shall Organized Medicine be Unified, or Separate?

    PubMed Central

    1971-01-01

    At present, physicians in California who choose to join organized medicine do so through their county medical societies, and membership in the California Medical Association and the American Medical Association is then automatic. At the March meeting of the CMA House of Delegates, question was raised whether membership in CMA, and the AMA, or both should remain automatic. The House requested an ad hoc committee to cause a “poll and its attendant statements to be developed by May 21 for copy distribution to component medical societies and printing in the CMA membership news media—with mailing of the official questionnaires to the society members on September 1, 1971.” Members will be asked to express their opinions by ballot in September. The Speaker of the House appointed an ad hoc committee of the House to conduct this informed opinion poll of the membership. The committee has met to set ground rules, prepare accurate pro and con statements and write the poll questions—in accord with the directions of the House action. The Informed Membership Opinion Poll Committee, with the advice of Decision Making Information, Inc., an independent consultant, prepared statements regarding unified and separate membership in CMA and AMA from comments which were solicited from every county medical society. A statement by legal counsel for the California Medical Association on the structural relationship of AMA, CMA and component societies, and the statements on unified or separate membership prepared by the committee appear on the following two pages. PMID:18730549

  1. An overview of the BIOASQ large-scale biomedical semantic indexing and question answering competition.

    PubMed

    Tsatsaronis, George; Balikas, Georgios; Malakasiotis, Prodromos; Partalas, Ioannis; Zschunke, Matthias; Alvers, Michael R; Weissenborn, Dirk; Krithara, Anastasia; Petridis, Sergios; Polychronopoulos, Dimitris; Almirantis, Yannis; Pavlopoulos, John; Baskiotis, Nicolas; Gallinari, Patrick; Artiéres, Thierry; Ngomo, Axel-Cyrille Ngonga; Heino, Norman; Gaussier, Eric; Barrio-Alvers, Liliana; Schroeder, Michael; Androutsopoulos, Ion; Paliouras, Georgios

    2015-04-30

    This article provides an overview of the first BIOASQ challenge, a competition on large-scale biomedical semantic indexing and question answering (QA), which took place between March and September 2013. BIOASQ assesses the ability of systems to semantically index very large numbers of biomedical scientific articles, and to return concise and user-understandable answers to given natural language questions by combining information from biomedical articles and ontologies. The 2013 BIOASQ competition comprised two tasks, Task 1a and Task 1b. In Task 1a participants were asked to automatically annotate new PUBMED documents with MESH headings. Twelve teams participated in Task 1a, with a total of 46 system runs submitted, and one of the teams performing consistently better than the MTI indexer used by NLM to suggest MESH headings to curators. Task 1b used benchmark datasets containing 29 development and 282 test English questions, along with gold standard (reference) answers, prepared by a team of biomedical experts from around Europe and participants had to automatically produce answers. Three teams participated in Task 1b, with 11 system runs. The BIOASQ infrastructure, including benchmark datasets, evaluation mechanisms, and the results of the participants and baseline methods, is publicly available. A publicly available evaluation infrastructure for biomedical semantic indexing and QA has been developed, which includes benchmark datasets, and can be used to evaluate systems that: assign MESH headings to published articles or to English questions; retrieve relevant RDF triples from ontologies, relevant articles and snippets from PUBMED Central; produce "exact" and paragraph-sized "ideal" answers (summaries). The results of the systems that participated in the 2013 BIOASQ competition are promising. In Task 1a one of the systems performed consistently better from the NLM's MTI indexer. In Task 1b the systems received high scores in the manual evaluation of the "ideal" answers; hence, they produced high quality summaries as answers. Overall, BIOASQ helped obtain a unified view of how techniques from text classification, semantic indexing, document and passage retrieval, question answering, and text summarization can be combined to allow biomedical experts to obtain concise, user-understandable answers to questions reflecting their real information needs.

  2. Image smoothing and enhancement via min/max curvature flow

    NASA Astrophysics Data System (ADS)

    Malladi, Ravikanth; Sethian, James A.

    1996-03-01

    We present a class of PDE-based algorithms suitable for a wide range of image processing applications. The techniques are applicable to both salt-and-pepper gray-scale noise and full- image continuous noise present in black and white images, gray-scale images, texture images and color images. At the core, the techniques rely on a level set formulation of evolving curves and surfaces and the viscosity in profile evolution. Essentially, the method consists of moving the isointensity contours in an image under curvature dependent speed laws to achieve enhancement. Compared to existing techniques, our approach has several distinct advantages. First, it contains only one enhancement parameter, which in most cases is automatically chosen. Second, the scheme automatically stops smoothing at some optimal point; continued application of the scheme produces no further change. Third, the method is one of the fastest possible schemes based on a curvature-controlled approach.

  3. High-throughput automatic defect review for 300mm blank wafers with atomic force microscope

    NASA Astrophysics Data System (ADS)

    Zandiatashbar, Ardavan; Kim, Byong; Yoo, Young-kook; Lee, Keibock; Jo, Ahjin; Lee, Ju Suk; Cho, Sang-Joon; Park, Sang-il

    2015-03-01

    While feature size in lithography process continuously becomes smaller, defect sizes on blank wafers become more comparable to device sizes. Defects with nm-scale characteristic size could be misclassified by automated optical inspection (AOI) and require post-processing for proper classification. Atomic force microscope (AFM) is known to provide high lateral and the highest vertical resolution by mechanical probing among all techniques. However, its low throughput and tip life in addition to the laborious efforts for finding the defects have been the major limitations of this technique. In this paper we introduce automatic defect review (ADR) AFM as a post-inspection metrology tool for defect study and classification for 300 mm blank wafers and to overcome the limitations stated above. The ADR AFM provides high throughput, high resolution, and non-destructive means for obtaining 3D information for nm-scale defect review and classification.

  4. Automatic segmentation and classification of mycobacterium tuberculosis with conventional light microscopy

    NASA Astrophysics Data System (ADS)

    Xu, Chao; Zhou, Dongxiang; Zhai, Yongping; Liu, Yunhui

    2015-12-01

    This paper realizes the automatic segmentation and classification of Mycobacterium tuberculosis with conventional light microscopy. First, the candidate bacillus objects are segmented by the marker-based watershed transform. The markers are obtained by an adaptive threshold segmentation based on the adaptive scale Gaussian filter. The scale of the Gaussian filter is determined according to the color model of the bacillus objects. Then the candidate objects are extracted integrally after region merging and contaminations elimination. Second, the shape features of the bacillus objects are characterized by the Hu moments, compactness, eccentricity, and roughness, which are used to classify the single, touching and non-bacillus objects. We evaluated the logistic regression, random forest, and intersection kernel support vector machines classifiers in classifying the bacillus objects respectively. Experimental results demonstrate that the proposed method yields to high robustness and accuracy. The logistic regression classifier performs best with an accuracy of 91.68%.

  5. Disentangling Complexity in Bayesian Automatic Adaptive Quadrature

    NASA Astrophysics Data System (ADS)

    Adam, Gheorghe; Adam, Sanda

    2018-02-01

    The paper describes a Bayesian automatic adaptive quadrature (BAAQ) solution for numerical integration which is simultaneously robust, reliable, and efficient. Detailed discussion is provided of three main factors which contribute to the enhancement of these features: (1) refinement of the m-panel automatic adaptive scheme through the use of integration-domain-length-scale-adapted quadrature sums; (2) fast early problem complexity assessment - enables the non-transitive choice among three execution paths: (i) immediate termination (exceptional cases); (ii) pessimistic - involves time and resource consuming Bayesian inference resulting in radical reformulation of the problem to be solved; (iii) optimistic - asks exclusively for subrange subdivision by bisection; (3) use of the weaker accuracy target from the two possible ones (the input accuracy specifications and the intrinsic integrand properties respectively) - results in maximum possible solution accuracy under minimum possible computing time.

  6. Application of automatic threshold in dynamic target recognition with low contrast

    NASA Astrophysics Data System (ADS)

    Miao, Hua; Guo, Xiaoming; Chen, Yu

    2014-11-01

    Hybrid photoelectric joint transform correlator can realize automatic real-time recognition with high precision through the combination of optical devices and electronic devices. When recognizing targets with low contrast using photoelectric joint transform correlator, because of the difference of attitude, brightness and grayscale between target and template, only four to five frames of dynamic targets can be recognized without any processing. CCD camera is used to capture the dynamic target images and the capturing speed of CCD is 25 frames per second. Automatic threshold has many advantages like fast processing speed, effectively shielding noise interference, enhancing diffraction energy of useful information and better reserving outline of target and template, so this method plays a very important role in target recognition with optical correlation method. However, the automatic obtained threshold by program can not achieve the best recognition results for dynamic targets. The reason is that outline information is broken to some extent. Optimal threshold is obtained by manual intervention in most cases. Aiming at the characteristics of dynamic targets, the processing program of improved automatic threshold is finished by multiplying OTSU threshold of target and template by scale coefficient of the processed image, and combining with mathematical morphology. The optimal threshold can be achieved automatically by improved automatic threshold processing for dynamic low contrast target images. The recognition rate of dynamic targets is improved through decreased background noise effect and increased correlation information. A series of dynamic tank images with the speed about 70 km/h are adapted as target images. The 1st frame of this series of tanks can correlate only with the 3rd frame without any processing. Through OTSU threshold, the 80th frame can be recognized. By automatic threshold processing of the joint images, this number can be increased to 89 frames. Experimental results show that the improved automatic threshold processing has special application value for the recognition of dynamic target with low contrast.

  7. The role of automatic defensive responses in the development of posttraumatic stress symptoms in police recruits: protocol of a prospective study.

    PubMed

    Koch, Saskia B J; Klumpers, Floris; Zhang, Wei; Hashemi, Mahur M; Kaldewaij, Reinoud; van Ast, Vanessa A; Smit, Annika S; Roelofs, Karin

    2017-01-01

    Background : Control over automatic tendencies is often compromised in challenging situations when people fall back on automatic defensive reactions, such as freeze - fight - flight responses. Stress-induced lack of control over automatic defensive responses constitutes a problem endemic to high-risk professions, such as the police. Difficulties controlling automatic defensive responses may not only impair split-second decisions under threat, but also increase the risk for and persistence of posttraumatic stress disorder (PTSD) symptoms. However, the significance of these automatic defensive responses in the development and maintenance of trauma-related symptoms remains unclear due to a shortage of large-scale prospective studies. Objective : The 'Police-in-Action' study is conducted to investigate the role of automatic defensive responses in the development and maintenance of PTSD symptomatology after trauma exposure. Methods : In this prospective study, 340 police recruits from the Dutch Police Academy are tested before (wave 1; pre-exposure) and after (wave 2; post-exposure) their first emergency aid experiences as police officers. The two waves of data assessment are separated by approximately 15 months. To control for unspecific time effects, a well-matched control group of civilians ( n  = 85) is also tested twice, approximately 15 months apart, but without being frequently exposed to potentially traumatic events. Main outcomes are associations between (changes in) behavioural, psychophysiological, endocrine and neural markers of automatic defensive responses and development of trauma-related symptoms after trauma exposure in police recruits. Discussion : This prospective study in a large group of primary responders enables us to distinguish predisposing from acquired neurobiological abnormalities in automatic defensive responses, associated with the development of trauma-related symptoms. Identifying neurobiological correlates of (vulnerability for) trauma-related psychopathology may greatly improve screening for individuals at risk for developing PTSD symptomatology and offer valuable targets for (early preventive) interventions for PTSD.

  8. Automatic affective appraisal of sexual penetration stimuli in women with vaginismus or dyspareunia.

    PubMed

    Huijding, Jorg; Borg, Charmaine; Weijmar-Schultz, Willibrord; de Jong, Peter J

    2011-03-01

    Current psychological views are that negative appraisals of sexual stimuli lie at the core of sexual dysfunctions. It is important to differentiate between deliberate appraisals and more automatic appraisals, as research has shown that the former are most relevant to controllable behaviors, and the latter are most relevant to reflexive behaviors. Accordingly, it can be hypothesized that in women with vaginismus, the persistent difficulty to allow vaginal entry is due to global negative automatic affective appraisals that trigger reflexive pelvic floor muscle contraction at the prospect of penetration. To test whether sexual penetration pictures elicited global negative automatic affective appraisals in women with vaginismus or dyspareunia and to examine whether deliberate appraisals and automatic appraisals differed between the two patient groups. Women with persistent vaginismus (N = 24), dyspareunia (N = 23), or no sexual complaints (N = 30) completed a pictorial Extrinsic Affective Simon Task (EAST), and then made a global affective assessment of the EAST stimuli using visual analogue scales (VAS). The EAST assessed global automatic affective appraisals of sexual penetration stimuli, while the VAS assessed global deliberate affective appraisals of these stimuli. Automatic affective appraisals of sexual penetration stimuli tended to be positive, independent of the presence of sexual complaints. Deliberate appraisals of the same stimuli were significantly more negative in the women with vaginismus than in the dyspareunia group and control group, while the latter two groups did not differ in their appraisals. Unexpectedly, deliberate appraisals seemed to be most important in vaginismus, whereas dyspareunia did not seem to implicate negative deliberate or automatic affective appraisals. These findings dispute the view that global automatic affect lies at the core of vaginismus and indicate that a useful element in therapeutic interventions may be the modification of deliberate global affective appraisals of sexual penetration (e.g., via counter-conditioning). © 2010 International Society for Sexual Medicine.

  9. Relation of the activities of the IPDF/INPE project (reforestation subproject) during the year 1979. [Mato Grosso do Sul, Brazil

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Filho, P. H.; Shimabukuro, Y. E.; Demedeiros, J. S.; Desantana, C. C.; Alves, E. C. M.

    1981-01-01

    The state of Mato Grosso do Sul was selected as the study area to define the recognizable classes of Eucalyptus spp. and Pinus spp. by visual and automatic analyses. For visual analysis, a preliminary interpretation key and a legend of 6 groups were derived. Based on these six groups, three final classes were defined for analysis: (1) area prepared for reforestation; (2) area reforested with Eucalyptus spp.; and (3) area reforested with Pinus spp. For automatic interpretation the area along the highway from Ribas do Rio Pardo to Agua Clara was classified into the following classes: eucalytus, bare soil, plowed soil, pine and "cerrado". The results of visual analysis show that 67% of the reforested farms have relative differences in area estimate below 5%, 22%, between 5% and 10%; and 11% between 10% and 20%. The reforested eucalyptus area is 17 times greater than the area of reforested pine. Automatic classification of eucalyptus ranged from 73.03% to 92.30% in the training areas.

  10. The automatic back-check mechanism of mask tooling database and automatic transmission of mask tooling data

    NASA Astrophysics Data System (ADS)

    Xu, Zhe; Peng, M. G.; Tu, Lin Hsin; Lee, Cedric; Lin, J. K.; Jan, Jian Feng; Yin, Alb; Wang, Pei

    2006-10-01

    Nowadays, most foundries have paid more and more attention in order to reduce the CD width. Although the lithography technologies have developed drastically, mask data accuracy is still a big challenge than before. Besides, mask (reticle) price also goes up drastically such that data accuracy needs more special treatments.We've developed a system called eFDMS to guarantee the mask data accuracy. EFDMS is developed to do the automatic back-check of mask tooling database and the data transmission of mask tooling. We integrate our own EFDMS systems to engage with the standard mask tooling system K2 so that the upriver and the downriver processes of the mask tooling main body K2 can perform smoothly and correctly with anticipation. The competition in IC marketplace is changing from high-tech process to lower-price gradually. How to control the reduction of the products' cost more plays a significant role in foundries. Before the violent competition's drawing nearer, we should prepare the cost task ahead of time.

  11. Classification of C2C12 cells at differentiation by convolutional neural network of deep learning using phase contrast images.

    PubMed

    Niioka, Hirohiko; Asatani, Satoshi; Yoshimura, Aina; Ohigashi, Hironori; Tagawa, Seiichi; Miyake, Jun

    2018-01-01

    In the field of regenerative medicine, tremendous numbers of cells are necessary for tissue/organ regeneration. Today automatic cell-culturing system has been developed. The next step is constructing a non-invasive method to monitor the conditions of cells automatically. As an image analysis method, convolutional neural network (CNN), one of the deep learning method, is approaching human recognition level. We constructed and applied the CNN algorithm for automatic cellular differentiation recognition of myogenic C2C12 cell line. Phase-contrast images of cultured C2C12 are prepared as input dataset. In differentiation process from myoblasts to myotubes, cellular morphology changes from round shape to elongated tubular shape due to fusion of the cells. CNN abstract the features of the shape of the cells and classify the cells depending on the culturing days from when differentiation is induced. Changes in cellular shape depending on the number of days of culture (Day 0, Day 3, Day 6) are classified with 91.3% accuracy. Image analysis with CNN has a potential to realize regenerative medicine industry.

  12. Adaptive Spot Detection With Optimal Scale Selection in Fluorescence Microscopy Images.

    PubMed

    Basset, Antoine; Boulanger, Jérôme; Salamero, Jean; Bouthemy, Patrick; Kervrann, Charles

    2015-11-01

    Accurately detecting subcellular particles in fluorescence microscopy is of primary interest for further quantitative analysis such as counting, tracking, or classification. Our primary goal is to segment vesicles likely to share nearly the same size in fluorescence microscopy images. Our method termed adaptive thresholding of Laplacian of Gaussian (LoG) images with autoselected scale (ATLAS) automatically selects the optimal scale corresponding to the most frequent spot size in the image. Four criteria are proposed and compared to determine the optimal scale in a scale-space framework. Then, the segmentation stage amounts to thresholding the LoG of the intensity image. In contrast to other methods, the threshold is locally adapted given a probability of false alarm (PFA) specified by the user for the whole set of images to be processed. The local threshold is automatically derived from the PFA value and local image statistics estimated in a window whose size is not a critical parameter. We also propose a new data set for benchmarking, consisting of six collections of one hundred images each, which exploits backgrounds extracted from real microscopy images. We have carried out an extensive comparative evaluation on several data sets with ground-truth, which demonstrates that ATLAS outperforms existing methods. ATLAS does not need any fine parameter tuning and requires very low computation time. Convincing results are also reported on real total internal reflection fluorescence microscopy images.

  13. Study of Automatic Image Rectification and Registration of Scanned Historical Aerial Photographs

    NASA Astrophysics Data System (ADS)

    Chen, H. R.; Tseng, Y. H.

    2016-06-01

    Historical aerial photographs directly provide good evidences of past times. The Research Center for Humanities and Social Sciences (RCHSS) of Taiwan Academia Sinica has collected and scanned numerous historical maps and aerial images of Taiwan and China. Some maps or images have been geo-referenced manually, but most of historical aerial images have not been registered since there are no GPS or IMU data for orientation assisting in the past. In our research, we developed an automatic process of matching historical aerial images by SIFT (Scale Invariant Feature Transform) for handling the great quantity of images by computer vision. SIFT is one of the most popular method of image feature extracting and matching. This algorithm extracts extreme values in scale space into invariant image features, which are robust to changing in rotation scale, noise, and illumination. We also use RANSAC (Random sample consensus) to remove outliers, and obtain good conjugated points between photographs. Finally, we manually add control points for registration through least square adjustment based on collinear equation. In the future, we can use image feature points of more photographs to build control image database. Every new image will be treated as query image. If feature points of query image match the features in database, it means that the query image probably is overlapped with control images.With the updating of database, more and more query image can be matched and aligned automatically. Other research about multi-time period environmental changes can be investigated with those geo-referenced temporal spatial data.

  14. Red blood cell depletion from bone marrow and peripheral blood buffy coat: a comparison of two new and three established technologies.

    PubMed

    Sorg, Nadine; Poppe, Carolin; Bunos, Milica; Wingenfeld, Eva; Hümmer, Christiane; Krämer, Ariane; Stock, Belinda; Seifried, Erhard; Bonig, Halvard

    2015-06-01

    Red blood cell (RBC) depletion is a standard technique for preparation of ABO-incompatible bone marrow transplants (BMTs). Density centrifugation or apheresis are used successfully at clinical scale. The advent of a bone marrow (BM) processing module for the Spectra Optia (Terumo BCT) provided the initiative to formally compare our standard technology, the COBE2991 (Ficoll, manual, "C") with the Spectra Optia BMP (apheresis, semiautomatic, "O"), the Sepax II NeatCell (Ficoll, automatic, "S"), the Miltenyi CliniMACS Prodigy density gradient separation system (Ficoll, automatic, "P"), and manual Ficoll ("M"). C and O handle larger product volumes than S, P, and M. Technologies were assessed for RBC depletion, target cell (mononuclear cells [MNCs] for buffy coats [BCs], CD34+ cells for BM) recovery, and cost/labor. BC pools were simultaneously purged with C, O, S, and P; five to 18 BM samples were sequentially processed with C, O, S, and M. Mean RBC removal with C was 97% (BCs) or 92% (BM). From both products, O removed 97%, and P, S, and M removed 99% of RBCs. MNC recovery from BC (98% C, 97% O, 65% P, 74% S) or CD34+ cell recovery from BM (92% C, 90% O, 67% S, 70% M) were best with C and O. Polymorphonuclear cells (PMNs) were depleted from BCs by P, S, and C, while O recovered 50% of PMNs. Time savings compared to C or M for all tested technologies are considerable. All methods are in principle suitable and can be selected based on sample volume, available technology, and desired product specifications beyond RBC depletion and MNC and/or CD34+ cell recovery. © 2015 AABB.

  15. [Evaluation of a context sensitive system for intra-operative usage of the electronic patient record].

    PubMed

    Dressler, C R; Fischer, M; Burgert, O; Strauß, G

    2012-06-01

    This article analyzes the usage of an electronic patient record (EPR), which may be accessed intra-operatively by the surgeon. The focus lies on the automatic prioritization of documents to dramatically reduce the surgeon's interaction with the EPR system. An EPR system has been developed, which displays documents in accordance to the current procedure. The system is controlled by a foot switch and the documents are displayed on a large-scale screen in the operating room. The usage of the system by 2 surgeons has been recorded in clinical routine. 55 surgical procedures have been recorded. The EPR system has been used 2 times per procedure in average for surgeries at the middle ear, for surgeries of the paranasal sinuses, it has been used 1.3 times per procedure. The EPR-system has been used pre-operatively in 58% of cases. The surgeons did not have to interact with the EPR system for more than the half of the procedures to view the desired document. The existence of digitized documents in a clinic does not automatically lead to improved workflows. The evaluated EPR system presented the patient data in a simple and comfortable way. The extensive pre-operative usage had not been expected. Because of the low barrier to view patient data, higher patient safety may be assumed. On the other hand, the surgeon could be encouraged to skip the important preparation before the procedure. Due to the low pervasiveness of medical communication standards at this time, the integrated connection between clinic IT and an EPR system would nowadays only be possible by great efforts. © Georg Thieme Verlag KG Stuttgart · New York.

  16. Acquisition Of Rainfall Dataset And The Application For The Automatic Harvester In The Chesapeake Bay Region

    NASA Astrophysics Data System (ADS)

    Choi, Y.; Piasecki, M.

    2008-12-01

    The objective of this study is the preparation and indexing of rainfall data products for ingestion into the Chesapeake Bay Environmental Observatory (CBEO) node of the CUAHSI/WATERs network. Rainfall products (which are obtained and then processed based on the WSR-88D NEXRAD network) are obtained from the NOAA/NWS Advanced Hydrologic Prediction Service that combines the Multi-sensor Precipitation Estimate (MPE) data generated by the Regional River Forecast Centers and Hydro-NEXRAD rainfall data generated as a service by the University of Iowa. The former is collected on 4*4 km grid (HRAP) with a daily average temporal resolution and the latter on a 1minute*1minute degree grid with hourly values. We have generated a cut-out for the Chesapeake Bay Basin that contains about 9,300 nodes (sites) for the MPE data and about 300,000 nodes (sites) for the Hydro-NEXRAD product. Automated harvesting services have been implemented for both data products. The MPE data is harvested from its download site using ArcGIS which in turn is used to extract the data for the Chesapeake Bay watershed before a scripting program is used to scatter the data into the ODM. The Hydro-NEXRAD is downloaded from a web-based system at the University of Iowa which permits downloads for large scale watersheds organized by Hydraulic Unit Codes (HUC). The resulting ASCII is then automatically parsed and the information stored alongside the MPE data. The two data products stored side-by-side then allows a comparison between them addressing the accuracy and agreement between the methods used to arrive at rainfall data as both use the raw reflectivity data from the WSD-88D system.

  17. Effects of land preparation and artificial vegetation on soil moisture variation in a loess hilly catchment of China

    NASA Astrophysics Data System (ADS)

    Feng, Tianjiao; Wei, Wei; Chen, Liding; Yu, Yang

    2017-04-01

    In the dryland regions, soil moisture is the main factor to determine vegetation growth and ecosystem restoration. Land preparation and vegetation restoration are the principal means for improving soil water content(SWC). Thus, it is important to analyze the coupling role of these two means on soil moisture. In this study, soil moisture were monitored at a semi-arid loess hilly catchment of China, during the growing season of 2014 and 2015. Four different land preparation methods (level ditches, fish-scale pits, adverse grade tablelands and level benches)and vegetation types(Prunus armeniaca, Platycladus orientalis, Platycladus orientalis and Caragana microphylla) were included in the experimental design. Our results showed that: (1)Soil moisture content differed across land preparation types, which is higher for fish-scale pits and decreased in the order of level ditches and adverse grade tablelands.(2) Rainwater harvesting capacity of fish-scale pits is greater than adverse grade tablelands. However the water holding capacity is much higher at soils prepared with the adverse grade tablelands method than the ones prepared by fish-scale pits methods. (3) When land preparation method is similar, vegetation play a key role in soil moisture variation. For example, the mean soil moisture under a Platycladus orientalis field is 26.72% higher than a Pinus tabulaeformis field, with the same land preparation methods. (4)Soil moisture in deeper soil layers is more affected by changes in the vegetation cover while soil moisture in the shallower layers is more affected by the variation in the land preparation methods. Therefore, we suggest that vegetation types such as: Platycladus orientalisor as well as soil preparation methods such as level ditch and fish-scale pit are the most appropriate vegetation cover and land preparation methods for landscape restoration in semi-arid loess hilly area. This conclusion was made based on the vegetation type and land preparation with the best water-holding capacity.

  18. Impact of model complexity and multi-scale data integration on the estimation of hydrogeological parameters in a dual-porosity aquifer

    NASA Astrophysics Data System (ADS)

    Tamayo-Mas, Elena; Bianchi, Marco; Mansour, Majdi

    2018-03-01

    This study investigates the impact of model complexity and multi-scale prior hydrogeological data on the interpretation of pumping test data in a dual-porosity aquifer (the Chalk aquifer in England, UK). In order to characterize the hydrogeological properties, different approaches ranging from a traditional analytical solution (Theis approach) to more sophisticated numerical models with automatically calibrated input parameters are applied. Comparisons of results from the different approaches show that neither traditional analytical solutions nor a numerical model assuming a homogenous and isotropic aquifer can adequately explain the observed drawdowns. A better reproduction of the observed drawdowns in all seven monitoring locations is instead achieved when medium and local-scale prior information about the vertical hydraulic conductivity (K) distribution is used to constrain the model calibration process. In particular, the integration of medium-scale vertical K variations based on flowmeter measurements lead to an improvement in the goodness-of-fit of the simulated drawdowns of about 30%. Further improvements (up to 70%) were observed when a simple upscaling approach was used to integrate small-scale K data to constrain the automatic calibration process of the numerical model. Although the analysis focuses on a specific case study, these results provide insights about the representativeness of the estimates of hydrogeological properties based on different interpretations of pumping test data, and promote the integration of multi-scale data for the characterization of heterogeneous aquifers in complex hydrogeological settings.

  19. Secure web-based invocation of large-scale plasma simulation codes

    NASA Astrophysics Data System (ADS)

    Dimitrov, D. A.; Busby, R.; Exby, J.; Bruhwiler, D. L.; Cary, J. R.

    2004-12-01

    We present our design and initial implementation of a web-based system for running, both in parallel and serial, Particle-In-Cell (PIC) codes for plasma simulations with automatic post processing and generation of visual diagnostics.

  20. Application of computer vision to automatic prescription verification in pharmaceutical mail order

    NASA Astrophysics Data System (ADS)

    Alouani, Ali T.

    2005-05-01

    In large volume pharmaceutical mail order, before shipping out prescriptions, licensed pharmacists ensure that the drug in the bottle matches the information provided in the patient prescription. Typically, the pharmacist has about 2 sec to complete the prescription verification process of one prescription. Performing about 1800 prescription verification per hour is tedious and can generate human errors as a result of visual and brain fatigue. Available automatic drug verification systems are limited to a single pill at a time. This is not suitable for large volume pharmaceutical mail order, where a prescription can have as many as 60 pills and where thousands of prescriptions are filled every day. In an attempt to reduce human fatigue, cost, and limit human error, the automatic prescription verification system (APVS) was invented to meet the need of large scale pharmaceutical mail order. This paper deals with the design and implementation of the first prototype online automatic prescription verification machine to perform the same task currently done by a pharmacist. The emphasis here is on the visual aspects of the machine. The system has been successfully tested on 43,000 prescriptions.

  1. Automatic cloud tracking applied to GOES and Meteosat observations

    NASA Technical Reports Server (NTRS)

    Endlich, R. M.; Wolf, D. E.

    1981-01-01

    An improved automatic processing method for the tracking of cloud motions as revealed by satellite imagery is presented and applications of the method to GOES observations of Hurricane Eloise and Meteosat water vapor and infrared data are presented. The method is shown to involve steps of picture smoothing, target selection and the calculation of cloud motion vectors by the matching of a group at a given time with its best likeness at a later time, or by a cross-correlation computation. Cloud motion computations can be made in as many as four separate layers simultaneously. For data of 4 and 8 km resolution in the eye of Hurricane Eloise, the automatic system is found to provide results comparable in accuracy and coverage to those obtained by NASA analysts using the Atmospheric and Oceanographic Information Processing System, with results obtained by the pattern recognition and cross correlation computations differing by only fractions of a pixel. For Meteosat water vapor data from the tropics and midlatitudes, the automatic motion computations are found to be reliable only in areas where the water vapor fields contained small-scale structure, although excellent results are obtained using Meteosat IR data in the same regions. The automatic method thus appears to be competitive in accuracy and coverage with motion determination by human analysts.

  2. Automatic methods of the processing of data from track detectors on the basis of the PAVICOM facility

    NASA Astrophysics Data System (ADS)

    Aleksandrov, A. B.; Goncharova, L. A.; Davydov, D. A.; Publichenko, P. A.; Roganova, T. M.; Polukhina, N. G.; Feinberg, E. L.

    2007-02-01

    New automatic methods essentially simplify and increase the rate of the processing of data from track detectors. This provides a possibility of processing large data arrays and considerably improves their statistical significance. This fact predetermines the development of new experiments which plan to use large-volume targets, large-area emulsion, and solid-state track detectors [1]. In this regard, the problem of training qualified physicists who are capable of operating modern automatic equipment is very important. Annually, about ten Moscow students master the new methods, working at the Lebedev Physical Institute at the PAVICOM facility [2 4]. Most students specializing in high-energy physics are only given an idea of archaic manual methods of the processing of data from track detectors. In 2005, on the basis of the PAVICOM facility and the physicstraining course of Moscow State University, a new training work was prepared. This work is devoted to the determination of the energy of neutrons passing through a nuclear emulsion. It provides the possibility of acquiring basic practical skills of the processing of data from track detectors using automatic equipment and can be included in the educational process of students of any physical faculty. Those who have mastered the methods of automatic data processing in a simple and pictorial example of track detectors will be able to apply their knowledge in various fields of science and technique. Formulation of training works for pregraduate and graduate students is a new additional aspect of application of the PAVICOM facility described earlier in [4].

  3. There are limits to the effects of task instructions: Making the automatic effects of task instructions context-specific takes practice.

    PubMed

    Braem, Senne; Liefooghe, Baptist; De Houwer, Jan; Brass, Marcel; Abrahamse, Elger L

    2017-03-01

    Unlike other animals, humans have the unique ability to share and use verbal instructions to prepare for upcoming tasks. Recent research showed that instructions are sufficient for the automatic, reflex-like activation of responses. However, systematic studies into the limits of these automatic effects of task instructions remain relatively scarce. In this study, the authors set out to investigate whether this instruction-based automatic activation of responses can be context-dependent. Specifically, participants performed a task of which the stimulus-response rules and context (location on the screen) could either coincide or not with those of an instructed to-be-performed task (whose instructions changed every run). In 2 experiments, the authors showed that the instructed task rules had an automatic impact on performance-performance was slowed down when the merely instructed task rules did not coincide, but, importantly, this effect was not context-dependent. Interestingly, a third and fourth experiment suggests that context dependency can actually be observed, but only when practicing the task in its appropriate context for over 60 trials or after a sufficient amount of practice on a fixed context (the context was the same for all instructed tasks). Together, these findings seem to suggest that instructions can establish stimulus-response representations that have a reflexive impact on behavior but are insensitive to the context in which the task is known to be valid. Instead, context-specific task representations seem to require practice. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  4. Occupational Survey Report. AFSC 4A2X1 Biomedical Equipment

    DTIC Science & Technology

    2004-05-01

    Electrocardiograms 70 Hospital Beds, Electric 67 Surgical Lamps 67 Hospital Beds, Manual 66 Audiometers 64 Dental Curing Units 63 Dental Handpieces 63...Pumps 78 Pulse Oximeters 78 Dental Chairs 76 Blood Pressure Monitors, Automatic 74 Examination Lamps 72 Examination Tables 72 Blood Pressure Cuffs 71...Exercise Bicycles 63 Dental Amalgamators 62 Scales or Balances, other than Pediatric 62 Scales or Balances, Pediatric 61 First-Enlistment Personnel

  5. Reconstruction of 24 Penicillium genome-scale metabolic models shows diversity based on their secondary metabolism.

    PubMed

    Prigent, Sylvain; Nielsen, Jens Christian; Frisvad, Jens Christian; Nielsen, Jens

    2018-06-05

    Modelling of metabolism at the genome-scale have proved to be an efficient method for explaining observed phenotypic traits in living organisms. Further, it can be used as a means of predicting the effect of genetic modifications e.g. for development of microbial cell factories. With the increasing amount of genome sequencing data available, a need exists to accurately and efficiently generate such genome-scale metabolic models (GEMs) of non-model organisms, for which data is sparse. In this study, we present an automatic reconstruction approach applied to 24 Penicillium species, which have potential for production of pharmaceutical secondary metabolites or used in the manufacturing of food products such as cheeses. The models were based on the MetaCyc database and a previously published Penicillium GEM, and gave rise to comprehensive genome-scale metabolic descriptions. The models proved that while central carbon metabolism is highly conserved, secondary metabolic pathways represent the main diversity among the species. The automatic reconstruction approach presented in this study can be applied to generate GEMs of other understudied organisms, and the developed GEMs are a useful resource for the study of Penicillium metabolism, for example with the scope of developing novel cell factories. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  6. Definite Integral Automatic Analysis Mechanism Research and Development Using the "Find the Area by Integration" Unit as an Example

    ERIC Educational Resources Information Center

    Ting, Mu Yu

    2017-01-01

    Using the capabilities of expert knowledge structures, the researcher prepared test questions on the university calculus topic of "finding the area by integration." The quiz is divided into two types of multiple choice items (one out of four and one out of many). After the calculus course was taught and tested, the results revealed that…

  7. Holding Cargo in Place With Foam

    NASA Technical Reports Server (NTRS)

    Fisher, T. T.

    1985-01-01

    Foam fills entire container to protect cargo from shock and vibration. Originally developed for stowing space debris and spent satellites in Space Shuttle for return to Earth, encapsulation concept suitable for preparing shipments carried by truck, boat, or airplane. Equipment automatically injects polyurethane foam into its interior to hold cargo securely in place. Container of rectangular or other cross section built to match shape of vehicle used.

  8. Exploiting automatic on-line renewable molecularly imprinted solid-phase extraction in lab-on-valve format as front end to liquid chromatography: application to the determination of riboflavin in foodstuffs.

    PubMed

    Oliveira, Hugo M; Segundo, Marcela A; Lima, José L F C; Miró, Manuel; Cerdà, Victor

    2010-05-01

    In the present work, it is proposed, for the first time, an on-line automatic renewable molecularly imprinted solid-phase extraction (MISPE) protocol for sample preparation prior to liquid chromatographic analysis. The automatic microscale procedure was based on the bead injection (BI) concept under the lab-on-valve (LOV) format, using a multisyringe burette as propulsion unit for handling solutions and suspensions. A high precision on handling the suspensions containing irregularly shaped molecularly imprinted polymer (MIP) particles was attained, enabling the use of commercial MIP as renewable sorbent. The features of the proposed BI-LOV manifold also allowed a strict control of the different steps within the extraction protocol, which are essential for promoting selective interactions in the cavities of the MIP. By using this on-line method, it was possible to extract and quantify riboflavin from different foodstuff samples in the range between 0.450 and 5.00 mg L(-1) after processing 1,000 microL of sample (infant milk, pig liver extract, and energy drink) without any prior treatment. For milk samples, LOD and LOQ values were 0.05 and 0.17 mg L(-1), respectively. The method was successfully applied to the analysis of two certified reference materials (NIST 1846 and BCR 487) with high precision (RSD < 5.5%). Considering the downscale and simplification of the sample preparation protocol and the simultaneous performance of extraction and chromatographic assays, a cost-effective and enhanced throughput (six determinations per hour) methodology for determination of riboflavin in foodstuff samples is deployed here.

  9. Automatic single questionnaire intensity (SQI, EMS98 scale) estimation using ranking models built on the existing BCSF database

    NASA Astrophysics Data System (ADS)

    Schlupp, A.; Sira, C.; Schmitt, K.; Schaming, M.

    2013-12-01

    In charge of intensity estimations in France, BCSF has collected and manually analyzed more than 47000 online individual macroseismic questionnaires since 2000 up to intensity VI. These macroseismic data allow us to estimate one SQI value (Single Questionnaire Intensity) for each form following the EMS98 scale. The reliability of the automatic intensity estimation is important as they are today used for automatic shakemaps communications and crisis management. Today, the automatic intensity estimation at BCSF is based on the direct use of thumbnails selected on a menu by the witnesses. Each thumbnail corresponds to an EMS-98 intensity value, allowing us to quickly issue an intensity map of the communal intensity by averaging the SQIs at each city. Afterwards an expert, to determine a definitive SQI, manually analyzes each form. This work is time consuming and not anymore suitable considering the increasing number of testimonies at BCSF. Nevertheless, it can take into account incoherent answers. We tested several automatic methods (USGS algorithm, Correlation coefficient, Thumbnails) (Sira et al. 2013, IASPEI) and compared them with 'expert' SQIs. These methods gave us medium score (between 50 to 60% of well SQI determined and 35 to 40% with plus one or minus one intensity degree). The best fit was observed with the thumbnails. Here, we present new approaches based on 3 statistical ranking methods as 1) Multinomial logistic regression model, 2) Discriminant analysis DISQUAL and 3) Support vector machines (SVMs). The two first methods are standard methods, while the third one is more recent. Theses methods could be applied because the BCSF has already in his database more then 47000 forms and because their questions and answers are well adapted for a statistical analysis. The ranking models could then be used as automatic method constrained on expert analysis. The performance of the automatic methods and the reliability of the estimated SQI can be evaluated thanks to the fact that each definitive BCSF SQIs is determined by an expert analysis. We compare the SQIs obtained by these methods from our database and discuss the coherency and variations between automatic and manual processes. These methods lead to high scores with up to 85% of the forms well classified and most of the remaining forms classified with only a shift of one intensity degree. This allows us to use the ranking methods as the best automatic methods to fast SQIs estimation and to produce fast shakemaps. The next step, to improve the use of these methods, will be to identify explanations for the forms not classified at the correct value and a way to select the few remaining forms that should be analyzed by the expert. Note that beyond intensity VI, on-line questionnaires are insufficient and a field survey is indispensable to estimate intensity. For such survey, in France, BCSF leads a macroseismic intervention group (GIM).

  10. Ball-scale based hierarchical multi-object recognition in 3D medical images

    NASA Astrophysics Data System (ADS)

    Bağci, Ulas; Udupa, Jayaram K.; Chen, Xinjian

    2010-03-01

    This paper investigates, using prior shape models and the concept of ball scale (b-scale), ways of automatically recognizing objects in 3D images without performing elaborate searches or optimization. That is, the goal is to place the model in a single shot close to the right pose (position, orientation, and scale) in a given image so that the model boundaries fall in the close vicinity of object boundaries in the image. This is achieved via the following set of key ideas: (a) A semi-automatic way of constructing a multi-object shape model assembly. (b) A novel strategy of encoding, via b-scale, the pose relationship between objects in the training images and their intensity patterns captured in b-scale images. (c) A hierarchical mechanism of positioning the model, in a one-shot way, in a given image from a knowledge of the learnt pose relationship and the b-scale image of the given image to be segmented. The evaluation results on a set of 20 routine clinical abdominal female and male CT data sets indicate the following: (1) Incorporating a large number of objects improves the recognition accuracy dramatically. (2) The recognition algorithm can be thought as a hierarchical framework such that quick replacement of the model assembly is defined as coarse recognition and delineation itself is known as finest recognition. (3) Scale yields useful information about the relationship between the model assembly and any given image such that the recognition results in a placement of the model close to the actual pose without doing any elaborate searches or optimization. (4) Effective object recognition can make delineation most accurate.

  11. Incorporation of composite defects from ultrasonic NDE into CAD and FE models

    NASA Astrophysics Data System (ADS)

    Bingol, Onur Rauf; Schiefelbein, Bryan; Grandin, Robert J.; Holland, Stephen D.; Krishnamurthy, Adarsh

    2017-02-01

    Fiber-reinforced composites are widely used in aerospace industry due to their combined properties of high strength and low weight. However, owing to their complex structure, it is difficult to assess the impact of manufacturing defects and service damage on their residual life. While, ultrasonic testing (UT) is the preferred NDE method to identify the presence of defects in composites, there are no reasonable ways to model the damage and evaluate the structural integrity of composites. We have developed an automated framework to incorporate flaws and known composite damage automatically into a finite element analysis (FEA) model of composites, ultimately aiding in accessing the residual life of composites and make informed decisions regarding repairs. The framework can be used to generate a layer-by-layer 3D structural CAD model of the composite laminates replicating their manufacturing process. Outlines of structural defects, such as delaminations, are automatically detected from UT of the laminate and are incorporated into the CAD model between the appropriate layers. In addition, the framework allows for direct structural analysis of the resulting 3D CAD models with defects by automatically applying the appropriate boundary conditions. In this paper, we show a working proof-of-concept for the composite model builder with capabilities of incorporating delaminations between laminate layers and automatically preparing the CAD model for structural analysis using a FEA software.

  12. Preparing Attitude Scale to Define Students' Attitudes about Environment, Recycling, Plastic and Plastic Waste

    ERIC Educational Resources Information Center

    Avan, Cagri; Aydinli, Bahattin; Bakar, Fatma; Alboga, Yunus

    2011-01-01

    The aim of this study is to introduce an attitude scale in order to define students? attitudes about environment, recycling, plastics, plastic waste. In this study, 80 attitude sentences according to 5-point Likert-type scale were prepared and applied to 492 students of 6th grade in the Kastamonu city center of Turkey. The scale consists of…

  13. Automatic map generalisation from research to production

    NASA Astrophysics Data System (ADS)

    Nyberg, Rose; Johansson, Mikael; Zhang, Yang

    2018-05-01

    The manual work of map generalisation is known to be a complex and time consuming task. With the development of technology and societies, the demands for more flexible map products with higher quality are growing. The Swedish mapping, cadastral and land registration authority Lantmäteriet has manual production lines for databases in five different scales, 1 : 10 000 (SE10), 1 : 50 000 (SE50), 1 : 100 000 (SE100), 1 : 250 000 (SE250) and 1 : 1 million (SE1M). To streamline this work, Lantmäteriet started a project to automatically generalise geographic information. Planned timespan for the project is 2015-2022. Below the project background together with the methods for the automatic generalisation are described. The paper is completed with a description of results and conclusions.

  14. Automatic process control in anaerobic digestion technology: A critical review.

    PubMed

    Nguyen, Duc; Gadhamshetty, Venkataramana; Nitayavardhana, Saoharit; Khanal, Samir Kumar

    2015-10-01

    Anaerobic digestion (AD) is a mature technology that relies upon a synergistic effort of a diverse group of microbial communities for metabolizing diverse organic substrates. However, AD is highly sensitive to process disturbances, and thus it is advantageous to use online monitoring and process control techniques to efficiently operate AD process. A range of electrochemical, chromatographic and spectroscopic devices can be deployed for on-line monitoring and control of the AD process. While complexity of the control strategy ranges from a feedback control to advanced control systems, there are some debates on implementation of advanced instrumentations or advanced control strategies. Centralized AD plants could be the answer for the applications of progressive automatic control field. This article provides a critical overview of the available automatic control technologies that can be implemented in AD processes at different scales. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. [Alexithymia and automatic activation of emotional-evaluative information].

    PubMed

    Suslow, T; Arolt, V; Junghanns, K

    1998-05-01

    The emotional valence of stimuli seems to be stored in the associative network and is automatically activated on the mere observation of a stimulus. A principal characteristic of alexithymia represents the difficulty to symbolize emotions verbally. The present study examines the relationship between the dimensions of the alexithymia construct and emotional priming effects in a word-word paradigma. The 20-Item Toronto Alexithymia Scale was administered to 32 subjects along with two word reading tasks as measures of emotional and semantic priming effects. The subscale "difficulty describing feelings" correlated as expected negatively with the negative inhibition effect. The subscale "externally oriented thinking" tended to correlate negatively with the negative facilitation effect. Thus, these dimensions of alexithymia are inversely related to the degree of automatic emotional priming. In summary, there is evidence for an impaired structural integration of emotion and language in persons with difficulties in describing feelings. Poor "symbolization" of emotions in alexithymia is discussed from a cognitive perspective.

  16. Automatic detection of small surface targets with electro-optical sensors in a harbor environment

    NASA Astrophysics Data System (ADS)

    Bouma, Henri; de Lange, Dirk-Jan J.; van den Broek, Sebastiaan P.; Kemp, Rob A. W.; Schwering, Piet B. W.

    2008-10-01

    In modern warfare scenarios naval ships must operate in coastal environments. These complex environments, in bays and narrow straits, with cluttered littoral backgrounds and many civilian ships may contain asymmetric threats of fast targets, such as rhibs, cabin boats and jet-skis. Optical sensors, in combination with image enhancement and automatic detection, assist an operator to reduce the response time, which is crucial for the protection of the naval and land-based supporting forces. In this paper, we present our work on automatic detection of small surface targets which includes multi-scale horizon detection and robust estimation of the background intensity. To evaluate the performance of our detection technology, data was recorded with both infrared and visual-light cameras in a coastal zone and in a harbor environment. During these trials multiple small targets were used. Results of this evaluation are shown in this paper.

  17. Terminology extraction from medical texts in Polish

    PubMed Central

    2014-01-01

    Background Hospital documents contain free text describing the most important facts relating to patients and their illnesses. These documents are written in specific language containing medical terminology related to hospital treatment. Their automatic processing can help in verifying the consistency of hospital documentation and obtaining statistical data. To perform this task we need information on the phrases we are looking for. At the moment, clinical Polish resources are sparse. The existing terminologies, such as Polish Medical Subject Headings (MeSH), do not provide sufficient coverage for clinical tasks. It would be helpful therefore if it were possible to automatically prepare, on the basis of a data sample, an initial set of terms which, after manual verification, could be used for the purpose of information extraction. Results Using a combination of linguistic and statistical methods for processing over 1200 children hospital discharge records, we obtained a list of single and multiword terms used in hospital discharge documents written in Polish. The phrases are ordered according to their presumed importance in domain texts measured by the frequency of use of a phrase and the variety of its contexts. The evaluation showed that the automatically identified phrases cover about 84% of terms in domain texts. At the top of the ranked list, only 4% out of 400 terms were incorrect while out of the final 200, 20% of expressions were either not domain related or syntactically incorrect. We also observed that 70% of the obtained terms are not included in the Polish MeSH. Conclusions Automatic terminology extraction can give results which are of a quality high enough to be taken as a starting point for building domain related terminological dictionaries or ontologies. This approach can be useful for preparing terminological resources for very specific subdomains for which no relevant terminologies already exist. The evaluation performed showed that none of the tested ranking procedures were able to filter out all improperly constructed noun phrases from the top of the list. Careful choice of noun phrases is crucial to the usefulness of the created terminological resource in applications such as lexicon construction or acquisition of semantic relations from texts. PMID:24976943

  18. Terminology extraction from medical texts in Polish.

    PubMed

    Marciniak, Małgorzata; Mykowiecka, Agnieszka

    2014-01-01

    Hospital documents contain free text describing the most important facts relating to patients and their illnesses. These documents are written in specific language containing medical terminology related to hospital treatment. Their automatic processing can help in verifying the consistency of hospital documentation and obtaining statistical data. To perform this task we need information on the phrases we are looking for. At the moment, clinical Polish resources are sparse. The existing terminologies, such as Polish Medical Subject Headings (MeSH), do not provide sufficient coverage for clinical tasks. It would be helpful therefore if it were possible to automatically prepare, on the basis of a data sample, an initial set of terms which, after manual verification, could be used for the purpose of information extraction. Using a combination of linguistic and statistical methods for processing over 1200 children hospital discharge records, we obtained a list of single and multiword terms used in hospital discharge documents written in Polish. The phrases are ordered according to their presumed importance in domain texts measured by the frequency of use of a phrase and the variety of its contexts. The evaluation showed that the automatically identified phrases cover about 84% of terms in domain texts. At the top of the ranked list, only 4% out of 400 terms were incorrect while out of the final 200, 20% of expressions were either not domain related or syntactically incorrect. We also observed that 70% of the obtained terms are not included in the Polish MeSH. Automatic terminology extraction can give results which are of a quality high enough to be taken as a starting point for building domain related terminological dictionaries or ontologies. This approach can be useful for preparing terminological resources for very specific subdomains for which no relevant terminologies already exist. The evaluation performed showed that none of the tested ranking procedures were able to filter out all improperly constructed noun phrases from the top of the list. Careful choice of noun phrases is crucial to the usefulness of the created terminological resource in applications such as lexicon construction or acquisition of semantic relations from texts.

  19. Intelligent and automatic in vivo detection and quantification of transplanted cells in MRI.

    PubMed

    Afridi, Muhammad Jamal; Ross, Arun; Liu, Xiaoming; Bennewitz, Margaret F; Shuboni, Dorela D; Shapiro, Erik M

    2017-11-01

    Magnetic resonance imaging (MRI)-based cell tracking has emerged as a useful tool for identifying the location of transplanted cells, and even their migration. Magnetically labeled cells appear as dark contrast in T2*-weighted MRI, with sensitivities of individual cells. One key hurdle to the widespread use of MRI-based cell tracking is the inability to determine the number of transplanted cells based on this contrast feature. In the case of single cell detection, manual enumeration of spots in three-dimensional (3D) MRI in principle is possible; however, it is a tedious and time-consuming task that is prone to subjectivity and inaccuracy on a large scale. This research presents the first comprehensive study on how a computer-based intelligent, automatic, and accurate cell quantification approach can be designed for spot detection in MRI scans. Magnetically labeled mesenchymal stem cells (MSCs) were transplanted into rats using an intracardiac injection, accomplishing single cell seeding in the brain. T2*-weighted MRI of these rat brains were performed where labeled MSCs appeared as spots. Using machine learning and computer vision paradigms, approaches were designed to systematically explore the possibility of automatic detection of these spots in MRI. Experiments were validated against known in vitro scenarios. Using the proposed deep convolutional neural network (CNN) architecture, an in vivo accuracy up to 97.3% and in vitro accuracy of up to 99.8% was achieved for automated spot detection in MRI data. The proposed approach for automatic quantification of MRI-based cell tracking will facilitate the use of MRI in large-scale cell therapy studies. Magn Reson Med 78:1991-2002, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  20. NC-AFM observation of atomic scale structure of rutile-type TiO2(110) surface prepared by wet chemical process.

    PubMed

    Namai, Yoshimichi; Matsuoka, Osamu

    2006-04-06

    We succeeded in observing the atomic scale structure of a rutile-type TiO2(110) single-crystal surface prepared by the wet chemical method of chemical etching in an acid solution and surface annealing in air. Ultrahigh vacuum noncontact atomic force microscopy (UHV-NC-AFM) was used for observing the atomic scale structures of the surface. The UHV-NC-AFM measurements at 450 K, which is above a desorption temperature of molecularly adsorbed water on the TiO2(110) surface, enabled us to observe the atomic scale structure of the TiO2(110) surface prepared by the wet chemical method. In the UHV-NC-AFM measurements at room temperature (RT), however, the atomic scale structure of the TiO2(110) surface was not observed. The TiO2(110) surface may be covered with molecularly adsorbed water after the surface was prepared by the wet chemical method. The structure of the TiO2(110) surface that was prepared by the wet chemical method was consistent with the (1 x 1) bulk-terminated model of the TiO2(110) surface.

  1. Techniques for automatic large scale change analysis of temporal multispectral imagery

    NASA Astrophysics Data System (ADS)

    Mercovich, Ryan A.

    Change detection in remotely sensed imagery is a multi-faceted problem with a wide variety of desired solutions. Automatic change detection and analysis to assist in the coverage of large areas at high resolution is a popular area of research in the remote sensing community. Beyond basic change detection, the analysis of change is essential to provide results that positively impact an image analyst's job when examining potentially changed areas. Present change detection algorithms are geared toward low resolution imagery, and require analyst input to provide anything more than a simple pixel level map of the magnitude of change that has occurred. One major problem with this approach is that change occurs in such large volume at small spatial scales that a simple change map is no longer useful. This research strives to create an algorithm based on a set of metrics that performs a large area search for change in high resolution multispectral image sequences and utilizes a variety of methods to identify different types of change. Rather than simply mapping the magnitude of any change in the scene, the goal of this research is to create a useful display of the different types of change in the image. The techniques presented in this dissertation are used to interpret large area images and provide useful information to an analyst about small regions that have undergone specific types of change while retaining image context to make further manual interpretation easier. This analyst cueing to reduce information overload in a large area search environment will have an impact in the areas of disaster recovery, search and rescue situations, and land use surveys among others. By utilizing a feature based approach founded on applying existing statistical methods and new and existing topological methods to high resolution temporal multispectral imagery, a novel change detection methodology is produced that can automatically provide useful information about the change occurring in large area and high resolution image sequences. The change detection and analysis algorithm developed could be adapted to many potential image change scenarios to perform automatic large scale analysis of change.

  2. Schizophrenia with prominent catatonic features ('catatonic schizophrenia') III. Latent class analysis of the catatonic syndrome.

    PubMed

    Ungvari, Gabor S; Goggins, William; Leung, Siu-Kau; Lee, Edwin; Gerevich, Jozsef

    2009-02-01

    No reports have yet been published on catatonia using latent class analysis (LCA). This study applied LCA to a large, diagnostically homogenous sample of patients with chronic schizophrenia who also presented with catatonic symptoms. A random sample of 225 Chinese inpatients with DSM-IV schizophrenia was selected from the long-stay wards of a psychiatric hospital. Their psychopathology, extrapyramidal motor status and level of functioning were evaluated with standardized rating scales. Catatonia was rated using a modified version of the Bush-Francis Catatonia Rating Scale. LCA was then applied to the 178 patients who presented with at least one catatonic sign. In LCA a four-class solution was found to fit best the statistical model. Classes 1, 2, 3 and 4 constituted 18%, 39.4%, 20.1% and 22.5% of the whole catatonic sample, respectively. Class 1 included patients with symptoms of 'automatic' phenomena (automatic obedience, Mitgehen, waxy flexibility). Class 2 comprised patients with 'repetitive/echo' phenomena (perseveration, stereotypy, verbigeration, mannerisms and grimacing). Class 3 contained patients with symptoms of 'withdrawal' (immobility, mutism, posturing, staring and withdrawal). Class 4 consisted of 'agitated/resistive' patients, who displayed symptoms of excitement, impulsivity, negativism and combativeness. The symptom composition of these 4 classes was nearly identical with that of the four factors identified by factor analysis in the same cohort of subjects in an earlier study. In multivariate regression analysis, the 'withdrawn' class was associated with higher scores on the Scale of Assessment of Negative Symptoms and lower and higher scores for negative and positive items respectively on the Nurses' Observation Scale for Inpatient Evaluation's (NOSIE). The 'automatic' class was associated with lower values on the Simpson-Angus Extrapyramidal Side Effects Scale, and the 'repetitive/echo' class with higher scores on the NOSIE positive items. These results provide preliminary support for the notion that chronic schizophrenia patients with catatonic features can be classified into 4 distinct syndromal groups on the basis of their motor symptoms. Identifying distinct catatonic syndromes would help to find their biological substrates and to develop specific therapeutic measures.

  3. Evaluation of manual and automatic manually triggered ventilation performance and ergonomics using a simulation model.

    PubMed

    Marjanovic, Nicolas; Le Floch, Soizig; Jaffrelot, Morgan; L'Her, Erwan

    2014-05-01

    In the absence of endotracheal intubation, the manual bag-valve-mask (BVM) is the most frequently used ventilation technique during resuscitation. The efficiency of other devices has been poorly studied. The bench-test study described here was designed to evaluate the effectiveness of an automatic, manually triggered system, and to compare it with manual BVM ventilation. A respiratory system bench model was assembled using a lung simulator connected to a manikin to simulate a patient with unprotected airways. Fifty health-care providers from different professional groups (emergency physicians, residents, advanced paramedics, nurses, and paramedics; n = 10 per group) evaluated manual BVM ventilation, and compared it with an automatic manually triggered device (EasyCPR). Three pathological situations were simulated (restrictive, obstructive, normal). Standard ventilation parameters were recorded; the ergonomics of the system were assessed by the health-care professionals using a standard numerical scale once the recordings were completed. The tidal volume fell within the standard range (400-600 mL) for 25.6% of breaths (0.6-45 breaths) using manual BVM ventilation, and for 28.6% of breaths (0.3-80 breaths) using the automatic manually triggered device (EasyCPR) (P < .0002). Peak inspiratory airway pressure was lower using the automatic manually triggered device (EasyCPR) (10.6 ± 5 vs 15.9 ± 10 cm H2O, P < .001). The ventilation rate fell consistently within the guidelines, in the case of the automatic manually triggered device (EasyCPR) only (10.3 ± 2 vs 17.6 ± 6, P < .001). Significant pulmonary overdistention was observed when using the manual BVM device during the normal and obstructive sequences. The nurses and paramedics considered the ergonomics of the automatic manually triggered device (EasyCPR) to be better than those of the manual device. The use of an automatic manually triggered device may improve ventilation efficiency and decrease the risk of pulmonary overdistention, while decreasing the ventilation rate.

  4. Automatic segmentation of psoriasis lesions

    NASA Astrophysics Data System (ADS)

    Ning, Yang; Shi, Chenbo; Wang, Li; Shu, Chang

    2014-10-01

    The automatic segmentation of psoriatic lesions is widely researched these years. It is an important step in Computer-aid methods of calculating PASI for estimation of lesions. Currently those algorithms can only handle single erythema or only deal with scaling segmentation. In practice, scaling and erythema are often mixed together. In order to get the segmentation of lesions area - this paper proposes an algorithm based on Random forests with color and texture features. The algorithm has three steps. The first step, the polarized light is applied based on the skin's Tyndall-effect in the imaging to eliminate the reflection and Lab color space are used for fitting the human perception. The second step, sliding window and its sub windows are used to get textural feature and color feature. In this step, a feature of image roughness has been defined, so that scaling can be easily separated from normal skin. In the end, Random forests will be used to ensure the generalization ability of the algorithm. This algorithm can give reliable segmentation results even the image has different lighting conditions, skin types. In the data set offered by Union Hospital, more than 90% images can be segmented accurately.

  5. First order augmentation to tensor voting for boundary inference and multiscale analysis in 3D.

    PubMed

    Tong, Wai-Shun; Tang, Chi-Keung; Mordohai, Philippos; Medioni, Gérard

    2004-05-01

    Most computer vision applications require the reliable detection of boundaries. In the presence of outliers, missing data, orientation discontinuities, and occlusion, this problem is particularly challenging. We propose to address it by complementing the tensor voting framework, which was limited to second order properties, with first order representation and voting. First order voting fields and a mechanism to vote for 3D surface and volume boundaries and curve endpoints in 3D are defined. Boundary inference is also useful for a second difficult problem in grouping, namely, automatic scale selection. We propose an algorithm that automatically infers the smallest scale that can preserve the finest details. Our algorithm then proceeds with progressively larger scales to ensure continuity where it has not been achieved. Therefore, the proposed approach does not oversmooth features or delay the handling of boundaries and discontinuities until model misfit occurs. The interaction of smooth features, boundaries, and outliers is accommodated by the unified representation, making possible the perceptual organization of data in curves, surfaces, volumes, and their boundaries simultaneously. We present results on a variety of data sets to show the efficacy of the improved formalism.

  6. A procedure for the reliability improvement of the oblique ionograms automatic scaling algorithm

    NASA Astrophysics Data System (ADS)

    Ippolito, Alessandro; Scotto, Carlo; Sabbagh, Dario; Sgrigna, Vittorio; Maher, Phillip

    2016-05-01

    A procedure made by the combined use of the Oblique Ionogram Automatic Scaling Algorithm (OIASA) and Autoscala program is presented. Using Martyn's equivalent path theorem, 384 oblique soundings from a high-quality data set have been converted into vertical ionograms and analyzed by Autoscala program. The ionograms pertain to the radio link between Curtin W.A. (CUR) and Alice Springs N.T. (MTE), Australia, geographical coordinates (17.60°S; 123.82°E) and (23.52°S; 133.68°E), respectively. The critical frequency foF2 values extracted from the converted vertical ionograms by Autoscala were then compared with the foF2 values derived from the maximum usable frequencies (MUFs) provided by OIASA. A quality factor Q for the MUF values autoscaled by OIASA has been identified. Q represents the difference between the foF2 value scaled by Autoscala from the converted vertical ionogram and the foF2 value obtained applying the secant law to the MUF provided by OIASA. Using the receiver operating characteristic curve, an appropriate threshold level Qt was chosen for Q to improve the performance of OIASA.

  7. Automatic detection of ischemic stroke based on scaling exponent electroencephalogram using extreme learning machine

    NASA Astrophysics Data System (ADS)

    Adhi, H. A.; Wijaya, S. K.; Prawito; Badri, C.; Rezal, M.

    2017-03-01

    Stroke is one of cerebrovascular diseases caused by the obstruction of blood flow to the brain. Stroke becomes the leading cause of death in Indonesia and the second in the world. Stroke also causes of the disability. Ischemic stroke accounts for most of all stroke cases. Obstruction of blood flow can cause tissue damage which results the electrical changes in the brain that can be observed through the electroencephalogram (EEG). In this study, we presented the results of automatic detection of ischemic stroke and normal subjects based on the scaling exponent EEG obtained through detrended fluctuation analysis (DFA) using extreme learning machine (ELM) as the classifier. The signal processing was performed with 18 channels of EEG in the range of 0-30 Hz. Scaling exponents of the subjects were used as the input for ELM to classify the ischemic stroke. The performance of detection was observed by the value of accuracy, sensitivity and specificity. The result showed, performance of the proposed method to classify the ischemic stroke was 84 % for accuracy, 82 % for sensitivity and 87 % for specificity with 120 hidden neurons and sine as the activation function of ELM.

  8. Patent Administration by Office Computer - A Case at Mazda Motor Corporation

    NASA Astrophysics Data System (ADS)

    Kimura, Ikuo; Nakamura, Shinji

    The needs of patent administration have been diversified reflecting R&D activities under the severe competition of technical development, and business has been increased in quantity year after year as seen in patent application. Under these circumstances it is necessary to develop business mechanization which assists manual operation as much as possible to enforce the patent administration. Introducing office computer (CPU 512 KB, external memory 128 MB) for exclusive use in this purpose, Patent Department of Mazda Motor Corporation has been constructing database of patent administration centered around patent application by their own company, and utilizes it for automatic preparation of business forms, preparation of various statistical materials, and real-time reference to the application procedures.

  9. Novel Repair Concept for Composite Materials by Repetitive Geometrical Interlock Elements

    PubMed Central

    Hufenbach, Werner; Adam, Frank; Heber, Thomas; Weckend, Nico; Bach, Friedrich-Wilhelm; Hassel, Thomas; Zaremba, David

    2011-01-01

    Material adapted repair technologies for fiber-reinforced polymers with thermosetting matrix systems are currently characterized by requiring major efforts for repair preparation and accomplishment in all industrial areas of application. In order to allow for a uniform distribution of material and geometrical parameters over the repair zone, a novel composite interlock repair concept is introduced, which is based on a repair zone with undercuts prepared by water-jet technology. The presented numerical and experimental sensitivity analyses make a contribution to the systematic development of the interlock repair technology with respect to material and geometrical factors of influence. The results show the ability of the novel concept for a reproducible and automatable composite repair. PMID:28824134

  10. Analysis of Energy-Absorbing Foundations.

    DTIC Science & Technology

    1978-12-15

    side rails. At the top of the rebound, air brakes are automatically activated which press against the rails and stop the table, preventing a second...for the same application to automobile bumpers , was greater than that used in an alternate design in which the tube was crushed axially, so it appears...shock mounts prepared by Burns [48]. Typi- cal non-linear, elastic, load-deflection curves are given for helical springs, pneumatic cylinders, hydraulic

  11. Mobile-Dose: A Dose-Meter Designed for Use in Automatic Machineries for Dose Manipulation in Nuclear Medicine

    NASA Astrophysics Data System (ADS)

    de Asmundis, Riccardo; Boiano, Alfonso; Ramaglia, Antonio

    2008-06-01

    Mobile-Dose has been designed for a very innovative use: the integration in a robotic machinery for automatic preparation of radioactive doses, to be injected to patients in Nuclear Medicine Departments, with real time measurement of the activity under preparation. Mobile-Dose gives a constant measurement of the dose during the filling of vials or syringes, triggering the end of the filling process based on a predefined dose limit. Several applications of Mobile-Dose have been delivered worldwide, from Italian hospitals and clinics to European and Japanese ones. The design of such an instrument and its integration in robotic machineries, was required by an Italian company specialised in radiation protection tools for nuclear applications, in the period 2001-2003. At the time of its design, apparently no commercial instruments with a suitable interfacing capability to the external world existed: we designed it in order to satisfy all the strict requirements coming from the medical aspects (precision within 10%, repeatability, stability, time response) and from the industrial conceiving principles that are mandatory to ensure a good reliability in such a complicated environment. The instrument is suitable to be used in standalone mode too, thanks to its portability and compactness and to the intelligent operator panel programmed for this purpose.

  12. Virtual Surveyor based Object Extraction from Airborne LiDAR data

    NASA Astrophysics Data System (ADS)

    Habib, Md. Ahsan

    Topographic feature detection of land cover from LiDAR data is important in various fields - city planning, disaster response and prevention, soil conservation, infrastructure or forestry. In recent years, feature classification, compliant with Object-Based Image Analysis (OBIA) methodology has been gaining traction in remote sensing and geographic information science (GIS). In OBIA, the LiDAR image is first divided into meaningful segments called object candidates. This results, in addition to spectral values, in a plethora of new information such as aggregated spectral pixel values, morphology, texture, context as well as topology. Traditional nonparametric segmentation methods rely on segmentations at different scales to produce a hierarchy of semantically significant objects. Properly tuned scale parameters are, therefore, imperative in these methods for successful subsequent classification. Recently, some progress has been made in the development of methods for tuning the parameters for automatic segmentation. However, researchers found that it is very difficult to automatically refine the tuning with respect to each object class present in the scene. Moreover, due to the relative complexity of real-world objects, the intra-class heterogeneity is very high, which leads to over-segmentation. Therefore, the method fails to deliver correctly many of the new segment features. In this dissertation, a new hierarchical 3D object segmentation algorithm called Automatic Virtual Surveyor based Object Extracted (AVSOE) is presented. AVSOE segments objects based on their distinct geometric concavity/convexity. This is achieved by strategically mapping the sloping surface, which connects the object to its background. Further analysis produces hierarchical decomposition of objects to its sub-objects at a single scale level. Extensive qualitative and qualitative results are presented to demonstrate the efficacy of this hierarchical segmentation approach.

  13. [Optimization of isolation of the concentrate of stem cells from the umbilical blood].

    PubMed

    Tiumina, O V; Savchenko, V G; Gusarova, G I; Pavlov, V V; Zharkov, M N; Volchkov, S E; Rossiev, V A; Gridasov, G N

    2005-01-01

    To study correlations between body mass and height of the newborn, Apgar scale estimates, gestation time, volume of the obtained umbilical blood (UB), number of nucleated cells (NC); to compare manual and automatic modes of UB processing. 330 procurements of UB were made, 230 (69.7%) samples were frozen. Comparison of 2 techniques of UB processing was made in 73 cases of double centrifugation with hydroxyethylstarch (HES) and 47 cases of using separator Sepax (Biosafe, Switzerland). Blood cell count before and after UB processing and number of CD34+ cells were estimated. A correlation analysis was made of dependence of the volume of 102 samples of UB on the weight (r = 0.268, p < 0.01) and height of the fetus (r = 0.203, p < 0.05), estimation by Apgar scale (r = -0.092, p < 0.1) and gestation term (r = -0.003, p > 0.1); analysis of the number of NC dependence on the volume of UB (r = 0.102 p < 0.1), mass (r = 0.073 p > 0.1) and fetus height (r = 0.121 p > 0.1), gestation time (r = 0.159 p > 0.1), Apgar scale assessment (r = -0.174 p > 0.1). In manual UB management NC yield made up 71.9 +/- 6.7%, in automatic--81 +/- 8.0% (p < 0.05). Percent of erythrocytes removal was 73 +/- 5.7% and 80.5 +/- 6.1% (p < 0.05), respectively. A weak correlation was found between UB volume, mass and height of the fetus. The number of NC in UB depends on none of the parameters. Automatic processing of UB provides a greater release of NC and better elimination of erythrocytes in minimal risk of contamination.

  14. A Scalable Analysis Toolkit

    NASA Technical Reports Server (NTRS)

    Aiken, Alexander

    2001-01-01

    The Scalable Analysis Toolkit (SAT) project aimed to demonstrate that it is feasible and useful to statically detect software bugs in very large systems. The technical focus of the project was on a relatively new class of constraint-based techniques for analysis software, where the desired facts about programs (e.g., the presence of a particular bug) are phrased as constraint problems to be solved. At the beginning of this project, the most successful forms of formal software analysis were limited forms of automatic theorem proving (as exemplified by the analyses used in language type systems and optimizing compilers), semi-automatic theorem proving for full verification, and model checking. With a few notable exceptions these approaches had not been demonstrated to scale to software systems of even 50,000 lines of code. Realistic approaches to large-scale software analysis cannot hope to make every conceivable formal method scale. Thus, the SAT approach is to mix different methods in one application by using coarse and fast but still adequate methods at the largest scales, and reserving the use of more precise but also more expensive methods at smaller scales for critical aspects (that is, aspects critical to the analysis problem under consideration) of a software system. The principled method proposed for combining a heterogeneous collection of formal systems with different scalability characteristics is mixed constraints. This idea had been used previously in small-scale applications with encouraging results: using mostly coarse methods and narrowly targeted precise methods, useful information (meaning the discovery of bugs in real programs) was obtained with excellent scalability.

  15. A scale self-adapting segmentation approach and knowledge transfer for automatically updating land use/cover change databases using high spatial resolution images

    NASA Astrophysics Data System (ADS)

    Wang, Zhihua; Yang, Xiaomei; Lu, Chen; Yang, Fengshuo

    2018-07-01

    Automatic updating of land use/cover change (LUCC) databases using high spatial resolution images (HSRI) is important for environmental monitoring and policy making, especially for coastal areas that connect the land and coast and that tend to change frequently. Many object-based change detection methods are proposed, especially those combining historical LUCC with HSRI. However, the scale parameter(s) segmenting the serial temporal images, which directly determines the average object size, is hard to choose without experts' intervention. And the samples transferred from historical LUCC also need experts' intervention to avoid insufficient or wrong samples. With respect to the scale parameter(s) choosing, a Scale Self-Adapting Segmentation (SSAS) approach based on the exponential sampling of a scale parameter and location of the local maximum of a weighted local variance was proposed to determine the scale selection problem when segmenting images constrained by LUCC for detecting changes. With respect to the samples transferring, Knowledge Transfer (KT), a classifier trained on historical images with LUCC and applied in the classification of updated images, was also proposed. Comparison experiments were conducted in a coastal area of Zhujiang, China, using SPOT 5 images acquired in 2005 and 2010. The results reveal that (1) SSAS can segment images more effectively without intervention of experts. (2) KT can also reach the maximum accuracy of samples transfer without experts' intervention. Strategy SSAS + KT would be a good choice if the temporal historical image and LUCC match, and the historical image and updated image are obtained from the same resource.

  16. Syntactic Approach To Geometric Surface Shell Determination

    NASA Astrophysics Data System (ADS)

    DeGryse, Donald G.; Panton, Dale J.

    1980-12-01

    Autonomous terminal homing of a smart missile requires a stored reference scene of the target for which the missle is destined. The reference scene is produced from stereo source imagery by deriving a three-dimensional model containing cultural structures such as buildings, towers, bridges, and tanks. This model is obtained by the precise matching of cultural features from one image of the stereo pair to the other. In the past, this stereo matching process has relied heavily on local edge operators and a gray scale matching metric. The processing is performed line by line over the imagery and the amount of geometric control is minimal. As a result, the gross structure of the scene is determined but the derived three-dimensional data is noisy, oscillatory, and at times significantly inaccurate. This paper discusses new concepts that are currently being developed to stabilize this geometric reference preparation process. The new concepts involve the use of a structural syntax which will be used as a geometric constraint on automatic stereo matching. The syntax arises from the stereo configuration of the imaging platforms at the time of exposure and the knowledge of how various cultural structures are constructed. The syntax is used to parse a scene in terms of its cultural surfaces and to dictate to the matching process the allowable relative positions and orientations of surface edges in the image planes. Using the syntax, extensive searches using a gray scale matching metric are reduced.

  17. Small-scale anomaly detection in panoramic imaging using neural models of low-level vision

    NASA Astrophysics Data System (ADS)

    Casey, Matthew C.; Hickman, Duncan L.; Pavlou, Athanasios; Sadler, James R. E.

    2011-06-01

    Our understanding of sensory processing in animals has reached the stage where we can exploit neurobiological principles in commercial systems. In human vision, one brain structure that offers insight into how we might detect anomalies in real-time imaging is the superior colliculus (SC). The SC is a small structure that rapidly orients our eyes to a movement, sound or touch that it detects, even when the stimulus may be on a small-scale; think of a camouflaged movement or the rustle of leaves. This automatic orientation allows us to prioritize the use of our eyes to raise awareness of a potential threat, such as a predator approaching stealthily. In this paper we describe the application of a neural network model of the SC to the detection of anomalies in panoramic imaging. The neural approach consists of a mosaic of topographic maps that are each trained using competitive Hebbian learning to rapidly detect image features of a pre-defined shape and scale. What makes this approach interesting is the ability of the competition between neurons to automatically filter noise, yet with the capability of generalizing the desired shape and scale. We will present the results of this technique applied to the real-time detection of obscured targets in visible-band panoramic CCTV images. Using background subtraction to highlight potential movement, the technique is able to correctly identify targets which span as little as 3 pixels wide while filtering small-scale noise.

  18. Multi-scale curvature for automated identification of glaciated mountain landscapes

    NASA Astrophysics Data System (ADS)

    Prasicek, Günther; Otto, Jan-Christoph; Montgomery, David R.; Schrott, Lothar

    2014-03-01

    Erosion by glacial and fluvial processes shapes mountain landscapes in a long-recognized and characteristic way. Upland valleys incised by fluvial processes typically have a V-shaped cross-section with uniform and moderately steep slopes, whereas glacial valleys tend to have a U-shaped profile with a changing slope gradient. We present a novel regional approach to automatically differentiate between fluvial and glacial mountain landscapes based on the relation of multi-scale curvature and drainage area. Sample catchments are delineated and multiple moving window sizes are used to calculate per-cell curvature over a variety of scales ranging from the vicinity of the flow path at the valley bottom to catchment sections fully including valley sides. Single-scale curvature can take similar values for glaciated and non-glaciated catchments but a comparison of multi-scale curvature leads to different results according to the typical cross-sectional shapes. To adapt these differences for automated classification of mountain landscapes into areas with V- and U-shaped valleys, curvature values are correlated with drainage area and a new and simple morphometric parameter, the Difference of Minimum Curvature (DMC), is developed. At three study sites in the western United States the DMC thresholds determined from catchment analysis are used to automatically identify 5 × 5 km quadrats of glaciated and non-glaciated landscapes and the distinctions are validated by field-based geological and geomorphological maps. Our results demonstrate that DMC is a good predictor of glacial imprint, allowing automated delineation of glacially and fluvially incised mountain landscapes.

  19. Global small-scale lunar cartography

    NASA Technical Reports Server (NTRS)

    Lipskiy, Y. N.; Pskovskiy, Y. P.; Rodionova, Z. F.; Shevchenko, V. V.; Chikmachev, V. I.; Volchkova, L. I.

    1972-01-01

    The primary sources information for compiling this map were the photographs of the visible hemisphere obtained by earth-based observatories, the Luna 3 and Zond 3 pictures, and a small number of Lunar Orbiter pictures. The primary content of the complete lunar map is the surface relief and its tonal characteristics. In preparing the map, particular attention was devoted to the variety of lunar relief forms. The color spectrum of the map was selected not only for the natural coloring of the lunar surface, but also with the objective of achieving maximum expressiveness. A lunar globe to scale 1:10 million was prepared along with the preparation of the map. The scale of the globe, half that of the map, led to some selection and generalization of the relief forms. The globe permits maintaining simultaneously geometric similarity of contours, exact proportions of areas, and identical scales in all directions. The globe was prepared in both the Latin and Russian languages.

  20. An Investigation of Automatic Change Detection for Topographic Map Updating

    NASA Astrophysics Data System (ADS)

    Duncan, P.; Smit, J.

    2012-08-01

    Changes to the landscape are constantly occurring and it is essential for geospatial and mapping organisations that these changes are regularly detected and captured, so that map databases can be updated to reflect the current status of the landscape. The Chief Directorate of National Geospatial Information (CD: NGI), South Africa's national mapping agency, currently relies on manual methods of detecting changes and capturing these changes. These manual methods are time consuming and labour intensive, and rely on the skills and interpretation of the operator. It is therefore necessary to move towards more automated methods in the production process at CD: NGI. The aim of this research is to do an investigation into a methodology for automatic or semi-automatic change detection for the purpose of updating topographic databases. The method investigated for detecting changes is through image classification as well as spatial analysis and is focussed on urban landscapes. The major data input into this study is high resolution aerial imagery and existing topographic vector data. Initial results indicate the traditional pixel-based image classification approaches are unsatisfactory for large scale land-use mapping and that object-orientated approaches hold more promise. Even in the instance of object-oriented image classification generalization of techniques on a broad-scale has provided inconsistent results. A solution may lie with a hybrid approach of pixel and object-oriented techniques.

  1. Validity Evidence of the Spanish Version of the Automatic Thoughts Questionnaire-8 in Colombia.

    PubMed

    Ruiz, Francisco J; Suárez-Falcón, Juan C; Riaño-Hernández, Diana

    2017-02-13

    The Automatic Thoughts Questionnaire (ATQ) is a widely used, 30-item, 5-point Likert-type scale that measures the frequency of negative automatic thoughts as experienced by individuals suffering from depression. However, there is some controversy about the factor structure of the ATQ, and its application can be too time-consuming for survey research. Accordingly, an abbreviated, 8-item version of the ATQ has been proposed. The aim of this study was to analyze the validity evidence of the Spanish version of the ATQ-8 in Colombia. The ATQ-8 was administered to a total of 1587 participants, including a sample of undergraduates, one of general population, and a clinical sample. The internal consistency across the different samples was good (α = .89). The one-factor model found in the original scale showed a good fit to the data (RMSEA = .083, 90% CI [.074, .092]; CFI = .96; NNFI = .95). The clinical sample's mean score on the ATQ-8 was significantly higher than the scores of the nonclinical samples. The ATQ-8 was sensitive to the effects of a 1-session acceptance and commitment therapy focused on disrupting negative repetitive thinking. ATQ-8 scores were significantly related to dysfunctional schemas, emotional symptoms, mindfulness, experiential avoidance, satisfaction with life, and dysfunctional attitudes. In conclusion, the Spanish version of the ATQ-8 showed good psychometric properties in Colombia.

  2. Model-based vision using geometric hashing

    NASA Astrophysics Data System (ADS)

    Akerman, Alexander, III; Patton, Ronald

    1991-04-01

    The Geometric Hashing technique developed by the NYU Courant Institute has been applied to various automatic target recognition applications. In particular, I-MATH has extended the hashing algorithm to perform automatic target recognition ofsynthetic aperture radar (SAR) imagery. For this application, the hashing is performed upon the geometric locations of dominant scatterers. In addition to being a robust model-based matching algorithm -- invariant under translation, scale, and 3D rotations of the target -- hashing is of particular utility because it can still perform effective matching when the target is partially obscured. Moreover, hashing is very amenable to a SIMD parallel processing architecture, and thus potentially realtime implementable.

  3. Dietary Assessment on a Mobile Phone Using Image Processing and Pattern Recognition Techniques: Algorithm Design and System Prototyping.

    PubMed

    Probst, Yasmine; Nguyen, Duc Thanh; Tran, Minh Khoi; Li, Wanqing

    2015-07-27

    Dietary assessment, while traditionally based on pen-and-paper, is rapidly moving towards automatic approaches. This study describes an Australian automatic food record method and its prototype for dietary assessment via the use of a mobile phone and techniques of image processing and pattern recognition. Common visual features including scale invariant feature transformation (SIFT), local binary patterns (LBP), and colour are used for describing food images. The popular bag-of-words (BoW) model is employed for recognizing the images taken by a mobile phone for dietary assessment. Technical details are provided together with discussions on the issues and future work.

  4. Frequency control of wind turbine in power system

    NASA Astrophysics Data System (ADS)

    Xu, Huawei

    2018-06-01

    In order to improve the stability of the overall frequency of the power system, automatic power generation control and secondary frequency adjustment were applied. Automatic power generation control was introduced into power generation planning. A dual-fed wind generator power regulation model suitable for secondary frequency regulation was established. The results showed that this method satisfied the basic requirements of frequency regulation control of large-scale wind power access power systems and improved the stability and reliability of power system operation. Therefore, this system frequency control method and strategy is relatively simple. The effect is significant. The system frequency can quickly reach a steady state. It is worth applying and promoting.

  5. Natural-Annotation-based Unsupervised Construction of Korean-Chinese Domain Dictionary

    NASA Astrophysics Data System (ADS)

    Liu, Wuying; Wang, Lin

    2018-03-01

    The large-scale bilingual parallel resource is significant to statistical learning and deep learning in natural language processing. This paper addresses the automatic construction issue of the Korean-Chinese domain dictionary, and presents a novel unsupervised construction method based on the natural annotation in the raw corpus. We firstly extract all Korean-Chinese word pairs from Korean texts according to natural annotations, secondly transform the traditional Chinese characters into the simplified ones, and finally distill out a bilingual domain dictionary after retrieving the simplified Chinese words in an extra Chinese domain dictionary. The experimental results show that our method can automatically build multiple Korean-Chinese domain dictionaries efficiently.

  6. Automatic measurement of images on astrometric plates

    NASA Astrophysics Data System (ADS)

    Ortiz Gil, A.; Lopez Garcia, A.; Martinez Gonzalez, J. M.; Yershov, V.

    1994-04-01

    We present some results on the process of automatic detection and measurement of objects in overlapped fields of astrometric plates. The main steps of our algorithm are the following: determination of the Scale and Tilt between charge coupled devices (CCD) and microscope coordinate systems and estimation of signal-to-noise ratio in each field;--image identification and improvement of its position and size;--image final centering;--image selection and storage. Several parameters allow the use of variable criteria for image identification, characterization and selection. Problems related with faint images and crowded fields will be approached by special techniques (morphological filters, histogram properties and fitting models).

  7. Automatic laser welding and milling with in situ inline coherent imaging.

    PubMed

    Webster, P J L; Wright, L G; Ji, Y; Galbraith, C M; Kinross, A W; Van Vlack, C; Fraser, J M

    2014-11-01

    Although new affordable high-power laser technologies enable many processing applications in science and industry, depth control remains a serious technical challenge. In this Letter we show that inline coherent imaging (ICI), with line rates up to 312 kHz and microsecond-duration capture times, is capable of directly measuring laser penetration depth, in a process as violent as kW-class keyhole welding. We exploit ICI's high speed, high dynamic range, and robustness to interference from other optical sources to achieve automatic, adaptive control of laser welding, as well as ablation, achieving 3D micron-scale sculpting in vastly different heterogeneous biological materials.

  8. [An automatic system for anatomophysiological correlation in three planes simultaneously during functional neurosurgery].

    PubMed

    Teijeiro, E J; Macías, R J; Morales, J M; Guerra, E; López, G; Alvarez, L M; Fernández, F; Maragoto, C; Seijo, F; Alvarez, E

    The Neurosurgical Deep Recording System (NDRS) using a personal computer takes the place of complex electronic equipment for recording and processing deep cerebral electrical activity, as a guide in stereotaxic functional neurosurgery. It also permits increased possibilities of presenting information in direct graphic form with automatic management and sufficient flexibility to implement different analyses. This paper describes the possibilities of automatic simultaneous graphic representation in three almost orthogonal planes, available with the new 5.1 version of NDRS so as to facilitate the analysis of anatomophysiological correlation in the localization of deep structures of the brain during minimal access surgery. This new version can automatically show the spatial behaviour of signals registered throughout the path of the electrode inside the brain, superimposed simultaneously on sagittal, coronal and axial sections of an anatomical atlas of the brain, after adjusting the scale automatically according to the dimensions of the brain of each individual patient. This may also be shown in a tridimensional representation of the different planes themselves intercepting. The NDRS system has been successfully used in Spain and Cuba in over 300 functional neurosurgery operations. The new version further facilitates analysis of spatial anatomophysiological correlation for the localization of brain structures. This system has contributed to increase the precision and safety in selecting surgical targets in the control of Parkinson s disease and other disorders of movement.

  9. ActiveSeismoPick3D - automatic first arrival determination for large active seismic arrays

    NASA Astrophysics Data System (ADS)

    Paffrath, Marcel; Küperkoch, Ludger; Wehling-Benatelli, Sebastian; Friederich, Wolfgang

    2016-04-01

    We developed a tool for automatic determination of first arrivals in active seismic data based on an approach, that utilises higher order statistics (HOS) and the Akaike information criterion (AIC), commonly used in seismology, but not in active seismics. Automatic picking is highly desirable in active seismics as the number of data provided by large seismic arrays rapidly exceeds of what an analyst can evaluate in a reasonable amount of time. To bring the functionality of automatic phase picking into the context of active data, the software package ActiveSeismoPick3D was developed in Python. It uses a modified algorithm for the determination of first arrivals which searches for the HOS maximum in unfiltered data. Additionally, it offers tools for manual quality control and postprocessing, e.g. various visualisation and repicking functionalities. For flexibility, the tool also includes methods for the preparation of geometry information of large seismic arrays and improved interfaces to the Fast Marching Tomography Package (FMTOMO), which can be used for the prediction of travel times and inversion for subsurface properties. Output files are generated in the VTK format, allowing the 3D visualization of e.g. the inversion results. As a test case, a data set consisting of 9216 traces from 64 shots was gathered, recorded at 144 receivers deployed in a regular 2D array of a size of 100 x 100 m. ActiveSeismoPick3D automatically checks the determined first arrivals by a dynamic signal to noise ratio threshold. From the data a 3D model of the subsurface was generated using the export functionality of the package and FMTOMO.

  10. Semi-automatic semantic annotation of PubMed Queries: a study on quality, efficiency, satisfaction

    PubMed Central

    Névéol, Aurélie; Islamaj-Doğan, Rezarta; Lu, Zhiyong

    2010-01-01

    Information processing algorithms require significant amounts of annotated data for training and testing. The availability of such data is often hindered by the complexity and high cost of production. In this paper, we investigate the benefits of a state-of-the-art tool to help with the semantic annotation of a large set of biomedical information queries. Seven annotators were recruited to annotate a set of 10,000 PubMed® queries with 16 biomedical and bibliographic categories. About half of the queries were annotated from scratch, while the other half were automatically pre-annotated and manually corrected. The impact of the automatic pre-annotations was assessed on several aspects of the task: time, number of actions, annotator satisfaction, inter-annotator agreement, quality and number of the resulting annotations. The analysis of annotation results showed that the number of required hand annotations is 28.9% less when using pre-annotated results from automatic tools. As a result, the overall annotation time was substantially lower when pre-annotations were used, while inter-annotator agreement was significantly higher. In addition, there was no statistically significant difference in the semantic distribution or number of annotations produced when pre-annotations were used. The annotated query corpus is freely available to the research community. This study shows that automatic pre-annotations are found helpful by most annotators. Our experience suggests using an automatic tool to assist large-scale manual annotation projects. This helps speed-up the annotation time and improve annotation consistency while maintaining high quality of the final annotations. PMID:21094696

  11. Estimating Mutual Information for High-to-Low Calibration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michaud, Isaac James; Williams, Brian J.; Weaver, Brian Phillip

    Presentation shows that KSG 2 is superior to KSG 1 because it scales locally automatically; KSG estimators are limited to a maximum MI due to sample size; LNC extends the capability of KSG without onerous assumptions; iLNC allows LNC to estimate information gain.

  12. Automatic gender detection of dream reports: A promising approach.

    PubMed

    Wong, Christina; Amini, Reza; De Koninck, Joseph

    2016-08-01

    A computer program was developed in an attempt to differentiate the dreams of males from females. Hypothesized gender predictors were based on previous literature concerning both dream content and written language features. Dream reports from home-collected dream diaries of 100 male (144 dreams) and 100 female (144 dreams) adolescent Anglophones were matched for equal length. They were first scored with the Hall and Van de Castle (HVDC) scales and quantified using DreamSAT. Two male and two female undergraduate students were asked to read all dreams and predict the dreamer's gender. They averaged a pairwise percent correct gender prediction of 75.8% (κ=0.516), while the Automatic Analysis showed that the computer program's accuracy was 74.5% (κ=0.492), both of which were higher than chance of 50% (κ=0.00). The prediction levels were maintained when dreams containing obvious gender identifiers were eliminated and integration of HVDC scales did not improve prediction. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Inference of segmented color and texture description by tensor voting.

    PubMed

    Jia, Jiaya; Tang, Chi-Keung

    2004-06-01

    A robust synthesis method is proposed to automatically infer missing color and texture information from a damaged 2D image by (N)D tensor voting (N > 3). The same approach is generalized to range and 3D data in the presence of occlusion, missing data and noise. Our method translates texture information into an adaptive (N)D tensor, followed by a voting process that infers noniteratively the optimal color values in the (N)D texture space. A two-step method is proposed. First, we perform segmentation based on insufficient geometry, color, and texture information in the input, and extrapolate partitioning boundaries by either 2D or 3D tensor voting to generate a complete segmentation for the input. Missing colors are synthesized using (N)D tensor voting in each segment. Different feature scales in the input are automatically adapted by our tensor scale analysis. Results on a variety of difficult inputs demonstrate the effectiveness of our tensor voting approach.

  14. TU-H-CAMPUS-JeP1-02: Fully Automatic Verification of Automatically Contoured Normal Tissues in the Head and Neck

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCarroll, R; UT Health Science Center, Graduate School of Biomedical Sciences, Houston, TX; Beadle, B

    Purpose: To investigate and validate the use of an independent deformable-based contouring algorithm for automatic verification of auto-contoured structures in the head and neck towards fully automated treatment planning. Methods: Two independent automatic contouring algorithms [(1) Eclipse’s Smart Segmentation followed by pixel-wise majority voting, (2) an in-house multi-atlas based method] were used to create contours of 6 normal structures of 10 head-and-neck patients. After rating by a radiation oncologist, the higher performing algorithm was selected as the primary contouring method, the other used for automatic verification of the primary. To determine the ability of the verification algorithm to detect incorrectmore » contours, contours from the primary method were shifted from 0.5 to 2cm. Using a logit model the structure-specific minimum detectable shift was identified. The models were then applied to a set of twenty different patients and the sensitivity and specificity of the models verified. Results: Per physician rating, the multi-atlas method (4.8/5 point scale, with 3 rated as generally acceptable for planning purposes) was selected as primary and the Eclipse-based method (3.5/5) for verification. Mean distance to agreement and true positive rate were selected as covariates in an optimized logit model. These models, when applied to a group of twenty different patients, indicated that shifts could be detected at 0.5cm (brain), 0.75cm (mandible, cord), 1cm (brainstem, cochlea), or 1.25cm (parotid), with sensitivity and specificity greater than 0.95. If sensitivity and specificity constraints are reduced to 0.9, detectable shifts of mandible and brainstem were reduced by 0.25cm. These shifts represent additional safety margins which might be considered if auto-contours are used for automatic treatment planning without physician review. Conclusion: Automatically contoured structures can be automatically verified. This fully automated process could be used to flag auto-contours for special review or used with safety margins in a fully automatic treatment planning system.« less

  15. Advances in Software Tools for Pre-processing and Post-processing of Overset Grid Computations

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    2004-01-01

    Recent developments in three pieces of software for performing pre-processing and post-processing work on numerical computations using overset grids are presented. The first is the OVERGRID graphical interface which provides a unified environment for the visualization, manipulation, generation and diagnostics of geometry and grids. Modules are also available for automatic boundary conditions detection, flow solver input preparation, multiple component dynamics input preparation and dynamics animation, simple solution viewing for moving components, and debris trajectory analysis input preparation. The second is a grid generation script library that enables rapid creation of grid generation scripts. A sample of recent applications will be described. The third is the OVERPLOT graphical interface for displaying and analyzing history files generated by the flow solver. Data displayed include residuals, component forces and moments, number of supersonic and reverse flow points, and various dynamics parameters.

  16. Fabrication and characterization of millimeter-scale translucent La{sub 2}O{sub 3}-doped Al{sub 2}O{sub 3} ceramic hollow spheres

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Haoting; Liao, Qilong, E-mail: liaoqilong@swust.edu.cn; Dai, Yunya

    2016-04-15

    Highlights: • Millimeter-scale translucent La{sub 2}O{sub 3}-doped Al{sub 2}O{sub 3} hollow spheres have been prepared. • The diameters of the prepared hollow spheres are 500–1300μm. • The degree of sphericity for the prepared hollow spheres is above 98%. • The mechanisms of transparency are discussed. - Abstract: Millimeter-scale translucent La{sub 2}O{sub 3}-doped Al{sub 2}O{sub 3} ceramic hollow spheres have been successfully prepared using the oil-in-water (paraffin-in-alumina sol) droplets as precursors made by self-made T-shape micro-emulsion device. The main crystalline phase of the obtained hollow sphere is alpha alumina. The prepared translucent La{sub 2}O{sub 3}-containing Al{sub 2}O{sub 3} ceramic hollow spheresmore » have diameters of 500–1300 μm, wall thickness of about 23 μm and the degree of sphericity of above 98%. With the increase of the La{sub 2}O{sub 3} content, grains and grain-boundaries of the alumina spherical shell for the prepared millimeter-scale hollow spheres become regular and clear gradually. When the La{sub 2}O{sub 3} content is 0.1 wt.%, the crystal surface of the obtained Al{sub 2}O{sub 3} spherical shell shows optimal grains and few pores, and its transmittance reaches 42% at 532 nm laser light. This method provides a promising technique of preparing millimeter-scale translucent ceramic hollow spheres for laser inertial confined fusion.« less

  17. SHIELD: FITGALAXY -- A Software Package for Automatic Aperture Photometry of Extended Sources

    NASA Astrophysics Data System (ADS)

    Marshall, Melissa

    2013-01-01

    Determining the parameters of extended sources, such as galaxies, is a common but time-consuming task. Finding a photometric aperture that encompasses the majority of the flux of a source and identifying and excluding contaminating objects is often done by hand - a lengthy and difficult to reproduce process. To make extracting information from large data sets both quick and repeatable, I have developed a program called FITGALAXY, written in IDL. This program uses minimal user input to automatically fit an aperture to, and perform aperture and surface photometry on, an extended source. FITGALAXY also automatically traces the outlines of surface brightness thresholds and creates surface brightness profiles, which can then be used to determine the radial properties of a source. Finally, the program performs automatic masking of contaminating sources. Masks and apertures can be applied to multiple images (regardless of the WCS solution or plate scale) in order to accurately measure the same source at different wavelengths. I present the fluxes, as measured by the program, of a selection of galaxies from the Local Volume Legacy Survey. I then compare these results with the fluxes given by Dale et al. (2009) in order to assess the accuracy of FITGALAXY.

  18. Automatic graphene transfer system for improved material quality and efficiency

    PubMed Central

    Boscá, Alberto; Pedrós, Jorge; Martínez, Javier; Palacios, Tomás; Calle, Fernando

    2016-01-01

    In most applications based on chemical vapor deposition (CVD) graphene, the transfer from the growth to the target substrate is a critical step for the final device performance. Manual procedures are time consuming and depend on handling skills, whereas existing automatic roll-to-roll methods work well for flexible substrates but tend to induce mechanical damage in rigid ones. A new system that automatically transfers CVD graphene to an arbitrary target substrate has been developed. The process is based on the all-fluidic manipulation of the graphene to avoid mechanical damage, strain and contamination, and on the combination of capillary action and electrostatic repulsion between the graphene and its container to ensure a centered sample on top of the target substrate. The improved carrier mobility and yield of the automatically transferred graphene, as compared to that manually transferred, is demonstrated by the optical and electrical characterization of field-effect transistors fabricated on both materials. In particular, 70% higher mobility values, with a 30% decrease in the unintentional doping and a 10% strain reduction are achieved. The system has been developed for lab-scale transfer and proved to be scalable for industrial applications. PMID:26860260

  19. Automation on the generation of genome-scale metabolic models.

    PubMed

    Reyes, R; Gamermann, D; Montagud, A; Fuente, D; Triana, J; Urchueguía, J F; de Córdoba, P Fernández

    2012-12-01

    Nowadays, the reconstruction of genome-scale metabolic models is a nonautomatized and interactive process based on decision making. This lengthy process usually requires a full year of one person's work in order to satisfactory collect, analyze, and validate the list of all metabolic reactions present in a specific organism. In order to write this list, one manually has to go through a huge amount of genomic, metabolomic, and physiological information. Currently, there is no optimal algorithm that allows one to automatically go through all this information and generate the models taking into account probabilistic criteria of unicity and completeness that a biologist would consider. This work presents the automation of a methodology for the reconstruction of genome-scale metabolic models for any organism. The methodology that follows is the automatized version of the steps implemented manually for the reconstruction of the genome-scale metabolic model of a photosynthetic organism, Synechocystis sp. PCC6803. The steps for the reconstruction are implemented in a computational platform (COPABI) that generates the models from the probabilistic algorithms that have been developed. For validation of the developed algorithm robustness, the metabolic models of several organisms generated by the platform have been studied together with published models that have been manually curated. Network properties of the models, like connectivity and average shortest mean path of the different models, have been compared and analyzed.

  20. Automation of laboratory testing for infectious diseases using the polymerase chain reaction-- our past, our present, our future.

    PubMed

    Jungkind, D

    2001-01-01

    While it is an extremely powerful and versatile assay method, polymerase chain reaction (PCR) can be a labor-intensive process. Since the advent of commercial test kits from Roche and the semi-automated microwell Amplicor system, PCR has become an increasingly useful and widespread clinical tool. However, more widespread acceptance of molecular testing will depend upon automation that allows molecular assays to enter the routine clinical laboratory. The forces driving the need for automated PCR are the requirements for diagnosis and treatment of chronic viral diseases, economic pressures to develop more automated and less expensive test procedures similar to those in the clinical chemistry laboratories, and a shortage in many areas of qualified laboratory personnel trained in the types of manual procedures used in past decades. The automated Roche COBAS AMPLICOR system has automated the amplification and detection process. Specimen preparation remains the most labor-intensive part of the PCR testing process, accounting for the majority of the hands-on-time in most of the assays. A new automated specimen preparation system, the COBAS AmpliPrep, was evaluated. The system automatically releases the target nucleic acid, captures the target with specific oligonucleotide probes, which become attached to magnetic beads via a biotin-streptavidin binding reaction. Once attached to the beads, the target is purified and concentrated automatically. Results of 298 qualitative and 57 quantitative samples representing a wide range of virus concentrations analyzed after the COBAS AmpliPrep and manual specimen preparation methods, showed that there was no significant difference in qualitative or quantitative hepatitis C virus (HCV) assay performance, respectively. The AmpliPrep instrument decreased the time required to prepare serum or plasma samples for HCV PCR to under 1 min per sample. This was a decrease of 76% compared to the manual specimen preparation method. Systems that can analyze more samples with higher throughput and that can answer more questions about the nature of the microbes that we can presently only detect and quantitate will be needed in the future.

  1. Automation methodologies and large-scale validation for G W : Towards high-throughput G W calculations

    NASA Astrophysics Data System (ADS)

    van Setten, M. J.; Giantomassi, M.; Gonze, X.; Rignanese, G.-M.; Hautier, G.

    2017-10-01

    The search for new materials based on computational screening relies on methods that accurately predict, in an automatic manner, total energy, atomic-scale geometries, and other fundamental characteristics of materials. Many technologically important material properties directly stem from the electronic structure of a material, but the usual workhorse for total energies, namely density-functional theory, is plagued by fundamental shortcomings and errors from approximate exchange-correlation functionals in its prediction of the electronic structure. At variance, the G W method is currently the state-of-the-art ab initio approach for accurate electronic structure. It is mostly used to perturbatively correct density-functional theory results, but is, however, computationally demanding and also requires expert knowledge to give accurate results. Accordingly, it is not presently used in high-throughput screening: fully automatized algorithms for setting up the calculations and determining convergence are lacking. In this paper, we develop such a method and, as a first application, use it to validate the accuracy of G0W0 using the PBE starting point and the Godby-Needs plasmon-pole model (G0W0GN @PBE) on a set of about 80 solids. The results of the automatic convergence study utilized provide valuable insights. Indeed, we find correlations between computational parameters that can be used to further improve the automatization of G W calculations. Moreover, we find that G0W0GN @PBE shows a correlation between the PBE and the G0W0GN @PBE gaps that is much stronger than that between G W and experimental gaps. However, the G0W0GN @PBE gaps still describe the experimental gaps more accurately than a linear model based on the PBE gaps. With this paper, we hence show that G W can be made automatic and is more accurate than using an empirical correction of the PBE gap, but that, for accurate predictive results for a broad class of materials, an improved starting point or some type of self-consistency is necessary.

  2. SigVox - A 3D feature matching algorithm for automatic street object recognition in mobile laser scanning point clouds

    NASA Astrophysics Data System (ADS)

    Wang, Jinhu; Lindenbergh, Roderik; Menenti, Massimo

    2017-06-01

    Urban road environments contain a variety of objects including different types of lamp poles and traffic signs. Its monitoring is traditionally conducted by visual inspection, which is time consuming and expensive. Mobile laser scanning (MLS) systems sample the road environment efficiently by acquiring large and accurate point clouds. This work proposes a methodology for urban road object recognition from MLS point clouds. The proposed method uses, for the first time, shape descriptors of complete objects to match repetitive objects in large point clouds. To do so, a novel 3D multi-scale shape descriptor is introduced, that is embedded in a workflow that efficiently and automatically identifies different types of lamp poles and traffic signs. The workflow starts by tiling the raw point clouds along the scanning trajectory and by identifying non-ground points. After voxelization of the non-ground points, connected voxels are clustered to form candidate objects. For automatic recognition of lamp poles and street signs, a 3D significant eigenvector based shape descriptor using voxels (SigVox) is introduced. The 3D SigVox descriptor is constructed by first subdividing the points with an octree into several levels. Next, significant eigenvectors of the points in each voxel are determined by principal component analysis (PCA) and mapped onto the appropriate triangle of a sphere approximating icosahedron. This step is repeated for different scales. By determining the similarity of 3D SigVox descriptors between candidate point clusters and training objects, street furniture is automatically identified. The feasibility and quality of the proposed method is verified on two point clouds obtained in opposite direction of a stretch of road of 4 km. 6 types of lamp pole and 4 types of road sign were selected as objects of interest. Ground truth validation showed that the overall accuracy of the ∼170 automatically recognized objects is approximately 95%. The results demonstrate that the proposed method is able to recognize street furniture in a practical scenario. Remaining difficult cases are touching objects, like a lamp pole close to a tree.

  3. The multiscale nature of magnetic pattern on the solar surface

    NASA Astrophysics Data System (ADS)

    Scardigli, S.; Del Moro, D.; Berrilli, F.

    Multiscale magnetic underdense regions (voids) appear in high resolution magnetograms of quiet solar surface. These regions may be considered a signature of the underlying convective structure. The study of the associated pattern paves the way for the study of turbulent convective scales from granular to global. In order to address the question of magnetic pattern driven by turbulent convection we used a novel automatic void detection method to calculate void distributions. The absence of preferred scales of organization in the calculated distributions supports the multiscale nature of flows on the solar surface and the absence of preferred convective scales.

  4. A Multi-Scale Settlement Matching Algorithm Based on ARG

    NASA Astrophysics Data System (ADS)

    Yue, Han; Zhu, Xinyan; Chen, Di; Liu, Lingjia

    2016-06-01

    Homonymous entity matching is an important part of multi-source spatial data integration, automatic updating and change detection. Considering the low accuracy of existing matching methods in dealing with matching multi-scale settlement data, an algorithm based on Attributed Relational Graph (ARG) is proposed. The algorithm firstly divides two settlement scenes at different scales into blocks by small-scale road network and constructs local ARGs in each block. Then, ascertains candidate sets by merging procedures and obtains the optimal matching pairs by comparing the similarity of ARGs iteratively. Finally, the corresponding relations between settlements at large and small scales are identified. At the end of this article, a demonstration is presented and the results indicate that the proposed algorithm is capable of handling sophisticated cases.

  5. [Comparison of colon-cleansing methods in preparation for colonoscopy-comparative of solutions of mannitol and sodium picosulfate].

    PubMed

    de Moura, Diogo Turiani; Guedes, Hugo; Tortoretto, Verônica; Arataque, Tayrê Pádua; de Moura, Eduardo Guimarães; Román, Juan Pablo; Rodela, Gustavo Luis; Artifon, Everson L

    2016-01-01

    The purpose of the present study is to compare intestinal preparation with mannitol and sodium picosulphate, assessing patient's acceptance, side effects and cleaning capacity. This is a prospective, nom randomized, blind study, in which the evaluator had no information about the preparation applied. The sample obtained was divided into two groups according to the bowel preparation applied, with 153 patients prepared with 10% mannitol and 84 patients with sodium picosulfate. The evaluation of colon preparation was done using the Boston Scale (Boston Bowel Preparation Scale - BBP) through a three-point scoring system for each of the three regions of the colon: right, left and transverse colon. Of the 237 patients that were evaluated, 146 (61.60%) were female and 91 (38.4%) were male. Regarding the group that used mannitol, 98 were female (64.05%) and 55 were male (35.95%). Among the patients who used sodium picosulfate, 48 were female (57.14%) and 36 were male (42.86%), with no statistical differences between both groups (p> 0.32). Considering that an adequate preparation scores ≥ 6 in the Boston Scale, the bowel cleansing preparation was satisfactory in both groups. 93% of the patients who used mannitol and 81% of the patients who used sodium picosulfate had adequate preparation (score of ≥ 6). Moreover, we consider that the average score in the preparation with Mannitol was 9, while the sodium picosulfate score was 7. There were no significant differences between the two groups. There is consensus among authors who state that colonoscopy's safety and success are highly related to the cleansing outcome, regardless of the method used. The same can be observed in the present study, on which both preparations were proved safe and effective for bowel cleansing, according to the Boston scale, as well as accepted by patients and free of complications.

  6. An evaluation of automatic coronary artery calcium scoring methods with cardiac CT using the orCaScore framework.

    PubMed

    Wolterink, Jelmer M; Leiner, Tim; de Vos, Bob D; Coatrieux, Jean-Louis; Kelm, B Michael; Kondo, Satoshi; Salgado, Rodrigo A; Shahzad, Rahil; Shu, Huazhong; Snoeren, Miranda; Takx, Richard A P; van Vliet, Lucas J; van Walsum, Theo; Willems, Tineke P; Yang, Guanyu; Zheng, Yefeng; Viergever, Max A; Išgum, Ivana

    2016-05-01

    The amount of coronary artery calcification (CAC) is a strong and independent predictor of cardiovascular disease (CVD) events. In clinical practice, CAC is manually identified and automatically quantified in cardiac CT using commercially available software. This is a tedious and time-consuming process in large-scale studies. Therefore, a number of automatic methods that require no interaction and semiautomatic methods that require very limited interaction for the identification of CAC in cardiac CT have been proposed. Thus far, a comparison of their performance has been lacking. The objective of this study was to perform an independent evaluation of (semi)automatic methods for CAC scoring in cardiac CT using a publicly available standardized framework. Cardiac CT exams of 72 patients distributed over four CVD risk categories were provided for (semi)automatic CAC scoring. Each exam consisted of a noncontrast-enhanced calcium scoring CT (CSCT) and a corresponding coronary CT angiography (CCTA) scan. The exams were acquired in four different hospitals using state-of-the-art equipment from four major CT scanner vendors. The data were divided into 32 training exams and 40 test exams. A reference standard for CAC in CSCT was defined by consensus of two experts following a clinical protocol. The framework organizers evaluated the performance of (semi)automatic methods on test CSCT scans, per lesion, artery, and patient. Five (semi)automatic methods were evaluated. Four methods used both CSCT and CCTA to identify CAC, and one method used only CSCT. The evaluated methods correctly detected between 52% and 94% of CAC lesions with positive predictive values between 65% and 96%. Lesions in distal coronary arteries were most commonly missed and aortic calcifications close to the coronary ostia were the most common false positive errors. The majority (between 88% and 98%) of correctly identified CAC lesions were assigned to the correct artery. Linearly weighted Cohen's kappa for patient CVD risk categorization by the evaluated methods ranged from 0.80 to 1.00. A publicly available standardized framework for the evaluation of (semi)automatic methods for CAC identification in cardiac CT is described. An evaluation of five (semi)automatic methods within this framework shows that automatic per patient CVD risk categorization is feasible. CAC lesions at ambiguous locations such as the coronary ostia remain challenging, but their detection had limited impact on CVD risk determination.

  7. Understanding Student Language: An Unsupervised Dialogue Act Classification Approach

    ERIC Educational Resources Information Center

    Ezen-Can, Aysu; Boyer, Kristy Elizabeth

    2015-01-01

    Within the landscape of educational data, textual natural language is an increasingly vast source of learning-centered interactions. In natural language dialogue, student contributions hold important information about knowledge and goals. Automatically modeling the dialogue act of these student utterances is crucial for scaling natural language…

  8. Harmonizing Automatic Test System Assets, Drivers, and Control Methodologies

    DTIC Science & Technology

    1999-07-18

    ORGANIZATION PRINCIPAL AREAS OF INTEREST TO ATS NAME 1394 TA Firewire Trade Association Defining high speed bus protocol Active Group Accelerating ActiveX ...System Assets, Drivers, and Control Methodologies 17 JUL, 1999 component is a diagonal matrix containing scaling values such that when the three

  9. Clarification to "Examining Rater Errors in the Assessment of Written Composition with a Many-Faceted Rasch Model."

    ERIC Educational Resources Information Center

    Englehard, George, Jr.

    1996-01-01

    Data presented in figure three of the article cited may be misleading in that the automatic scaling procedure used by the computer program that generated the histogram highlighted spikes that would look different with different histogram methods. (SLD)

  10. Subroutines GEORGE and DRASTC simplify operation of automatic digital plotter

    NASA Technical Reports Server (NTRS)

    Englel, F., III; Gray, W. H.; Richard, P. J.

    1967-01-01

    FORTRAN language subroutines enable the production of a tape for a 360-30 tape unit that controls the CALCOMP 566 Digital Incremental Plotter. This provides the plotter with instructions for graphically displaying data points with the proper scaling of axes, numbering, lettering, and tic marking.

  11. Observing control and data reduction at the UKIRT

    NASA Astrophysics Data System (ADS)

    Bridger, Alan; Economou, Frossie; Wright, Gillian S.; Currie, Malcolm J.

    1998-07-01

    For the past seven years observing with the major instruments at the United Kingdom IR Telescope (UKIRT) has been semi-automated, using ASCII files top configure the instruments and then sequence a series of exposures and telescope movements to acquire the data. For one instrument automatic data reduction completes the cycle. The emergence of recent software technologies has suggested an evolution of this successful system to provide a friendlier and more powerful interface to observing at UKIRT. The Observatory Reduction and Acquisition Control (ORAC) project is now underway to construct this system. A key aim of ORAC is to allow a more complete description of the observing program, including the target sources and the recipe that will be used to provide on-line data reduction. Remote observation preparation and submission will also be supported. In parallel the observatory control system will be upgraded to use these descriptions for more automatic observing, while retaining the 'classical' interactive observing mode. The final component of the project is an improved automatic data reduction system, allowing on-line reduction of data at the telescope while retaining the flexibility to cope with changing observing techniques and instruments. The user will also automatically be provided with the scripts used for the real-time reduction to help provide post-observing data reduction support. The overall project goal is to improve the scientific productivity of the telescope, but it should also reduce the overall ongoing support requirements, and has the eventual goal of supporting the use of queue- scheduled observing.

  12. Programming Environments for High Level Scientific Problem Solving. IFIP WG 2.5 Working Conference 6 Held in Karlsruhe, Germany on September 23 - 27, 1991

    DTIC Science & Technology

    1991-09-27

    Springer Verlag (1989). (13] Hulshof , B.J.A. and van Hulzen, J.A.: "Automatic error cumulation control", Proceedir EUROSAM 󈨘 (J. Fitch. ed.), Springer...User’s Manual", Dept of Comp. Science. Univ. of Twente (In preparation). 268 [15] van Hulzen, J.A., Hulshof , B.J.A.. Gates, B.L. and van Heerwaarden, M.C

  13. A performance evaluation of various coatings, substrate materials, and solar collector systems

    NASA Technical Reports Server (NTRS)

    Dolan, F. J.

    1976-01-01

    An experimental apparatus was constructed and utilized in conjunction with both a solar simulator and actual sunlight to test and evaluate various solar panel coatings, panel designs, and scaled-down collector subsystems. Data were taken by an automatic digital data acquisition system and reduced and printed by a computer system. The solar collector test setup, data acquisition system, and data reduction and printout systems were considered to have operated very satisfactorily. Test data indicated that there is a practical or useful limit in scaling down beyond which scaled-down testing cannot produce results comparable to results of larger scale tests. Test data are presented as are schematics and pictures of test equipment and test hardware.

  14. Wheat cultivation: Identification and estimation of areas using LANDSAT data

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Mendonca, F. J.; Cottrell, D. A.; Tardin, A. T.; Lee, D. C. L.; Shimabukuro, Y. E.; Moreira, M. A.; Delimaefernandocelsosoaresmaia, A. M.

    1981-01-01

    The feasibility of using automatically processed multispectral data obtained from LANDSAT to identify wheat and estimate the areas planted with this grain was investigated. Three 20 km by 40 km segments in a wheat growing region of Rio Grande do Sul were aerially photographed using type 2443 Aerochrome film. Three maps corresponding to each segment were obtained from the analysis of the photographs which identified wheat, barley, fallow land, prepared soil, forests, and reforested land. Using basic information about the fields and maps made from the photographed areas, an automatic classification of wheat was made using MSS data from two different periods: July to September and July to October 1979. Results show that orbital data is not only useful in characterizing the growth of wheat, but also provides information of the intensity and extent of adverse climate which affects cultivation. The temporal and spatial characteristics of LANDSAR data are also demonstrated.

  15. Automatic finger joint synovitis localization in ultrasound images

    NASA Astrophysics Data System (ADS)

    Nurzynska, Karolina; Smolka, Bogdan

    2016-04-01

    A long-lasting inflammation of joints results between others in many arthritis diseases. When not cured, it may influence other organs and general patients' health. Therefore, early detection and running proper medical treatment are of big value. The patients' organs are scanned with high frequency acoustic waves, which enable visualization of interior body structures through an ultrasound sonography (USG) image. However, the procedure is standardized, different projections result in a variety of possible data, which should be analyzed in short period of time by a physician, who is using medical atlases as a guidance. This work introduces an efficient framework based on statistical approach to the finger joint USG image, which enables automatic localization of skin and bone regions, which are then used for localization of the finger joint synovitis area. The processing pipeline realizes the task in real-time and proves high accuracy when compared to annotation prepared by the expert.

  16. Signal chain for the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS)

    NASA Technical Reports Server (NTRS)

    Bunn, James S., Jr.

    1988-01-01

    The AVIRIS instrument has a separate dedicated analog signal processing chain for each of its four spectrometers. The signal chains amplify low-level focal-plane line array signals (5 to 10 mV full-scale span) in the presence of larger multiplexing signals (approx 150 mV) providing the data handling system a ten-bit digital word (for each spectrometer) each 1.3 microns. This signal chain provides automatic correction for the line array dark signal nonuniformity (which can approach the full-scale signal span).

  17. Signal chain for the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS)

    NASA Technical Reports Server (NTRS)

    Bunn, James S., Jr.

    1987-01-01

    The AVIRIS instrument has a separate dedicated analog signal processing chain for each of its four spectrometers. The signal chains amplify low-level focal-plane line array signals (5 to 10 mV full-scale span) in the presence of larger multiplexing signals (approx 150 mV) providing the data handling system a ten-bit digital word (for each spectrometer) each 1.3 microns. This signal chain provides automatic correction for the line array dark signal nonuniformity (which can approach the full-scale signal span).

  18. Automatic Recognition of Fetal Facial Standard Plane in Ultrasound Image via Fisher Vector.

    PubMed

    Lei, Baiying; Tan, Ee-Leng; Chen, Siping; Zhuo, Liu; Li, Shengli; Ni, Dong; Wang, Tianfu

    2015-01-01

    Acquisition of the standard plane is the prerequisite of biometric measurement and diagnosis during the ultrasound (US) examination. In this paper, a new algorithm is developed for the automatic recognition of the fetal facial standard planes (FFSPs) such as the axial, coronal, and sagittal planes. Specifically, densely sampled root scale invariant feature transform (RootSIFT) features are extracted and then encoded by Fisher vector (FV). The Fisher network with multi-layer design is also developed to extract spatial information to boost the classification performance. Finally, automatic recognition of the FFSPs is implemented by support vector machine (SVM) classifier based on the stochastic dual coordinate ascent (SDCA) algorithm. Experimental results using our dataset demonstrate that the proposed method achieves an accuracy of 93.27% and a mean average precision (mAP) of 99.19% in recognizing different FFSPs. Furthermore, the comparative analyses reveal the superiority of the proposed method based on FV over the traditional methods.

  19. Research on the Construction of Remote Sensing Automatic Interpretation Symbol Big Data

    NASA Astrophysics Data System (ADS)

    Gao, Y.; Liu, R.; Liu, J.; Cheng, T.

    2018-04-01

    Remote sensing automatic interpretation symbol (RSAIS) is an inexpensive and fast method in providing precise in-situ information for image interpretation and accuracy. This study designed a scientific and precise RSAIS data characterization method, as well as a distributed and cloud architecture massive data storage method. Additionally, it introduced an offline and online data update mode and a dynamic data evaluation mechanism, with the aim to create an efficient approach for RSAIS big data construction. Finally, a national RSAIS database with more than 3 million samples covering 86 land types was constructed during 2013-2015 based on the National Geographic Conditions Monitoring Project of China and then annually updated since the 2016 period. The RSAIS big data has proven to be a good method for large scale image interpretation and field validation. It is also notable that it has the potential to solve image automatic interpretation with the assistance of deep learning technology in the remote sensing big data era.

  20. Piloted simulation study of the effects of an automated trim system on flight characteristics of a light twin-engine airplane with one engine inoperative

    NASA Technical Reports Server (NTRS)

    Stewart, E. C.; Brown, P. W.; Yenni, K. R.

    1986-01-01

    A simulation study was conducted to investigate the piloting problems associated with failure of an engine on a generic light twin-engine airplane. A primary piloting problem for a light twin-engine airplane after an engine failure is maintaining precise control of the airplane in the presence of large steady control forces. To address this problem, a simulated automatic trim system which drives the trim tabs as an open-loop function of propeller slipstream measurements was developed. The simulated automatic trim system was found to greatly increase the controllability in asymmetric powered flight without having to resort to complex control laws or an irreversible control system. However, the trim-tab control rates needed to produce the dramatic increase in controllability may require special design consideration for automatic trim system failures. Limited measurements obtained in full-scale flight tests confirmed the fundamental validity of the proposed control law.

  1. A model for simulating the grinding and classification cyclic system of waste PCBs recycling production line.

    PubMed

    Yang, Deming; Xu, Zhenming

    2011-09-15

    Crushing and separating technology is widely used in waste printed circuit boards (PCBs) recycling process. A set of automatic line without negative impact to environment for recycling waste PCBs was applied in industry scale. Crushed waste PCBs particles grinding and classification cyclic system is the most important part of the automatic production line, and it decides the efficiency of the whole production line. In this paper, a model for computing the process of the system was established, and matrix analysis method was adopted. The result showed that good agreement can be achieved between the simulation model and the actual production line, and the system is anti-jamming. This model possibly provides a basis for the automatic process control of waste PCBs production line. With this model, many engineering problems can be reduced, such as metals and nonmetals insufficient dissociation, particles over-pulverizing, incomplete comminuting, material plugging and equipment fever. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. A computer-aided diagnosis system of nuclear cataract.

    PubMed

    Li, Huiqi; Lim, Joo Hwee; Liu, Jiang; Mitchell, Paul; Tan, Ava Grace; Wang, Jie Jin; Wong, Tien Yin

    2010-07-01

    Cataracts are the leading cause of blindness worldwide, and nuclear cataract is the most common form of cataract. An algorithm for automatic diagnosis of nuclear cataract is investigated in this paper. Nuclear cataract is graded according to the severity of opacity using slit lamp lens images. Anatomical structure in the lens image is detected using a modified active shape model. On the basis of the anatomical landmark, local features are extracted according to clinical grading protocol. Support vector machine regression is employed for grade prediction. This is the first time that the nucleus region can be detected automatically in slit lamp images. The system is validated using clinical images and clinical ground truth on >5000 images. The success rate of structure detection is 95% and the average grading difference is 0.36 on a 5.0 scale. The automatic diagnosis system can improve the grading objectivity and potentially be used in clinics and population studies to save the workload of ophthalmologists.

  3. Enhanced Image-Aided Navigation Algorithm with Automatic Calibration and Affine Distortion Prediction

    DTIC Science & Technology

    2012-03-01

    Lowe, David G. “Distinctive Image Features from Scale-Invariant Keypoints”. International Journal of Computer Vision, 2004. 13. Maybeck, Peter S...Fairfax Drive - 3rd Floor Arlington,VA 22203 Dr. Stefanie Tompkins ; (703)248–1540; Stefanie.Tompkins@darpa.mil DARPA Distribution A. Approved for Public

  4. Laser Altimeter for Flight Simulator

    NASA Technical Reports Server (NTRS)

    Webster, L. D.

    1986-01-01

    Height of flight-simulator probe above model of terrain measured by automatic laser triangulation system. Airplane simulated by probe that moves over model of terrain. Altitude of airplane scaled from height of probe above model. Height measured by triangulation of laser beam aimed at intersection of model surface with plumb line of probe.

  5. A Developmental Writing Scale. Research Report. ETS RR-08-19

    ERIC Educational Resources Information Center

    Attali, Yigal; Powers, Don

    2008-01-01

    This report describes the development of grade norms for timed-writing performance in two modes of writing: persuasive and descriptive. These norms are based on objective and automatically computed measures of writing quality in grammar, usage, mechanics, style, vocabulary, organization, and development. These measures are also used in the…

  6. Short Answers to Deep Questions: Supporting Teachers in Large-Class Settings

    ERIC Educational Resources Information Center

    McDonald, J.; Bird, R. J.; Zouaq, A.; Moskal, A. C. M.

    2017-01-01

    In large class settings, individualized student-teacher interaction is difficult. However, teaching interactions (e.g., formative feedback) are central to encouraging deep approaches to learning. While there has been progress in automatic short-answer grading, analysing student responses to support formative feedback at scale is arguably some way…

  7. 75 FR 35447 - Buy American Exception Under the American Recovery and Reinvestment Act of 2009; Nationwide...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-22

    ... Reinvestment and Recovery Act of 2009 (Recovery Act) to EERE-funded projects for non-residential programmable...[hyphen]residential programmable thermostats; commercial scale fully-automatic wood pellet boiler systems...) Programmable Thermostats--Includes devices that permit adjustment of heating or air-conditioning operations...

  8. 30 CFR 77.413 - Boilers.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Boilers. 77.413 Section 77.413 Mineral... Mechanical Equipment § 77.413 Boilers. (a) Boilers shall be equipped with guarded, well-maintained water... the gages shall be kept clean and free of scale and rust. (b) Boilers shall be equipped with automatic...

  9. Enabling the Interoperability of Large-Scale Legacy Systems

    DTIC Science & Technology

    2008-01-01

    information retrieval systems ( Salton and McGill 1983). We use this method because, in the schema mapping task, only one instance per class is...2001). A survey of approaches to automatic schema matching. The VLDB Journal, 10, 334-350. Salton , G., & McGill, M.J. (1983). Introduction to

  10. New Technologies Extend the Reach of Many College Fund Raisers.

    ERIC Educational Resources Information Center

    Nicklin, Julie L.

    1992-01-01

    Increasingly, colleges are using new technologies, often expensive, to improve fund-raising capacity among small-scale donors. Techniques include computerized screening of prospective donors based on personal information, automatic dialing and phone-bank worker training, and sophisticated direct-mail tactics. Concern about privacy and loss of the…

  11. Espresso coffee foam delays cooling of the liquid phase.

    PubMed

    Arii, Yasuhiro; Nishizawa, Kaho

    2017-04-01

    Espresso coffee foam, called crema, is known to be a marker of the quality of espresso coffee extraction. However, the role of foam in coffee temperature has not been quantitatively clarified. In this study, we used an automatic machine for espresso coffee extraction. We evaluated whether the foam prepared using the machine was suitable for foam analysis. After extraction, the percentage and consistency of the foam were measured using various techniques, and changes in the foam volume were tracked over time. Our extraction method, therefore, allowed consistent preparation of high-quality foam. We also quantitatively determined that the foam phase slowed cooling of the liquid phase after extraction. High-quality foam plays an important role in delaying the cooling of espresso coffee.

  12. The prepared emotional reflex: intentional preparation of automatic approach and avoidance tendencies as a means to regulate emotional responding.

    PubMed

    Eder, Andreas B; Rothermund, Klaus; Proctor, Robert W

    2010-08-01

    Advance preparation of action courses toward emotional stimuli is an effective means to regulate impulsive emotional behavior. Our experiment shows that performing intentional acts of approach and avoidance in an evaluation task influences the unintended activation of approach and avoidance tendencies in another task in which stimulus valence is irrelevant. For the evaluation-relevant blocks, participants received either congruent (positive-approach, negative-avoidance) or incongruent (positive-avoidance, negative-approach) mapping instructions. In the evaluation-irrelevant blocks, approach- and avoidance-related lever movements were selected in response to a stimulus feature other than valence (affective Simon task). Response mapping in the evaluation task influenced performance in the evaluation-irrelevant task: An enhanced affective Simon effect was observed with congruent mapping instructions; in contrast, the effect was reversed when the evaluation task required incongruent responses. Thus, action instructions toward affective stimuli received in one task determined affective response tendencies in another task where these instructions were not in effect. These findings suggest that intentionally prepared short-term links between affective valence and motor responses elicit associated responses without a deliberate act of will, operating like a "prepared reflex." Copyright 2010 APA

  13. An analysis of metropolitan land-use by machine processing of earth resources technology satellite data

    NASA Technical Reports Server (NTRS)

    Mausel, P. W.; Todd, W. J.; Baumgardner, M. F.

    1976-01-01

    A successful application of state-of-the-art remote sensing technology in classifying an urban area into its broad land use classes is reported. This research proves that numerous urban features are amenable to classification using ERTS multispectral data automatically processed by computer. Furthermore, such automatic data processing (ADP) techniques permit areal analysis on an unprecedented scale with a minimum expenditure of time. Also, classification results obtained using ADP procedures are consistent, comparable, and replicable. The results of classification are compared with the proposed U. S. G. S. land use classification system in order to determine the level of classification that is feasible to obtain through ERTS analysis of metropolitan areas.

  14. Dietary Assessment on a Mobile Phone Using Image Processing and Pattern Recognition Techniques: Algorithm Design and System Prototyping

    PubMed Central

    Probst, Yasmine; Nguyen, Duc Thanh; Tran, Minh Khoi; Li, Wanqing

    2015-01-01

    Dietary assessment, while traditionally based on pen-and-paper, is rapidly moving towards automatic approaches. This study describes an Australian automatic food record method and its prototype for dietary assessment via the use of a mobile phone and techniques of image processing and pattern recognition. Common visual features including scale invariant feature transformation (SIFT), local binary patterns (LBP), and colour are used for describing food images. The popular bag-of-words (BoW) model is employed for recognizing the images taken by a mobile phone for dietary assessment. Technical details are provided together with discussions on the issues and future work. PMID:26225994

  15. Design automation techniques for custom LSI arrays

    NASA Technical Reports Server (NTRS)

    Feller, A.

    1975-01-01

    The standard cell design automation technique is described as an approach for generating random logic PMOS, CMOS or CMOS/SOS custom large scale integration arrays with low initial nonrecurring costs and quick turnaround time or design cycle. The system is composed of predesigned circuit functions or cells and computer programs capable of automatic placement and interconnection of the cells in accordance with an input data net list. The program generates a set of instructions to drive an automatic precision artwork generator. A series of support design automation and simulation programs are described, including programs for verifying correctness of the logic on the arrays, performing dc and dynamic analysis of MOS devices, and generating test sequences.

  16. A robust and hierarchical approach for the automatic co-registration of intensity and visible images

    NASA Astrophysics Data System (ADS)

    González-Aguilera, Diego; Rodríguez-Gonzálvez, Pablo; Hernández-López, David; Luis Lerma, José

    2012-09-01

    This paper presents a new robust approach to integrate intensity and visible images which have been acquired with a terrestrial laser scanner and a calibrated digital camera, respectively. In particular, an automatic and hierarchical method for the co-registration of both sensors is developed. The approach integrates several existing solutions to improve the performance of the co-registration between range-based and visible images: the Affine Scale-Invariant Feature Transform (A-SIFT), the epipolar geometry, the collinearity equations, the Groebner basis solution and the RANdom SAmple Consensus (RANSAC), integrating a voting scheme. The approach presented herein improves the existing co-registration approaches in automation, robustness, reliability and accuracy.

  17. An Open-Source Automated Peptide Synthesizer Based on Arduino and Python.

    PubMed

    Gali, Hariprasad

    2017-10-01

    The development of the first open-source automated peptide synthesizer, PepSy, using Arduino UNO and readily available components is reported. PepSy was primarily designed to synthesize small peptides in a relatively small scale (<100 µmol). Scripts to operate PepSy in a fully automatic or manual mode were written in Python. Fully automatic script includes functions to carry out resin swelling, resin washing, single coupling, double coupling, Fmoc deprotection, ivDde deprotection, on-resin oxidation, end capping, and amino acid/reagent line cleaning. Several small peptides and peptide conjugates were successfully synthesized on PepSy with reasonably good yields and purity depending on the complexity of the peptide.

  18. Automatic segmentation and quantification of the cardiac structures from non-contrast-enhanced cardiac CT scans

    NASA Astrophysics Data System (ADS)

    Shahzad, Rahil; Bos, Daniel; Budde, Ricardo P. J.; Pellikaan, Karlijn; Niessen, Wiro J.; van der Lugt, Aad; van Walsum, Theo

    2017-05-01

    Early structural changes to the heart, including the chambers and the coronary arteries, provide important information on pre-clinical heart disease like cardiac failure. Currently, contrast-enhanced cardiac computed tomography angiography (CCTA) is the preferred modality for the visualization of the cardiac chambers and the coronaries. In clinical practice not every patient undergoes a CCTA scan; many patients receive only a non-contrast-enhanced calcium scoring CT scan (CTCS), which has less radiation dose and does not require the administration of contrast agent. Quantifying cardiac structures in such images is challenging, as they lack the contrast present in CCTA scans. Such quantification would however be relevant, as it enables population based studies with only a CTCS scan. The purpose of this work is therefore to investigate the feasibility of automatic segmentation and quantification of cardiac structures viz whole heart, left atrium, left ventricle, right atrium, right ventricle and aortic root from CTCS scans. A fully automatic multi-atlas-based segmentation approach is used to segment the cardiac structures. Results show that the segmentation overlap between the automatic method and that of the reference standard have a Dice similarity coefficient of 0.91 on average for the cardiac chambers. The mean surface-to-surface distance error over all the cardiac structures is 1.4+/- 1.7 mm. The automatically obtained cardiac chamber volumes using the CTCS scans have an excellent correlation when compared to the volumes in corresponding CCTA scans, a Pearson correlation coefficient (R) of 0.95 is obtained. Our fully automatic method enables large-scale assessment of cardiac structures on non-contrast-enhanced CT scans.

  19. A novel method for preparation of high dense tetragonal Li7La3Zr2O12

    NASA Astrophysics Data System (ADS)

    Zhao, Pengcheng; Wen, Yuehua; Cheng, Jie; Cao, Gaoping; Jin, Zhaoqing; Ming, Hai; Xu, Yan; Zhu, Xiayu

    2017-03-01

    For conventional preparation methods of Li7La3Zr2O12 (LLZO) solid state electrolytes, there is a stereotype that higher density always comes from higher pressure enforced upon the LLZO pellets. In this paper, a different way with an auto-consolidation mechanism is provided and discussed. No pressing operations are employed during the whole preparation process. Due to the surface tension of liquid melted Li2O at sintering temperature, LLZO particles could aggregate together freely and automatically. The preparation process for dense LLZO is greatly simplified. A dense tetragonal LLZO with high relative density about 93% has been prepared successfully by this auto-consolidation method. And there are no voids observed in the SEM images. At 30 °C, the total conductivity is about 5.67 × 10-5 S cm-1, which is the highest one for tetragonal LLZO in the reported issues, even two times higher than that prepared by hot-pressing method. The activation energy for total conductivity is ∼0.35 eV atom-1 at 30-120 °C, slightly lower than the previous reported values. This work sheds light on the understanding of the consolidation mechanism for solid electrolytes and suggests a reliable route to syhthesize cemanic solid electrolytes.

  20. Multi-scale curvature for automated identification of glaciated mountain landscapes☆

    PubMed Central

    Prasicek, Günther; Otto, Jan-Christoph; Montgomery, David R.; Schrott, Lothar

    2014-01-01

    Erosion by glacial and fluvial processes shapes mountain landscapes in a long-recognized and characteristic way. Upland valleys incised by fluvial processes typically have a V-shaped cross-section with uniform and moderately steep slopes, whereas glacial valleys tend to have a U-shaped profile with a changing slope gradient. We present a novel regional approach to automatically differentiate between fluvial and glacial mountain landscapes based on the relation of multi-scale curvature and drainage area. Sample catchments are delineated and multiple moving window sizes are used to calculate per-cell curvature over a variety of scales ranging from the vicinity of the flow path at the valley bottom to catchment sections fully including valley sides. Single-scale curvature can take similar values for glaciated and non-glaciated catchments but a comparison of multi-scale curvature leads to different results according to the typical cross-sectional shapes. To adapt these differences for automated classification of mountain landscapes into areas with V- and U-shaped valleys, curvature values are correlated with drainage area and a new and simple morphometric parameter, the Difference of Minimum Curvature (DMC), is developed. At three study sites in the western United States the DMC thresholds determined from catchment analysis are used to automatically identify 5 × 5 km quadrats of glaciated and non-glaciated landscapes and the distinctions are validated by field-based geological and geomorphological maps. Our results demonstrate that DMC is a good predictor of glacial imprint, allowing automated delineation of glacially and fluvially incised mountain landscapes. PMID:24748703

  1. Pipeline Reduction of Binary Light Curves from Large-Scale Surveys

    NASA Astrophysics Data System (ADS)

    Prša, Andrej; Zwitter, Tomaž

    2007-08-01

    One of the most important changes in observational astronomy of the 21st Century is a rapid shift from classical object-by-object observations to extensive automatic surveys. As CCD detectors are getting better and their prices are getting lower, more and more small and medium-size observatories are refocusing their attention to detection of stellar variability through systematic sky-scanning missions. This trend is additionally powered by the success of pioneering surveys such as ASAS, DENIS, OGLE, TASS, their space counterpart Hipparcos and others. Such surveys produce massive amounts of data and it is not at all clear how these data are to be reduced and analysed. This is especially striking in the eclipsing binary (EB) field, where most frequently used tools are optimized for object-by-object analysis. A clear need for thorough, reliable and fully automated approaches to modeling and analysis of EB data is thus obvious. This task is very difficult because of limited data quality, non-uniform phase coverage and parameter degeneracy. The talk will review recent advancements in putting together semi-automatic and fully automatic pipelines for EB data processing. Automatic procedures have already been used to process the Hipparcos data, LMC/SMC observations, OGLE and ASAS catalogs etc. We shall discuss the advantages and shortcomings of these procedures and overview the current status of automatic EB modeling pipelines for the upcoming missions such as CoRoT, Kepler, Gaia and others.

  2. Automated object-based classification of topography from SRTM data

    PubMed Central

    Drăguţ, Lucian; Eisank, Clemens

    2012-01-01

    We introduce an object-based method to automatically classify topography from SRTM data. The new method relies on the concept of decomposing land-surface complexity into more homogeneous domains. An elevation layer is automatically segmented and classified at three scale levels that represent domains of complexity by using self-adaptive, data-driven techniques. For each domain, scales in the data are detected with the help of local variance and segmentation is performed at these appropriate scales. Objects resulting from segmentation are partitioned into sub-domains based on thresholds given by the mean values of elevation and standard deviation of elevation respectively. Results resemble reasonably patterns of existing global and regional classifications, displaying a level of detail close to manually drawn maps. Statistical evaluation indicates that most of classes satisfy the regionalization requirements of maximizing internal homogeneity while minimizing external homogeneity. Most objects have boundaries matching natural discontinuities at regional level. The method is simple and fully automated. The input data consist of only one layer, which does not need any pre-processing. Both segmentation and classification rely on only two parameters: elevation and standard deviation of elevation. The methodology is implemented as a customized process for the eCognition® software, available as online download. The results are embedded in a web application with functionalities of visualization and download. PMID:22485060

  3. A Hessian-based methodology for automatic surface crack detection and classification from pavement images

    NASA Astrophysics Data System (ADS)

    Ghanta, Sindhu; Shahini Shamsabadi, Salar; Dy, Jennifer; Wang, Ming; Birken, Ralf

    2015-04-01

    Around 3,000,000 million vehicle miles are annually traveled utilizing the US transportation systems alone. In addition to the road traffic safety, maintaining the road infrastructure in a sound condition promotes a more productive and competitive economy. Due to the significant amounts of financial and human resources required to detect surface cracks by visual inspection, detection of these surface defects are often delayed resulting in deferred maintenance operations. This paper introduces an automatic system for acquisition, detection, classification, and evaluation of pavement surface cracks by unsupervised analysis of images collected from a camera mounted on the rear of a moving vehicle. A Hessian-based multi-scale filter has been utilized to detect ridges in these images at various scales. Post-processing on the extracted features has been implemented to produce statistics of length, width, and area covered by cracks, which are crucial for roadway agencies to assess pavement quality. This process has been realized on three sets of roads with different pavement conditions in the city of Brockton, MA. A ground truth dataset labeled manually is made available to evaluate this algorithm and results rendered more than 90% segmentation accuracy demonstrating the feasibility of employing this approach at a larger scale.

  4. Automated object-based classification of topography from SRTM data

    NASA Astrophysics Data System (ADS)

    Drăguţ, Lucian; Eisank, Clemens

    2012-03-01

    We introduce an object-based method to automatically classify topography from SRTM data. The new method relies on the concept of decomposing land-surface complexity into more homogeneous domains. An elevation layer is automatically segmented and classified at three scale levels that represent domains of complexity by using self-adaptive, data-driven techniques. For each domain, scales in the data are detected with the help of local variance and segmentation is performed at these appropriate scales. Objects resulting from segmentation are partitioned into sub-domains based on thresholds given by the mean values of elevation and standard deviation of elevation respectively. Results resemble reasonably patterns of existing global and regional classifications, displaying a level of detail close to manually drawn maps. Statistical evaluation indicates that most of classes satisfy the regionalization requirements of maximizing internal homogeneity while minimizing external homogeneity. Most objects have boundaries matching natural discontinuities at regional level. The method is simple and fully automated. The input data consist of only one layer, which does not need any pre-processing. Both segmentation and classification rely on only two parameters: elevation and standard deviation of elevation. The methodology is implemented as a customized process for the eCognition® software, available as online download. The results are embedded in a web application with functionalities of visualization and download.

  5. Learning-based automatic detection of severe coronary stenoses in CT angiographies

    NASA Astrophysics Data System (ADS)

    Melki, Imen; Cardon, Cyril; Gogin, Nicolas; Talbot, Hugues; Najman, Laurent

    2014-03-01

    3D cardiac computed tomography angiography (CCTA) is becoming a standard routine for non-invasive heart diseases diagnosis. Thanks to its high negative predictive value, CCTA is increasingly used to decide whether or not the patient should be considered for invasive angiography. However, an accurate assessment of cardiac lesions using this modality is still a time consuming task and needs a high degree of clinical expertise. Thus, providing automatic tool to assist clinicians during the diagnosis task is highly desirable. In this work, we propose a fully automatic approach for accurate severe cardiac stenoses detection. Our algorithm uses the Random Forest classi cation to detect stenotic areas. First, the classi er is trained on 18 CT cardiac exams with CTA reference standard. Then, then classi cation result is used to detect severe stenoses (with a narrowing degree higher than 50%) in a 30 cardiac CT exam database. Features that best captures the di erent stenoses con guration are extracted along the vessel centerlines at di erent scales. To ensure the accuracy against the vessel direction and scale changes, we extract features inside cylindrical patterns with variable directions and radii. Thus, we make sure that the ROIs contains only the vessel walls. The algorithm is evaluated using the Rotterdam Coronary Artery Stenoses Detection and Quantication Evaluation Framework. The evaluation is performed using reference standard quanti cations obtained from quantitative coronary angiography (QCA) and consensus reading of CTA. The obtained results show that we can reliably detect severe stenosis with a sensitivity of 64%.

  6. Comparison of filters for concentrating microbial indicators and pathogens in lake-water samples

    USGS Publications Warehouse

    Francy, Donna S.; Stelzer, Erin A.; Brady, Amie M.G.; Huitger, Carrie; Bushon, Rebecca N.; Ip, Hon S.; Ware, Michael W.; Villegas, Eric N.; Gallardo, Vincent; Lindquist, H.D. Alan

    2013-01-01

    Bacterial indicators are used to indicate increased health risk from pathogens and to make beach closure and advisory decisions; however, beaches are seldom monitored for the pathogens themselves. Studies of sources and types of pathogens at beaches are needed to improve estimates of swimming-associated health risks. It would be advantageous and cost-effective, especially for studies conducted on a regional scale, to use a method that can simultaneously filter and concentrate all classes of pathogens from the large volumes of water needed to detect pathogens. In seven recovery experiments, stock cultures of viruses and protozoa were seeded into 10-liter lake water samples, and concentrations of naturally occurring bacterial indicators were used to determine recoveries. For the five filtration methods tested, the highest median recoveries were as follows: glass wool for adenovirus (4.7%); NanoCeram for enterovirus (14.5%) and MS2 coliphage (84%); continuous-flow centrifugation (CFC) plus Virocap (CFC+ViroCap) for Escherichia coli (68.3%) and Cryptosporidium (54%); automatic ultrafiltration (UF) for norovirus GII (2.4%); and dead-end UF for Enterococcus faecalis (80.5%), avian influenza virus (0.02%), and Giardia (57%). In evaluating filter performance in terms of both recovery and variability, the automatic UF resulted in the highest recovery while maintaining low variability for all nine microorganisms. The automatic UF was used to demonstrate that filtration can be scaled up to field deployment and the collection of 200-liter lake water samples.

  7. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    NASA Astrophysics Data System (ADS)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  8. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data.

    PubMed

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-21

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  9. Comparison of Filters for Concentrating Microbial Indicators and Pathogens in Lake Water Samples

    PubMed Central

    Stelzer, Erin A.; Brady, Amie M. G.; Huitger, Carrie; Bushon, Rebecca N.; Ip, Hon S.; Ware, Michael W.; Villegas, Eric N.; Gallardo, Vicente; Lindquist, H. D. Alan

    2013-01-01

    Bacterial indicators are used to indicate increased health risk from pathogens and to make beach closure and advisory decisions; however, beaches are seldom monitored for the pathogens themselves. Studies of sources and types of pathogens at beaches are needed to improve estimates of swimming-associated health risks. It would be advantageous and cost-effective, especially for studies conducted on a regional scale, to use a method that can simultaneously filter and concentrate all classes of pathogens from the large volumes of water needed to detect pathogens. In seven recovery experiments, stock cultures of viruses and protozoa were seeded into 10-liter lake water samples, and concentrations of naturally occurring bacterial indicators were used to determine recoveries. For the five filtration methods tested, the highest median recoveries were as follows: glass wool for adenovirus (4.7%); NanoCeram for enterovirus (14.5%) and MS2 coliphage (84%); continuous-flow centrifugation (CFC) plus Virocap (CFC+ViroCap) for Escherichia coli (68.3%) and Cryptosporidium (54%); automatic ultrafiltration (UF) for norovirus GII (2.4%); and dead-end UF for Enterococcus faecalis (80.5%), avian influenza virus (0.02%), and Giardia (57%). In evaluating filter performance in terms of both recovery and variability, the automatic UF resulted in the highest recovery while maintaining low variability for all nine microorganisms. The automatic UF was used to demonstrate that filtration can be scaled up to field deployment and the collection of 200-liter lake water samples. PMID:23263948

  10. Automatic classification for mammogram backgrounds based on bi-rads complexity definition and on a multi content analysis framework

    NASA Astrophysics Data System (ADS)

    Wu, Jie; Besnehard, Quentin; Marchessoux, Cédric

    2011-03-01

    Clinical studies for the validation of new medical imaging devices require hundreds of images. An important step in creating and tuning the study protocol is the classification of images into "difficult" and "easy" cases. This consists of classifying the image based on features like the complexity of the background, the visibility of the disease (lesions). Therefore, an automatic medical background classification tool for mammograms would help for such clinical studies. This classification tool is based on a multi-content analysis framework (MCA) which was firstly developed to recognize image content of computer screen shots. With the implementation of new texture features and a defined breast density scale, the MCA framework is able to automatically classify digital mammograms with a satisfying accuracy. BI-RADS (Breast Imaging Reporting Data System) density scale is used for grouping the mammograms, which standardizes the mammography reporting terminology and assessment and recommendation categories. Selected features are input into a decision tree classification scheme in MCA framework, which is the so called "weak classifier" (any classifier with a global error rate below 50%). With the AdaBoost iteration algorithm, these "weak classifiers" are combined into a "strong classifier" (a classifier with a low global error rate) for classifying one category. The results of classification for one "strong classifier" show the good accuracy with the high true positive rates. For the four categories the results are: TP=90.38%, TN=67.88%, FP=32.12% and FN =9.62%.

  11. Consumers' convenience orientation towards meal preparation: conceptualization and measurement.

    PubMed

    Candel, M

    2001-02-01

    Consumer researchers consider convenience orientation towards meal preparation to be a relevant construct for understanding consumer behavior towards foods. This study set out to conceptualize this construct and to develop a scale that measures it. As examined in two different samples of meal preparers, the resulting scale is reliable, satisfies a unifactorial structure and has satisfactory convergent validity. The scale's nomological validity is supported in that it conforms to expectations regarding various psychographic constructs and various food-related behaviors. Convenience orientation was found to be negatively related to cooking enjoyment, involvement with food products and variety seeking, and to be positively related to role overload. The analyses also suggest that the lack of relation between the meal preparer's working status and convenience food consumption, as found in many studies, is due to convenience food not offering enough preparation convenience. Consuming take-away meals and eating in restaurants appear to satisfy the consumer's need for convenience more adequately. Copyright 2001 Academic Press.

  12. Reconstruction of Mammary Gland Structure Using Three-Dimensional Computer-Based Microscopy

    DTIC Science & Technology

    2004-08-01

    for image analysis in cytology" Ortiz de Solorzano C., R . Malladi , Lockett S. In: Geometric methods in bio-medical image processing. Ravikanth Malladi ...Deschamps T., Idica A.K., Malladi R ., Ortiz de Solorzano C. Journal of Biomedical Optics 9(3):445-453, 2004.. Manuscripts (in preparation): "* "Three...Deschamps T., Idica A.K., 16 Malladi R ., Ortiz de Solorzano C., Proceedings of Photonics West 2003, Vol. 4964, 2003 "* "Automatic and segmentation

  13. Design and fabrication of a prototype for an automatic transport system for transferring human and other wastes to an incinerator unit onboard spacecraft, phase A

    NASA Technical Reports Server (NTRS)

    Labak, L. J.; Remus, G. A.; Mansnerus, R.

    1971-01-01

    Three transport system concepts were experimentally evaluated for transferring human and nonhuman wastes from a collection site to an incineration unit onboard spacecraft. The operating parameters, merits, and shortcomings of a porous-pneumatic, nozzle-pneumatic, and a mechanical screw-feed system were determined. An analysis of the test data was made and a preliminary design of two prototype systems was prepared.

  14. Software engineering project management - A state-of-the-art report

    NASA Technical Reports Server (NTRS)

    Thayer, R. H.; Lehman, J. H.

    1977-01-01

    The management of software engineering projects in the aerospace industry was investigated. The survey assessed such features as contract type, specification preparation techniques, software documentation required by customers, planning and cost-estimating, quality control, the use of advanced program practices, software tools and test procedures, the education levels of project managers, programmers and analysts, work assignment, automatic software monitoring capabilities, design and coding reviews, production times, success rates, and organizational structure of the projects.

  15. Automatic classification of blank substrate defects

    NASA Astrophysics Data System (ADS)

    Boettiger, Tom; Buck, Peter; Paninjath, Sankaranarayanan; Pereira, Mark; Ronald, Rob; Rost, Dan; Samir, Bhamidipati

    2014-10-01

    Mask preparation stages are crucial in mask manufacturing, since this mask is to later act as a template for considerable number of dies on wafer. Defects on the initial blank substrate, and subsequent cleaned and coated substrates, can have a profound impact on the usability of the finished mask. This emphasizes the need for early and accurate identification of blank substrate defects and the risk they pose to the patterned reticle. While Automatic Defect Classification (ADC) is a well-developed technology for inspection and analysis of defects on patterned wafers and masks in the semiconductors industry, ADC for mask blanks is still in the early stages of adoption and development. Calibre ADC is a powerful analysis tool for fast, accurate, consistent and automatic classification of defects on mask blanks. Accurate, automated classification of mask blanks leads to better usability of blanks by enabling defect avoidance technologies during mask writing. Detailed information on blank defects can help to select appropriate job-decks to be written on the mask by defect avoidance tools [1][4][5]. Smart algorithms separate critical defects from the potentially large number of non-critical defects or false defects detected at various stages during mask blank preparation. Mechanisms used by Calibre ADC to identify and characterize defects include defect location and size, signal polarity (dark, bright) in both transmitted and reflected review images, distinguishing defect signals from background noise in defect images. The Calibre ADC engine then uses a decision tree to translate this information into a defect classification code. Using this automated process improves classification accuracy, repeatability and speed, while avoiding the subjectivity of human judgment compared to the alternative of manual defect classification by trained personnel [2]. This paper focuses on the results from the evaluation of Automatic Defect Classification (ADC) product at MP Mask Technology Center (MPMask). The Calibre ADC tool was qualified on production mask blanks against the manual classification. The classification accuracy of ADC is greater than 95% for critical defects with an overall accuracy of 90%. The sensitivity to weak defect signals and locating the defect in the images is a challenge we are resolving. The performance of the tool has been demonstrated on multiple mask types and is ready for deployment in full volume mask manufacturing production flow. Implementation of Calibre ADC is estimated to reduce the misclassification of critical defects by 60-80%.

  16. Flexibility of Expressive Timing in Repeated Musical Performances

    PubMed Central

    Demos, Alexander P.; Lisboa, Tânia; Chaffin, Roger

    2016-01-01

    Performances by soloists in the Western classical tradition are normally highly prepared, yet must sound fresh and spontaneous. How do musicians manage this? We tested the hypothesis that they achieve the necessary spontaneity by varying the musical gestures that express their interpretation of a piece. We examined the tempo arches produced by final slowing at the ends of phrases in performances of J. S. Bach’s No. 6 (Prelude) for solo cello (12 performances) and the Italian Concerto (Presto) for solo piano (eight performances). The performances were given by two experienced concert soloists during a short time period (3½ months for the Prelude, 2 weeks for the Presto) after completing their preparations for public performance. We measured the tempo of each bar or half-bar, and the stability of tempo across performances (difference of the tempo of each bar/half bar from each of the other performances). There were phrase arches for both tempo and stability with slower, less stable tempi at beginnings and ends of phrases and faster, more stable tempi mid-phrase. The effects of practice were complex. Tempo decreased overall with practice, while stability increased in some bars and decreased in others. One effect of practice may be to imbue well-learned, automatic motor sequences with freshness and spontaneity through cognitive control at phrase boundaries where slower tempi and decreased stability provide opportunities for slower cognitive processes to modulate rapid automatic motor sequences. PMID:27757089

  17. Syntax diagrams for body wave nomenclature, with generalizations for terrestrial planets

    NASA Astrophysics Data System (ADS)

    Knapmeyer, M.

    2003-04-01

    The Apollo network on the Moon constitutes the beginning of planetary seismology. In the next few decades, we may see seismometers deployed on the Moon again, on Mars, and perhaps on other terrestrial planets or satellites. Any seismological software for computation of body wave travel times on other planets should be highly versatile and be prepared for a huge variety of velocity distributions and internal structures. A suite of trial models for a planet might, for example, contain models with and without solid inner cores. It would then be useful if the software could detect physically meaningless phase names automatically without actually carrying out any computation. It would also be useful if the program were prepared to deal with features like fully solid cores, internal oceans, and varying depths of mineralogical phase changes like the olivine-spinel transition. Syntax diagrams are a standard method to describe the syntax of programming languages. They represent a graphical way to define which letter or phrase is allowed to follow a given sequence of letters. Syntax diagrams may be stored in data structures that allow automatic evaluation of a given letter sequence. Such diagrams are presented here for a generalized body wave nomenclature. Generalizations are made to overcome earth-specific notations which incorporate discontinuity depths into phase names or to distinguish olivine transitions from ice-ice transitions (as expected on the Galilean Satellites).

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Egorov, Oleg; O'Hara, Matthew J.; Grate, Jay W.

    An automated fluidic instrument is described that rapidly determines the total 99Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the 99Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of 99Tc. We developed amore » preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of 99Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.« less

  19. Application of the Repetitions in Reserve-Based Rating of Perceived Exertion Scale for Resistance Training

    PubMed Central

    Cronin, John; Storey, Adam; Zourdos, Michael C.

    2016-01-01

    ABSTRACT RATINGS OF PERCEIVED EXERTION ARE A VALID METHOD OF ESTIMATING THE INTENSITY OF A RESISTANCE TRAINING EXERCISE OR SESSION. SCORES ARE GIVEN AFTER COMPLETION OF AN EXERCISE OR TRAINING SESSION FOR THE PURPOSES OF ATHLETE MONITORING. HOWEVER, A NEWLY DEVELOPED SCALE BASED ON HOW MANY REPETITIONS ARE REMAINING AT THE COMPLETION OF A SET MAY BE A MORE PRECISE TOOL. THIS APPROACH ADJUSTS LOADS AUTOMATICALLY TO MATCH ATHLETE CAPABILITIES ON A SET-TO-SET BASIS AND MAY MORE ACCURATELY GAUGE INTENSITY AT NEAR-LIMIT LOADS. THIS ARTICLE OUTLINES HOW TO INCORPORATE THIS NOVEL SCALE INTO A TRAINING PLAN. PMID:27531969

  20. Oil Spill Detection and Tracking Using Lipschitz Regularity and Multiscale Techniques in Synthetic Aperture Radar Imagery

    NASA Astrophysics Data System (ADS)

    Ajadi, O. A.; Meyer, F. J.

    2014-12-01

    Automatic oil spill detection and tracking from Synthetic Aperture Radar (SAR) images is a difficult task, due in large part to the inhomogeneous properties of the sea surface, the high level of speckle inherent in SAR data, the complexity and the highly non-Gaussian nature of amplitude information, and the low temporal sampling that is often achieved with SAR systems. This research presents a promising new oil spill detection and tracking method that is based on time series of SAR images. Through the combination of a number of advanced image processing techniques, the develop approach is able to mitigate some of these previously mentioned limitations of SAR-based oil-spill detection and enables fully automatic spill detection and tracking across a wide range of spatial scales. The method combines an initial automatic texture analysis with a consecutive change detection approach based on multi-scale image decomposition. The first step of the approach, a texture transformation of the original SAR images, is performed in order to normalize the ocean background and enhance the contrast between oil-covered and oil-free ocean surfaces. The Lipschitz regularity (LR), a local texture parameter, is used here due to its proven ability to normalize the reflectivity properties of ocean water and maximize the visibly of oil in water. To calculate LR, the images are decomposed using two-dimensional continuous wavelet transform (2D-CWT), and transformed into Holder space to measure LR. After texture transformation, the now normalized images are inserted into our multi-temporal change detection algorithm. The multi-temporal change detection approach is a two-step procedure including (1) data enhancement and filtering and (2) multi-scale automatic change detection. The performance of the developed approach is demonstrated by an application to oil spill areas in the Gulf of Mexico. In this example, areas affected by oil spills were identified from a series of ALOS PALSAR images acquired in 2010. The comparison showed exceptional performance of our method. This method can be applied to emergency management and decision support systems with a need for real-time data, and it shows great potential for rapid data analysis in other areas, including volcano detection, flood boundaries, forest health, and wildfires.

  1. Designing and Implementing a Retrospective Earthquake Detection Framework at the U.S. Geological Survey National Earthquake Information Center

    NASA Astrophysics Data System (ADS)

    Patton, J.; Yeck, W.; Benz, H.

    2017-12-01

    The U.S. Geological Survey National Earthquake Information Center (USGS NEIC) is implementing and integrating new signal detection methods such as subspace correlation, continuous beamforming, multi-band picking and automatic phase identification into near-real-time monitoring operations. Leveraging the additional information from these techniques help the NEIC utilize a large and varied network on local to global scales. The NEIC is developing an ordered, rapid, robust, and decentralized framework for distributing seismic detection data as well as a set of formalized formatting standards. These frameworks and standards enable the NEIC to implement a seismic event detection framework that supports basic tasks, including automatic arrival time picking, social media based event detections, and automatic association of different seismic detection data into seismic earthquake events. In addition, this framework enables retrospective detection processing such as automated S-wave arrival time picking given a detected event, discrimination and classification of detected events by type, back-azimuth and slowness calculations, and ensuring aftershock and induced sequence detection completeness. These processes and infrastructure improve the NEIC's capabilities, accuracy, and speed of response. In addition, this same infrastructure provides an improved and convenient structure to support access to automatic detection data for both research and algorithmic development.

  2. FOCIH: Form-Based Ontology Creation and Information Harvesting

    NASA Astrophysics Data System (ADS)

    Tao, Cui; Embley, David W.; Liddle, Stephen W.

    Creating an ontology and populating it with data are both labor-intensive tasks requiring a high degree of expertise. Thus, scaling ontology creation and population to the size of the web in an effort to create a web of data—which some see as Web 3.0—is prohibitive. Can we find ways to streamline these tasks and lower the barrier enough to enable Web 3.0? Toward this end we offer a form-based approach to ontology creation that provides a way to create Web 3.0 ontologies without the need for specialized training. And we offer a way to semi-automatically harvest data from the current web of pages for a Web 3.0 ontology. In addition to harvesting information with respect to an ontology, the approach also annotates web pages and links facts in web pages to ontological concepts, resulting in a web of data superimposed over the web of pages. Experience with our prototype system shows that mappings between conceptual-model-based ontologies and forms are sufficient for creating the kind of ontologies needed for Web 3.0, and experiments with our prototype system show that automatic harvesting, automatic annotation, and automatic superimposition of a web of data over a web of pages work well.

  3. Performance Engineering Research Institute SciDAC-2 Enabling Technologies Institute Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lucas, Robert

    2013-04-20

    Enhancing the performance of SciDAC applications on petascale systems had high priority within DOE SC at the start of the second phase of the SciDAC program, SciDAC-2, as it continues to do so today. Achieving expected levels of performance on high-end computing (HEC) systems is growing ever more challenging due to enormous scale, increasing architectural complexity, and increasing application complexity. To address these challenges, the University of Southern California?s Information Sciences Institute organized the Performance Engineering Research Institute (PERI). PERI implemented a unified, tripartite research plan encompassing: (1) performance modeling and prediction; (2) automatic performance tuning; and (3) performance engineeringmore » of high profile applications. Within PERI, USC?s primary research activity was automatic tuning (autotuning) of scientific software. This activity was spurred by the strong user preference for automatic tools and was based on previous successful activities such as ATLAS, which automatically tuned components of the LAPACK linear algebra library, and other recent work on autotuning domain-specific libraries. Our other major component was application engagement, to which we devoted approximately 30% of our effort to work directly with SciDAC-2 applications. This report is a summary of the overall results of the USC PERI effort.« less

  4. Automatic lumbar spine measurement in CT images

    NASA Astrophysics Data System (ADS)

    Mao, Yunxiang; Zheng, Dong; Liao, Shu; Peng, Zhigang; Yan, Ruyi; Liu, Junhua; Dong, Zhongxing; Gong, Liyan; Zhou, Xiang Sean; Zhan, Yiqiang; Fei, Jun

    2017-03-01

    Accurate lumbar spine measurement in CT images provides an essential way for quantitative spinal diseases analysis such as spondylolisthesis and scoliosis. In today's clinical workflow, the measurements are manually performed by radiologists and surgeons, which is time consuming and irreproducible. Therefore, automatic and accurate lumbar spine measurement algorithm becomes highly desirable. In this study, we propose a method to automatically calculate five different lumbar spine measurements in CT images. There are three main stages of the proposed method: First, a learning based spine labeling method, which integrates both the image appearance and spine geometry information, is used to detect lumbar and sacrum vertebrae in CT images. Then, a multiatlases based image segmentation method is used to segment each lumbar vertebra and the sacrum based on the detection result. Finally, measurements are derived from the segmentation result of each vertebra. Our method has been evaluated on 138 spinal CT scans to automatically calculate five widely used clinical spine measurements. Experimental results show that our method can achieve more than 90% success rates across all the measurements. Our method also significantly improves the measurement efficiency compared to manual measurements. Besides benefiting the routine clinical diagnosis of spinal diseases, our method also enables the large scale data analytics for scientific and clinical researches.

  5. preAssemble: a tool for automatic sequencer trace data processing.

    PubMed

    Adzhubei, Alexei A; Laerdahl, Jon K; Vlasova, Anna V

    2006-01-17

    Trace or chromatogram files (raw data) are produced by automatic nucleic acid sequencing equipment or sequencers. Each file contains information which can be interpreted by specialised software to reveal the sequence (base calling). This is done by the sequencer proprietary software or publicly available programs. Depending on the size of a sequencing project the number of trace files can vary from just a few to thousands of files. Sequencing quality assessment on various criteria is important at the stage preceding clustering and contig assembly. Two major publicly available packages--Phred and Staden are used by preAssemble to perform sequence quality processing. The preAssemble pre-assembly sequence processing pipeline has been developed for small to large scale automatic processing of DNA sequencer chromatogram (trace) data. The Staden Package Pregap4 module and base-calling program Phred are utilized in the pipeline, which produces detailed and self-explanatory output that can be displayed with a web browser. preAssemble can be used successfully with very little previous experience, however options for parameter tuning are provided for advanced users. preAssemble runs under UNIX and LINUX operating systems. It is available for downloading and will run as stand-alone software. It can also be accessed on the Norwegian Salmon Genome Project web site where preAssemble jobs can be run on the project server. preAssemble is a tool allowing to perform quality assessment of sequences generated by automatic sequencing equipment. preAssemble is flexible since both interactive jobs on the preAssemble server and the stand alone downloadable version are available. Virtually no previous experience is necessary to run a default preAssemble job, on the other hand options for parameter tuning are provided. Consequently preAssemble can be used as efficiently for just several trace files as for large scale sequence processing.

  6. Differences between autogenous and reactive obsessions in terms of metacognitions and automatic thoughts.

    PubMed

    Keleş Altun, İlkay; Uysal, Emel; Özkorumak Karagüzel, Evrim

    2017-01-01

    Obsessive compulsive disorder (OCD) is characterized by obsessions and compulsions. Obsessions have been classified as autogenous obsessions and reactive obsessions on the basis of the cognitive theory of Lee and Kwon. The aim of this study was to investigate the differences between autogenous groups (AG) and reactive groups (RG) in terms of metacognition and automatic thoughts, for the purpose of investigating the differences of cognitive appraisals. One hundred and thirty-three patients diagnosed with OCD were included in the study as the patient group. A control group was formed of 133 age, gender and education-matched healthy individuals. The OCD group patients were separated into subgroups according to the primary obsessions. The sociodemographic data, and the Yale-Brown Obsessive Compulsive Scale, Metacognition Questionnaire-30 (MCQ-30), Automatic Thoughts Questionnaire (ATQ), Beck Depression Inventory (BDI) and Beck Anxiety Inventory (BAI) scores of the AG, RG, and control groups were compared. The MCQ-30 (total) and the subscales of MCQ-30 and ATQ scale points were seen to be significantly higher in the AG than in the RG and significantly higher in the RG than in the control group. In the reactive obsession group, the predictive variables of the ATQ points were determined to be MCQ-30 (total), BDI and BAI. In the autogenous obsession group, the predictive variables of the ATQ points were determined to be BDI and BAI. In the current study, differences were determined between the AG and the RG in respect of metacognitions and automatic thoughts. In light of these results, the recommended grouping can be considered useful in the identification of OCD sub-types. There is a need for further studies to identify more homogenous sub-types of OCD. Future multi-centered studies of sub-typing with larger samples using more specific instruments to sub-type and dimensional evaluation will be useful for detailed evaluation and better understanding of the subject.

  7. Why not the best? Social anxiety symptoms and perfectionism among Israeli Jews and Arabs: a comparative Study.

    PubMed

    Iancu, I; Bodner, E; Joubran, S; Ben Zion, I; Ram, E

    2015-05-01

    Social Anxiety Disorder (SAD) has been repeatedly shown to be very prevalent in the Western society and is characterized by low self-esteem, pessimism, procrastination and also perfectionism. Very few studies on SAD have been done in the Middle East or in Arab countries, and no study tackled the relationship between social anxiety symptoms and perfectionism in non-Western samples. We examined social anxiety symptoms and perfectionism in a group of 132 Israeli Jewish (IJ) and Israeli Arab (IA) students. Subjects completed the Liebowitz Social Anxiety Scale (LSAS), the Multidimensional Perfectionism Scale (MPS), the Negative Automatic Thoughts Questionnaire (ATQ-N), the Positive Automatic Thoughts Questionnaire (ATQ-P) and a socio-demographic questionnaire. The rate of SAD in our sample according to a LSAS score of 60 or more was 17.2% (IJ=13.8%, IA=19%, ns). The correlation between perfectionism and the LSAS was high in both groups, and in particular in the IJ group. The IA group had higher scores of social avoidance, of ATQ-P and of two of the MPS subscales: parental expectations and parental criticism. Concern over mistakes and negative automatic thoughts positively predicted social fear in the IJ group, whereas in the IA group being female, religious and less educated positively predicted social fear. Negative automatic thoughts and age positively predicted social avoidance in the IJ group. In general, the IJ and IA subjects showed higher social anxiety, higher ATQ-N scores and lower parental expectations as compared with non-clinical US samples. Social anxiety symptoms and perfectionism are prevalent in Arab and Jewish students in Israel and seem to be closely related. Further studies among non-western minority groups may detect cultural influences on social anxiety and might add to the growing body of knowledge on this intriguing condition. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Fully automatic adjoints: a robust and efficient mechanism for generating adjoint ocean models

    NASA Astrophysics Data System (ADS)

    Ham, D. A.; Farrell, P. E.; Funke, S. W.; Rognes, M. E.

    2012-04-01

    The problem of generating and maintaining adjoint models is sufficiently difficult that typically only the most advanced and well-resourced community ocean models achieve it. There are two current technologies which each suffer from their own limitations. Algorithmic differentiation, also called automatic differentiation, is employed by models such as the MITGCM [2] and the Alfred Wegener Institute model FESOM [3]. This technique is very difficult to apply to existing code, and requires a major initial investment to prepare the code for automatic adjoint generation. AD tools may also have difficulty with code employing modern software constructs such as derived data types. An alternative is to formulate the adjoint differential equation and to discretise this separately. This approach, known as the continuous adjoint and employed in ROMS [4], has the disadvantage that two different model code bases must be maintained and manually kept synchronised as the model develops. The discretisation of the continuous adjoint is not automatically consistent with that of the forward model, producing an additional source of error. The alternative presented here is to formulate the flow model in the high level language UFL (Unified Form Language) and to automatically generate the model using the software of the FEniCS project. In this approach it is the high level code specification which is differentiated, a task very similar to the formulation of the continuous adjoint [5]. However since the forward and adjoint models are generated automatically, the difficulty of maintaining them vanishes and the software engineering process is therefore robust. The scheduling and execution of the adjoint model, including the application of an appropriate checkpointing strategy is managed by libadjoint [1]. In contrast to the conventional algorithmic differentiation description of a model as a series of primitive mathematical operations, libadjoint employs a new abstraction of the simulation process as a sequence of discrete equations which are assembled and solved. It is the coupling of the respective abstractions employed by libadjoint and the FEniCS project which produces the adjoint model automatically, without further intervention from the model developer. This presentation will demonstrate this new technology through linear and non-linear shallow water test cases. The exceptionally simple model syntax will be highlighted and the correctness of the resulting adjoint simulations will be demonstrated using rigorous convergence tests.

  9. Electron Density Profiles of the Topside Ionosphere

    NASA Technical Reports Server (NTRS)

    Huang, Xue-Qin; Reinsch, Bodo W.; Bilitza, Dieter; Benson, Robert F.

    2002-01-01

    The existing uncertainties about the electron density profiles in the topside ionosphere, i.e., in the height region from h,F2 to - 2000 km, require the search for new data sources. The ISIS and Alouette topside sounder satellites from the sixties to the eighties recorded millions of ionograms but most were not analyzed in terms of electron density profiles. In recent years an effort started to digitize the analog recordings to prepare the ionograms for computerized analysis. As of November 2001 about 350000 ionograms have been digitized from the original 7-track analog tapes. These data are available in binary and CDF format from the anonymous ftp site of the National Space Science Data Center. A search site and browse capabilities on CDAWeb assist the scientific usage of these data. All information and access links can be found at http://nssdc.gsfc.nasa.gov/space/isis/isis- status.htm1. This paper describes the ISIS data restoration effort and shows how the digital ionograms are automatically processed into electron density profiles from satellite orbit altitude (1400 km for ISIS-2) down to the F peak. Because of the large volume of data an automated processing algorithm is imperative. The TOPside Ionogram Scaler with True height algorithm TOPIST software developed for this task is successfully scaling - 70% of the ionograms. An <> is available to manually scale the more difficult ionograms. The automated processing of the digitized ISIS ionograms is now underway, producing a much-needed database of topside electron density profiles for ionospheric modeling covering more than one solar cycle.

  10. Large-scale preparation of clove essential oil and eugenol-loaded liposomes using a membrane contactor and a pilot plant.

    PubMed

    Sebaaly, Carine; Greige-Gerges, Hélène; Agusti, Géraldine; Fessi, Hatem; Charcosset, Catherine

    2016-01-01

    Based on our previous study where optimal conditions were defined to encapsulate clove essential oil (CEO) into liposomes at laboratory scale, we scaled-up the preparation of CEO and eugenol (Eug)-loaded liposomes using a membrane contactor (600 mL) and a pilot plant (3 L) based on the principle of ethanol injection method, both equipped with a Shirasu Porous Glass membrane for injection of the organic phase into the aqueous phase. Homogenous, stable, nanometric-sized and multilamellar liposomes with high phospholipid, Eug loading rates and encapsulation efficiency of CEO components were obtained. Saturation of phospholipids and drug concentration in the organic phase may control the liposome stability. Liposomes loaded with other hydrophobic volatile compounds could be prepared at large scale using the ethanol injection method and a membrane for injection.

  11. Data Intensive Systems (DIS) Benchmark Performance Summary

    DTIC Science & Technology

    2003-08-01

    models assumed by today’s conventional architectures. Such applications include model- based Automatic Target Recognition (ATR), synthetic aperture...radar (SAR) codes, large scale dynamic databases/battlefield integration, dynamic sensor- based processing, high-speed cryptanalysis, high speed...distributed interactive and data intensive simulations, data-oriented problems characterized by pointer- based and other highly irregular data structures

  12. SWAT Check: A screening tool to assist users in the identification of potential model application problems

    USDA-ARS?s Scientific Manuscript database

    The Soil and Water Assessment Tool (SWAT) is a basin scale hydrologic model developed by the US Department of Agriculture-Agricultural Research Service. SWAT's broad applicability, user friendly model interfaces, and automatic calibration software have led to a rapid increase in the number of new u...

  13. Multiple Object Retrieval in Image Databases Using Hierarchical Segmentation Tree

    ERIC Educational Resources Information Center

    Chen, Wei-Bang

    2012-01-01

    The purpose of this research is to develop a new visual information analysis, representation, and retrieval framework for automatic discovery of salient objects of user's interest in large-scale image databases. In particular, this dissertation describes a content-based image retrieval framework which supports multiple-object retrieval. The…

  14. Predicting habits of vegetable parenting practices to facilitate the design of change programmes

    USDA-ARS?s Scientific Manuscript database

    Habit has been defined as the automatic performance of a usual behaviour. The present paper reports the relationships of variables from a Model of Goal Directed Behavior to four scales in regard to parents' habits when feeding their children: habit of (i) actively involving child in selection of veg...

  15. Self-Correcting Electronically-Scanned Pressure Sensor

    NASA Technical Reports Server (NTRS)

    Gross, C.; Basta, T.

    1982-01-01

    High-data-rate sensor automatically corrects for temperature variations. Multichannel, self-correcting pressure sensor can be used in wind tunnels, aircraft, process controllers and automobiles. Offers data rates approaching 100,000 measurements per second with inaccuracies due to temperature shifts held below 0.25 percent (nominal) of full scale over a temperature span of 55 degrees C.

  16. Validation of Automated Scoring of Oral Reading

    ERIC Educational Resources Information Center

    Balogh, Jennifer; Bernstein, Jared; Cheng, Jian; Van Moere, Alistair; Townshend, Brent; Suzuki, Masanori

    2012-01-01

    A two-part experiment is presented that validates a new measurement tool for scoring oral reading ability. Data collected by the U.S. government in a large-scale literacy assessment of adults were analyzed by a system called VersaReader that uses automatic speech recognition and speech processing technologies to score oral reading fluency. In the…

  17. Automatic Scaling of Digisonde Ionograms Test and Evaluation Report.

    DTIC Science & Technology

    1982-09-01

    Labrador 30 21 IManual MUF - Auto MUFj/Manual MUF April 1980 Goose Bay, Labrador 31 22 IManual MUF - Auto MUFI /Manual MUF July 1980 Goose Bay...Labrador 32 23 IManual MUF - Auto MUFI /Manual MUF September 1980 Goose Bay, Labrador 33 24 !Manual M(3000) - Auto M(3000)1 January 1980 Goose Bay, Labrador

  18. Defense Horizons. Number 11, April 2002. Computer Games and the Military: Two Views

    DTIC Science & Technology

    2002-04-01

    environmental knowledge). As a design and engineering challenge, Star Wars Galaxies rivals the con- struction of a space station in its sheer scale and... Rangers . (Picture Mark Messier and Ken Belanger, running down the halls with automatic weapons, out for blood—it was only a matter of time.) 8 Clan

  19. Multiscale CNNs for Brain Tumor Segmentation and Diagnosis.

    PubMed

    Zhao, Liya; Jia, Kebin

    2016-01-01

    Early brain tumor detection and diagnosis are critical to clinics. Thus segmentation of focused tumor area needs to be accurate, efficient, and robust. In this paper, we propose an automatic brain tumor segmentation method based on Convolutional Neural Networks (CNNs). Traditional CNNs focus only on local features and ignore global region features, which are both important for pixel classification and recognition. Besides, brain tumor can appear in any place of the brain and be any size and shape in patients. We design a three-stream framework named as multiscale CNNs which could automatically detect the optimum top-three scales of the image sizes and combine information from different scales of the regions around that pixel. Datasets provided by Multimodal Brain Tumor Image Segmentation Benchmark (BRATS) organized by MICCAI 2013 are utilized for both training and testing. The designed multiscale CNNs framework also combines multimodal features from T1, T1-enhanced, T2, and FLAIR MRI images. By comparison with traditional CNNs and the best two methods in BRATS 2012 and 2013, our framework shows advances in brain tumor segmentation accuracy and robustness.

  20. Visual Iconicity Across Sign Languages: Large-Scale Automated Video Analysis of Iconic Articulators and Locations

    PubMed Central

    Östling, Robert; Börstell, Carl; Courtaux, Servane

    2018-01-01

    We use automatic processing of 120,000 sign videos in 31 different sign languages to show a cross-linguistic pattern for two types of iconic form–meaning relationships in the visual modality. First, we demonstrate that the degree of inherent plurality of concepts, based on individual ratings by non-signers, strongly correlates with the number of hands used in the sign forms encoding the same concepts across sign languages. Second, we show that certain concepts are iconically articulated around specific parts of the body, as predicted by the associational intuitions by non-signers. The implications of our results are both theoretical and methodological. With regard to theoretical implications, we corroborate previous research by demonstrating and quantifying, using a much larger material than previously available, the iconic nature of languages in the visual modality. As for the methodological implications, we show how automatic methods are, in fact, useful for performing large-scale analysis of sign language data, to a high level of accuracy, as indicated by our manual error analysis. PMID:29867684

  1. MovieMaker: a web server for rapid rendering of protein motions and interactions

    PubMed Central

    Maiti, Rajarshi; Van Domselaar, Gary H.; Wishart, David S.

    2005-01-01

    MovieMaker is a web server that allows short (∼10 s), downloadable movies of protein motions to be generated. It accepts PDB files or PDB accession numbers as input and automatically calculates, renders and merges the necessary image files to create colourful animations covering a wide range of protein motions and other dynamic processes. Users have the option of animating (i) simple rotation, (ii) morphing between two end-state conformers, (iii) short-scale, picosecond vibrations, (iv) ligand docking, (v) protein oligomerization, (vi) mid-scale nanosecond (ensemble) motions and (vii) protein folding/unfolding. MovieMaker does not perform molecular dynamics calculations. Instead it is an animation tool that uses a sophisticated superpositioning algorithm in conjunction with Cartesian coordinate interpolation to rapidly and automatically calculate the intermediate structures needed for many of its animations. Users have extensive control over the rendering style, structure colour, animation quality, background and other image features. MovieMaker is intended to be a general-purpose server that allows both experts and non-experts to easily generate useful, informative protein animations for educational and illustrative purposes. MovieMaker is accessible at . PMID:15980488

  2. Grohar: Automated Visualization of Genome-Scale Metabolic Models and Their Pathways.

    PubMed

    Moškon, Miha; Zimic, Nikolaj; Mraz, Miha

    2018-05-01

    Genome-scale metabolic models (GEMs) have become a powerful tool for the investigation of the entire metabolism of the organism in silico. These models are, however, often extremely hard to reconstruct and also difficult to apply to the selected problem. Visualization of the GEM allows us to easier comprehend the model, to perform its graphical analysis, to find and correct the faulty relations, to identify the parts of the system with a designated function, etc. Even though several approaches for the automatic visualization of GEMs have been proposed, metabolic maps are still manually drawn or at least require large amount of manual curation. We present Grohar, a computational tool for automatic identification and visualization of GEM (sub)networks and their metabolic fluxes. These (sub)networks can be specified directly by listing the metabolites of interest or indirectly by providing reference metabolic pathways from different sources, such as KEGG, SBML, or Matlab file. These pathways are identified within the GEM using three different pathway alignment algorithms. Grohar also supports the visualization of the model adjustments (e.g., activation or inhibition of metabolic reactions) after perturbations are induced.

  3. A prototype automatic phase compensation module

    NASA Technical Reports Server (NTRS)

    Terry, John D.

    1992-01-01

    The growing demands for high gain and accurate satellite communication systems will necessitate the utilization of large reflector systems. One area of concern of reflector based satellite communication is large scale surface deformations due to thermal effects. These distortions, when present, can degrade the performance of the reflector system appreciable. This performance degradation is manifested by a decrease in peak gain, and increase in sidelobe level, and pointing errors. It is essential to compensate for these distortion effects and to maintain the required system performance in the operating space environment. For this reason the development of a technique to offset the degradation effects is highly desirable. Currently, most research is direct at developing better material for the reflector. These materials have a lower coefficient of linear expansion thereby reducing the surface errors. Alternatively, one can minimize the distortion effects of these large scale errors by adaptive phased array compensation. Adaptive phased array techniques have been studied extensively at NASA and elsewhere. Presented in this paper is a prototype automatic phase compensation module designed and built at NASA Lewis Research Center which is the first stage of development for an adaptive array compensation module.

  4. Automatic segmentation and centroid detection of skin sensors for lung interventions

    NASA Astrophysics Data System (ADS)

    Lu, Kongkuo; Xu, Sheng; Xue, Zhong; Wong, Stephen T.

    2012-02-01

    Electromagnetic (EM) tracking has been recognized as a valuable tool for locating the interventional devices in procedures such as lung and liver biopsy or ablation. The advantage of this technology is its real-time connection to the 3D volumetric roadmap, i.e. CT, of a patient's anatomy while the intervention is performed. EM-based guidance requires tracking of the tip of the interventional device, transforming the location of the device onto pre-operative CT images, and superimposing the device in the 3D images to assist physicians to complete the procedure more effectively. A key requirement of this data integration is to find automatically the mapping between EM and CT coordinate systems. Thus, skin fiducial sensors are attached to patients before acquiring the pre-operative CTs. Then, those sensors can be recognized in both CT and EM coordinate systems and used calculate the transformation matrix. In this paper, to enable the EM-based navigation workflow and reduce procedural preparation time, an automatic fiducial detection method is proposed to obtain the centroids of the sensors from the pre-operative CT. The approach has been applied to 13 rabbit datasets derived from an animal study and eight human images from an observation study. The numerical results show that it is a reliable and efficient method for use in EM-guided application.

  5. Automatic guidance of attention during real-world visual search.

    PubMed

    Seidl-Rathkopf, Katharina N; Turk-Browne, Nicholas B; Kastner, Sabine

    2015-08-01

    Looking for objects in cluttered natural environments is a frequent task in everyday life. This process can be difficult, because the features, locations, and times of appearance of relevant objects often are not known in advance. Thus, a mechanism by which attention is automatically biased toward information that is potentially relevant may be helpful. We tested for such a mechanism across five experiments by engaging participants in real-world visual search and then assessing attentional capture for information that was related to the search set but was otherwise irrelevant. Isolated objects captured attention while preparing to search for objects from the same category embedded in a scene, as revealed by lower detection performance (Experiment 1A). This capture effect was driven by a central processing bottleneck rather than the withdrawal of spatial attention (Experiment 1B), occurred automatically even in a secondary task (Experiment 2A), and reflected enhancement of matching information rather than suppression of nonmatching information (Experiment 2B). Finally, attentional capture extended to objects that were semantically associated with the target category (Experiment 3). We conclude that attention is efficiently drawn towards a wide range of information that may be relevant for an upcoming real-world visual search. This mechanism may be adaptive, allowing us to find information useful for our behavioral goals in the face of uncertainty.

  6. Towards automatic patient positioning and scan planning using continuously moving table MR imaging.

    PubMed

    Koken, Peter; Dries, Sebastian P M; Keupp, Jochen; Bystrov, Daniel; Pekar, Vladimir; Börnert, Peter

    2009-10-01

    A concept is proposed to simplify patient positioning and scan planning to improve ease of use and workflow in MR. After patient preparation in front of the scanner the operator selects the anatomy of interest by a single push-button action. Subsequently, the patient table is moved automatically into the scanner, while real-time 3D isotropic low-resolution continuously moving table scout scanning is performed using patient-independent MR system settings. With a real-time organ identification process running in parallel and steering the scanner, the target anatomy can be positioned fully automatically in the scanner's sensitive volume. The desired diagnostic examination of the anatomy of interest can be planned and continued immediately using the geometric information derived from the acquired 3D data. The concept was implemented and successfully tested in vivo in 12 healthy volunteers, focusing on the liver as the target anatomy. The positioning accuracy achieved was on the order of several millimeters, which turned out to be sufficient for initial planning purposes. Furthermore, the impact of nonoptimal system settings on the positioning performance, the signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR) was investigated. The present work proved the basic concept of the proposed approach as an element of future scan automation. (c) 2009 Wiley-Liss, Inc.

  7. Advanced and standardized evaluation of neurovascular compression syndromes

    NASA Astrophysics Data System (ADS)

    Hastreiter, Peter; Vega Higuera, Fernando; Tomandl, Bernd; Fahlbusch, Rudolf; Naraghi, Ramin

    2004-05-01

    Caused by a contact between vascular structures and the root entry or exit zone of cranial nerves neurovascular compression syndromes are combined with different neurological diseases (trigeminal neurolagia, hemifacial spasm, vertigo, glossopharyngeal neuralgia) and show a relation with essential arterial hypertension. As presented previously, the semi-automatic segmentation and 3D visualization of strongly T2 weighted MR volumes has proven to be an effective strategy for a better spatial understanding prior to operative microvascular decompression. After explicit segmentation of coarse structures, the tiny target nerves and vessels contained in the area of cerebrospinal fluid are segmented implicitly using direct volume rendering. However, based on this strategy the delineation of vessels in the vicinity of the brainstem and those at the border of the segmented CSF subvolume are critical. Therefore, we suggest registration with MR angiography and introduce consecutive fusion after semi-automatic labeling of the vascular information. Additionally, we present an approach of automatic 3D visualization and video generation based on predefined flight paths. Thereby, a standardized evaluation of the fused image data is supported and the visualization results are optimally prepared for intraoperative application. Overall, our new strategy contributes to a significantly improved 3D representation and evaluation of vascular compression syndromes. Its value for diagnosis and surgery is demonstrated with various clinical examples.

  8. Highly Enantioselective Synthesis of syn-β-Hydroxy α-Dibenzylamino Esters via DKR Asymmetric Transfer Hydrogenation and Gram-Scale Preparation of Droxidopa.

    PubMed

    Sun, Guodong; Zhou, Zihong; Luo, Zhonghua; Wang, Hailong; Chen, Lei; Xu, Yongbo; Li, Shun; Jian, Weilin; Zeng, Jiebin; Hu, Benquan; Han, Xiaodong; Lin, Yicao; Wang, Zhongqing

    2017-08-18

    A highly efficient preparation of enantiomerically pure syn aryl β-hydroxy α-dibenzylamino esters is reported. The outcome was achieved via dynamic kinetic resolution and asymmetric transfer hydrogenation of aryl α-dibenzylamino β-keto esters. The desired products were obtained in high yields (up to 98%) with excellent diastereoselectivity (>20:1 dr) and enantioselectivity (up to >99% ee). Furthermore, this method was applied for the gram-scale preparation of droxidopa.

  9. Surrounded by work platforms, the full-scale Orion AFT crew module (center) is undergoing preparations for the first flight test of Orion's launch abort system.

    NASA Image and Video Library

    2008-05-20

    Surrounded by work platforms, NASA's first full-scale Orion abort flight test (AFT) crew module (center) is undergoing preparations at the NASA Dryden Flight Research Center in California for the first flight test of Orion's launch abort system.

  10. A comparison between modeled and measured permafrost temperatures at Ritigraben borehole, Switzerland

    NASA Astrophysics Data System (ADS)

    Mitterer-Hoinkes, Susanna; Lehning, Michael; Phillips, Marcia; Sailer, Rudolf

    2013-04-01

    The area-wide distribution of permafrost is sparsely known in mountainous terrain (e.g. Alps). Permafrost monitoring can only be based on point or small scale measurements such as boreholes, active rock glaciers, BTS measurements or geophysical measurements. To get a better understanding of permafrost distribution, it is necessary to focus on modeling permafrost temperatures and permafrost distribution patterns. A lot of effort on these topics has been already expended using different kinds of models. In this study, the evolution of subsurface temperatures over successive years has been modeled at the location Ritigraben borehole (Mattertal, Switzerland) by using the one-dimensional snow cover model SNOWPACK. The model needs meteorological input and in our case information on subsurface properties. We used meteorological input variables of the automatic weather station Ritigraben (2630 m) in combination with the automatic weather station Saas Seetal (2480 m). Meteorological data between 2006 and 2011 on an hourly basis were used to drive the model. As former studies showed, the snow amount and the snow cover duration have a great influence on the thermal regime. Low snow heights allow for deeper penetration of low winter temperatures into the ground, strong winters with a high amount of snow attenuate this effect. In addition, variations in subsurface conditions highly influence the temperature regime. Therefore, we conducted sensitivity runs by defining a series of different subsurface properties. The modeled subsurface temperature profiles of Ritigraben were then compared to the measured temperatures in the Ritigraben borehole. This allows a validation of the influence of subsurface properties on the temperature regime. As expected, the influence of the snow cover is stronger than the influence of sub-surface material properties, which are significant, however. The validation presented here serves to prepare a larger spatial simulation with the complex hydro-meteorological 3-dimensional model Alpine 3D, which is based on a distributed application of SNOWPACK.

  11. Study on Misalignment Angle Compensation during Scale Factor Matching for Two Pairs of Accelerometers in a Gravity Gradient Instrument

    PubMed Central

    Huang, Xiangqing; Deng, Zhongguang; Xie, Yafei; Fan, Ji; Hu, Chenyuan

    2018-01-01

    A method for automatic compensation of misalignment angles during matching the scale factors of two pairs of the accelerometers in developing the rotating accelerometer gravity gradient instrument (GGI) is proposed and demonstrated in this paper. The purpose of automatic scale factor matching of the four accelerometers in GGI is to suppress the common mode acceleration of the moving-based platforms. However, taking the full model equation of the accelerometer into consideration, the other two orthogonal axes which is the pendulous axis and the output axis, will also sense the common mode acceleration and reduce the suppression performance. The coefficients from the two axes to the output are δO and δP respectively, called the misalignment angles. The angle δO, coupling with the acceleration along the pendulous axis perpendicular to the rotational plane, will not be modulated by the rotation and gives little contribution to the scale factors matching. On the other hand, because of coupling with the acceleration along the centripetal direction in the rotating plane, the angle δP would produce a component with 90 degrees phase delay relative to the scale factor component. Hence, the δP component coincides exactly with the sensitive direction of the orthogonal accelerometers. To improve the common mode acceleration rejection, the misalignment angle δP is compensated by injecting a trimming current, which is proportional to the output of an orthogonal accelerometer, into the torque coil of the accelerometer during the scale factor matching. The experimental results show that the common linear acceleration suppression achieved three orders after the scale factors balance and five orders after the misalignment angles compensation, which is almost down to the noise level of the used accelerometers of 1~2 × 10−7 g/√Hz (1 g ≈ 9.8 m/s2). PMID:29670021

  12. Study on Misalignment Angle Compensation during Scale Factor Matching for Two Pairs of Accelerometers in a Gravity Gradient Instrument.

    PubMed

    Huang, Xiangqing; Deng, Zhongguang; Xie, Yafei; Fan, Ji; Hu, Chenyuan; Tu, Liangcheng

    2018-04-18

    A method for automatic compensation of misalignment angles during matching the scale factors of two pairs of the accelerometers in developing the rotating accelerometer gravity gradient instrument (GGI) is proposed and demonstrated in this paper. The purpose of automatic scale factor matching of the four accelerometers in GGI is to suppress the common mode acceleration of the moving-based platforms. However, taking the full model equation of the accelerometer into consideration, the other two orthogonal axes which is the pendulous axis and the output axis, will also sense the common mode acceleration and reduce the suppression performance. The coefficients from the two axes to the output are δ O and δ P respectively, called the misalignment angles. The angle δ O , coupling with the acceleration along the pendulous axis perpendicular to the rotational plane, will not be modulated by the rotation and gives little contribution to the scale factors matching. On the other hand, because of coupling with the acceleration along the centripetal direction in the rotating plane, the angle δ P would produce a component with 90 degrees phase delay relative to the scale factor component. Hence, the δ P component coincides exactly with the sensitive direction of the orthogonal accelerometers. To improve the common mode acceleration rejection, the misalignment angle δ P is compensated by injecting a trimming current, which is proportional to the output of an orthogonal accelerometer, into the torque coil of the accelerometer during the scale factor matching. The experimental results show that the common linear acceleration suppression achieved three orders after the scale factors balance and five orders after the misalignment angles compensation, which is almost down to the noise level of the used accelerometers of 1~2 × 10 −7 g/√Hz (1 g ≈ 9.8 m/s²).

  13. Automatic control of cryogenic wind tunnels

    NASA Technical Reports Server (NTRS)

    Balakrishna, S.

    1989-01-01

    Inadequate Reynolds number similarity in testing of scaled models affects the quality of aerodynamic data from wind tunnels. This is due to scale effects of boundary-layer shock wave interaction which is likely to be severe at transonic speeds. The idea of operation of wind tunnels using test gas cooled to cryogenic temperatures has yielded a quantrum jump in the ability to realize full scale Reynolds number flow similarity in small transonic tunnels. In such tunnels, the basic flow control problem consists of obtaining and maintaining the desired test section flow parameters. Mach number, Reynolds number, and dynamic pressure are the three flow parameters that are usually required to be kept constant during the period of model aerodynamic data acquisition. The series of activity involved in modeling, control law development, mechanization of the control laws on a microcomputer, and the performance of a globally stable automatic control system for the 0.3-m Transonic Cryogenic Tunnel (TCT) are discussed. A lumped multi-variable nonlinear dynamic model of the cryogenic tunnel, generation of a set of linear control laws for small perturbation, and nonlinear control strategy for large set point changes including tunnel trajectory control are described. The details of mechanization of the control laws on a 16 bit microcomputer system, the software features, operator interface, the display and safety are discussed. The controller is shown to provide globally stable and reliable temperature control to + or - 0.2 K, pressure to + or - 0.07 psi and Mach number to + or - 0.002 of the set point value. This performance is obtained both during large set point commands as for a tunnel cooldown, and during aerodynamic data acquisition with intrusive activity like geometrical changes in the test section such as angle of attack changes, drag rake movements, wall adaptation and sidewall boundary-layer removal. Feasibility of the use of an automatic Reynolds number control mode with fixed Mach number control is demonstrated.

  14. Automated structure determination of proteins with the SAIL-FLYA NMR method.

    PubMed

    Takeda, Mitsuhiro; Ikeya, Teppei; Güntert, Peter; Kainosho, Masatsune

    2007-01-01

    The labeling of proteins with stable isotopes enhances the NMR method for the determination of 3D protein structures in solution. Stereo-array isotope labeling (SAIL) provides an optimal stereospecific and regiospecific pattern of stable isotopes that yields sharpened lines, spectral simplification without loss of information, and the ability to collect rapidly and evaluate fully automatically the structural restraints required to solve a high-quality solution structure for proteins up to twice as large as those that can be analyzed using conventional methods. Here, we describe a protocol for the preparation of SAIL proteins by cell-free methods, including the preparation of S30 extract and their automated structure analysis using the FLYA algorithm and the program CYANA. Once efficient cell-free expression of the unlabeled or uniformly labeled target protein has been achieved, the NMR sample preparation of a SAIL protein can be accomplished in 3 d. A fully automated FLYA structure calculation can be completed in 1 d on a powerful computer system.

  15. Solid Phase Microextraction and Related Techniques for Drugs in Biological Samples

    PubMed Central

    Moein, Mohammad Mahdi; Said, Rana; Bassyouni, Fatma

    2014-01-01

    In drug discovery and development, the quantification of drugs in biological samples is an important task for the determination of the physiological performance of the investigated drugs. After sampling, the next step in the analytical process is sample preparation. Because of the low concentration levels of drug in plasma and the variety of the metabolites, the selected extraction technique should be virtually exhaustive. Recent developments of sample handling techniques are directed, from one side, toward automatization and online coupling of sample preparation units. The primary objective of this review is to present the recent developments in microextraction sample preparation methods for analysis of drugs in biological fluids. Microextraction techniques allow for less consumption of solvent, reagents, and packing materials, and small sample volumes can be used. In this review the use of solid phase microextraction (SPME), microextraction in packed sorbent (MEPS), and stir-bar sorbtive extraction (SBSE) in drug analysis will be discussed. In addition, the use of new sorbents such as monoliths and molecularly imprinted polymers will be presented. PMID:24688797

  16. Application of 2-cyanoethyl N,N,N',N'-tetraisopropylphosphorodiamidite for in situ preparation of deoxyribonucleoside phosphoramidites and their use in polymer-supported synthesis of oligodeoxyribonucleotides.

    PubMed Central

    Nielsen, J; Taagaard, M; Marugg, J E; van Boom, J H; Dahl, O

    1986-01-01

    Deoxyribonucleoside phosphoramidites are prepared in situ from 5'-O,N-protected deoxyribonucleosides and 2-cyanoethyl N,N,N',N'-tetraisopropylphosphorodiamidite with tetrazole as catalyst, and the solutions applied directly on an automatic solid-phase DNA synthesizer. Using LCAA-CPG support and a cycle time of 12.5 min, oligonucleotides of 16-25 bases are obtained with a DMT-efficiency per cycle of 98.0-99.3%. The crude and fully deblocked products are of a purity comparable to that obtained using purified phosphoramidites. In case of d(G)16 the product was difficult to analyse and a better product was not obtained using doubly protected (O-6 diphenylcarbamoyl) guanine. PMID:3763407

  17. Image-based automatic recognition of larvae

    NASA Astrophysics Data System (ADS)

    Sang, Ru; Yu, Guiying; Fan, Weijun; Guo, Tiantai

    2010-08-01

    As the main objects, imagoes have been researched in quarantine pest recognition in these days. However, pests in their larval stage are latent, and the larvae spread abroad much easily with the circulation of agricultural and forest products. It is presented in this paper that, as the new research objects, larvae are recognized by means of machine vision, image processing and pattern recognition. More visional information is reserved and the recognition rate is improved as color image segmentation is applied to images of larvae. Along with the characteristics of affine invariance, perspective invariance and brightness invariance, scale invariant feature transform (SIFT) is adopted for the feature extraction. The neural network algorithm is utilized for pattern recognition, and the automatic identification of larvae images is successfully achieved with satisfactory results.

  18. Image processing in biodosimetry: A proposal of a generic free software platform.

    PubMed

    Dumpelmann, Matthias; Cadena da Matta, Mariel; Pereira de Lemos Pinto, Marcela Maria; de Salazar E Fernandes, Thiago; Borges da Silva, Edvane; Amaral, Ademir

    2015-08-01

    The scoring of chromosome aberrations is the most reliable biological method for evaluating individual exposure to ionizing radiation. However, microscopic analyses of chromosome human metaphases, generally employed to identify aberrations mainly dicentrics (chromosome with two centromeres), is a laborious task. This method is time consuming and its application in biological dosimetry would be almost impossible in case of a large scale radiation incidents. In this project, a generic software was enhanced for automatic chromosome image processing from a framework originally developed for the Framework V project Simbio, of the European Union for applications in the area of source localization from electroencephalographic signals. The platforms capability is demonstrated by a study comparing automatic segmentation strategies of chromosomes from microscopic images.

  19. Real-time micro-modelling of city evacuations

    NASA Astrophysics Data System (ADS)

    Löhner, Rainald; Haug, Eberhard; Zinggerling, Claudio; Oñate, Eugenio

    2018-01-01

    A methodology to integrate geographical information system (GIS) data with large-scale pedestrian simulations has been developed. Advances in automatic data acquisition and archiving from GIS databases, automatic input for pedestrian simulations, as well as scalable pedestrian simulation tools have made it possible to simulate pedestrians at the individual level for complete cities in real time. An example that simulates the evacuation of the city of Barcelona demonstrates that this is now possible. This is the first step towards a fully integrated crowd prediction and management tool that takes into account not only data gathered in real time from cameras, cell phones or other sensors, but also merges these with advanced simulation tools to predict the future state of the crowd.

  20. Functions of nonsuicidal self-injury in Singapore adolescents: Implications for intervention.

    PubMed

    Ong, Say How; Tan, Augustine Chin Yeow; Liang, Wilfred Zhijian

    2017-08-01

    The functions of nonsuicidal self-injury (NSSI) and DSM-IV-TR diagnoses were examined in a sample of thirty ethnic adolescents followed up in a local child and adolescent psychiatric clinic in Singapore. The most commonly endorsed function of NSSI on the Functional Assessment of Self-Mutilation scale was Automatic Negative Reinforcement (A-NR) and the least being Social Negative Reinforcement (S-NR). Participants were more likely to be diagnosed as having Major Depression Disorder. Depressed adolescents did not differ from non-depressed counterparts in their endorsement of social reinforcement functions. The results suggest that specific psychosocial interventions may help address both automatic and social functions of NSSI in Singapore adolescents. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. The integration of system specifications and program coding

    NASA Technical Reports Server (NTRS)

    Luebke, W. R.

    1970-01-01

    Experience in maintaining up-to-date documentation for one module of the large-scale Medical Literature Analysis and Retrieval System 2 (MEDLARS 2) is described. Several innovative techniques were explored in the development of this system's data management environment, particularly those that use PL/I as an automatic documenter. The PL/I data description section can provide automatic documentation by means of a master description of data elements that has long and highly meaningful mnemonic names and a formalized technique for the production of descriptive commentary. The techniques discussed are practical methods that employ the computer during system development in a manner that assists system implementation, provides interim documentation for customer review, and satisfies some of the deliverable documentation requirements.

  2. The digital geologic map of Colorado in ARC/INFO format, Part A. Documentation

    USGS Publications Warehouse

    Green, Gregory N.

    1992-01-01

    This geologic map was prepared as a part of a study of digital methods and techniques as applied to complex geologic maps. The geologic map was digitized from the original scribe sheets used to prepare the published Geologic Map of Colorado (Tweto 1979). Consequently the digital version is at 1:500,000 scale using the Lambert Conformal Conic map projection parameters of the state base map. Stable base contact prints of the scribe sheets were scanned on a Tektronix 4991 digital scanner. The scanner automatically converts the scanned image to an ASCII vector format. These vectors were transferred to a VAX minicomputer, where they were then loaded into ARC/INFO. Each vector and polygon was given attributes derived from the original 1979 geologic map. This database was developed on a MicroVAX computer system using VAX V 5.4 nd ARC/INFO 5.0 software. UPDATE: April 1995, The update was done solely for the purpose of adding the abilitly to plot to an HP650c plotter. Two new ARC/INFO plot AMLs along with a lineset and shadeset for the HP650C design jet printer have been included. These new files are COLORADO.650, INDEX.650, TWETOLIN.E00 and TWETOSHD.E00. These files were created on a UNIX platform with ARC/INFO 6.1.2. Updated versions of INDEX.E00, CONTACT.E00, LINE.E00, DECO.E00 and BORDER.E00 files that included the newly defined HP650c items are also included. * Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government. Descriptors: The Digital Geologic Map of Colorado in ARC/INFO Format Open-File Report 92-050

  3. The digital geologic map of Colorado in ARC/INFO format, Part B. Common files

    USGS Publications Warehouse

    Green, Gregory N.

    1992-01-01

    This geologic map was prepared as a part of a study of digital methods and techniques as applied to complex geologic maps. The geologic map was digitized from the original scribe sheets used to prepare the published Geologic Map of Colorado (Tweto 1979). Consequently the digital version is at 1:500,000 scale using the Lambert Conformal Conic map projection parameters of the state base map. Stable base contact prints of the scribe sheets were scanned on a Tektronix 4991 digital scanner. The scanner automatically converts the scanned image to an ASCII vector format. These vectors were transferred to a VAX minicomputer, where they were then loaded into ARC/INFO. Each vector and polygon was given attributes derived from the original 1979 geologic map. This database was developed on a MicroVAX computer system using VAX V 5.4 nd ARC/INFO 5.0 software. UPDATE: April 1995, The update was done solely for the purpose of adding the abilitly to plot to an HP650c plotter. Two new ARC/INFO plot AMLs along with a lineset and shadeset for the HP650C design jet printer have been included. These new files are COLORADO.650, INDEX.650, TWETOLIN.E00 and TWETOSHD.E00. These files were created on a UNIX platform with ARC/INFO 6.1.2. Updated versions of INDEX.E00, CONTACT.E00, LINE.E00, DECO.E00 and BORDER.E00 files that included the newly defined HP650c items are also included. * Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government. Descriptors: The Digital Geologic Map of Colorado in ARC/INFO Format Open-File Report 92-050

  4. Electro-Chemical-Mechanical, Low Stress, Automatic Polishing (ECMP) Device (Preprint)

    DTIC Science & Technology

    2010-01-01

    into models that predict mechanical response [ 4 - 6 ]. In addition, surface preparation steps are critical to the imaging of ceramic and hybrid...2p 3/2 peak in the spectral data found in Figure 4 . The Ti 2p 3/2 peak is initially observed at 458.4 eV indicating that titanium is present in its...above 6 acceptable limits for both (average IQ values were higher than 2000). For the titanium samples, the samples processed without applied

  5. [Headspace analysis of volatile organic compounds (VOC) in drinking water by the method of gas chromatography].

    PubMed

    Sotnikov, E E; Zagaynov, V F; Mikhaylova, R I; Milochkin, D A; Ryzhova, I N; Kornilov, I O

    2014-01-01

    In the paper there is presented a methodology of analysis of headspace 52 volatile organic compounds in drinking water by the method of gas chromatography with the use of the chromatograph "Crystal 5000.2" with three detectors and automatic attachment Lab Hut 200N NT-200 for the preparation of the sample water and vapor phase input. The lower limit of detection for all compounds in the 2-10 times lower than that of the corresponding standard value.

  6. Array Automated Assembly Task Low Cost Silicon Solar Array Project, Phase 2

    NASA Technical Reports Server (NTRS)

    Rhee, S. S.; Jones, G. T.; Allison, K. L.

    1978-01-01

    Progress in the development of solar cells and module process steps for low-cost solar arrays is reported. Specific topics covered include: (1) a system to automatically measure solar cell electrical performance parameters; (2) automation of wafer surface preparation, printing, and plating; (3) laser inspection of mechanical defects of solar cells; and (4) a silicon antireflection coating system. Two solar cell process steps, laser trimming and holing automation and spray-on dopant junction formation, are described.

  7. The evolution of automated launch processing

    NASA Technical Reports Server (NTRS)

    Tomayko, James E.

    1988-01-01

    The NASA Launch Processing System (LPS) to which attention is presently given has arrived at satisfactory solutions for the distributed-computing, good user interface and dissimilar-hardware interface, and automation-related problems that emerge in the specific arena of spacecraft launch preparations. An aggressive effort was made to apply the lessons learned in the 1960s, during the first attempts at automatic launch vehicle checkout, to the LPS. As the Space Shuttle System continues to evolve, the primary contributor to safety and reliability will be the LPS.

  8. Protecting the axion with local baryon number

    NASA Astrophysics Data System (ADS)

    Duerr, Michael; Schmidt-Hoberg, Kai; Unwin, James

    2018-05-01

    The Peccei-Quinn (PQ) solution to the Strong CP Problem is expected to fail unless the global symmetry U(1)PQ is protected from Planck-scale operators up to high mass dimension. Suitable protection can be achieved if the PQ symmetry is an automatic consequence of some gauge symmetry. We highlight that if baryon number is promoted to a gauge symmetry, the exotic fermions needed for anomaly cancellation can elegantly provide an implementation of the Kim-Shifman-Vainshtein-Zakharov 'hidden axion' mechanism with a PQ symmetry protected from Planck-scale physics.

  9. [Integration and demonstration of key techniques in surveillance and forecast of schistosomiasis in Jiangsu Province III Development of a machine simultaneously integrating mechanized environmental cleaning and automatic mollusciciding].

    PubMed

    Wang, Fu-biao; Ma, Yu-cai; Sun, Le-ping; Hong, Qing-biao; Gao, Yang; Zhang, Chang-lin; Du, Guang-lin; Lu, Da-qin; Sun, Zhi-yong; Wang, Wei; Dai, Jian-rong; Liang, You-sheng

    2016-02-01

    To develop a machine simultaneously integrating mechanized environmental cleaning and automatic mollusciciding and to evaluate its effectiveness of field application, so as to provide a novel Oncomelania hupensis snail control technique in the large-scale marshlands. The machine simultaneously integrating mechanized environmental cleaning and automatic mollusciciding, which was suitable for use in complex marshland areas, was developed according to the mechanization and automation principles, and was used for O. hupensis snail control in the marshland. The effect of the machine on environmental cleaning and plough was evaluated, and the distribution of living snails was observed at various soil layers following plough. The snail control effects of plough alone and plough followed by mollusciciding were compared. The machine could simultaneously complete the procedures of getting vegetation down and cut vegetation into pieces, plough and snail control by spraying niclosamide. After plough, the constituent ratios of living snails were 36.31%, 25.60%, 22.62% and 15.48% in the soil layers at depths of 0-5, 6-10, 11-15 cm and 16-20 cm respectively, and 61.91% living snails were found in the 0-10 cm soil layers. Seven and fifteen days after the experiment, the mortality rates of snails were 9.38% and 8.29% in the plough alone group, and 63.04% and 80.70% in the plough + mollusciciding group respectively (χ²₇ d = 42.74, χ²₁₅ d = 155.56, both P values < 0.01). Thirty days after the experiment, the densities of snails were 3.02 snails/0.1 m² and 0.53 snails/ 0.1 m² in the soil surface of the plough alone group and the plough + mollusciciding group, which decreased by 64.92% and 93.60%, respectively, and the decrease rate of snail density was approximately 30% higher in the plough + mollusciciding group than that in the plough alone group. The machine simultaneously integrating mechanized environmental cleaning and automatic mollusciciding achieves the integration of mechanical environmental cleaning and automatic niclosamide spraying in the complex marshland areas, which provides a novel technique of field snail control in the large-scale setting in China.

  10. Smartphone data as an electronic biomarker of illness activity in bipolar disorder.

    PubMed

    Faurholt-Jepsen, Maria; Vinberg, Maj; Frost, Mads; Christensen, Ellen Margrethe; Bardram, Jakob E; Kessing, Lars Vedel

    2015-11-01

    Objective methods are lacking for continuous monitoring of illness activity in bipolar disorder. Smartphones offer unique opportunities for continuous monitoring and automatic collection of real-time data. The objectives of the paper were to test the hypotheses that (i) daily electronic self-monitored data and (ii) automatically generated objective data collected using smartphones correlate with clinical ratings of depressive and manic symptoms in patients with bipolar disorder. Software for smartphones (the MONARCA I system) that collects automatically generated objective data and self-monitored data on illness activity in patients with bipolar disorder was developed by the authors. A total of 61 patients aged 18-60 years and with a diagnosis of bipolar disorder according to ICD-10 used the MONARCA I system for six months. Depressive and manic symptoms were assessed monthly using the Hamilton Depression Rating Scale 17-item (HDRS-17) and the Young Mania Rating Scale (YMRS), respectively. Data are representative of over 400 clinical ratings. Analyses were computed using linear mixed-effect regression models allowing for both between individual variation and within individual variation over time. Analyses showed significant positive correlations between the duration of incoming and outgoing calls/day and scores on the HDRS-17, and significant positive correlations between the number and duration of incoming calls/day and scores on the YMRS; the number of and duration of outgoing calls/day and scores on the YMRS; and the number of outgoing text messages/day and scores on the YMRS. Analyses showed significant negative correlations between self-monitored data (i.e., mood and activity) and scores on the HDRS-17, and significant positive correlations between self-monitored data (i.e., mood and activity) and scores on the YMRS. Finally, the automatically generated objective data were able to discriminate between affective states. Automatically generated objective data and self-monitored data collected using smartphones correlate with clinically rated depressive and manic symptoms and differ between affective states in patients with bipolar disorder. Smartphone apps represent an easy and objective way to monitor illness activity with real-time data in bipolar disorder and may serve as an electronic biomarker of illness activity. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  11. Scale Space for Camera Invariant Features.

    PubMed

    Puig, Luis; Guerrero, José J; Daniilidis, Kostas

    2014-09-01

    In this paper we propose a new approach to compute the scale space of any central projection system, such as catadioptric, fisheye or conventional cameras. Since these systems can be explained using a unified model, the single parameter that defines each type of system is used to automatically compute the corresponding Riemannian metric. This metric, is combined with the partial differential equations framework on manifolds, allows us to compute the Laplace-Beltrami (LB) operator, enabling the computation of the scale space of any central projection system. Scale space is essential for the intrinsic scale selection and neighborhood description in features like SIFT. We perform experiments with synthetic and real images to validate the generalization of our approach to any central projection system. We compare our approach with the best-existing methods showing competitive results in all type of cameras: catadioptric, fisheye, and perspective.

  12. A Hybrid-Cloud Science Data System Enabling Advanced Rapid Imaging & Analysis for Monitoring Hazards

    NASA Astrophysics Data System (ADS)

    Hua, H.; Owen, S. E.; Yun, S.; Lundgren, P.; Moore, A. W.; Fielding, E. J.; Radulescu, C.; Sacco, G.; Stough, T. M.; Mattmann, C. A.; Cervelli, P. F.; Poland, M. P.; Cruz, J.

    2012-12-01

    Volcanic eruptions, landslides, and levee failures are some examples of hazards that can be more accurately forecasted with sufficient monitoring of precursory ground deformation, such as the high-resolution measurements from GPS and InSAR. In addition, coherence and reflectivity change maps can be used to detect surface change due to lava flows, mudslides, tornadoes, floods, and other natural and man-made disasters. However, it is difficult for many volcano observatories and other monitoring agencies to process GPS and InSAR products in an automated scenario needed for continual monitoring of events. Additionally, numerous interoperability barriers exist in multi-sensor observation data access, preparation, and fusion to create actionable products. Combining high spatial resolution InSAR products with high temporal resolution GPS products--and automating this data preparation & processing across global-scale areas of interests--present an untapped science and monitoring opportunity. The global coverage offered by satellite-based SAR observations, and the rapidly expanding GPS networks, can provide orders of magnitude more data on these hazardous events if we have a data system that can efficiently and effectively analyze the voluminous raw data, and provide users the tools to access data from their regions of interest. Currently, combined GPS & InSAR time series are primarily generated for specific research applications, and are not implemented to run on large-scale continuous data sets and delivered to decision-making communities. We are developing an advanced service-oriented architecture for hazard monitoring leveraging NASA-funded algorithms and data management to enable both science and decision-making communities to monitor areas of interests via seamless data preparation, processing, and distribution. Our objectives: * Enable high-volume and low-latency automatic generation of NASA Solid Earth science data products (InSAR and GPS) to support hazards monitoring. * Facilitate NASA-USGS collaborations to share NASA InSAR and GPS data products, which are difficult to process in high-volume and low-latency, for decision-support. * Enable interoperable discovery, access, and sharing of NASA observations and derived actionable products, and between the observation and decision-making communities. * Enable their improved understanding through visualization, mining, and cross-agency sharing. Existing InSAR & GPS processing packages and other software are integrated for generating geodetic decision support monitoring products. We employ semantic and cloud-based data management and processing techniques for handling large data volumes, reducing end product latency, codifying data system information with semantics, and deploying interoperable services for actionable products to decision-making communities.

  13. High-Temperature Tolerance in Multi-Scale Cermet Solar-Selective Absorbing Coatings Prepared by Laser Cladding.

    PubMed

    Pang, Xuming; Wei, Qian; Zhou, Jianxin; Ma, Huiyang

    2018-06-19

    In order to achieve cermet-based solar absorber coatings with long-term thermal stability at high temperatures, a novel single-layer, multi-scale TiC-Ni/Mo cermet coating was first prepared using laser cladding technology in atmosphere. The results show that the optical properties of the cermet coatings using laser cladding were much better than the preplaced coating. In addition, the thermal stability of the optical properties for the laser cladding coating were excellent after annealing at 650 °C for 200 h. The solar absorptance and thermal emittance of multi-scale cermet coating were 85% and 4.7% at 650 °C. The results show that multi-scale cermet materials are more suitable for solar-selective absorbing coating. In addition, laser cladding is a new technology that can be used for the preparation of spectrally-selective coatings.

  14. Scaling analysis of the non-Abelian quasiparticle tunneling in Z}}_k FQH states

    NASA Astrophysics Data System (ADS)

    Li, Qi; Jiang, Na; Wan, Xin; Hu, Zi-Xiang

    2018-06-01

    Quasiparticle tunneling between two counter propagating edges through point contacts could provide information on its statistics. Previous study of the short distance tunneling displays a scaling behavior, especially in the conformal limit with zero tunneling distance. The scaling exponents for the non-Abelian quasiparticle tunneling exhibit some non-trivial behaviors. In this work, we revisit the quasiparticle tunneling amplitudes and their scaling behavior in a full range of the tunneling distance by putting the electrons on the surface of a cylinder. The edge–edge distance can be smoothly tuned by varying the aspect ratio for a finite size cylinder. We analyze the scaling behavior of the quasiparticles for the Read–Rezayi states for and 4 both in the short and long tunneling distance region. The finite size scaling analysis automatically gives us a critical length scale where the anomalous correction appears. We demonstrate this length scale is related to the size of the quasiparticle at which the backscattering between two counter propagating edges starts to be significant.

  15. Expression of recombinant human flavin monooxygenase and moclobemide-N-oxide synthesis on multi-mg scale.

    PubMed

    Hanlon, Steven P; Camattari, Andrea; Abad, Sandra; Glieder, Anton; Kittelmann, Matthias; Lütz, Stephan; Wirz, Beat; Winkler, Margit

    2012-06-18

    A panel of human flavin monooxygenases were heterologously expressed in E. coli to obtain ready-to-use biocatalysts for the in vitro preparation of human drug metabolites. Moclobemide-N-oxide (65 mg) was the first high-priced metabolite prepared with recombinant hFMO3 on the multi-milligram scale.

  16. Preparative purification of polyethylene glycol derivatives with polystyrene-divinylbenzene beads as chromatographic packing.

    PubMed

    Yu, Pengzhan; Li, Xingqi; Li, Xiunan; Lu, Xiuling; Ma, Guanghui; Su, Zhiguo

    2007-10-15

    A clear and powerful chromatographic approach to purify polyethylene glycol derivatives at a preparative scale was reported, which was based on the polystyrene-divinylbenzene beads with ethanol/water as eluants. The validity of this method was verified with the reaction mixture of mPEG-Glu and mPEG propionaldehyde diethylacetal (ALD-PEG) as the model. The target products were one-step achieved with the purity of >99% on the polymer resins column at gram scale. The method developed was free from such disadvantages as utility of toxic solvent and narrow application scope, which was combined with conventional approaches. The method developed provided an appealing and attractive alternative methods for purification of PEG derivatives at a preparative scale.

  17. Automatic Tools for Enhancing the Collaborative Experience in Large Projects

    NASA Astrophysics Data System (ADS)

    Bourilkov, D.; Rodriquez, J. L.

    2014-06-01

    With the explosion of big data in many fields, the efficient management of knowledge about all aspects of the data analysis gains in importance. A key feature of collaboration in large scale projects is keeping a log of what is being done and how - for private use, reuse, and for sharing selected parts with collaborators and peers, often distributed geographically on an increasingly global scale. Even better if the log is automatically created on the fly while the scientist or software developer is working in a habitual way, without the need for extra efforts. This saves time and enables a team to do more with the same resources. The CODESH - COllaborative DEvelopment SHell - and CAVES - Collaborative Analysis Versioning Environment System projects address this problem in a novel way. They build on the concepts of virtual states and transitions to enhance the collaborative experience by providing automatic persistent virtual logbooks. CAVES is designed for sessions of distributed data analysis using the popular ROOT framework, while CODESH generalizes the approach for any type of work on the command line in typical UNIX shells like bash or tcsh. Repositories of sessions can be configured dynamically to record and make available the knowledge accumulated in the course of a scientific or software endeavor. Access can be controlled to define logbooks of private sessions or sessions shared within or between collaborating groups. A typical use case is building working scalable systems for analysis of Petascale volumes of data as encountered in the LHC experiments. Our approach is general enough to find applications in many fields.

  18. Automatic identification of agricultural terraces through object-oriented analysis of very high resolution DSMs and multispectral imagery obtained from an unmanned aerial vehicle.

    PubMed

    Diaz-Varela, R A; Zarco-Tejada, P J; Angileri, V; Loudjani, P

    2014-02-15

    Agricultural terraces are features that provide a number of ecosystem services. As a result, their maintenance is supported by measures established by the European Common Agricultural Policy (CAP). In the framework of CAP implementation and monitoring, there is a current and future need for the development of robust, repeatable and cost-effective methodologies for the automatic identification and monitoring of these features at farm scale. This is a complex task, particularly when terraces are associated to complex vegetation cover patterns, as happens with permanent crops (e.g. olive trees). In this study we present a novel methodology for automatic and cost-efficient identification of terraces using only imagery from commercial off-the-shelf (COTS) cameras on board unmanned aerial vehicles (UAVs). Using state-of-the-art computer vision techniques, we generated orthoimagery and digital surface models (DSMs) at 11 cm spatial resolution with low user intervention. In a second stage, these data were used to identify terraces using a multi-scale object-oriented classification method. Results show the potential of this method even in highly complex agricultural areas, both regarding DSM reconstruction and image classification. The UAV-derived DSM had a root mean square error (RMSE) lower than 0.5 m when the height of the terraces was assessed against field GPS data. The subsequent automated terrace classification yielded an overall accuracy of 90% based exclusively on spectral and elevation data derived from the UAV imagery. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Automatic nipple detection on 3D images of an automated breast ultrasound system (ABUS)

    NASA Astrophysics Data System (ADS)

    Javanshir Moghaddam, Mandana; Tan, Tao; Karssemeijer, Nico; Platel, Bram

    2014-03-01

    Recent studies have demonstrated that applying Automated Breast Ultrasound in addition to mammography in women with dense breasts can lead to additional detection of small, early stage breast cancers which are occult in corresponding mammograms. In this paper, we proposed a fully automatic method for detecting the nipple location in 3D ultrasound breast images acquired from Automated Breast Ultrasound Systems. The nipple location is a valuable landmark to report the position of possible abnormalities in a breast or to guide image registration. To detect the nipple location, all images were normalized. Subsequently, features have been extracted in a multi scale approach and classification experiments were performed using a gentle boost classifier to identify the nipple location. The method was applied on a dataset of 100 patients with 294 different 3D ultrasound views from Siemens and U-systems acquisition systems. Our database is a representative sample of cases obtained in clinical practice by four medical centers. The automatic method could accurately locate the nipple in 90% of AP (Anterior-Posterior) views and in 79% of the other views.

  20. Integrating High-Resolution Taskable Imagery into a Sensorweb for Automatic Space-Based Monitoring of Flooding in Thailand

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Mclaren, David; Doubleday, Joshua; Tran, Daniel; Tanpipat, Veerachai; Chitradon, Royol; Boonya-aroonnet, Surajate; Thanapakpawin, Porranee; Mandl, Daniel

    2012-01-01

    Several space-based assets (Terra, Aqua, Earth Observing One) have been integrated into a sensorweb to monitor flooding in Thailand. In this approach, the Moderate Imaging Spectrometer (MODIS) data from Terra and Aqua is used to perform broad-scale monitoring to track flooding at the regional level (250m/pixel) and EO-1 is autonomously tasked in response to alerts to acquire higher resolution (30m/pixel) Advanced Land Imager (ALI) data. This data is then automatically processed to derive products such as surface water extent and volumetric water estimates. These products are then automatically pushed to organizations in Thailand for use in damage estimation, relief efforts, and damage mitigation. More recently, this sensorweb structure has been used to request imagery, access imagery, and process high-resolution (several m to 30m), targetable asset imagery from commercial assets including Worldview-2, Ikonos, Radarsat-2, Landsat-7, and Geo-Eye-1. We describe the overall sensorweb framework as well as new workflows and products made possible via these extensions.

  1. Automatic Diabetic Macular Edema Detection in Fundus Images Using Publicly Available Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giancardo, Luca; Meriaudeau, Fabrice; Karnowski, Thomas Paul

    2011-01-01

    Diabetic macular edema (DME) is a common vision threatening complication of diabetic retinopathy. In a large scale screening environment DME can be assessed by detecting exudates (a type of bright lesions) in fundus images. In this work, we introduce a new methodology for diagnosis of DME using a novel set of features based on colour, wavelet decomposition and automatic lesion segmentation. These features are employed to train a classifier able to automatically diagnose DME. We present a new publicly available dataset with ground-truth data containing 169 patients from various ethnic groups and levels of DME. This and other two publiclymore » available datasets are employed to evaluate our algorithm. We are able to achieve diagnosis performance comparable to retina experts on the MESSIDOR (an independently labelled dataset with 1200 images) with cross-dataset testing. Our algorithm is robust to segmentation uncertainties, does not need ground truth at lesion level, and is very fast, generating a diagnosis on an average of 4.4 seconds per image on an 2.6 GHz platform with an unoptimised Matlab implementation.« less

  2. Suspect/foil identification in actual crimes and in the laboratory: a reality monitoring analysis.

    PubMed

    Behrman, Bruce W; Richards, Regina E

    2005-06-01

    Four reality monitoring variables were used to discriminate suspect from foil identifications in 183 actual criminal cases. Four hundred sixty-one identification attempts based on five and six-person lineups were analyzed. These identification attempts resulted in 238 suspect identifications and 68 foil identifications. Confidence, automatic processing, eliminative processing and feature use comprised the set of reality monitoring variables. Thirty-five verbal confidence phrases taken from police reports were assigned numerical values on a 10-point confidence scale. Automatic processing identifications were those that occurred "immediately" or "without hesitation." Eliminative processing identifications occurred when witnesses compared or eliminated persons in the lineups. Confidence, automatic processing and eliminative processing were significant predictors, but feature use was not. Confidence was the most effective discriminator. In cases that involved substantial evidence extrinsic to the identification 43% of the suspect identifications were made with high confidence, whereas only 10% of the foil identifications were made with high confidence. The results of a laboratory study using the same predictors generally paralleled the archival results. Forensic implications are discussed.

  3. New approaches to optimization in aerospace conceptual design

    NASA Technical Reports Server (NTRS)

    Gage, Peter J.

    1995-01-01

    Aerospace design can be viewed as an optimization process, but conceptual studies are rarely performed using formal search algorithms. Three issues that restrict the success of automatic search are identified in this work. New approaches are introduced to address the integration of analyses and optimizers, to avoid the need for accurate gradient information and a smooth search space (required for calculus-based optimization), and to remove the restrictions imposed by fixed complexity problem formulations. (1) Optimization should be performed in a flexible environment. A quasi-procedural architecture is used to conveniently link analysis modules and automatically coordinate their execution. It efficiently controls a large-scale design tasks. (2) Genetic algorithms provide a search method for discontinuous or noisy domains. The utility of genetic optimization is demonstrated here, but parameter encodings and constraint-handling schemes must be carefully chosen to avoid premature convergence to suboptimal designs. The relationship between genetic and calculus-based methods is explored. (3) A variable-complexity genetic algorithm is created to permit flexible parameterization, so that the level of description can change during optimization. This new optimizer automatically discovers novel designs in structural and aerodynamic tasks.

  4. No strong evidence for abnormal levels of dysfunctional attitudes, automatic thoughts, and emotional information-processing biases in remitted bipolar I affective disorder.

    PubMed

    Lex, Claudia; Meyer, Thomas D; Marquart, Barbara; Thau, Kenneth

    2008-03-01

    Beck extended his original cognitive theory of depression by suggesting that mania was a mirror image of depression characterized by extreme positive cognition about the self, the world, and the future. However, there were no suggestions what might be special regarding cognitive features in bipolar patients (Mansell & Scott, 2006). We therefore used different indicators to evaluate cognitive processes in bipolar patients and healthy controls. We compared 19 remitted bipolar I patients (BPs) without any Axis I comorbidity with 19 healthy individuals (CG). All participants completed the Beck Depression Inventory, the Dysfunctional Attitude Scale, the Automatic Thoughts Questionnaire, the Emotional Stroop Test, and an incidental recall task. No significant group differences were found in automatic thinking and the information-processing styles (Emotional Stroop Test, incidental recall task). Regarding dysfunctional attitudes, we obtained ambiguous results. It appears that individuals with remitted bipolar affective disorder do not show cognitive vulnerability as proposed in Beck's theory of depression if they only report subthreshold levels of depressive symptoms. Perhaps, the cognitive vulnerability might only be observable if mood induction procedures are used.

  5. A tool to automatically analyze electromagnetic tracking data from high dose rate brachytherapy of breast cancer patients.

    PubMed

    Götz, Th I; Lahmer, G; Strnad, V; Bert, Ch; Hensel, B; Tomé, A M; Lang, E W

    2017-01-01

    During High Dose Rate Brachytherapy (HDR-BT) the spatial position of the radiation source inside catheters implanted into a female breast is determined via electromagnetic tracking (EMT). Dwell positions and dwell times of the radiation source are established, relative to the patient's anatomy, from an initial X-ray-CT-image. During the irradiation treatment, catheter displacements can occur due to patient movements. The current study develops an automatic analysis tool of EMT data sets recorded with a solenoid sensor to assure concordance of the source movement with the treatment plan. The tool combines machine learning techniques such as multi-dimensional scaling (MDS), ensemble empirical mode decomposition (EEMD), singular spectrum analysis (SSA) and particle filter (PF) to precisely detect and quantify any mismatch between the treatment plan and actual EMT measurements. We demonstrate that movement artifacts as well as technical signal distortions can be removed automatically and reliably, resulting in artifact-free reconstructed signals. This is a prerequisite for a highly accurate determination of any deviations of dwell positions from the treatment plan.

  6. A tool to automatically analyze electromagnetic tracking data from high dose rate brachytherapy of breast cancer patients

    PubMed Central

    Lahmer, G.; Strnad, V.; Bert, Ch.; Hensel, B.; Tomé, A. M.; Lang, E. W.

    2017-01-01

    During High Dose Rate Brachytherapy (HDR-BT) the spatial position of the radiation source inside catheters implanted into a female breast is determined via electromagnetic tracking (EMT). Dwell positions and dwell times of the radiation source are established, relative to the patient’s anatomy, from an initial X-ray-CT-image. During the irradiation treatment, catheter displacements can occur due to patient movements. The current study develops an automatic analysis tool of EMT data sets recorded with a solenoid sensor to assure concordance of the source movement with the treatment plan. The tool combines machine learning techniques such as multi-dimensional scaling (MDS), ensemble empirical mode decomposition (EEMD), singular spectrum analysis (SSA) and particle filter (PF) to precisely detect and quantify any mismatch between the treatment plan and actual EMT measurements. We demonstrate that movement artifacts as well as technical signal distortions can be removed automatically and reliably, resulting in artifact-free reconstructed signals. This is a prerequisite for a highly accurate determination of any deviations of dwell positions from the treatment plan. PMID:28934238

  7. Application of latent variable model in Rosenberg self-esteem scale.

    PubMed

    Leung, Shing-On; Wu, Hui-Ping

    2013-01-01

    Latent Variable Models (LVM) are applied to Rosenberg Self-Esteem Scale (RSES). Parameter estimations automatically give negative signs hence no recoding is necessary for negatively scored items. Bad items can be located through parameter estimate, item characteristic curves and other measures. Two factors are extracted with one on self-esteem and the other on the degree to take moderate views, with the later not often being covered in previous studies. A goodness-of-fit measure based on two-way margins is used but more works are needed. Results show that scaling provided by models with more formal statistical ground correlated highly with conventional method, which may provide justification for usual practice.

  8. Automation, the Impact of Technological Change.

    ERIC Educational Resources Information Center

    Brozen, Yale

    The scale of educational activities is increasing because mechanization, automation, cybernation, or whatever new technology is called, makes it possible to do more than could formerly be done. If a man helped by an automatic machine can turn out twice as much per hour, then, presumably, only half as many hours of work will be available for each…

  9. Remote Sensing Analysis of Forest Disturbances

    NASA Technical Reports Server (NTRS)

    Asner, Gregory P. (Inventor)

    2015-01-01

    The present invention provides systems and methods to automatically analyze Landsat satellite data of forests. The present invention can easily be used to monitor any type of forest disturbance such as from selective logging, agriculture, cattle ranching, natural hazards (fire, wind events, storms), etc. The present invention provides a large-scale, high-resolution, automated remote sensing analysis of such disturbances.

  10. Remote sensing analysis of forest disturbances

    NASA Technical Reports Server (NTRS)

    Asner, Gregory P. (Inventor)

    2012-01-01

    The present invention provides systems and methods to automatically analyze Landsat satellite data of forests. The present invention can easily be used to monitor any type of forest disturbance such as from selective logging, agriculture, cattle ranching, natural hazards (fire, wind events, storms), etc. The present invention provides a large-scale, high-resolution, automated remote sensing analysis of such disturbances.

  11. A new machine classification method applied to human peripheral blood leukocytes

    NASA Technical Reports Server (NTRS)

    Rorvig, Mark E.; Fitzpatrick, Steven J.; Vitthal, Sanjay; Ladoulis, Charles T.

    1994-01-01

    Human beings judge images by complex mental processes, whereas computing machines extract features. By reducing scaled human judgments and machine extracted features to a common metric space and fitting them by regression, the judgments of human experts rendered on a sample of images may be imposed on an image population to provide automatic classification.

  12. [A communication campaign to improve how antibiotics are used].

    PubMed

    Héron, Myriam

    2015-01-01

    A wide-scale information campaign, using a memorable slogan, reminded health professionals and users that the prescribing of antibiotics is not 'automatic' in the case of a viral infection. The fight against antibiotic resistant bacteria requires the consumption of these medications to be limited in order to preserve their effectiveness. Copyright © 2015. Published by Elsevier Masson SAS.

  13. The Complete Automation of the Minnesota Multiphasic Personality Inventory and a Study of its Response Latency.

    ERIC Educational Resources Information Center

    Dunn, Thomas G.; And Others

    The feasibility of completely automating the Minnesota Multiphasic Personality Inventory (MMPI) was tested, and item response latencies were compared with other MMPI item characteristics. A total of 26 scales were successfully scored automatically for 165 subjects. The program also typed a Mayo Clinic interpretive report on a computer terminal,…

  14. A modular approach to detection and identification of defects in rough lumber

    Treesearch

    Sang Mook Lee; A. Lynn Abbott; Daniel L. Schmoldt

    2001-01-01

    This paper describes a prototype scanning system that can automatically identify several important defects on rough hardwood lumber. The scanning system utilizes 3 laser sources and an embedded-processor camera to capture and analyze profile and gray-scale images. The modular approach combines the detection of wane (the curved sides of a board, possibly containing...

  15. Long-Term Priming of Visual Search Prevails against the Passage of Time and Counteracting Instructions

    ERIC Educational Resources Information Center

    Kruijne, Wouter; Meeter, Martijn

    2016-01-01

    Studies on "intertrial priming" have shown that in visual search experiments, the preceding trial automatically affects search performance: facilitating it when the target features repeat and giving rise to switch costs when they change--so-called (short-term) intertrial priming. These effects also occur at longer time scales: When 1 of…

  16. Boundary and object detection in real world images. [by means of algorithms

    NASA Technical Reports Server (NTRS)

    Yakimovsky, Y.

    1974-01-01

    A solution to the problem of automatic location of objects in digital pictures by computer is presented. A self-scaling local edge detector which can be applied in parallel on a picture is described. Clustering algorithms and boundary following algorithms which are sequential in nature process the edge data to locate images of objects.

  17. CPR for Rural School Districts: Emerging Alternatives in Curriculum, Program and Reorganization.

    ERIC Educational Resources Information Center

    Thurston, Paul; Clauss, Joanne

    Justification for school district consolidation is made on the basis of either reducing cost or increasing educational quality. Some cost reduction may be realized through certain economies of scale in some consolidations but it is by no means automatic. The Illinois State Board of Education emphasizes the relationship between high school size and…

  18. Applications of color machine vision in the agricultural and food industries

    NASA Astrophysics Data System (ADS)

    Zhang, Min; Ludas, Laszlo I.; Morgan, Mark T.; Krutz, Gary W.; Precetti, Cyrille J.

    1999-01-01

    Color is an important factor in Agricultural and the Food Industry. Agricultural or prepared food products are often grade by producers and consumers using color parameters. Color is used to estimate maturity, sort produce for defects, but also perform genetic screenings or make an aesthetic judgement. The task of sorting produce following a color scale is very complex, requires special illumination and training. Also, this task cannot be performed for long durations without fatigue and loss of accuracy. This paper describes a machine vision system designed to perform color classification in real-time. Applications for sorting a variety of agricultural products are included: e.g. seeds, meat, baked goods, plant and wood.FIrst the theory of color classification of agricultural and biological materials is introduced. Then, some tools for classifier development are presented. Finally, the implementation of the algorithm on real-time image processing hardware and example applications for industry is described. This paper also presented an image analysis algorithm and a prototype machine vision system which was developed for industry. This system will automatically locate the surface of some plants using digital camera and predict information such as size, potential value and type of this plant. The algorithm developed will be feasible for real-time identification in an industrial environment.

  19. A 16-Channel Nonparametric Spike Detection ASIC Based on EC-PC Decomposition.

    PubMed

    Wu, Tong; Xu, Jian; Lian, Yong; Khalili, Azam; Rastegarnia, Amir; Guan, Cuntai; Yang, Zhi

    2016-02-01

    In extracellular neural recording experiments, detecting neural spikes is an important step for reliable information decoding. A successful implementation in integrated circuits can achieve substantial data volume reduction, potentially enabling a wireless operation and closed-loop system. In this paper, we report a 16-channel neural spike detection chip based on a customized spike detection method named as exponential component-polynomial component (EC-PC) algorithm. This algorithm features a reliable prediction of spikes by applying a probability threshold. The chip takes raw data as input and outputs three data streams simultaneously: field potentials, band-pass filtered neural data, and spiking probability maps. The algorithm parameters are on-chip configured automatically based on input data, which avoids manual parameter tuning. The chip has been tested with both in vivo experiments for functional verification and bench-top experiments for quantitative performance assessment. The system has a total power consumption of 1.36 mW and occupies an area of 6.71 mm (2) for 16 channels. When tested on synthesized datasets with spikes and noise segments extracted from in vivo preparations and scaled according to required precisions, the chip outperforms other detectors. A credit card sized prototype board is developed to provide power and data management through a USB port.

  20. Synthesis and characterization of CaCO3 (calcite) nano particles from cockle shells (Anadara granosa Linn) by precipitation method

    NASA Astrophysics Data System (ADS)

    Widyastuti, Sri; Intan Ayu Kusuma, P.

    2017-06-01

    Calcium supplements can reduce the risk of osteoporosis, but they are not automatically absorbed in the gastrointestinal tract. Nanotechnology is presumed to have a capacity in resolving this problem. The preparation and characterization of calcium carbonate nano particle to improve the solubility was performed. Calcium carbonate nano particles were synthesized using precipitation method from cockle shells (Anadara granosa Linn). Samples of the cockle shells were dried in an oven at temperature of 50°C for 7 (seven) days and subsequently they were crushed and blended into fine powder that was sieved through 125-μm sieve. The synthesis of calcium carbonate nanocrystals was done by extracting using hydro chloride acid and various concentrations of sodium hydroxide were used to precipitate the calcium carbonate nano particles. The size of the nano particles was determined by SEM, XRD data, and Fourier transform infrared spectroscopy (FT-IR). The results of XRD indicated that the overall crystalline structure and phase purity of the typical calcite phase CaCO3 particles were approximately 300 nm in size. The method to find potential applications in industry to yield the large scale synthesis of aragonite nano particles by a low cost but abundant natural resource such as cockle shells is required.

Top