Science.gov

Sample records for pcgc automated structure

  1. Revised users manual, Pulverized Coal Gasification or Combustion: 2-dimensional (87-PCGC-2): Final report, Volume 2. [87-PCGC-2

    SciTech Connect

    Smith, P.J.; Smoot, L.D.; Brewster, B.S.

    1987-12-01

    A two-dimensional, steady-state model for describing a variety of reactive and non-reactive flows, including pulverized coal combustion and gasification, is presented. Recent code revisions and additions are described. The model, referred to as 87-PCGC-2, is applicable to cylindrical axi-symmetric systems. Turbulence is accounted for in both the fluid mechanics equations and the combustion scheme. Radiation from gases, walls, and particles is taken into account using either a flux method or discrete ordinates method. The particle phase is modeled in a Lagrangian framework, such that mean paths of particle groups are followed. Several multi-step coal devolatilization schemes are included along with a heterogeneous reaction scheme that allows for both diffusion and chemical reaction. Major gas-phase reactions are modeled assuming local instantaneous equilibrium, and thus the reaction rates are limited by the turbulent rate mixing. A NO/sub x/ finite rate chemistry submodel is included which integrates chemical kinetics and the statistics of the turbulence. The gas phase is described by elliptic partial differential equations that are solved by an iterative line-by-line technique. Under-relaxation is used to achieve numerical stability. The generalized nature of the model allows for calculation of isothermal fluid mechanicsgaseous combustion, droplet combustion, particulate combustion and various mixtures of the above, including combustion of coal-water and coal-oil slurries. Both combustion and gasification environments are permissible. User information and theory are presented, along with sample problems. 106 refs.

  2. Automated design of aerospace structures

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Mccomb, H. G.

    1974-01-01

    The current state-of-the-art in structural analysis of aerospace vehicles is characterized, automated design technology is discussed, and an indication is given of the future direction of research in analysis and automated design. Representative computer programs for analysis typical of those in routine use in vehicle design activities are described, and results are shown for some selected analysis problems. Recent and planned advances in analysis capability are indicated. Techniques used to automate the more routine aspects of structural design are discussed, and some recently developed automated design computer programs are described. Finally, discussion is presented of early accomplishments in interdisciplinary automated design systems, and some indication of the future thrust of research in this field is given.

  3. Automated Tape Laying Machine for Composite Structures.

    DTIC Science & Technology

    The invention comprises an automated tape laying machine, for laying tape on a composite structure. The tape laying machine has a tape laying head...neatly cut. The automated tape laying device utilizes narrow width tape to increase machine flexibility and reduce wastage.

  4. Automated Characterization Of Vibrations Of A Structure

    NASA Technical Reports Server (NTRS)

    Bayard, David S.; Yam, Yeung; Mettler, Edward; Hadaegh, Fred Y.; Milman, Mark H.; Scheid, Robert E.

    1992-01-01

    Automated method of characterizing dynamical properties of large flexible structure yields estimates of modal parameters used by robust control system to stabilize structure and minimize undesired motions. Based on extraction of desired modal and control-design data from responses of structure to known vibrational excitations. Applicable to terrestrial structures where vibrations are important - aircraft, buildings, bridges, cranes, and drill strings.

  5. Isolation of individual fatty acids in sediments using preparative capillary gas chromatography (PCGC) for radiocarbon analysis at NIES-TERRA

    NASA Astrophysics Data System (ADS)

    Uchida, Masao; Shibata, Yasuyuki; Kawamura, Kimitaka; Yoneda, Minoru; Mukai, Hitoshi; Tanaka, Atsushi; Uehiro, Takashi; Morita, Masatoshi

    2000-10-01

    Compound-specific radiocarbon analysis (CSRA) of individual fatty acids (140-1190 μg C) in an estuarine sediment sample collected from Tokyo Bay was carried out using a recently developed preparative capillary gas chromatography (PCGC) system and accelerator mass spectrometry (AMS). The results showed that the estimated 14C ages of four components greatly varied from modern age (combined iso and anteiso C 15:0, C 16:0) to 17 000 years BP (C 22:0), while a bulk-phase 14C age of organic matter is 5000 years BP. The 14C ages of the fatty acids derived from phytoplankton and bacteria are much younger than that of the bulk phase. On the other hand, the fatty acid originated from terrestrial higher plants (C 22:0) shows an older 14C age of 17 000 years BP.

  6. ASTROS: A multidisciplinary automated structural design tool

    NASA Technical Reports Server (NTRS)

    Neill, D. J.

    1989-01-01

    ASTROS (Automated Structural Optimization System) is a finite-element-based multidisciplinary structural optimization procedure developed under Air Force sponsorship to perform automated preliminary structural design. The design task is the determination of the structural sizes that provide an optimal structure while satisfying numerous constraints from many disciplines. In addition to its automated design features, ASTROS provides a general transient and frequency response capability, as well as a special feature to perform a transient analysis of a vehicle subjected to a nuclear blast. The motivation for the development of a single multidisciplinary design tool is that such a tool can provide improved structural designs in less time than is currently needed. The role of such a tool is even more apparent as modern materials come into widespread use. Balancing conflicting requirements for the structure's strength and stiffness while exploiting the benefits of material anisotropy is perhaps an impossible task without assistance from an automated design tool. Finally, the use of a single tool can bring the design task into better focus among design team members, thereby improving their insight into the overall task.

  7. Automated structure solution with the PHENIX suite

    SciTech Connect

    Terwilliger, Thomas C; Zwart, Peter H; Afonine, Pavel V; Grosse - Kunstleve, Ralf W

    2008-01-01

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution, and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution, and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template- and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix. refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  8. Automated Structure Solution with the PHENIX Suite

    SciTech Connect

    Zwart, Peter H.; Zwart, Peter H.; Afonine, Pavel; Grosse-Kunstleve, Ralf W.; Hung, Li-Wei; Ioerger, Tom R.; McCoy, A.J.; McKee, Eric; Moriarty, Nigel; Read, Randy J.; Sacchettini, James C.; Sauter, Nicholas K.; Storoni, L.C.; Terwilliger, Tomas C.; Adams, Paul D.

    2008-06-09

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix.refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  9. Automated modeling of RNA 3D structure.

    PubMed

    Rother, Kristian; Rother, Magdalena; Skiba, Pawel; Bujnicki, Janusz M

    2014-01-01

    This chapter gives an overview over the current methods for automated modeling of RNA structures, with emphasis on template-based methods. The currently used approaches to RNA modeling are presented with a side view on the protein world, where many similar ideas have been used. Two main programs for automated template-based modeling are presented: ModeRNA assembling structures from fragments and MacroMoleculeBuilder performing a simulation to satisfy spatial restraints. Both approaches have in common that they require an alignment of the target sequence to a known RNA structure that is used as a modeling template. As a way to find promising template structures and to align the target and template sequences, we propose a pipeline combining the ParAlign and Infernal programs on RNA family data from Rfam. We also briefly summarize template-free methods for RNA 3D structure prediction. Typically, RNA structures generated by automated modeling methods require local or global optimization. Thus, we also discuss methods that can be used for local or global refinement of RNA structures.

  10. Carbohydrate structure: the rocky road to automation.

    PubMed

    Agirre, Jon; Davies, Gideon J; Wilson, Keith S; Cowtan, Kevin D

    2016-12-08

    With the introduction of intuitive graphical software, structural biologists who are not experts in crystallography are now able to build complete protein or nucleic acid models rapidly. In contrast, carbohydrates are in a wholly different situation: scant automation exists, with manual building attempts being sometimes toppled by incorrect dictionaries or refinement problems. Sugars are the most stereochemically complex family of biomolecules and, as pyranose rings, have clear conformational preferences. Despite this, all refinement programs may produce high-energy conformations at medium to low resolution, without any support from the electron density. This problem renders the affected structures unusable in glyco-chemical terms. Bringing structural glycobiology up to 'protein standards' will require a total overhaul of the methodology. Time is of the essence, as the community is steadily increasing the production rate of glycoproteins, and electron cryo-microscopy has just started to image them in precisely that resolution range where crystallographic methods falter most.

  11. Automated S/TEM metrology on advanced semiconductor gate structures

    NASA Astrophysics Data System (ADS)

    Strauss, M.; Arjavac, J.; Horspool, D. N.; Nakahara, K.; Deeb, C.; Hobbs, C.

    2012-03-01

    Alternate techniques for obatining metrology data from advanced semiconductor device structures may be required. Automated STEM-based dimensional metrology (CD-STEM) was developed for complex 3D geometries in read/write head metrology in teh hard disk drive industry. It has been widely adopted, and is the process of record for metrology. Fully automated S/TEM metrology on advanced semiconductor gate structures is viable, with good repeatability and robustness. Consistent automated throughput of 10 samples per hour was achieved. Automated sample preparation was developed with sufficient throughput and quality to support the automated CD-STEM.

  12. Hitting the MARC: Database Structure for Library Automation.

    ERIC Educational Resources Information Center

    Calame, Albert P.

    2000-01-01

    Outlines the basic database structure that should be in place in library automation systems. Suggests that any library automation system that includes a catalog should be using a database engine that uses the MARC record structure at its basic structure for record storage. Describes the elements of MARC 21 formats--the new standards for the…

  13. Pricing Structures for Automated Library Consortia.

    ERIC Educational Resources Information Center

    Machovec, George S.

    1993-01-01

    Discusses the development of successful pricing algorithms for cooperative library automation projects. Highlights include desirable characteristics of pricing measures, including simplicity and the ability to allow for system growth; problems with transaction-based systems; and a review of the pricing strategies of seven library consortia.…

  14. On automation of the procedure for crystal structure model refinement

    SciTech Connect

    Dudka, A. P.

    2008-03-15

    The methods of automation of the procedure for crystal structure model refinement from experimental diffraction data, implemented in the ASTRA program package, are described. Such tools as statistical tests, parameter scanning, and data scanning reduce the time necessary for structural investigation. At strong correlations between parameters, especially when the data set is limited, parameter scanning has an advantage over the full-matrix refinement.

  15. Automating the determination of 3D protein structure

    SciTech Connect

    Rayl, K.D.

    1993-12-31

    The creation of an automated method for determining 3D protein structure would be invaluable to the field of biology and presents an interesting challenge to computer science. Unfortunately, given the current level of protein knowledge, a completely automated solution method is not yet feasible, therefore, our group has decided to integrate existing databases and theories to create a software system that assists X-ray crystallographers in specifying a particular protein structure. By breaking the problem of determining overall protein structure into small subproblems, we hope to come closer to solving a novel structure by solving each component. By generating necessary information for structure determination, this method provides the first step toward designing a program to determine protein conformation automatically.

  16. The Phenix software for automated determination of macromolecular structures.

    PubMed

    Adams, Paul D; Afonine, Pavel V; Bunkóczi, Gábor; Chen, Vincent B; Echols, Nathaniel; Headd, Jeffrey J; Hung, Li-Wei; Jain, Swati; Kapral, Gary J; Grosse Kunstleve, Ralf W; McCoy, Airlie J; Moriarty, Nigel W; Oeffner, Robert D; Read, Randy J; Richardson, David C; Richardson, Jane S; Terwilliger, Thomas C; Zwart, Peter H

    2011-09-01

    X-ray crystallography is a critical tool in the study of biological systems. It is able to provide information that has been a prerequisite to understanding the fundamentals of life. It is also a method that is central to the development of new therapeutics for human disease. Significant time and effort are required to determine and optimize many macromolecular structures because of the need for manual interpretation of complex numerical data, often using many different software packages, and the repeated use of interactive three-dimensional graphics. The Phenix software package has been developed to provide a comprehensive system for macromolecular crystallographic structure solution with an emphasis on automation. This has required the development of new algorithms that minimize or eliminate subjective input in favor of built-in expert-systems knowledge, the automation of procedures that are traditionally performed by hand, and the development of a computational framework that allows a tight integration between the algorithms. The application of automated methods is particularly appropriate in the field of structural proteomics, where high throughput is desired. Features in Phenix for the automation of experimental phasing with subsequent model building, molecular replacement, structure refinement and validation are described and examples given of running Phenix from both the command line and graphical user interface.

  17. Automated sizing of large structures by mixed optimization methods

    NASA Technical Reports Server (NTRS)

    Sobieszczanski, J.; Loendorf, D.

    1973-01-01

    A procedure for automating the sizing of wing-fuselage airframes was developed and implemented in the form of an operational program. The program combines fully stressed design to determine an overall material distribution with mass-strength and mathematical programming methods to design structural details accounting for realistic design constraints. The practicality and efficiency of the procedure is demonstrated for transport aircraft configurations. The methodology is sufficiently general to be applicable to other large and complex structures.

  18. Automated structural classification of lipids by machine learning.

    PubMed

    Taylor, Ryan; Miller, Ryan H; Miller, Ryan D; Porter, Michael; Dalgleish, James; Prince, John T

    2015-03-01

    Modern lipidomics is largely dependent upon structural ontologies because of the great diversity exhibited in the lipidome, but no automated lipid classification exists to facilitate this partitioning. The size of the putative lipidome far exceeds the number currently classified, despite a decade of work. Automated classification would benefit ongoing classification efforts by decreasing the time needed and increasing the accuracy of classification while providing classifications for mass spectral identification algorithms. We introduce a tool that automates classification into the LIPID MAPS ontology of known lipids with >95% accuracy and novel lipids with 63% accuracy. The classification is based upon simple chemical characteristics and modern machine learning algorithms. The decision trees produced are intelligible and can be used to clarify implicit assumptions about the current LIPID MAPS classification scheme. These characteristics and decision trees are made available to facilitate alternative implementations. We also discovered many hundreds of lipids that are currently misclassified in the LIPID MAPS database, strongly underscoring the need for automated classification. Source code and chemical characteristic lists as SMARTS search strings are available under an open-source license at https://www.github.com/princelab/lipid_classifier. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. Computer automated structure evaluation of quinolone antibacterial agents.

    PubMed Central

    Klopman, G; Macina, O T; Levinson, M E; Rosenkranz, H S

    1987-01-01

    The Computer Automated Structure Evaluation (CASE) program was used to study a series of quinolone antibacterial agents for which experimental data pertaining to DNA gyrase inhibition as well as MICs against several strains of gram-positive and gram-negative bacteria are available. The result of the analysis was the automatic generation of molecular fragments relevant to the respective biological endpoints. The potential significance of these major activating-inactivating fragments to the biological activity is discussed. PMID:2829716

  20. Automated Localization of Multiple Pelvic Bone Structures on MRI.

    PubMed

    Onal, Sinan; Lai-Yuen, Susana; Bao, Paul; Weitzenfeld, Alfredo; Hart, Stuart

    2016-01-01

    In this paper, we present a fully automated localization method for multiple pelvic bone structures on magnetic resonance images (MRI). Pelvic bone structures are at present identified manually on MRI to locate reference points for measurement and evaluation of pelvic organ prolapse (POP). Given that this is a time-consuming and subjective procedure, there is a need to localize pelvic bone structures automatically. However, bone structures are not easily differentiable from soft tissue on MRI as their pixel intensities tend to be very similar. In this paper, we present a model that combines support vector machines and nonlinear regression capturing global and local information to automatically identify the bounding boxes of bone structures on MRI. The model identifies the location of the pelvic bone structures by establishing the association between their relative locations and using local information such as texture features. Results show that the proposed method is able to locate the bone structures of interest accurately (dice similarity index >0.75) in 87-91% of the images. This research aims to enable accurate, consistent, and fully automated localization of bone structures on MRI to facilitate and improve the diagnosis of health conditions such as female POP.

  1. Fully automated localization of multiple pelvic bone structures on MRI.

    PubMed

    Onal, Sinan; Lai-Yuen, Susana; Bao, Paul; Weitzenfeld, Alfredo; Hart, Stuart

    2014-01-01

    In this paper, we present a fully automated localization method for multiple pelvic bone structures on magnetic resonance images (MRI). Pelvic bone structures are currently identified manually on MRI to identify reference points for measurement and evaluation of pelvic organ prolapse (POP). Given that this is a time-consuming and subjective procedure, there is a need to localize pelvic bone structures without any user interaction. However, bone structures are not easily differentiable from soft tissue on MRI as their pixel intensities tend to be very similar. In this research, we present a model that automatically identifies the bounding boxes of the bone structures on MRI using support vector machines (SVM) based classification and non-linear regression model that captures global and local information. Based on the relative locations of pelvic bones and organs, and local information such as texture features, the model identifies the location of the pelvic bone structures by establishing the association between their locations. Results show that the proposed method is able to locate the bone structures of interest accurately. The pubic bone, sacral promontory, and coccyx were correctly detected (DSI > 0.75) in 92%, 90%, and 88% of the testing images. This research aims to enable accurate, consistent and fully automated identification of pelvic bone structures on MRI to facilitate and improve the diagnosis of female pelvic organ prolapse.

  2. Automated output-only dynamic identification of civil engineering structures

    NASA Astrophysics Data System (ADS)

    Rainieri, C.; Fabbrocino, G.

    2010-04-01

    Modal-based damage detection algorithms are well-known techniques for structural health assessment, but they are not commonly used due to the lack of automated modal identification and tracking procedures. Development of such procedures is not a trivial task since traditional modal identification requires extensive interaction from an expert user. Nevertheless, computational efforts have to be carefully considered. If fast on-line data processing is crucial for quickly varying in time systems (such as a rocket burning fuel), a number of vibration-based condition monitoring applications are performed at very different time scales, resulting in satisfactory time steps for on-line data analysis. Moreover, promising results in the field of automated modal identification have been recently achieved. In the present paper, a literature review on this topic is presented and recent developments concerning fully automated output-only modal identification procedures are described. Some case studies are also reported in order to validate the approach. They are characterized by different levels of complexity, in terms of mode coupling, dynamic interaction effects and level of vibration. Advantages and drawbacks of the proposed approach will be pointed out with reference to available experimental results. The final objective is the implementation of a fully automated system for vibration-based structural health monitoring of civil engineering structures and identification of adequate requirements about sensor number and layout, record duration and hardware characteristics able to ensure a reliable low-cost health assessment of constructions. Results of application of the proposed methodology to modal parameter estimation in operational conditions and during ground motions induced by the recent L'Aquila earthquake will be finally presented and discussed.

  3. PYMORPH: automated galaxy structural parameter estimation using PYTHON

    NASA Astrophysics Data System (ADS)

    Vikram, Vinu; Wadadekar, Yogesh; Kembhavi, Ajit K.; Vijayagovindan, G. V.

    2010-12-01

    We present a new software pipeline - PYMORPH- for automated estimation of structural parameters of galaxies. Both parametric fits through a two-dimensional bulge disc decomposition and structural parameter measurements like concentration, asymmetry etc. are supported. The pipeline is designed to be easy to use yet flexible; individual software modules can be replaced with ease. A find-and-fit mode is available so that all galaxies in an image can be measured with a simple command. A parallel version of the PYMORPH pipeline runs on computer clusters and a virtual observatory compatible web enabled interface is under development.

  4. Automated quantification of lung structures from optical coherence tomography images

    PubMed Central

    Pagnozzi, Alex M.; Kirk, Rodney W.; Kennedy, Brendan F.; Sampson, David D.; McLaughlin, Robert A.

    2013-01-01

    Characterization of the size of lung structures can aid in the assessment of a range of respiratory diseases. In this paper, we present a fully automated segmentation and quantification algorithm for the delineation of large numbers of lung structures in optical coherence tomography images, and the characterization of their size using the stereological measure of median chord length. We demonstrate this algorithm on scans acquired with OCT needle probes in fresh, ex vivo tissues from two healthy animal models: pig and rat. Automatically computed estimates of lung structure size were validated against manual measures. In addition, we present 3D visualizations of the lung structures using the segmentation calculated for each data set. This method has the potential to provide an in vivo indicator of structural remodeling caused by a range of respiratory diseases, including chronic obstructive pulmonary disease and pulmonary fibrosis. PMID:24298402

  5. A telerobotic system for automated assembly of large space structures

    NASA Technical Reports Server (NTRS)

    Rhodes, Marvin D.; Will, Ralph W.; Wise, Marion A.

    1989-01-01

    Future space missions such as polar platforms and antennas are anticipated to require large truss structures as their primary support system. During the past several years considerable research has been conducted to develop hardware and construction techniques suitable for astronaut assembly of truss structures in space. A research program has recently been initiated to develop the technology and to demonstrate the potential for automated in-space assembly of large erectable structures. The initial effort will be focussed on automated assembly of a tetrahedral truss composed of 2-meter members. The facility is designed as a ground based system to permit evaluation of assembly concepts and was not designed for space qualification. The system is intended to be used as a tool from which more sophisticated procedures and operations can be developed. The facility description includes a truss structure, motionbases and a robot arm equipped with an end effector. Other considerations and requirements of the structural assembly describe computer control systems to monitor and control the operations of the assembly facility.

  6. Automated 3D structure composition for large RNAs.

    PubMed

    Popenda, Mariusz; Szachniuk, Marta; Antczak, Maciej; Purzycka, Katarzyna J; Lukasiak, Piotr; Bartol, Natalia; Blazewicz, Jacek; Adamiak, Ryszard W

    2012-08-01

    Understanding the numerous functions that RNAs play in living cells depends critically on knowledge of their three-dimensional structure. Due to the difficulties in experimentally assessing structures of large RNAs, there is currently great demand for new high-resolution structure prediction methods. We present the novel method for the fully automated prediction of RNA 3D structures from a user-defined secondary structure. The concept is founded on the machine translation system. The translation engine operates on the RNA FRABASE database tailored to the dictionary relating the RNA secondary structure and tertiary structure elements. The translation algorithm is very fast. Initial 3D structure is composed in a range of seconds on a single processor. The method assures the prediction of large RNA 3D structures of high quality. Our approach needs neither structural templates nor RNA sequence alignment, required for comparative methods. This enables the building of unresolved yet native and artificial RNA structures. The method is implemented in a publicly available, user-friendly server RNAComposer. It works in an interactive mode and a batch mode. The batch mode is designed for large-scale modelling and accepts atomic distance restraints. Presently, the server is set to build RNA structures of up to 500 residues.

  7. Automated 3D structure composition for large RNAs

    PubMed Central

    Popenda, Mariusz; Szachniuk, Marta; Antczak, Maciej; Purzycka, Katarzyna J.; Lukasiak, Piotr; Bartol, Natalia; Blazewicz, Jacek; Adamiak, Ryszard W.

    2012-01-01

    Understanding the numerous functions that RNAs play in living cells depends critically on knowledge of their three-dimensional structure. Due to the difficulties in experimentally assessing structures of large RNAs, there is currently great demand for new high-resolution structure prediction methods. We present the novel method for the fully automated prediction of RNA 3D structures from a user-defined secondary structure. The concept is founded on the machine translation system. The translation engine operates on the RNA FRABASE database tailored to the dictionary relating the RNA secondary structure and tertiary structure elements. The translation algorithm is very fast. Initial 3D structure is composed in a range of seconds on a single processor. The method assures the prediction of large RNA 3D structures of high quality. Our approach needs neither structural templates nor RNA sequence alignment, required for comparative methods. This enables the building of unresolved yet native and artificial RNA structures. The method is implemented in a publicly available, user-friendly server RNAComposer. It works in an interactive mode and a batch mode. The batch mode is designed for large-scale modelling and accepts atomic distance restraints. Presently, the server is set to build RNA structures of up to 500 residues. PMID:22539264

  8. An automated, integrated approach to Space Station structural modeling

    NASA Technical Reports Server (NTRS)

    Lindenmoyer, Alan J.; Habermeyer, John A.

    1989-01-01

    NASA and its contractors have developed an integrated, interdisciplinary CAD/analysis system designated IDEAS(double asterisk)2 in order to conduct evaluations of alternative Space Station concepts' performance over the projected course of the Station's evolution in orbit. Attention is presently given to the requirements associated with automated FEM-building methods applicable to Space Station system-level structural dynamic analysis, and the ways in which IDEAS(double asterisk)2 addresses these requirements. Advantage is taken of the interactive capabilities of the SUPERTAB FEM preprocessor system for Space Station model manipulation and modification.

  9. Towards automated crystallographic structure refinement with phenix.refine

    PubMed Central

    Afonine, Pavel V.; Grosse-Kunstleve, Ralf W.; Echols, Nathaniel; Headd, Jeffrey J.; Moriarty, Nigel W.; Mustyakimov, Marat; Terwilliger, Thomas C.; Urzhumtsev, Alexandre; Zwart, Peter H.; Adams, Paul D.

    2012-01-01

    phenix.refine is a program within the PHENIX package that supports crystallographic structure refinement against experimental data with a wide range of upper resolution limits using a large repertoire of model parameterizations. It has several automation features and is also highly flexible. Several hundred parameters enable extensive customizations for complex use cases. Multiple user-defined refinement strategies can be applied to specific parts of the model in a single refinement run. An intuitive graphical user interface is available to guide novice users and to assist advanced users in managing refinement projects. X-ray or neutron diffraction data can be used separately or jointly in refinement. phenix.refine is tightly integrated into the PHENIX suite, where it serves as a critical component in automated model building, final structure refinement, structure validation and deposition to the wwPDB. This paper presents an overview of the major phenix.refine features, with extensive literature references for readers interested in more detailed discussions of the methods. PMID:22505256

  10. Towards automated crystallographic structure refinement with phenix.refine.

    PubMed

    Afonine, Pavel V; Grosse-Kunstleve, Ralf W; Echols, Nathaniel; Headd, Jeffrey J; Moriarty, Nigel W; Mustyakimov, Marat; Terwilliger, Thomas C; Urzhumtsev, Alexandre; Zwart, Peter H; Adams, Paul D

    2012-04-01

    phenix.refine is a program within the PHENIX package that supports crystallographic structure refinement against experimental data with a wide range of upper resolution limits using a large repertoire of model parameterizations. It has several automation features and is also highly flexible. Several hundred parameters enable extensive customizations for complex use cases. Multiple user-defined refinement strategies can be applied to specific parts of the model in a single refinement run. An intuitive graphical user interface is available to guide novice users and to assist advanced users in managing refinement projects. X-ray or neutron diffraction data can be used separately or jointly in refinement. phenix.refine is tightly integrated into the PHENIX suite, where it serves as a critical component in automated model building, final structure refinement, structure validation and deposition to the wwPDB. This paper presents an overview of the major phenix.refine features, with extensive literature references for readers interested in more detailed discussions of the methods.

  11. ModLoop: automated modeling of loops in protein structures.

    PubMed

    Fiser, András; Sali, Andrej

    2003-12-12

    ModLoop is a web server for automated modeling of loops in protein structures. The input is the atomic coordinates of the protein structure in the Protein Data Bank format, and the specification of the starting and ending residues of one or more segments to be modeled, containing no more than 20 residues in total. The output is the coordinates of the non-hydrogen atoms in the modeled segments. A user provides the input to the server via a simple web interface, and receives the output by e-mail. The server relies on the loop modeling routine in MODELLER that predicts the loop conformations by satisfaction of spatial restraints, without relying on a database of known protein structures. For a rapid response, ModLoop runs on a cluster of Linux PC computers. The server is freely accessible to academic users at http://salilab.org/modloop

  12. Verification Test of Automated Robotic Assembly of Space Truss Structures

    NASA Technical Reports Server (NTRS)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.

    1995-01-01

    A multidisciplinary program has been conducted at the Langley Research Center to develop operational procedures for supervised autonomous assembly of truss structures suitable for large-aperture antennas. The hardware and operations required to assemble a 102-member tetrahedral truss and attach 12 hexagonal panels were developed and evaluated. A brute-force automation approach was used to develop baseline assembly hardware and software techniques. However, as the system matured and operations were proven, upgrades were incorporated and assessed against the baseline test results. These upgrades included the use of distributed microprocessors to control dedicated end-effector operations, machine vision guidance for strut installation, and the use of an expert system-based executive-control program. This paper summarizes the developmental phases of the program, the results of several assembly tests, and a series of proposed enhancements. No problems that would preclude automated in-space assembly or truss structures have been encountered. The test system was developed at a breadboard level and continued development at an enhanced level is warranted.

  13. The automated strength-aeroelastic design of aerospace structures program

    NASA Technical Reports Server (NTRS)

    Johnson, E. H.; Venkayya, V. B.

    1984-01-01

    An ongoing program whose goal is to develop an automated procedure that can assist in the preliminary design of aircraft and space structures is described. The approach and capabilities that are to be included in the final procedures are descussed. By using proven engineering software as a basis for the project, a reliable and interdisciplinary procedure is developed. The use of a control language for module sequencing and execution permits efficient development of the procedure and gives the user significant flexibility in altering or enhancing the procedure. The data base system provides reliable and efficient access to the large amounts of interrelated data required in an enterprise of this sort. In addition, the data base allows interfacing with existing pre- and post-processors in an almost trivial manner. Altogether, the procedure promises to be of considerable utility to preliminary structural design teams.

  14. pmx: Automated protein structure and topology generation for alchemical perturbations.

    PubMed

    Gapsys, Vytautas; Michielssens, Servaas; Seeliger, Daniel; de Groot, Bert L

    2015-02-15

    Computational protein design requires methods to accurately estimate free energy changes in protein stability or binding upon an amino acid mutation. From the different approaches available, molecular dynamics-based alchemical free energy calculations are unique in their accuracy and solid theoretical basis. The challenge in using these methods lies in the need to generate hybrid structures and topologies representing two physical states of a system. A custom made hybrid topology may prove useful for a particular mutation of interest, however, a high throughput mutation analysis calls for a more general approach. In this work, we present an automated procedure to generate hybrid structures and topologies for the amino acid mutations in all commonly used force fields. The described software is compatible with the Gromacs simulation package. The mutation libraries are readily supported for five force fields, namely Amber99SB, Amber99SB*-ILDN, OPLS-AA/L, Charmm22*, and Charmm36.

  15. pmx: Automated protein structure and topology generation for alchemical perturbations

    PubMed Central

    Gapsys, Vytautas; Michielssens, Servaas; Seeliger, Daniel; de Groot, Bert L

    2015-01-01

    Computational protein design requires methods to accurately estimate free energy changes in protein stability or binding upon an amino acid mutation. From the different approaches available, molecular dynamics-based alchemical free energy calculations are unique in their accuracy and solid theoretical basis. The challenge in using these methods lies in the need to generate hybrid structures and topologies representing two physical states of a system. A custom made hybrid topology may prove useful for a particular mutation of interest, however, a high throughput mutation analysis calls for a more general approach. In this work, we present an automated procedure to generate hybrid structures and topologies for the amino acid mutations in all commonly used force fields. The described software is compatible with the Gromacs simulation package. The mutation libraries are readily supported for five force fields, namely Amber99SB, Amber99SB*-ILDN, OPLS-AA/L, Charmm22*, and Charmm36. PMID:25487359

  16. Automated Glioblastoma Segmentation Based on a Multiparametric Structured Unsupervised Classification

    PubMed Central

    Juan-Albarracín, Javier; Fuster-Garcia, Elies; Manjón, José V.; Robles, Montserrat; Aparici, F.; Martí-Bonmatí, L.; García-Gómez, Juan M.

    2015-01-01

    Automatic brain tumour segmentation has become a key component for the future of brain tumour treatment. Currently, most of brain tumour segmentation approaches arise from the supervised learning standpoint, which requires a labelled training dataset from which to infer the models of the classes. The performance of these models is directly determined by the size and quality of the training corpus, whose retrieval becomes a tedious and time-consuming task. On the other hand, unsupervised approaches avoid these limitations but often do not reach comparable results than the supervised methods. In this sense, we propose an automated unsupervised method for brain tumour segmentation based on anatomical Magnetic Resonance (MR) images. Four unsupervised classification algorithms, grouped by their structured or non-structured condition, were evaluated within our pipeline. Considering the non-structured algorithms, we evaluated K-means, Fuzzy K-means and Gaussian Mixture Model (GMM), whereas as structured classification algorithms we evaluated Gaussian Hidden Markov Random Field (GHMRF). An automated postprocess based on a statistical approach supported by tissue probability maps is proposed to automatically identify the tumour classes after the segmentations. We evaluated our brain tumour segmentation method with the public BRAin Tumor Segmentation (BRATS) 2013 Test and Leaderboard datasets. Our approach based on the GMM model improves the results obtained by most of the supervised methods evaluated with the Leaderboard set and reaches the second position in the ranking. Our variant based on the GHMRF achieves the first position in the Test ranking of the unsupervised approaches and the seventh position in the general Test ranking, which confirms the method as a viable alternative for brain tumour segmentation. PMID:25978453

  17. Automated glioblastoma segmentation based on a multiparametric structured unsupervised classification.

    PubMed

    Juan-Albarracín, Javier; Fuster-Garcia, Elies; Manjón, José V; Robles, Montserrat; Aparici, F; Martí-Bonmatí, L; García-Gómez, Juan M

    2015-01-01

    Automatic brain tumour segmentation has become a key component for the future of brain tumour treatment. Currently, most of brain tumour segmentation approaches arise from the supervised learning standpoint, which requires a labelled training dataset from which to infer the models of the classes. The performance of these models is directly determined by the size and quality of the training corpus, whose retrieval becomes a tedious and time-consuming task. On the other hand, unsupervised approaches avoid these limitations but often do not reach comparable results than the supervised methods. In this sense, we propose an automated unsupervised method for brain tumour segmentation based on anatomical Magnetic Resonance (MR) images. Four unsupervised classification algorithms, grouped by their structured or non-structured condition, were evaluated within our pipeline. Considering the non-structured algorithms, we evaluated K-means, Fuzzy K-means and Gaussian Mixture Model (GMM), whereas as structured classification algorithms we evaluated Gaussian Hidden Markov Random Field (GHMRF). An automated postprocess based on a statistical approach supported by tissue probability maps is proposed to automatically identify the tumour classes after the segmentations. We evaluated our brain tumour segmentation method with the public BRAin Tumor Segmentation (BRATS) 2013 Test and Leaderboard datasets. Our approach based on the GMM model improves the results obtained by most of the supervised methods evaluated with the Leaderboard set and reaches the second position in the ranking. Our variant based on the GHMRF achieves the first position in the Test ranking of the unsupervised approaches and the seventh position in the general Test ranking, which confirms the method as a viable alternative for brain tumour segmentation.

  18. Automated optimum design of wing structures. Deterministic and probabilistic approaches

    NASA Technical Reports Server (NTRS)

    Rao, S. S.

    1982-01-01

    The automated optimum design of airplane wing structures subjected to multiple behavior constraints is described. The structural mass of the wing is considered the objective function. The maximum stress, wing tip deflection, root angle of attack, and flutter velocity during the pull up maneuver (static load), the natural frequencies of the wing structure, and the stresses induced in the wing structure due to landing and gust loads are suitably constrained. Both deterministic and probabilistic approaches are used for finding the stresses induced in the airplane wing structure due to landing and gust loads. A wing design is represented by a uniform beam with a cross section in the form of a hollow symmetric double wedge. The airfoil thickness and chord length are the design variables, and a graphical procedure is used to find the optimum solutions. A supersonic wing design is represented by finite elements. The thicknesses of the skin and the web and the cross sectional areas of the flanges are the design variables, and nonlinear programming techniques are used to find the optimum solution.

  19. Semi-Automated Discovery of Application Session Structure

    SciTech Connect

    Kannan, J.; Jung, J.; Paxson, V.; Koksal, C.

    2006-09-07

    While the problem of analyzing network traffic at the granularity of individual connections has seen considerable previous work and tool development, understanding traffic at a higher level---the structure of user-initiated sessions comprised of groups of related connections---remains much less explored. Some types of session structure, such as the coupling between an FTP control connection and the data connections it spawns, have prespecified forms, though the specifications do not guarantee how the forms appear in practice. Other types of sessions, such as a user reading email with a browser, only manifest empirically. Still other sessions might exist without us even knowing of their presence, such as a botnet zombie receiving instructions from its master and proceeding in turn to carry them out. We present algorithms rooted in the statistics of Poisson processes that can mine a large corpus of network connection logs to extract the apparent structure of application sessions embedded in the connections. Our methods are semi-automated in that we aim to present an analyst with high-quality information (expressed as regular expressions) reflecting different possible abstractions of an application's session structure. We develop and test our methods using traces from a large Internet site, finding diversity in the number of applications that manifest, their different session structures, and the presence of abnormal behavior. Our work has applications to traffic characterization and monitoring, source models for synthesizing network traffic, and anomaly detection.

  20. An automated approach to network features of protein structure ensembles

    PubMed Central

    Bhattacharyya, Moitrayee; Bhat, Chanda R; Vishveshwara, Saraswathi

    2013-01-01

    Network theory applied to protein structures provides insights into numerous problems of biological relevance. The explosion in structural data available from PDB and simulations establishes a need to introduce a standalone-efficient program that assembles network concepts/parameters under one hood in an automated manner. Herein, we discuss the development/application of an exhaustive, user-friendly, standalone program package named PSN-Ensemble, which can handle structural ensembles generated through molecular dynamics (MD) simulation/NMR studies or from multiple X-ray structures. The novelty in network construction lies in the explicit consideration of side-chain interactions among amino acids. The program evaluates network parameters dealing with topological organization and long-range allosteric communication. The introduction of a flexible weighing scheme in terms of residue pairwise cross-correlation/interaction energy in PSN-Ensemble brings in dynamical/chemical knowledge into the network representation. Also, the results are mapped on a graphical display of the structure, allowing an easy access of network analysis to a general biological community. The potential of PSN-Ensemble toward examining structural ensemble is exemplified using MD trajectories of an ubiquitin-conjugating enzyme (UbcH5b). Furthermore, insights derived from network parameters evaluated using PSN-Ensemble for single-static structures of active/inactive states of β2-adrenergic receptor and the ternary tRNA complexes of tyrosyl tRNA synthetases (from organisms across kingdoms) are discussed. PSN-Ensemble is freely available from http://vishgraph.mbu.iisc.ernet.in/PSN-Ensemble/psn_index.html. PMID:23934896

  1. Development of a machine vision system for automated structural assembly

    NASA Technical Reports Server (NTRS)

    Sydow, P. Daniel; Cooper, Eric G.

    1992-01-01

    Research is being conducted at the LaRC to develop a telerobotic assembly system designed to construct large space truss structures. This research program was initiated within the past several years, and a ground-based test-bed was developed to evaluate and expand the state of the art. Test-bed operations currently use predetermined ('taught') points for truss structural assembly. Total dependence on the use of taught points for joint receptacle capture and strut installation is neither robust nor reliable enough for space operations. Therefore, a machine vision sensor guidance system is being developed to locate and guide the robot to a passive target mounted on the truss joint receptacle. The vision system hardware includes a miniature video camera, passive targets mounted on the joint receptacles, target illumination hardware, and an image processing system. Discrimination of the target from background clutter is accomplished through standard digital processing techniques. Once the target is identified, a pose estimation algorithm is invoked to determine the location, in three-dimensional space, of the target relative to the robots end-effector. Preliminary test results of the vision system in the Automated Structural Assembly Laboratory with a range of lighting and background conditions indicate that it is fully capable of successfully identifying joint receptacle targets throughout the required operational range. Controlled optical bench test results indicate that the system can also provide the pose estimation accuracy to define the target position.

  2. Automated segmentation of tissue structures in optical coherence tomography data

    NASA Astrophysics Data System (ADS)

    Gasca, Fernando; Ramrath, Lukas; Huettmann, Gereon; Schweikard, Achim

    2009-05-01

    Segmentation of optical coherence tomography (OCT) images provides useful information, especially in medical imaging applications. Because OCT images are subject to speckle noise, the identification of structures is complicated. Addressing this issue, two methods for the automated segmentation of arbitrary structures in OCT images are proposed. The methods perform a seeded region growing, applying a model-based analysis of OCT A-scans for the seed's acquisition. The segmentation therefore avoids any user-intervention dependency. The first region-growing algorithm uses an adaptive neighborhood homogeneity criterion based on a model of an OCT intensity course in tissue and a model of speckle noise corruption. It can be applied to an unfiltered OCT image. The second performs region growing on a filtered OCT image applying the local median as a measure for homogeneity in the region. Performance is compared through the quantitative evaluation of artificial data, showing the capabilities of both in terms of structures detected and leakage. The proposed methods were tested on real OCT data in different scenarios and showed promising results for their application in OCT imaging.

  3. Automated segmentation of tissue structures in optical coherence tomography data.

    PubMed

    Gasca, Fernando; Ramrath, Lukas; Huettmann, Gereon; Schweikard, Achim

    2009-01-01

    Segmentation of optical coherence tomography (OCT) images provides useful information, especially in medical imaging applications. Because OCT images are subject to speckle noise, the identification of structures is complicated. Addressing this issue, two methods for the automated segmentation of arbitrary structures in OCT images are proposed. The methods perform a seeded region growing, applying a model-based analysis of OCT A-scans for the seed's acquisition. The segmentation therefore avoids any user-intervention dependency. The first region-growing algorithm uses an adaptive neighborhood homogeneity criterion based on a model of an OCT intensity course in tissue and a model of speckle noise corruption. It can be applied to an unfiltered OCT image. The second performs region growing on a filtered OCT image applying the local median as a measure for homogeneity in the region. Performance is compared through the quantitative evaluation of artificial data, showing the capabilities of both in terms of structures detected and leakage. The proposed methods were tested on real OCT data in different scenarios and showed promising results for their application in OCT imaging.

  4. Towards an automated analysis of bacterial peptidoglycan structure.

    PubMed

    Bern, Marshall; Beniston, Richard; Mesnage, Stéphane

    2017-01-01

    Peptidoglycan (PG) is an essential component of the bacterial cell envelope. This macromolecule consists of glycan chains alternating N-acetylglucosamine and N-acetylmuramic acid, cross-linked by short peptides containing nonstandard amino acids. Structural analysis of PG usually involves enzymatic digestion of glycan strands and separation of disaccharide peptides by reversed-phase HPLC followed by collection of individual peaks for MALDI-TOF and/or tandem mass spectrometry. Here, we report a novel strategy using shotgun proteomics techniques for a systematic and unbiased structural analysis of PG using high-resolution mass spectrometry and automated analysis of HCD and ETD fragmentation spectra with the Byonic software. Using the PG of the nosocomial pathogen Clostridium difficile as a proof of concept, we show that this high-throughput approach allows the identification of all PG monomers and dimers previously described, leaving only disambiguation of 3-3 and 4-3 cross-linking as a manual step. Our analysis confirms previous findings that C. difficile peptidoglycans include mainly deacetylated N-acetylglucosamine residues and 3-3 cross-links. The analysis also revealed a number of low abundance muropeptides with peptide sequences not previously reported. Graphical Abstract The bacterial cell envelope includes plasma membrane, peptidoglycan, and surface layer. Peptidoglycan is unique to bacteria and the target of the most important antibiotics; here it is analyzed by mass spectrometry.

  5. Automated identification of elemental ions in macromolecular crystal structures

    SciTech Connect

    Echols, Nathaniel Morshed, Nader; Afonine, Pavel V.; McCoy, Airlie J.; Read, Randy J.; Terwilliger, Thomas C.; Adams, Paul D.

    2014-04-01

    The solvent-picking procedure in phenix.refine has been extended and combined with Phaser anomalous substructure completion and analysis of coordination geometry to identify and place elemental ions. Many macromolecular model-building and refinement programs can automatically place solvent atoms in electron density at moderate-to-high resolution. This process frequently builds water molecules in place of elemental ions, the identification of which must be performed manually. The solvent-picking algorithms in phenix.refine have been extended to build common ions based on an analysis of the chemical environment as well as physical properties such as occupancy, B factor and anomalous scattering. The method is most effective for heavier elements such as calcium and zinc, for which a majority of sites can be placed with few false positives in a diverse test set of structures. At atomic resolution, it is observed that it can also be possible to identify tightly bound sodium and magnesium ions. A number of challenges that contribute to the difficulty of completely automating the process of structure completion are discussed.

  6. Automated web service composition supporting conditional branch structures

    NASA Astrophysics Data System (ADS)

    Wang, Pengwei; Ding, Zhijun; Jiang, Changjun; Zhou, Mengchu

    2014-01-01

    The creation of value-added services by automatic composition of existing ones is gaining a significant momentum as the potential silver bullet in service-oriented architecture. However, service composition faces two aspects of difficulties. First, users' needs present such characteristics as diversity, uncertainty and personalisation; second, the existing services run in a real-world environment that is highly complex and dynamically changing. These difficulties may cause the emergence of nondeterministic choices in the process of service composition, which has gone beyond what the existing automated service composition techniques can handle. According to most of the existing methods, the process model of composite service includes sequence constructs only. This article presents a method to introduce conditional branch structures into the process model of composite service when needed, in order to satisfy users' diverse and personalised needs and adapt to the dynamic changes of real-world environment. UML activity diagrams are used to represent dependencies in composite service. Two types of user preferences are considered in this article, which have been ignored by the previous work and a simple programming language style expression is adopted to describe them. Two different algorithms are presented to deal with different situations. A real-life case is provided to illustrate the proposed concepts and methods.

  7. 12 CFR Appendix D to Part 360 - Sweep/Automated Credit Account File Structure

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 5 2014-01-01 2014-01-01 false Sweep/Automated Credit Account File Structure D.... Character (25). 12. SW_Sub_Acct_Identifier Sweep/Automated Credit Sub-Account IdentifierIf available, the.... • AI = Deposit Held in an affiliated depository institution. • FF = Federal Funds. • CP =...

  8. 12 CFR Appendix D to Part 360 - Sweep/Automated Credit Account File Structure

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 5 2013-01-01 2013-01-01 false Sweep/Automated Credit Account File Structure D.... Character (25). 12. SW_Sub_Acct_Identifier Sweep/Automated Credit Sub-Account IdentifierIf available, the.... • AI = Deposit Held in an affiliated depository institution. • FF = Federal Funds. • CP =...

  9. 12 CFR Appendix D to Part 360 - Sweep/Automated Credit Account File Structure

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 5 2012-01-01 2012-01-01 false Sweep/Automated Credit Account File Structure D.... Character (25). 12. SW_Sub_Acct_Identifier Sweep/Automated Credit Sub-Account IdentifierIf available, the.... • AI = Deposit Held in an affiliated depository institution. • FF = Federal Funds. • CP =...

  10. 12 CFR Appendix D to Part 360 - Sweep/Automated Credit Account File Structure

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Sweep/Automated Credit Account File Structure D.... Character (25). 12. SW_Sub_Acct_Identifier Sweep/Automated Credit Sub-Account IdentifierIf available, the.... • AI = Deposit Held in an affiliated depository institution. • FF = Federal Funds. • CP =...

  11. 12 CFR Appendix D to Part 360 - Sweep/Automated Credit Account File Structure

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 4 2011-01-01 2011-01-01 false Sweep/Automated Credit Account File Structure D.... Character (25). 12. SW_Sub_Acct_Identifier Sweep/Automated Credit Sub-Account IdentifierIf available, the.... • AI = Deposit Held in an affiliated depository institution. • FF = Federal Funds. • CP =...

  12. Software design for automated assembly of truss structures

    NASA Technical Reports Server (NTRS)

    Herstrom, Catherine L.; Grantham, Carolyn; Allen, Cheryl L.; Doggett, William R.; Will, Ralph W.

    1992-01-01

    Concern over the limited intravehicular activity time has increased the interest in performing in-space assembly and construction operations with automated robotic systems. A technique being considered at LaRC is a supervised-autonomy approach, which can be monitored by an Earth-based supervisor that intervenes only when the automated system encounters a problem. A test-bed to support evaluation of the hardware and software requirements for supervised-autonomy assembly methods was developed. This report describes the design of the software system necessary to support the assembly process. The software is hierarchical and supports both automated assembly operations and supervisor error-recovery procedures, including the capability to pause and reverse any operation. The software design serves as a model for the development of software for more sophisticated automated systems and as a test-bed for evaluation of new concepts and hardware components.

  13. Automated eukaryotic gene structure annotation using EVidenceModeler and the Program to Assemble Spliced Alignments.

    PubMed

    Haas, Brian J; Salzberg, Steven L; Zhu, Wei; Pertea, Mihaela; Allen, Jonathan E; Orvis, Joshua; White, Owen; Buell, C Robin; Wortman, Jennifer R

    2008-01-11

    EVidenceModeler (EVM) is presented as an automated eukaryotic gene structure annotation tool that reports eukaryotic gene structures as a weighted consensus of all available evidence. EVM, when combined with the Program to Assemble Spliced Alignments (PASA), yields a comprehensive, configurable annotation system that predicts protein-coding genes and alternatively spliced isoforms. Our experiments on both rice and human genome sequences demonstrate that EVM produces automated gene structure annotation approaching the quality of manual curation.

  14. Automated Eukaryotic Gene Structure Annotation Using EVidenceModeler and the Program to Assemble Spliced Alignments

    SciTech Connect

    Haas, B J; Salzberg, S L; Zhu, W; Pertea, M; Allen, J E; Orvis, J; White, O; Buell, C R; Wortman, J R

    2007-12-10

    EVidenceModeler (EVM) is presented as an automated eukaryotic gene structure annotation tool that reports eukaryotic gene structures as a weighted consensus of all available evidence. EVM, when combined with the Program to Assemble Spliced Alignments (PASA), yields a comprehensive, configurable annotation system that predicts protein-coding genes and alternatively spliced isoforms. Our experiments on both rice and human genome sequences demonstrate that EVM produces automated gene structure annotation approaching the quality of manual curation.

  15. Automated construction of lightweight, simple, field-erected structures

    NASA Technical Reports Server (NTRS)

    Leonard, R. S.

    1980-01-01

    The feasibility of automation of construction processes which could result in mobile construction robots is examined. The construction of a large photovoltaic power plant with a peak power output of 100 MW is demonstrated. The reasons to automate the construction process, a conventional construction scenario as the reference for evaluation, and a list of potential cost benefits using robots are presented. The technical feasibility of using robots to construct SPS ground stations is addressed.

  16. Towards fully automated structure-based function prediction in structural genomics: a case study.

    PubMed

    Watson, James D; Sanderson, Steve; Ezersky, Alexandra; Savchenko, Alexei; Edwards, Aled; Orengo, Christine; Joachimiak, Andrzej; Laskowski, Roman A; Thornton, Janet M

    2007-04-13

    As the global Structural Genomics projects have picked up pace, the number of structures annotated in the Protein Data Bank as hypothetical protein or unknown function has grown significantly. A major challenge now involves the development of computational methods to assign functions to these proteins accurately and automatically. As part of the Midwest Center for Structural Genomics (MCSG) we have developed a fully automated functional analysis server, ProFunc, which performs a battery of analyses on a submitted structure. The analyses combine a number of sequence-based and structure-based methods to identify functional clues. After the first stage of the Protein Structure Initiative (PSI), we review the success of the pipeline and the importance of structure-based function prediction. As a dataset, we have chosen all structures solved by the MCSG during the 5 years of the first PSI. Our analysis suggests that two of the structure-based methods are particularly successful and provide examples of local similarity that is difficult to identify using current sequence-based methods. No one method is successful in all cases, so, through the use of a number of complementary sequence and structural approaches, the ProFunc server increases the chances that at least one method will find a significant hit that can help elucidate function. Manual assessment of the results is a time-consuming process and subject to individual interpretation and human error. We present a method based on the Gene Ontology (GO) schema using GO-slims that can allow the automated assessment of hits with a success rate approaching that of expert manual assessment.

  17. Concurrent combined verification: reducing false positives in automated NMR structure verification through the evaluation of multiple challenge control structures.

    PubMed

    Golotvin, Sergey S; Pol, Rostislav; Sasaki, Ryan R; Nikitina, Asya; Keyes, Philip

    2012-06-01

    Automated structure verification using (1)H NMR data or a combination of (1)H and heteronuclear single-quantum correlation (HSQC) data is gaining more interest as a routine application for qualitative evaluation of large compound libraries produced by synthetic chemistry. The goal of this automated software method is to identify a manageable subset of compounds and data that require human review. In practice, the automated method will flag structure and data combinations that exhibit some inconsistency (i.e. strange chemical shifts, conflicts in multiplicity, or overestimated and underestimated integration values) and validate those that appear consistent. One drawback of this approach is that no automated system can guarantee that all passing structures are indeed correct structures. The major reason for this is that approaches using only (1)H or even (1)H and HSQC spectra often do not provide sufficient information to properly distinguish between similar structures. Therefore, current implementations of automated structure verification systems allow, in principle, false positive results. Presented in this work is a method that greatly reduces the probability of an automated validation system passing incorrect structures (i.e. false positives). This novel method was applied to automatically validate 127 non-proprietary compounds from several commercial sources. Presented also is the impact of this approach on false positive and false negative results. Copyright © 2012 John Wiley & Sons, Ltd.

  18. STUDY OF ALUMINA CRYSTAL STRUCTURES (AUTOMATION OF THE VERNEUIL PROCESS).

    DTIC Science & Technology

    A careful analysis of the basic mechanisms of the Verneuil process led to a methodical study of the many parameters associated with it. Among these...powder feed rate. A completely automated Verneuil apparatus, incorporating this and other control systems, was designed and constructed to study crystal

  19. Instrumentation Automation for Concrete Structures: Report 2, Automation Hardware and Retrofitting Techniques, and Report 3, Available Data Collection and Reduction Software

    DTIC Science & Technology

    1987-06-01

    US-CE-CProperty ot the United States Government REPAIR, EVALUATION, MAINTENANCE, AND REHABILITATION RESEARCH PROGRAM TECHNICAL REPORT REMR-CS-5...INSTRUMENTATION AUTOMATION FOR CONCRETE STRUCTURES Report 2 AUTOMATION HARDWARE AND RETROFITTING TECHNIQUES by Aubrey Keeter, Byron Stonecypher...Vicksburg, Mississippi 39180-0631 The following two letters used as part of the number designating technical reports of research published under the Repair

  20. Exploring representations of protein structure for automated remote homology detection and mapping of protein structure space

    PubMed Central

    2014-01-01

    Background Due to rapid sequencing of genomes, there are now millions of deposited protein sequences with no known function. Fast sequence-based comparisons allow detecting close homologs for a protein of interest to transfer functional information from the homologs to the given protein. Sequence-based comparison cannot detect remote homologs, in which evolution has adjusted the sequence while largely preserving structure. Structure-based comparisons can detect remote homologs but most methods for doing so are too expensive to apply at a large scale over structural databases of proteins. Recently, fragment-based structural representations have been proposed that allow fast detection of remote homologs with reasonable accuracy. These representations have also been used to obtain linearly-reducible maps of protein structure space. It has been shown, as additionally supported from analysis in this paper that such maps preserve functional co-localization of the protein structure space. Methods Inspired by a recent application of the Latent Dirichlet Allocation (LDA) model for conducting structural comparisons of proteins, we propose higher-order LDA-obtained topic-based representations of protein structures to provide an alternative route for remote homology detection and organization of the protein structure space in few dimensions. Various techniques based on natural language processing are proposed and employed to aid the analysis of topics in the protein structure domain. Results We show that a topic-based representation is just as effective as a fragment-based one at automated detection of remote homologs and organization of protein structure space. We conduct a detailed analysis of the information content in the topic-based representation, showing that topics have semantic meaning. The fragment-based and topic-based representations are also shown to allow prediction of superfamily membership. Conclusions This work opens exciting venues in designing novel

  1. Exploiting structure similarity in refinement: automated NCS and target-structure restraints in BUSTER

    PubMed Central

    Smart, Oliver S.; Womack, Thomas O.; Flensburg, Claus; Keller, Peter; Paciorek, Włodek; Sharff, Andrew; Vonrhein, Clemens; Bricogne, Gérard

    2012-01-01

    Maximum-likelihood X-ray macromolecular structure refinement in BUSTER has been extended with restraints facilitating the exploitation of structural similarity. The similarity can be between two or more chains within the structure being refined, thus favouring NCS, or to a distinct ‘target’ structure that remains fixed during refinement. The local structural similarity restraints (LSSR) approach considers all distances less than 5.5 Å between pairs of atoms in the chain to be restrained. For each, the difference from the distance between the corresponding atoms in the related chain is found. LSSR applies a restraint penalty on each difference. A functional form that reaches a plateau for large differences is used to avoid the restraints distorting parts of the structure that are not similar. Because LSSR are local, there is no need to separate out domains. Some restraint pruning is still necessary, but this has been automated. LSSR have been available to academic users of BUSTER since 2009 with the easy-to-use -autoncs and -­target target.pdb options. The use of LSSR is illustrated in the re-refinement of PDB entries 5rnt, where -target enables the correct ligand-binding structure to be found, and 1osg, where -autoncs contributes to the location of an additional copy of the cyclic peptide ligand. PMID:22505257

  2. Automated frequency domain system identification of a large space structure

    NASA Technical Reports Server (NTRS)

    Yam, Y.; Bayard, D. S.; Hadaegh, F. Y.; Mettler, E.; Milman, M. H.

    1989-01-01

    This paper presents the development and experimental results of an automated on-orbit system identification method for large flexible spacecraft that yields estimated quantities to support on-line design and tuning of robust high performance control systems. The procedure consists of applying an input to the plant, obtaining an output, and then conducting nonparametric identification to yield the spectral estimate of the system transfer function. A parametric model is determined by curve fitting the spectral estimate to a rational transfer function. The identification method has been demonstrated experimentally on the Large Spacecraft Control Laboratory in JPL.

  3. Finite element based electrostatic-structural coupled analysis with automated mesh morphing

    SciTech Connect

    OWEN,STEVEN J.; ZHULIN,V.I.; OSTERGAARD,D.F.

    2000-02-29

    A co-simulation tool based on finite element principles has been developed to solve coupled electrostatic-structural problems. An automated mesh morphing algorithm has been employed to update the field mesh after structural deformation. The co-simulation tool has been successfully applied to the hysteric behavior of a MEMS switch.

  4. Guiding automated NMR structure determination using a global optimization metric, the NMR DP score

    PubMed Central

    Huang, Yuanpeng Janet; Mao, Binchen; Xu, Fei; Montelione, Gaetano

    2016-01-01

    ASDP is an automated NMR NOE assignment program. It uses a distinct bottom-up topology-constrained network anchoring approach for NOE interpretation, with 2D, 3D and/or 4D NOESY peak lists and resonance assignments as input, and generates unambiguous NOE constraints for iterative structure calculations. ASDP is designed to function interactively with various structure determination programs that use distance restraints to generate molecular models. In the CASD-NMR project, ASDP was tested and further developed using blinded NMR data, including resonance assignments, either raw or manually-curated (refined) NOESY peak list data, and in some cases 15N-1H residual dipolar coupling data. In these blinded tests, in which the reference structure was not available until after structures were generated, the fully-automated ASDP program performed very well on all targets using both the raw and refined NOESY peak list data. Improvements of ASDP relative to its predecessor program for automated NOESY peak assignments, AutoStructure, were driven by challenges provided by these CASD-NMR data. These algorithmic improvements include 1) using a global metric of structural accuracy, the Discriminating Power (DP) score, for guiding model selection during the iterative NOE interpretation process, and 2) identifying incorrect NOESY cross peak assignments caused by errors in the NMR resonance assignment list. These improvements provide a more robust automated NOESY analysis program, ASDP, with the unique capability of being utilized with alternative structure generation and refinement programs including CYANA, CNS, and/or Rosetta. PMID:26081575

  5. Aircraft wing structural design optimization based on automated finite element modelling and ground structure approach

    NASA Astrophysics Data System (ADS)

    Yang, Weizhu; Yue, Zhufeng; Li, Lei; Wang, Peiyan

    2016-01-01

    An optimization procedure combining an automated finite element modelling (AFEM) technique with a ground structure approach (GSA) is proposed for structural layout and sizing design of aircraft wings. The AFEM technique, based on CATIA VBA scripting and PCL programming, is used to generate models automatically considering the arrangement of inner systems. GSA is used for local structural topology optimization. The design procedure is applied to a high-aspect-ratio wing. The arrangement of the integral fuel tank, landing gear and control surfaces is considered. For the landing gear region, a non-conventional initial structural layout is adopted. The positions of components, the number of ribs and local topology in the wing box and landing gear region are optimized to obtain a minimum structural weight. Constraints include tank volume, strength, buckling and aeroelastic parameters. The results show that the combined approach leads to a greater weight saving, i.e. 26.5%, compared with three additional optimizations based on individual design approaches.

  6. Automated hexahedral meshing of anatomic structures using deformable registration.

    PubMed

    Grosland, Nicole M; Bafna, Ritesh; Magnotta, Vincent A

    2009-02-01

    This work introduces a novel method of automating the process of patient-specific finite element (FE) model development using a mapped mesh technique. The objective is to map a predefined mesh (template) of high quality directly onto a new bony surface (target) definition, thereby yielding a similar mesh with minimal user interaction. To bring the template mesh into correspondence with the target surface, a deformable registration technique based on the FE method has been adopted. The procedure has been made hierarchical allowing several levels of mesh refinement to be used, thus reducing the time required to achieve a solution. Our initial efforts have focused on the phalanx bones of the human hand. Mesh quality metrics, such as element volume and distortion were evaluated. Furthermore, the distance between the target surface and the final mapped mesh were measured. The results have satisfactorily proven the applicability of the proposed method.

  7. Texture analysis for automated classification of geologic structures

    USGS Publications Warehouse

    Shankar, V.; Rodriguez, J.J.; Gettings, M.E.

    2006-01-01

    Texture present in aeromagnetic anomaly images offers an abundance of useful geological information for discriminating between rock types, but current analysis of such images still relies on tedious, human interpretation. This study is believed to be the first effort to quantitatively assess the performance of texture-based digital image analysis for this geophysical exploration application. We computed several texture measures and determined the best subset using automated feature selection techniques. Pattern classification experiments measured the ability of various texture measures to automatically predict rock types. The classification accuracy was significantly better than a priori probability and prior weights-of-evidence results. The accuracy rates and choice of texture measures that minimize the error rate are reported. ?? 2006 IEEE.

  8. SOLVE and RESOLVE: automated structure solution, density modification and model building.

    PubMed

    Terwilliger, Thomas

    2004-01-01

    The software SOLVE and RESOLVE can carry out all the steps in macromolecular structure solution, from scaling and heavy-atom location through phasing, density modification and model-building in the MAD, SAD and MIR cases. SOLVE uses scoring scheme to convert the decision-making in macromolecular structure solution to an optimization problem. RESOLVE carries out the identification of NCS, density modification and automated model-building. The procedure is fully automated and can function at resolutions as low as 3 A.

  9. Development and verification testing of automation and robotics for assembly of space structures

    NASA Technical Reports Server (NTRS)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.

    1993-01-01

    A program was initiated within the past several years to develop operational procedures for automated assembly of truss structures suitable for large-aperture antennas. The assembly operations require the use of a robotic manipulator and are based on the principle of supervised autonomy to minimize crew resources. A hardware testbed was established to support development and evaluation testing. A brute-force automation approach was used to develop the baseline assembly hardware and software techniques. As the system matured and an operation was proven, upgrades were incorprated and assessed against the baseline test results. This paper summarizes the developmental phases of the program, the results of several assembly tests, the current status, and a series of proposed developments for additional hardware and software control capability. No problems that would preclude automated in-space assembly of truss structures have been encountered. The current system was developed at a breadboard level and continued development at an enhanced level is warranted.

  10. Development and verification testing of automation and robotics for assembly of space structures

    NASA Technical Reports Server (NTRS)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.

    1993-01-01

    A program was initiated within the past several years to develop operational procedures for automated assembly of truss structures suitable for large-aperture antennas. The assembly operations require the use of a robotic manipulator and are based on the principle of supervised autonomy to minimize crew resources. A hardware testbed was established to support development and evaluation testing. A brute-force automation approach was used to develop the baseline assembly hardware and software techniques. As the system matured and an operation was proven, upgrades were incorprated and assessed against the baseline test results. This paper summarizes the developmental phases of the program, the results of several assembly tests, the current status, and a series of proposed developments for additional hardware and software control capability. No problems that would preclude automated in-space assembly of truss structures have been encountered. The current system was developed at a breadboard level and continued development at an enhanced level is warranted.

  11. Blind testing of routine, fully automated determination of protein structures from NMR data

    PubMed Central

    Rosato, Antonio; Aramini, James M.; Arrowsmith, Cheryl; Bagaria, Anurag; Baker, David; Cavalli, Andrea; Doreleijers, Jurgen F.; Eletsky, Alexander; Giachetti, Andrea; Guerry, Paul; Gutmanas, Aleksandras; Güntert, Peter; He, Yunfen; Herrmann, Torsten; Huang, Yuanpeng J.; Jaravine, Victor; Jonker, Hendrik R.A.; Kennedy, Michael A.; Lange, Oliver F.; Liu, Gaohua; Malliavin, Thérèse E.; Mani, Rajeswari; Mao, Binchen; Montelione, Gaetano T.; Nilges, Michael; Rossi, Paolo; van der Schot, Gijs; Schwalbe, Harald; Szyperski, Thomas A.; Vendruscolo, Michele; Vernon, Robert; Vranken, Wim F.; de Vries, Sjoerd; Vuister, Geerten W.; Wu, Bin; Yang, Yunhuang; Bonvin, Alexandre M.J.J.

    2012-01-01

    SUMMARY The protocols currently used for protein structure determination by NMR depend on the determination of a large number of upper distance limits for proton-proton pairs. Typically, this task is performed manually by an experienced researcher rather than automatically by using a specific computer program. To assess whether it is indeed possible to generate in a fully automated manner NMR structures adequate for deposition in the Protein Data Bank, we gathered ten experimental datasets with unassigned NOESY peak lists for various proteins of unknown structure, computed structures for each of them using different, fully automatic programs, and compared the results to each other and to the manually solved reference structures that were not available at the time the data were provided. This constitutes a stringent “blind” assessment similar to the CASP and CAPRI initiatives. This study demonstrates the feasibility of routine, fully automated protein structure determination by NMR. PMID:22325772

  12. Automated Detection of Eruptive Structures for Solar Eruption Prediction

    NASA Astrophysics Data System (ADS)

    Georgoulis, Manolis K.

    2012-07-01

    The problem of data processing and assimilation for solar eruption prediction is, for contemporary solar physics, more pressing than the problem of data acquisition. Although critical solar data, such as the coronal magnetic field, are still not routinely available, space-based observatories deliver diverse, high-quality information at such a high rate that a manual or semi-manual processing becomes meaningless. We discuss automated data analysis methods and explain, using basic physics, why some of them are unlikely to advance eruption prediction. From this finding we also understand why solar eruption prediction is likely to remain inherently probabilistic. We discuss some promising eruption prediction measures and report on efforts to adapt them for use with high-resolution, high-cadence photospheric and coronal data delivered by the Solar Dynamics Observatory. Concluding, we touch on the problem of physical understanding and synthesis of different results: combining different measures inferred by different data sets is a yet-to-be-done exercise that, however, presents our best opportunity of realizing benefits in solar eruption prediction via a meaningful, targeted assimilation of solar data.

  13. An automated system for the study of ionospheric spatial structures

    NASA Astrophysics Data System (ADS)

    Belinskaya, I. V.; Boitman, O. N.; Vugmeister, B. O.; Vyborova, V. M.; Zakharov, V. N.; Laptev, V. A.; Mamchenko, M. S.; Potemkin, A. A.; Radionov, V. V.

    The system is designed for the study of the vertical distribution of electron density and the parameters of medium-scale ionospheric irregularities over the sounding site as well as the reconstruction of the spatial distribution of electron density within the range of up to 300 km from the sounding location. The system comprises an active central station as well as passive companion stations. The central station is equipped with the digital ionosonde ``Basis'', the measuring-and-computing complex IVK-2, and the receiver-recorder PRK-3M. The companion stations are equipped with receivers-recorders PRK-3. The automated comlex software system includes 14 subsystems. Data transfer between them is effected using magnetic disk data sets. The system is operated in both ionogram mode and Doppler shift and angle-of-arrival mode. Using data obtained in these two modes, the reconstruction of the spatial distribution of electron density in the region is carried out. Reconstruction is checked for accuracy using data from companion stations.

  14. Non-uniform Sampling and J-UNIO Automation for Efficient Protein NMR Structure Determination

    PubMed Central

    Didenko, Tatiana; Proudfoot, Andrew; Dutta, Samit Kumar; Serrano, Pedro; Wüthrich, Kurt

    2015-01-01

    High-resolution structure determination of small proteins in solution is one of the big assets of NMR spectroscopy in structural biology. Improvements in efficiency of NMR structure determination by advances in NMR experiments and automation of data handling therefore attracts continued interest. Here, non-uniform sampling (NUS) of 3D heteronuclear-resolved [1H,1H]-NOESY data yielded two- to three-fold savings of instrument time for structure determinations of soluble proteins. With the 152-residue protein NP_372339.1 from Staphylococcus aureus and the 71-residue protein NP_346341.1 from Streptococcus pneumonia we show that high-quality structures can be obtained with NUS NMR data, which are equally well amenable to robust automated analysis as the corresponding uniformly sampled data. PMID:26227870

  15. Amazon Forest Structure from IKONOS Satellite Data and the Automated Characterization of Forest Canopy Properties

    Treesearch

    Michael Palace; Michael Keller; Gregory P. Asner; Stephen Hagen; Bobby . Braswell

    2008-01-01

    We developed an automated tree crown analysis algorithm using 1-m panchromatic IKONOS satellite images to examine forest canopy structure in the Brazilian Amazon. The algorithm was calibrated on the landscape level with tree geometry and forest stand data at the Fazenda Cauaxi (3.75◦ S, 48.37◦ W) in the eastern Amazon, and then compared with forest...

  16. Automated effective band structures for defective and mismatched supercells

    NASA Astrophysics Data System (ADS)

    Brommer, Peter; Quigley, David

    2014-12-01

    In plane-wave density functional theory codes, defects and incommensurate structures are usually represented in supercells. However, interpretation of E versus k band structures is most effective within the primitive cell, where comparison to ideal structures and spectroscopy experiments are most natural. Popescu and Zunger recently described a method to derive effective band structures (EBS) from supercell calculations in the context of random alloys. In this paper, we present bs_sc2pc, an implementation of this method in the CASTEP code, which generates an EBS using the structural data of the supercell and the underlying primitive cell with symmetry considerations handled automatically. We demonstrate the functionality of our implementation in three test cases illustrating the efficacy of this scheme for capturing the effect of vacancies, substitutions and lattice mismatch on effective primitive cell band structures.

  17. Automated effective band structures for defective and mismatched supercells.

    PubMed

    Brommer, Peter; Quigley, David

    2014-12-03

    In plane-wave density functional theory codes, defects and incommensurate structures are usually represented in supercells. However, interpretation of E versus k band structures is most effective within the primitive cell, where comparison to ideal structures and spectroscopy experiments are most natural. Popescu and Zunger recently described a method to derive effective band structures (EBS) from supercell calculations in the context of random alloys. In this paper, we present bs_sc2pc, an implementation of this method in the CASTEP code, which generates an EBS using the structural data of the supercell and the underlying primitive cell with symmetry considerations handled automatically. We demonstrate the functionality of our implementation in three test cases illustrating the efficacy of this scheme for capturing the effect of vacancies, substitutions and lattice mismatch on effective primitive cell band structures.

  18. Automated discovery of active motifs in multiple RNA secondary structures

    SciTech Connect

    Wang, J.T.L.; Chang, Chia-Yo; Shapiro, B.A.

    1996-12-31

    In this paper we present a method for discovering approximately common motifs (also known as active motifs) in multiple RNA secondary structures. The secondary structures can be represented as ordered trees (i.e., the order among siblings matters). Motifs in these trees are connected subgraphs that can differ in both substitutions and deletions/insertions. The proposed method consists of two steps: (1) find candidate motifs in a small sample of the secondary structures; (2) search all of the secondary structures to determine how frequently these motifs occur (within the allowed approximation) in the secondary structures. To reduce the running time, we develop two optimization heuristics based on sampling and pattern matching techniques. Experimental results obtained by running these algorithms on both generated data and RNA secondary structures show the good performance of the algorithms. To demonstrate the utility of our algorithms, we discuss their applications to conducting the phylogenetic study of RNA sequences obtained from GenBank.

  19. Automated Finite Element Modeling of Wing Structures for Shape Optimization

    NASA Technical Reports Server (NTRS)

    Harvey, Michael Stephen

    1993-01-01

    The displacement formulation of the finite element method is the most general and most widely used technique for structural analysis of airplane configurations. Modem structural synthesis techniques based on the finite element method have reached a certain maturity in recent years, and large airplane structures can now be optimized with respect to sizing type design variables for many load cases subject to a rich variety of constraints including stress, buckling, frequency, stiffness and aeroelastic constraints (Refs. 1-3). These structural synthesis capabilities use gradient based nonlinear programming techniques to search for improved designs. For these techniques to be practical a major improvement was required in computational cost of finite element analyses (needed repeatedly in the optimization process). Thus, associated with the progress in structural optimization, a new perspective of structural analysis has emerged, namely, structural analysis specialized for design optimization application, or.what is known as "design oriented structural analysis" (Ref. 4). This discipline includes approximation concepts and methods for obtaining behavior sensitivity information (Ref. 1), all needed to make the optimization of large structural systems (modeled by thousands of degrees of freedom and thousands of design variables) practical and cost effective.

  20. Automated Process Initialization of Laser Surface Structuring Processes by Inline Process Metrology

    NASA Astrophysics Data System (ADS)

    Schmitt, R.; Mallmann, G.; Winands, K.; Pothen, M.

    Laser micro machining as well as laser surface structuring are innovative manufacturing technologies with a wide range of machinable materials and a high level of flexibility. These techniques are characterized by different machine, workpiece and environmental parameters. The large amount of process dependencies lead however to a time consuming process initialization and a complex process control. Currently no automated solution exists to achieve material specific process parameters, nor does a sufficient inline process control exist to adapt processing parameters or strategies inline. Therefore a novel scanner based inline metrology solution and an automated process initialization strategy has been developed.

  1. Automated on-orbit frequency domain identification for large space structures

    NASA Technical Reports Server (NTRS)

    Bayard, D. S.; Hadaegh, F. Y.; Yam, Y.; Scheid, R. E.; Mettler, E.; Milman, M. H.

    1991-01-01

    Recent experiences in the field of flexible structure control in space have indicated a need for on-orbit system identification to support robust control redesign to avoid in-flight instabilities and maintain high spacecraft performance. This paper highlights an automated frequency domain system identification methodology recently developed to fulfill this need. The methodology is focused to support (1) the estimation of system quantities useful for robust control analysis and design; (2) experiment design tailored to performing system identification in a typically constrained on-orbit environment; and (3) the automation of operations to reduce 'human in the loop' requirements.

  2. Automated Structure-Activity Relationship Mining: Connecting Chemical Structure to Biological Profiles.

    PubMed

    Wawer, Mathias J; Jaramillo, David E; Dančík, Vlado; Fass, Daniel M; Haggarty, Stephen J; Shamji, Alykhan F; Wagner, Bridget K; Schreiber, Stuart L; Clemons, Paul A

    2014-06-01

    Understanding the structure-activity relationships (SARs) of small molecules is important for developing probes and novel therapeutic agents in chemical biology and drug discovery. Increasingly, multiplexed small-molecule profiling assays allow simultaneous measurement of many biological response parameters for the same compound (e.g., expression levels for many genes or binding constants against many proteins). Although such methods promise to capture SARs with high granularity, few computational methods are available to support SAR analyses of high-dimensional compound activity profiles. Many of these methods are not generally applicable or reduce the activity space to scalar summary statistics before establishing SARs. In this article, we present a versatile computational method that automatically extracts interpretable SAR rules from high-dimensional profiling data. The rules connect chemical structural features of compounds to patterns in their biological activity profiles. We applied our method to data from novel cell-based gene-expression and imaging assays collected on more than 30,000 small molecules. Based on the rules identified for this data set, we prioritized groups of compounds for further study, including a novel set of putative histone deacetylase inhibitors. © 2014 Society for Laboratory Automation and Screening.

  3. MemProtMD: Automated Insertion of Membrane Protein Structures into Explicit Lipid Membranes

    PubMed Central

    Stansfeld, Phillip J.; Goose, Joseph E.; Caffrey, Martin; Carpenter, Elisabeth P.; Parker, Joanne L.; Newstead, Simon; Sansom, Mark S.P.

    2015-01-01

    Summary There has been exponential growth in the number of membrane protein structures determined. Nevertheless, these structures are usually resolved in the absence of their lipid environment. Coarse-grained molecular dynamics (CGMD) simulations enable insertion of membrane proteins into explicit models of lipid bilayers. We have automated the CGMD methodology, enabling membrane protein structures to be identified upon their release into the PDB and embedded into a membrane. The simulations are analyzed for protein-lipid interactions, identifying lipid binding sites, and revealing local bilayer deformations plus molecular access pathways within the membrane. The coarse-grained models of membrane protein/bilayer complexes are transformed to atomistic resolution for further analysis and simulation. Using this automated simulation pipeline, we have analyzed a number of recently determined membrane protein structures to predict their locations within a membrane, their lipid/protein interactions, and the functional implications of an enhanced understanding of the local membrane environment of each protein. PMID:26073602

  4. Recent developments in automated structure elucidation of natural products.

    PubMed

    Steinbeck, Christoph

    2004-08-01

    Advancements in the field of Computer-Assisted Structure Elucidation (CASE) of Natural Products achieved in the past five years are discussed. This process starts with a dereplication procedure, supported by structure-spectrum databases. Both commercial and free products are available to support the procedure. A number of new programs,as well as advancements in existing ones, are presented. Finally, the option to validate the result by an independent procedure, a high quality ab initio quantum mechanical calculation, is discussed.

  5. Automated dynamic analytical model improvement for damped structures

    NASA Technical Reports Server (NTRS)

    Fuh, J. S.; Berman, A.

    1985-01-01

    A method is described to improve a linear nonproportionally damped analytical model of a structure. The procedure finds the smallest changes in the analytical model such that the improved model matches the measured modal parameters. Features of the method are: (1) ability to properly treat complex valued modal parameters of a damped system; (2) applicability to realistically large structural models; and (3) computationally efficiency without involving eigensolutions and inversion of a large matrix.

  6. Fully automated high-quality NMR structure determination of small (2)H-enriched proteins.

    PubMed

    Tang, Yuefeng; Schneider, William M; Shen, Yang; Raman, Srivatsan; Inouye, Masayori; Baker, David; Roth, Monica J; Montelione, Gaetano T

    2010-12-01

    Determination of high-quality small protein structures by nuclear magnetic resonance (NMR) methods generally requires acquisition and analysis of an extensive set of structural constraints. The process generally demands extensive backbone and sidechain resonance assignments, and weeks or even months of data collection and interpretation. Here we demonstrate rapid and high-quality protein NMR structure generation using CS-Rosetta with a perdeuterated protein sample made at a significantly reduced cost using new bacterial culture condensation methods. Our strategy provides the basis for a high-throughput approach for routine, rapid, high-quality structure determination of small proteins. As an example, we demonstrate the determination of a high-quality 3D structure of a small 8 kDa protein, E. coli cold shock protein A (CspA), using <4 days of data collection and fully automated data analysis methods together with CS-Rosetta. The resulting CspA structure is highly converged and in excellent agreement with the published crystal structure, with a backbone RMSD value of 0.5 Å, an all atom RMSD value of 1.2 Å to the crystal structure for well-defined regions, and RMSD value of 1.1 Å to crystal structure for core, non-solvent exposed sidechain atoms. Cross validation of the structure with (15)N- and (13)C-edited NOESY data obtained with a perdeuterated (15)N, (13)C-enriched (13)CH(3) methyl protonated CspA sample confirms that essentially all of these independently-interpreted NOE-based constraints are already satisfied in each of the 10 CS-Rosetta structures. By these criteria, the CS-Rosetta structure generated by fully automated analysis of data for a perdeuterated sample provides an accurate structure of CspA. This represents a general approach for rapid, automated structure determination of small proteins by NMR.

  7. RNA structure framework: automated transcriptome-wide reconstruction of RNA secondary structures from high-throughput structure probing data.

    PubMed

    Incarnato, Danny; Neri, Francesco; Anselmi, Francesca; Oliviero, Salvatore

    2016-02-01

    The rapidly increasing number of discovered non-coding RNAs makes the understanding of their structure a key feature toward a deeper comprehension of gene expression regulation. Various enzymatic- and chemically- based approaches have been recently developed to allow whole-genome studies of RNA secondary structures. Several methods have been recently presented that allow high-throughput RNA structure probing (CIRS-seq, Structure-seq, SHAPE-seq, PARS, etc.) and unbiased structural inference of residues within RNAs in their native conformation. We here present an analysis toolkit, named RNA Structure Framework (RSF), which allows fast and fully-automated analysis of high-throughput structure probing data, from data pre-processing to whole-transcriptome RNA structure inference. RSF is written in Perl and is freely available under the GPLv3 license from http://rsf.hugef-research.org. salvatore.oliviero@hugef-torino.org Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. From bacterial to human dihydrouridine synthase: automated structure determination

    SciTech Connect

    Whelan, Fiona Jenkins, Huw T.; Griffiths, Samuel C.; Byrne, Robert T.; Dodson, Eleanor J.; Antson, Alfred A.

    2015-06-30

    The crystal structure of a human dihydrouridine synthase, an enzyme associated with lung cancer, with 18% sequence identity to a T. maritima enzyme, has been determined at 1.9 Å resolution by molecular replacement after extensive molecular remodelling of the template. The reduction of uridine to dihydrouridine at specific positions in tRNA is catalysed by dihydrouridine synthase (Dus) enzymes. Increased expression of human dihydrouridine synthase 2 (hDus2) has been linked to pulmonary carcinogenesis, while its knockdown decreased cancer cell line viability, suggesting that it may serve as a valuable target for therapeutic intervention. Here, the X-ray crystal structure of a construct of hDus2 encompassing the catalytic and tRNA-recognition domains (residues 1–340) determined at 1.9 Å resolution is presented. It is shown that the structure can be determined automatically by phenix.mr-rosetta starting from a bacterial Dus enzyme with only 18% sequence identity and a significantly divergent structure. The overall fold of the human Dus2 is similar to that of bacterial enzymes, but has a larger recognition domain and a unique three-stranded antiparallel β-sheet insertion into the catalytic domain that packs next to the recognition domain, contributing to domain–domain interactions. The structure may inform the development of novel therapeutic approaches in the fight against lung cancer.

  9. Reduced complexity structural modeling for automated airframe synthesis

    NASA Technical Reports Server (NTRS)

    Hajela, Prabhat

    1987-01-01

    A procedure is developed for the optimum sizing of wing structures based on representing the built-up finite element assembly of the structure by equivalent beam models. The reduced-order beam models are computationally less demanding in an optimum design environment which dictates repetitive analysis of several trial designs. The design procedure is implemented in a computer program requiring geometry and loading information to create the wing finite element model and its equivalent beam model, and providing a rapid estimate of the optimum weight obtained from a fully stressed design approach applied to the beam. The synthesis procedure is demonstrated for representative conventional-cantilever and joined wing configurations.

  10. Automated structural health monitoring based on adaptive kernel spectral clustering

    NASA Astrophysics Data System (ADS)

    Langone, Rocco; Reynders, Edwin; Mehrkanoon, Siamak; Suykens, Johan A. K.

    2017-06-01

    Structural health monitoring refers to the process of measuring damage-sensitive variables to assess the functionality of a structure. In principle, vibration data can capture the dynamics of the structure and reveal possible failures, but environmental and operational variability can mask this information. Thus, an effective outlier detection algorithm can be applied only after having performed data normalization (i.e. filtering) to eliminate external influences. Instead, in this article we propose a technique which unifies the data normalization and damage detection steps. The proposed algorithm, called adaptive kernel spectral clustering (AKSC), is initialized and calibrated in a phase when the structure is undamaged. The calibration process is crucial to ensure detection of early damage and minimize the number of false alarms. After the calibration, the method can automatically identify new regimes which may be associated with possible faults. These regimes are discovered by means of two complementary damage (i.e. outlier) indicators. The proposed strategy is validated with a simulated example and with real-life natural frequency data from the Z24 pre-stressed concrete bridge, which was progressively damaged at the end of a one-year monitoring period.

  11. Application of a hierarchical structure stochastic learning automation

    NASA Technical Reports Server (NTRS)

    Neville, R. G.; Chrystall, M. S.; Mars, P.

    1979-01-01

    A hierarchical structure automaton was developed using a two state stochastic learning automato (SLA) in a time shared model. Application of the hierarchical SLA to systems with multidimensional, multimodal performance criteria is described. Results of experiments performed with the hierarchical SLA using a performance index with a superimposed noise component of ? or - delta distributed uniformly over the surface are discussed.

  12. From bacterial to human dihydrouridine synthase: automated structure determination

    PubMed Central

    Whelan, Fiona; Jenkins, Huw T.; Griffiths, Samuel C.; Byrne, Robert T.; Dodson, Eleanor J.; Antson, Alfred A.

    2015-01-01

    The reduction of uridine to dihydrouridine at specific positions in tRNA is catalysed by dihydrouridine synthase (Dus) enzymes. Increased expression of human dihydrouridine synthase 2 (hDus2) has been linked to pulmonary carcinogenesis, while its knockdown decreased cancer cell line viability, suggesting that it may serve as a valuable target for therapeutic intervention. Here, the X-ray crystal structure of a construct of hDus2 encompassing the catalytic and tRNA-recognition domains (residues 1–340) determined at 1.9 Å resolution is presented. It is shown that the structure can be determined automatically by phenix.mr_rosetta starting from a bacterial Dus enzyme with only 18% sequence identity and a significantly divergent structure. The overall fold of the human Dus2 is similar to that of bacterial enzymes, but has a larger recognition domain and a unique three-stranded antiparallel β-sheet insertion into the catalytic domain that packs next to the recognition domain, contributing to domain–domain interactions. The structure may inform the development of novel therapeutic approaches in the fight against lung cancer. PMID:26143927

  13. Three-dimensional visualization for evaluating automated, geomorphic pattern-recognition analyses of crustal structures

    NASA Astrophysics Data System (ADS)

    Foley, M. G.

    1991-02-01

    We are developing and applying a suite of automated remote geologic analysis (RGA) methods at Pacific Northwest Laboratory (PNL) for extracting structural and tectonic patterns from digital models of topography and other spatially registered geophysical data. In analyzing a map area, the geologist employs a variety of spatial representations (e.g., topographic maps; oblique, vertical and vertical stereographic aerial photographs; satellite-sensor images) in addition to actual field observations to provide a basis for recognizing features (patterns) diagnostic or suggestive of various geologic and geomorphic features. We intend that our automated analyses of digital models of elevation use the same photogeologic pattern-recognition methods as the geologist's; otherwise there is no direct basis for manually evaluating results of the automated analysis. Any system for automating geologic analysis should extend the geologist's pattern-recognition abilities and quantify them, rather than replace them. This requirement means that results of automated structural pattern-recognition analyses must be evaluated by geologists using the same method that would be employed in manual field checking: visual examination of the three-dimensional relationships among rocks, erosional patterns, and identifiable structures. Interactive computer-graphics in quantitative (i.e., spatially registered), simulated three-dimensional perspective and stereo are thus critical to the integration and interpretation of topography, imagery, point data, RGA-identified fracture/fault planes, stratigraphy, contoured geophysical data, nonplanar surfaces, boreholes, and three-dimensional zones (e.g., crush zones at fracture intersections). This graphical interaction presents the megabytes of digital geologic and geophysical data to the geologist in the same spatial format that field observations would take, permitting direct evaluation of RGA methods and results.

  14. Three-dimensional visualization for evaluating automated, geomorphic pattern-recognition analyses of crustal structures

    SciTech Connect

    Foley, M.G.

    1991-02-01

    We are developing and applying a suite of automated remote geologic analysis (RGA) methods at Pacific Northwest Laboratory (PNL) for extracting structural and tectonic patterns from digital models of topography and other spatially registered geophysical data. In analyzing a map area, the geologist employs a variety of spatial representations (e.g., topographic maps; oblique, vertical and vertical stereographic aerial photographs; satellite-sensor images) in addition to actual field observations to provide a basis for recognizing features (patterns) diagnostic or suggestive of various geologic and geomorphic features. We intend that our automated analyses of digital models of elevation use the same photogeologic pattern-recognition methods as the geologist's; otherwise there is no direct basis for manually evaluating results of the automated analysis. Any system for automating geologic analysis should extend the geologist's pattern-recognition abilities and quantify them, rather than replace them. This requirement means that results of automated structural pattern-recognition analyses must be evaluated by geologists using the same method that would be employed in manual field checking: visual examination of the three-dimensional relationships among rocks, erosional patterns, and identifiable structures. Interactive computer-graphics in quantitative (i.e., spatially registered), simulated three-dimensional perspective and stereo are thus critical to the integration and interpretation of topography, imagery, point data, RGA-identified fracture/fault planes, stratigraphy, contoured geophysical data, nonplanar surfaces, boreholes, and three-dimensional zones (e.g., crush zones at fracture intersections). This graphical interaction presents the megabytes of digital geologic and geophysical data to the geologist in the same spatial format that field observations would take, permitting direct evaluation of RGA methods and results. 5 refs., 2 figs.

  15. Automating the parallel processing of fluid and structural dynamics calculations

    NASA Technical Reports Server (NTRS)

    Arpasi, Dale J.; Cole, Gary L.

    1987-01-01

    The NASA Lewis Research Center is actively involved in the development of expert system technology to assist users in applying parallel processing to computational fluid and structural dynamic analysis. The goal of this effort is to eliminate the necessity for the physical scientist to become a computer scientist in order to effectively use the computer as a research tool. Programming and operating software utilities have previously been developed to solve systems of ordinary nonlinear differential equations on parallel scalar processors. Current efforts are aimed at extending these capabilties to systems of partial differential equations, that describe the complex behavior of fluids and structures within aerospace propulsion systems. This paper presents some important considerations in the redesign, in particular, the need for algorithms and software utilities that can automatically identify data flow patterns in the application program and partition and allocate calculations to the parallel processors. A library-oriented multiprocessing concept for integrating the hardware and software functions is described.

  16. Automating the parallel processing of fluid and structural dynamics calculations

    NASA Technical Reports Server (NTRS)

    Arpasi, Dale J.; Cole, Gary L.

    1987-01-01

    The NASA Lewis Research Center is actively involved in the development of expert system technology to assist users in applying parallel processing to computational fluid and structural dynamic analysis. The goal of this effort is to eliminate the necessity for the physical scientist to become a computer scientist in order to effectively use the computer as a research tool. Programming and operating software utilities have previously been developed to solve systems of ordinary nonlinear differential equations on parallel scalar processors. Current efforts are aimed at extending these capabilities to systems of partial differential equations, that describe the complex behavior of fluids and structures within aerospace propulsion systems. This paper presents some important considerations in the redesign, in particular, the need for algorithms and software utilities that can automatically identify data flow patterns in the application program and partition and allocate calculations to the parallel processors. A library-oriented multiprocessing concept for integrating the hardware and software functions is described.

  17. An automated procedure for covariation-based detection of RNA structure

    SciTech Connect

    Winker, S.; Overbeek, R.; Woese, C.R.; Olsen, G.J.; Pfluger, N.

    1989-12-01

    This paper summarizes our investigations into the computational detection of secondary and tertiary structure of ribosomal RNA. We have developed a new automated procedure that not only identifies potential bondings of secondary and tertiary structure, but also provides the covariation evidence that supports the proposed bondings, and any counter-evidence that can be detected in the known sequences. A small number of previously unknown bondings have been detected in individual RNA molecules (16S rRNA and 7S RNA) through the use of our automated procedure. Currently, we are systematically studying mitochondrial rRNA. Our goal is to detect tertiary structure within 16S rRNA and quaternary structure between 16S and 23S rRNA. Our ultimate hope is that automated covariation analysis will contribute significantly to a refined picture of ribosome structure. Our colleagues in biology have begun experiments to test certain hypotheses suggested by an examination of our program's output. These experiments involve sequencing key portions of the 23S ribosomal RNA for species in which the known 16S ribosomal RNA exhibits variation (from the dominant pattern) at the site of a proposed bonding. The hope is that the 23S ribosomal RNA of these species will exhibit corresponding complementary variation or generalized covariation. 24 refs.

  18. Automated Structural Optimization System (ASTROS). Volume 1. Theoretical Manual

    DTIC Science & Technology

    1988-12-01

    Analysis Problem Oriented Language). Such a control language, similar to the DMAP of NASTRAN or the typical query language of a data base management...aerospace environment is addressed by making the ASTROS procedure resemble that of NASTRAN in terms of user input and pre- and post-processor interfaces...While the ASTROS procedure does not contain many of the specialized capabilities available in NASTRAN , the basic structural analysis features have

  19. Automated motif extraction and classification in RNA tertiary structures

    PubMed Central

    Djelloul, Mahassine; Denise, Alain

    2008-01-01

    We used a novel graph-based approach to extract RNA tertiary motifs. We cataloged them all and clustered them using an innovative graph similarity measure. We applied our method to three widely studied structures: Haloarcula marismortui 50S (H.m 50S), Escherichia coli 50S (E. coli 50S), and Thermus thermophilus 16S (T.th 16S) RNAs. We identified 10 known motifs without any prior knowledge of their shapes or positions. We additionally identified four putative new motifs. PMID:18957493

  20. Bim Automation: Advanced Modeling Generative Process for Complex Structures

    NASA Astrophysics Data System (ADS)

    Banfi, F.; Fai, S.; Brumana, R.

    2017-08-01

    The new paradigm of the complexity of modern and historic structures, which are characterised by complex forms, morphological and typological variables, is one of the greatest challenges for building information modelling (BIM). Generation of complex parametric models needs new scientific knowledge concerning new digital technologies. These elements are helpful to store a vast quantity of information during the life cycle of buildings (LCB). The latest developments of parametric applications do not provide advanced tools, resulting in time-consuming work for the generation of models. This paper presents a method capable of processing and creating complex parametric Building Information Models (BIM) with Non-Uniform to NURBS) with multiple levels of details (Mixed and ReverseLoD) based on accurate 3D photogrammetric and laser scanning surveys. Complex 3D elements are converted into parametric BIM software and finite element applications (BIM to FEA) using specific exchange formats and new modelling tools. The proposed approach has been applied to different case studies: the BIM of modern structure for the courtyard of West Block on Parliament Hill in Ottawa (Ontario) and the BIM of Masegra Castel in Sondrio (Italy), encouraging the dissemination and interaction of scientific results without losing information during the generative process.

  1. Toolkit for automated and rapid discovery of structural variants.

    PubMed

    Soylev, Arda; Kockan, Can; Hormozdiari, Fereydoun; Alkan, Can

    2017-06-02

    Structural variations (SV) are broadly defined as genomic alterations that affect >50bp of DNA, which are shown to have significant effect on evolution and disease. The advent of high throughput sequencing (HTS) technologies and the ability to perform whole genome sequencing (WGS), makes it feasible to study these variants in depth. However, discovery of all forms of SV using WGS has proven to be challenging as the short reads produced by the predominant HTS platforms (<200bp for current technologies) and the fact that most genomes include large amounts of repeats make it very difficult to unambiguously map and accurately characterize such variants. Furthermore, existing tools for SV discovery are primarily developed for only a few of the SV types, which may have conflicting sequence signatures (i.e. read pairs, read depth, split reads) with other, untargeted SV classes. Here we are introduce a new framework, Tardis, which combines multiple read signatures into a single package to characterize most SV types simultaneously, while preventing such conflicts. Tardis also has a modular structure that makes it easy to extend for the discovery of additional forms of SV. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Automated analysis of Physarum network structure and dynamics

    NASA Astrophysics Data System (ADS)

    Fricker, Mark D.; Akita, Dai; Heaton, Luke LM; Jones, Nick; Obara, Boguslaw; Nakagaki, Toshiyuki

    2017-06-01

    We evaluate different ridge-enhancement and segmentation methods to automatically extract the network architecture from time-series of Physarum plasmodia withdrawing from an arena via a single exit. Whilst all methods gave reasonable results, judged by precision-recall analysis against a ground-truth skeleton, the mean phase angle (Feature Type) from intensity-independent, phase-congruency edge enhancement and watershed segmentation was the most robust to variation in threshold parameters. The resultant single pixel-wide segmented skeleton was converted to a graph representation as a set of weighted adjacency matrices containing the physical dimensions of each vein, and the inter-vein regions. We encapsulate the complete image processing and network analysis pipeline in a downloadable software package, and provide an extensive set of metrics that characterise the network structure, including hierarchical loop decomposition to analyse the nested structure of the developing network. In addition, the change in volume for each vein and intervening plasmodial sheet was used to predict the net flow across the network. The scaling relationships between predicted current, speed and shear force with vein radius were consistent with predictions from Murray’s law. This work was presented at PhysNet 2015.

  3. Automated determination of fibrillar structures by simultaneous model building and fiber diffraction refinement.

    PubMed

    Potrzebowski, Wojciech; André, Ingemar

    2015-07-01

    For highly oriented fibrillar molecules, three-dimensional structures can often be determined from X-ray fiber diffraction data. However, because of limited information content, structure determination and validation can be challenging. We demonstrate that automated structure determination of protein fibers can be achieved by guiding the building of macromolecular models with fiber diffraction data. We illustrate the power of our approach by determining the structures of six bacteriophage viruses de novo using fiber diffraction data alone and together with solid-state NMR data. Furthermore, we demonstrate the feasibility of molecular replacement from monomeric and fibrillar templates by solving the structure of a plant virus using homology modeling and protein-protein docking. The generated models explain the experimental data to the same degree as deposited reference structures but with improved structural quality. We also developed a cross-validation method for model selection. The results highlight the power of fiber diffraction data as structural constraints.

  4. Automated Quality Assessment of Structural Magnetic Resonance Brain Images Based on a Supervised Machine Learning Algorithm.

    PubMed

    Pizarro, Ricardo A; Cheng, Xi; Barnett, Alan; Lemaitre, Herve; Verchinski, Beth A; Goldman, Aaron L; Xiao, Ena; Luo, Qian; Berman, Karen F; Callicott, Joseph H; Weinberger, Daniel R; Mattay, Venkata S

    2016-01-01

    High-resolution three-dimensional magnetic resonance imaging (3D-MRI) is being increasingly used to delineate morphological changes underlying neuropsychiatric disorders. Unfortunately, artifacts frequently compromise the utility of 3D-MRI yielding irreproducible results, from both type I and type II errors. It is therefore critical to screen 3D-MRIs for artifacts before use. Currently, quality assessment involves slice-wise visual inspection of 3D-MRI volumes, a procedure that is both subjective and time consuming. Automating the quality rating of 3D-MRI could improve the efficiency and reproducibility of the procedure. The present study is one of the first efforts to apply a support vector machine (SVM) algorithm in the quality assessment of structural brain images, using global and region of interest (ROI) automated image quality features developed in-house. SVM is a supervised machine-learning algorithm that can predict the category of test datasets based on the knowledge acquired from a learning dataset. The performance (accuracy) of the automated SVM approach was assessed, by comparing the SVM-predicted quality labels to investigator-determined quality labels. The accuracy for classifying 1457 3D-MRI volumes from our database using the SVM approach is around 80%. These results are promising and illustrate the possibility of using SVM as an automated quality assessment tool for 3D-MRI.

  5. Automated Quality Assessment of Structural Magnetic Resonance Brain Images Based on a Supervised Machine Learning Algorithm

    PubMed Central

    Pizarro, Ricardo A.; Cheng, Xi; Barnett, Alan; Lemaitre, Herve; Verchinski, Beth A.; Goldman, Aaron L.; Xiao, Ena; Luo, Qian; Berman, Karen F.; Callicott, Joseph H.; Weinberger, Daniel R.; Mattay, Venkata S.

    2016-01-01

    High-resolution three-dimensional magnetic resonance imaging (3D-MRI) is being increasingly used to delineate morphological changes underlying neuropsychiatric disorders. Unfortunately, artifacts frequently compromise the utility of 3D-MRI yielding irreproducible results, from both type I and type II errors. It is therefore critical to screen 3D-MRIs for artifacts before use. Currently, quality assessment involves slice-wise visual inspection of 3D-MRI volumes, a procedure that is both subjective and time consuming. Automating the quality rating of 3D-MRI could improve the efficiency and reproducibility of the procedure. The present study is one of the first efforts to apply a support vector machine (SVM) algorithm in the quality assessment of structural brain images, using global and region of interest (ROI) automated image quality features developed in-house. SVM is a supervised machine-learning algorithm that can predict the category of test datasets based on the knowledge acquired from a learning dataset. The performance (accuracy) of the automated SVM approach was assessed, by comparing the SVM-predicted quality labels to investigator-determined quality labels. The accuracy for classifying 1457 3D-MRI volumes from our database using the SVM approach is around 80%. These results are promising and illustrate the possibility of using SVM as an automated quality assessment tool for 3D-MRI. PMID:28066227

  6. Knowledge structure representation and automated updates in intelligent information management systems

    NASA Technical Reports Server (NTRS)

    Corey, Stephen; Carnahan, Richard S., Jr.

    1990-01-01

    A continuing effort to apply rapid prototyping and Artificial Intelligence techniques to problems associated with projected Space Station-era information management systems is examined. In particular, timely updating of the various databases and knowledge structures within the proposed intelligent information management system (IIMS) is critical to support decision making processes. Because of the significantly large amounts of data entering the IIMS on a daily basis, information updates will need to be automatically performed with some systems requiring that data be incorporated and made available to users within a few hours. Meeting these demands depends first, on the design and implementation of information structures that are easily modified and expanded, and second, on the incorporation of intelligent automated update techniques that will allow meaningful information relationships to be established. Potential techniques are studied for developing such an automated update capability and IIMS update requirements are examined in light of results obtained from the IIMS prototyping effort.

  7. Automated model formulation for time-varying flexible structures

    NASA Technical Reports Server (NTRS)

    Glass, B. J.; Hanagud, S.

    1989-01-01

    Presented here is an identification technique that uses the sensor information to choose a new model out of a finite set of discrete model space, in order to follow the observed changes to the given time varying flexible structure. Boundary condition sets or other information on model variations are used to organize the set of possible models laterally into a search tree with levels of abstraction used to order the models vertically within branches. An object-oriented programming approach is used to represent the model set in the search tree. A modified A (asterisk) best first search algorithm finds the model where the model response best matches the current observations. Several extensions to this methodology are discussed. Methods of possible integration of rules with the current search algorithm are considered to give weight to interpreted trends that may be found in a series of observations. This capability might lead, for instance, to identifying a model that incorporates a progressive damage rather than with incorrect paramenters such as added mass. Another new direction is to consider the use of noisy time domain sensor feedback rather than frequency domain information in the search algorithm to improve the real-time capability of the developed procedure.

  8. Automated model formulation for time-varying flexible structures

    NASA Technical Reports Server (NTRS)

    Glass, B. J.; Hanagud, S.

    1989-01-01

    Presented here is an identification technique that uses the sensor information to choose a new model out of a finite set of discrete model space, in order to follow the observed changes to the given time varying flexible structure. Boundary condition sets or other information on model variations are used to organize the set of possible models laterally into a search tree with levels of abstraction used to order the models vertically within branches. An object-oriented programming approach is used to represent the model set in the search tree. A modified A (asterisk) best first search algorithm finds the model where the model response best matches the current observations. Several extensions to this methodology are discussed. Methods of possible integration of rules with the current search algorithm are considered to give weight to interpreted trends that may be found in a series of observations. This capability might lead, for instance, to identifying a model that incorporates a progressive damage rather than with incorrect paramenters such as added mass. Another new direction is to consider the use of noisy time domain sensor feedback rather than frequency domain information in the search algorithm to improve the real-time capability of the developed procedure.

  9. Revealing biological information using data structuring and automated learning.

    PubMed

    Mohorianu, Irina; Moulton, Vincent

    2010-11-01

    The intermediary steps between a biological hypothesis, concretized in the input data, and meaningful results, validated using biological experiments, commonly employ bioinformatics tools. Starting with storage of the data and ending with a statistical analysis of the significance of the results, every step in a bioinformatics analysis has been intensively studied and the resulting methods and models patented. This review summarizes the bioinformatics patents that have been developed mainly for the study of genes, and points out the universal applicability of bioinformatics methods to other related studies such as RNA interference. More specifically, we overview the steps undertaken in the majority of bioinformatics analyses, highlighting, for each, various approaches that have been developed to reveal details from different perspectives. First we consider data warehousing, the first task that has to be performed efficiently, optimizing the structure of the database, in order to facilitate both the subsequent steps and the retrieval of information. Next, we review data mining, which occupies the central part of most bioinformatics analyses, presenting patents concerning differential expression, unsupervised and supervised learning. Last, we discuss how networks of interactions of genes or other players in the cell may be created, which help draw biological conclusions and have been described in several patents.

  10. The AUDANA algorithm for automated protein 3D structure determination from NMR NOE data.

    PubMed

    Lee, Woonghee; Petit, Chad M; Cornilescu, Gabriel; Stark, Jaime L; Markley, John L

    2016-06-01

    We introduce AUDANA (Automated Database-Assisted NOE Assignment), an algorithm for determining three-dimensional structures of proteins from NMR data that automates the assignment of 3D-NOE spectra, generates distance constraints, and conducts iterative high temperature molecular dynamics and simulated annealing. The protein sequence, chemical shift assignments, and NOE spectra are the only required inputs. Distance constraints generated automatically from ambiguously assigned NOE peaks are validated during the structure calculation against information from an enlarged version of the freely available PACSY database that incorporates information on protein structures deposited in the Protein Data Bank (PDB). This approach yields robust sets of distance constraints and 3D structures. We evaluated the performance of AUDANA with input data for 14 proteins ranging in size from 6 to 25 kDa that had 27-98 % sequence identity to proteins in the database. In all cases, the automatically calculated 3D structures passed stringent validation tests. Structures were determined with and without database support. In 9/14 cases, database support improved the agreement with manually determined structures in the PDB and in 11/14 cases, database support lowered the r.m.s.d. of the family of 20 structural models.

  11. Automated protein motif generation in the structure-based protein function prediction tool ProMOL.

    PubMed

    Osipovitch, Mikhail; Lambrecht, Mitchell; Baker, Cameron; Madha, Shariq; Mills, Jeffrey L; Craig, Paul A; Bernstein, Herbert J

    2015-12-01

    ProMOL, a plugin for the PyMOL molecular graphics system, is a structure-based protein function prediction tool. ProMOL includes a set of routines for building motif templates that are used for screening query structures for enzyme active sites. Previously, each motif template was generated manually and required supervision in the optimization of parameters for sensitivity and selectivity. We developed an algorithm and workflow for the automation of motif building and testing routines in ProMOL. The algorithm uses a set of empirically derived parameters for optimization and requires little user intervention. The automated motif generation algorithm was first tested in a performance comparison with a set of manually generated motifs based on identical active sites from the same 112 PDB entries. The two sets of motifs were equally effective in identifying alignments with homologs and in rejecting alignments with unrelated structures. A second set of 296 active site motifs were generated automatically, based on Catalytic Site Atlas entries with literature citations, as an expansion of the library of existing manually generated motif templates. The new motif templates exhibited comparable performance to the existing ones in terms of hit rates against native structures, homologs with the same EC and Pfam designations, and randomly selected unrelated structures with a different EC designation at the first EC digit, as well as in terms of RMSD values obtained from local structural alignments of motifs and query structures. This research is supported by NIH grant GM078077.

  12. Automated refinement of macromolecular structures at low resolution using prior information

    PubMed Central

    Kovalevskiy, Oleg; Nicholls, Robert A.; Murshudov, Garib N.

    2016-01-01

    Since the ratio of the number of observations to adjustable parameters is small at low resolution, it is necessary to use complementary information for the analysis of such data. ProSMART is a program that can generate restraints for macromolecules using homologous structures, as well as generic restraints for the stabilization of secondary structures. These restraints are used by REFMAC5 to stabilize the refinement of an atomic model. However, the optimal refinement protocol varies from case to case, and it is not always obvious how to select appropriate homologous structure(s), or other sources of prior information, for restraint generation. After running extensive tests on a large data set of low-resolution models, the best-performing refinement protocols and strategies for the selection of homologous structures have been identified. These strategies and protocols have been implemented in the Low-Resolution Structure Refinement (LORESTR) pipeline. The pipeline performs auto-detection of twinning and selects the optimal scaling method and solvent parameters. LORESTR can either use user-supplied homologous structures, or run an automated BLAST search and download homologues from the PDB. The pipeline executes multiple model-refinement instances using different parameters in order to find the best protocol. Tests show that the automated pipeline improves R factors, geometry and Ramachandran statistics for 94% of the low-resolution cases from the PDB included in the test set. PMID:27710936

  13. Automated refinement of macromolecular structures at low resolution using prior information.

    PubMed

    Kovalevskiy, Oleg; Nicholls, Robert A; Murshudov, Garib N

    2016-10-01

    Since the ratio of the number of observations to adjustable parameters is small at low resolution, it is necessary to use complementary information for the analysis of such data. ProSMART is a program that can generate restraints for macromolecules using homologous structures, as well as generic restraints for the stabilization of secondary structures. These restraints are used by REFMAC5 to stabilize the refinement of an atomic model. However, the optimal refinement protocol varies from case to case, and it is not always obvious how to select appropriate homologous structure(s), or other sources of prior information, for restraint generation. After running extensive tests on a large data set of low-resolution models, the best-performing refinement protocols and strategies for the selection of homologous structures have been identified. These strategies and protocols have been implemented in the Low-Resolution Structure Refinement (LORESTR) pipeline. The pipeline performs auto-detection of twinning and selects the optimal scaling method and solvent parameters. LORESTR can either use user-supplied homologous structures, or run an automated BLAST search and download homologues from the PDB. The pipeline executes multiple model-refinement instances using different parameters in order to find the best protocol. Tests show that the automated pipeline improves R factors, geometry and Ramachandran statistics for 94% of the low-resolution cases from the PDB included in the test set.

  14. I-TASSER: a unified platform for automated protein structure and function prediction.

    PubMed

    Roy, Ambrish; Kucukural, Alper; Zhang, Yang

    2010-04-01

    The iterative threading assembly refinement (I-TASSER) server is an integrated platform for automated protein structure and function prediction based on the sequence-to-structure-to-function paradigm. Starting from an amino acid sequence, I-TASSER first generates three-dimensional (3D) atomic models from multiple threading alignments and iterative structural assembly simulations. The function of the protein is then inferred by structurally matching the 3D models with other known proteins. The output from a typical server run contains full-length secondary and tertiary structure predictions, and functional annotations on ligand-binding sites, Enzyme Commission numbers and Gene Ontology terms. An estimate of accuracy of the predictions is provided based on the confidence score of the modeling. This protocol provides new insights and guidelines for designing of online server systems for the state-of-the-art protein structure and function predictions. The server is available at http://zhanglab.ccmb.med.umich.edu/I-TASSER.

  15. Towards automated detection of depression from brain structural magnetic resonance images.

    PubMed

    Kipli, Kuryati; Kouzani, Abbas Z; Williams, Lana J

    2013-05-01

    Depression is a major issue worldwide and is seen as a significant health problem. Stigma and patient denial, clinical experience, time limitations, and reliability of psychometrics are barriers to the clinical diagnoses of depression. Thus, the establishment of an automated system that could detect such abnormalities would assist medical experts in their decision-making process. This paper reviews existing methods for the automated detection of depression from brain structural magnetic resonance images (sMRI). Relevant sources were identified from various databases and online sites using a combination of keywords and terms including depression, major depressive disorder, detection, classification, and MRI databases. Reference lists of chosen articles were further reviewed for associated publications. The paper introduces a generic structure for representing and describing the methods developed for the detection of depression from sMRI of the brain. It consists of a number of components including acquisition and preprocessing, feature extraction, feature selection, and classification. Automated sMRI-based detection methods have the potential to provide an objective measure of depression, hence improving the confidence level in the diagnosis and prognosis of depression.

  16. An automated method for detecting architectural distortions on mammograms using direction analysis of linear structures.

    PubMed

    Matsubara, T; Ito, A; Tsunomori, A; Hara, T; Muramatsu, C; Endo, T; Fujita, H

    2015-08-01

    Architectural distortion is one of the most important findings when evaluating mammograms for breast cancer. Abnormal breast architecture is characterized by the presence of spicules, which are distorted mammary structures that are not accompanied by an increased density or mass. We have been developing an automated method for detecting spiculated architectural distortions by analyzing linear structures extracted by normal curvature. However, some structures that are possibly related to distorted areas are not extracted using this method. The purpose of this study was to develop a new automated method for direction analysis of linear structures to improve detection performance in mammography. The direction of linear structures in each region of interest (ROI) was first determined using a direction filter and a background filter that can define one of eight directions (0°, 22.5°, 45°, 67.5°, 90°, 112.5°, 135°, and 157.5°). The concentration and isotropic indexes were calculated using the determined direction of the linear structures in order to extract the candidate areas. Discriminant analysis was performed to eliminate false positives results. Our database consisted of 168 abnormal images containing 174 distorted areas and 580 normal images. The sensitivity of the new method was 81%. There were 2.6 and 4.2 false positives per image using the new and previous methods, respectively. These findings show that our new method is effective for detecting spiculated architectural distortions.

  17. Automated detection of structural alerts (chemical fragments) in (eco)toxicology

    PubMed Central

    Lepailleur, Alban; Poezevara, Guillaume; Bureau, Ronan

    2013-01-01

    This mini-review describes the evolution of different algorithms dedicated to the automated discovery of chemical fragments associated to (eco)toxicological endpoints. These structural alerts correspond to one of the most interesting approach of in silico toxicology due to their direct link with specific toxicological mechanisms. A number of expert systems are already available but, since the first work in this field which considered a binomial distribution of chemical fragments between two datasets, new data miners were developed and applied with success in chemoinformatics. The frequency of a chemical fragment in a dataset is often at the core of the process for the definition of its toxicological relevance. However, recent progresses in data mining provide new insights into the automated discovery of new rules. Particularly, this review highlights the notion of Emerging Patterns that can capture contrasts between classes of data. PMID:24688706

  18. Automated fine structure image analysis method for discrimination of diabetic retinopathy stage using conjunctival microvasculature images

    PubMed Central

    Khansari, Maziyar M; O’Neill, William; Penn, Richard; Chau, Felix; Blair, Norman P; Shahidi, Mahnaz

    2016-01-01

    The conjunctiva is a densely vascularized mucus membrane covering the sclera of the eye with a unique advantage of accessibility for direct visualization and non-invasive imaging. The purpose of this study is to apply an automated quantitative method for discrimination of different stages of diabetic retinopathy (DR) using conjunctival microvasculature images. Fine structural analysis of conjunctival microvasculature images was performed by ordinary least square regression and Fisher linear discriminant analysis. Conjunctival images between groups of non-diabetic and diabetic subjects at different stages of DR were discriminated. The automated method’s discriminate rates were higher than those determined by human observers. The method allowed sensitive and rapid discrimination by assessment of conjunctival microvasculature images and can be potentially useful for DR screening and monitoring. PMID:27446692

  19. The immobilization of enzymes on nylon structures and their use in automated analysis

    PubMed Central

    Inman, D. J.; Hornby, W. E.

    1972-01-01

    1. Glucose oxidase (EC 1.1.3.4) and urease (EC 3.5.1.5) were covalently attached through glutaraldehyde to low-molecular-weight nylon powder. 2. Immobilized derivatives of glucose oxidase and urease were prepared by cross-linking the respective enzymes within the matrix of a nylon membrane. 3. An improved process is described for the immobilization of glucose oxidase and urease on the inside surface of partially hydrolysed nylon tube. 4. Automated analytical procedures are described for the determination of glucose with each of the three immobilized glucose oxidase derivatives and for the determination of urea with each of the three immobilized urease derivatives. 5. The efficiencies of the three immobilized enzyme structures as reagents for the automated determination of their substrates were compared. PMID:4643309

  20. Automation of three-dimensional structured mesh generation for turbomachinery blade passages

    NASA Technical Reports Server (NTRS)

    Ascoli, Edward P.; Prueger, George H.

    1995-01-01

    Hybrid tools have been developed which greatly reduce the time required to generate three-dimensional structured CFD meshes for turbomachinery blade passages. RAGGS, an existing Rockwell proprietary, general purpose mesh generation and visualization system, provides the starting point and framework for tool development. Utilities which manipulate and interface with RAGGS tools have been developed to (1) facilitate blade geometry inputs from point or CAD representations, (2) automate auxiliary surface creation, and (3) streamline and automate edge, surface, and subsequent volume mesh generation from minimal inputs. The emphasis of this approach has been to maintain all the functionality of the general purpose mesh generator while simultaneously eliminating the bulk of the repetitive and tediuos manual steps in the mesh generation process. Using this approach, mesh generation cycle times have been reduced from the order of days down to the order of hours.

  1. A two-level structure for advanced space power system automation

    NASA Technical Reports Server (NTRS)

    Loparo, Kenneth A.; Chankong, Vira

    1990-01-01

    The tasks to be carried out during the three-year project period are: (1) performing extensive simulation using existing mathematical models to build a specific knowledge base of the operating characteristics of space power systems; (2) carrying out the necessary basic research on hierarchical control structures, real-time quantitative algorithms, and decision-theoretic procedures; (3) developing a two-level automation scheme for fault detection and diagnosis, maintenance and restoration scheduling, and load management; and (4) testing and demonstration. The outlines of the proposed system structure that served as a master plan for this project, work accomplished, concluding remarks, and ideas for future work are also addressed.

  2. A two-level structure for advanced space power system automation

    NASA Astrophysics Data System (ADS)

    Loparo, Kenneth A.; Chankong, Vira

    1990-05-01

    The tasks to be carried out during the three-year project period are: (1) performing extensive simulation using existing mathematical models to build a specific knowledge base of the operating characteristics of space power systems; (2) carrying out the necessary basic research on hierarchical control structures, real-time quantitative algorithms, and decision-theoretic procedures; (3) developing a two-level automation scheme for fault detection and diagnosis, maintenance and restoration scheduling, and load management; and (4) testing and demonstration. The outlines of the proposed system structure that served as a master plan for this project, work accomplished, concluding remarks, and ideas for future work are also addressed.

  3. Automated structure determination of proteins with the SAIL-FLYA NMR method.

    PubMed

    Takeda, Mitsuhiro; Ikeya, Teppei; Güntert, Peter; Kainosho, Masatsune

    2007-01-01

    The labeling of proteins with stable isotopes enhances the NMR method for the determination of 3D protein structures in solution. Stereo-array isotope labeling (SAIL) provides an optimal stereospecific and regiospecific pattern of stable isotopes that yields sharpened lines, spectral simplification without loss of information, and the ability to collect rapidly and evaluate fully automatically the structural restraints required to solve a high-quality solution structure for proteins up to twice as large as those that can be analyzed using conventional methods. Here, we describe a protocol for the preparation of SAIL proteins by cell-free methods, including the preparation of S30 extract and their automated structure analysis using the FLYA algorithm and the program CYANA. Once efficient cell-free expression of the unlabeled or uniformly labeled target protein has been achieved, the NMR sample preparation of a SAIL protein can be accomplished in 3 d. A fully automated FLYA structure calculation can be completed in 1 d on a powerful computer system.

  4. Integrating automated structured analysis and design with Ada programming support environments

    NASA Technical Reports Server (NTRS)

    Hecht, Alan; Simmons, Andy

    1986-01-01

    Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.

  5. Automated Reconstruction of Historic Roof Structures from Point Clouds - Development and Examples

    NASA Astrophysics Data System (ADS)

    Pöchtrager, M.; Styhler-Aydın, G.; Döring-Williams, M.; Pfeifer, N.

    2017-08-01

    The analysis of historic roof constructions is an important task for planning the adaptive reuse of buildings or for maintenance and restoration issues. Current approaches to modeling roof constructions consist of several consecutive operations that need to be done manually or using semi-automatic routines. To increase efficiency and allow the focus to be on analysis rather than on data processing, a set of methods was developed for the fully automated analysis of the roof constructions, including integration of architectural and structural modeling. Terrestrial laser scanning permits high-detail surveying of large-scale structures within a short time. Whereas 3-D laser scan data consist of millions of single points on the object surface, we need a geometric description of structural elements in order to obtain a structural model consisting of beam axis and connections. Preliminary results showed that the developed methods work well for beams in flawless condition with a quadratic cross section and no bending. Deformations or damages such as cracks and cuts on the wooden beams can lead to incomplete representations in the model. Overall, a high degree of automation was achieved.

  6. Genius: a genetic algorithm for automated structure elucidation from 13C NMR spectra.

    PubMed

    Meiler, Jens; Will, Martin

    2002-03-06

    The automated structure elucidation of organic molecules from experimentally obtained properties is extended by an entirely new approach. A genetic algorithm is implemented that uses molecular constitution structures as individuals. With this approach, the structure of organic molecules can be optimized to meet experimental criteria, if in addition a fast and accurate method for the prediction of the used physical or chemical features is available. This is demonstrated using 13C NMR spectrum as readily obtainable information. By means of artificial neural networks a fast and accurate method for calculating the 13C NMR spectrum of the generated structures exists. The method is implemented and tested successfully for organic molecules with up to 18 non-hydrogen atoms.

  7. Small-scale, semi-automated purification of eukaryotic proteins for structure determination

    PubMed Central

    Frederick, Ronnie O.; Bergeman, Lai; Blommel, Paul G.; Bailey, Lucas J.; McCoy, Jason G.; Song, Jikui; Meske, Louise; Bingman, Craig A.; Riters, Megan; Dillon, Nicholas A.; Kunert, John; Yoon, Jung Whan; Lim, Ahyoung; Cassidy, Michael; Bunge, Jason; Aceti, David J.; Primm, John G.; Markley, John L.; Phillips, George N.

    2007-01-01

    A simple approach that allows cost-effective automated purification of recombinant proteins in levels sufficient for functional characterization or structural studies is described. Studies with four human stem cell proteins, an engineered version of green fluorescent protein, and other proteins are included. The method combines an expression vector (pVP62K) that provides in vivo cleavage of an initial fusion protein, a factorial designed auto-induction medium that improves the performance of small-scale production, and rapid, automated metal affinity purification of His8-tagged proteins. For initial small-scale production screening, single colony transformants were grown overnight in 0.4 ml of auto-induction medium, produced proteins were purified using the Promega Maxwell 16, and purification results were analyzed by Caliper LC90 capillary electrophoresis. The yield of purified [U-15N]-His8-Tcl-1 was 7.5 μg/ml of culture medium, of purified [U-15N]-His8-GFP was 68 μg/ml, and of purified selenomethione-labeled AIA–GFP (His8 removed by treatment with TEV protease) was 172 μg/ml. The yield information obtained from a successful automated purification from 0.4 ml was used to inform the decision to scale-up for a second meso-scale (10–50 ml) cell growth and automated purification. 1H–15N NMR HSQC spectra of His8-Tcl-1 and of His8-GFP prepared from 50 ml cultures showed excellent chemical shift dispersion, consistent with well folded states in solution suitable for structure determination. Moreover, AIA–GFP obtained by proteolytic removal of the His8 tag was subjected to crystallization screening, and yielded crystals under several conditions. Single crystals were subsequently produced and optimized by the hanging drop method. The structure was solved by molecular replacement at a resolution of 1.7 Å. This approach provides an efficient way to carry out several key target screening steps that are essential for successful operation of proteomics

  8. Automated design optimization of supersonic airplane wing structures under dynamic constraints

    NASA Technical Reports Server (NTRS)

    Fox, R. L.; Miura, H.; Rao, S. S.

    1972-01-01

    The problems of the preliminary and first level detail design of supersonic aircraft wings are stated as mathematical programs and solved using automated optimum design techniques. The problem is approached in two phases: the first is a simplified equivalent plate model in which the envelope, planform and structural parameters are varied to produce a design, the second is a finite element model with fixed configuration in which the material distribution is varied. Constraints include flutter, aeroelastically computed stresses and deflections, natural frequency and a variety of geometric limitations.

  9. Small-scale, semi-automated purification of eukaryotic proteins for structure determination.

    PubMed

    Frederick, Ronnie O; Bergeman, Lai; Blommel, Paul G; Bailey, Lucas J; McCoy, Jason G; Song, Jikui; Meske, Louise; Bingman, Craig A; Riters, Megan; Dillon, Nicholas A; Kunert, John; Yoon, Jung Whan; Lim, Ahyoung; Cassidy, Michael; Bunge, Jason; Aceti, David J; Primm, John G; Markley, John L; Phillips, George N; Fox, Brian G

    2007-12-01

    A simple approach that allows cost-effective automated purification of recombinant proteins in levels sufficient for functional characterization or structural studies is described. Studies with four human stem cell proteins, an engineered version of green fluorescent protein, and other proteins are included. The method combines an expression vector (pVP62K) that provides in vivo cleavage of an initial fusion protein, a factorial designed auto-induction medium that improves the performance of small-scale production, and rapid, automated metal affinity purification of His8-tagged proteins. For initial small-scale production screening, single colony transformants were grown overnight in 0.4 ml of auto-induction medium, produced proteins were purified using the Promega Maxwell 16, and purification results were analyzed by Caliper LC90 capillary electrophoresis. The yield of purified [U-15N]-His8-Tcl-1 was 7.5 microg/ml of culture medium, of purified [U-15N]-His8-GFP was 68 microg/ml, and of purified selenomethione-labeled AIA-GFP (His8 removed by treatment with TEV protease) was 172 microg/ml. The yield information obtained from a successful automated purification from 0.4 ml was used to inform the decision to scale-up for a second meso-scale (10-50 ml) cell growth and automated purification. 1H-15N NMR HSQC spectra of His8-Tcl-1 and of His8-GFP prepared from 50 ml cultures showed excellent chemical shift dispersion, consistent with well folded states in solution suitable for structure determination. Moreover, AIA-GFP obtained by proteolytic removal of the His8 tag was subjected to crystallization screening, and yielded crystals under several conditions. Single crystals were subsequently produced and optimized by the hanging drop method. The structure was solved by molecular replacement at a resolution of 1.7 A. This approach provides an efficient way to carry out several key target screening steps that are essential for successful operation of proteomics pipelines

  10. Automated design optimization of supersonic airplane wing structures under dynamic constraints.

    NASA Technical Reports Server (NTRS)

    Fox, R. L.; Miura, H.; Rao, S. S.

    1972-01-01

    The problems of the preliminary and first level detail design of supersonic aircraft wings are stated as mathematical programs and solved using automated optimum design techniques. The problem is approached in two phases: the first is a simplified equivalent plate model in which the envelope, plan form and structural parameters are varied to produce a design, the second is a finite element model with fixed configuration in which the material distribution is varied. Constraints include flutter, aeroelastically computed stresses and deflections, natural frequency and a variety of geometric limitations. The Phase I objective is a combination of weight and drag while Phase II is a weight minimization.

  11. Automated High Throughput Protein Crystallization Screening at Nanoliter Scale and Protein Structural Study on Lactate Dehydrogenase

    SciTech Connect

    Li, Fenglei

    2006-08-09

    The purposes of our research were: (1) To develop an economical, easy to use, automated, high throughput system for large scale protein crystallization screening. (2) To develop a new protein crystallization method with high screening efficiency, low protein consumption and complete compatibility with high throughput screening system. (3) To determine the structure of lactate dehydrogenase complexed with NADH by x-ray protein crystallography to study its inherent structural properties. Firstly, we demonstrated large scale protein crystallization screening can be performed in a high throughput manner with low cost, easy operation. The overall system integrates liquid dispensing, crystallization and detection and serves as a whole solution to protein crystallization screening. The system can dispense protein and multiple different precipitants in nanoliter scale and in parallel. A new detection scheme, native fluorescence, has been developed in this system to form a two-detector system with a visible light detector for detecting protein crystallization screening results. This detection scheme has capability of eliminating common false positives by distinguishing protein crystals from inorganic crystals in a high throughput and non-destructive manner. The entire system from liquid dispensing, crystallization to crystal detection is essentially parallel, high throughput and compatible with automation. The system was successfully demonstrated by lysozyme crystallization screening. Secondly, we developed a new crystallization method with high screening efficiency, low protein consumption and compatibility with automation and high throughput. In this crystallization method, a gas permeable membrane is employed to achieve the gentle evaporation required by protein crystallization. Protein consumption is significantly reduced to nanoliter scale for each condition and thus permits exploring more conditions in a phase diagram for given amount of protein. In addition

  12. Automating crystallographic structure solution and refinement of protein–ligand complexes

    SciTech Connect

    Echols, Nathaniel Moriarty, Nigel W. Klei, Herbert E.; Afonine, Pavel V.; Bunkóczi, Gábor; McCoy, Airlie J.; Oeffner, Robert D.; Read, Randy J.; Adams, Paul D.

    2014-01-01

    A software system for automated protein–ligand crystallography has been implemented in the Phenix suite. This significantly reduces the manual effort required in high-throughput crystallographic studies. High-throughput drug-discovery and mechanistic studies often require the determination of multiple related crystal structures that only differ in the bound ligands, point mutations in the protein sequence and minor conformational changes. If performed manually, solution and refinement requires extensive repetition of the same tasks for each structure. To accelerate this process and minimize manual effort, a pipeline encompassing all stages of ligand building and refinement, starting from integrated and scaled diffraction intensities, has been implemented in Phenix. The resulting system is able to successfully solve and refine large collections of structures in parallel without extensive user intervention prior to the final stages of model completion and validation.

  13. New tissue priors for improved automated classification of subcortical brain structures on MRI☆

    PubMed Central

    Lorio, S.; Fresard, S.; Adaszewski, S.; Kherif, F.; Chowdhury, R.; Frackowiak, R.S.; Ashburner, J.; Helms, G.; Weiskopf, N.; Lutti, A.; Draganski, B.

    2016-01-01

    Despite the constant improvement of algorithms for automated brain tissue classification, the accurate delineation of subcortical structures using magnetic resonance images (MRI) data remains challenging. The main difficulties arise from the low gray-white matter contrast of iron rich areas in T1-weighted (T1w) MRI data and from the lack of adequate priors for basal ganglia and thalamus. The most recent attempts to obtain such priors were based on cohorts with limited size that included subjects in a narrow age range, failing to account for age-related gray-white matter contrast changes. Aiming to improve the anatomical plausibility of automated brain tissue classification from T1w data, we have created new tissue probability maps for subcortical gray matter regions. Supported by atlas-derived spatial information, raters manually labeled subcortical structures in a cohort of healthy subjects using magnetization transfer saturation and R2* MRI maps, which feature optimal gray-white matter contrast in these areas. After assessment of inter-rater variability, the new tissue priors were tested on T1w data within the framework of voxel-based morphometry. The automated detection of gray matter in subcortical areas with our new probability maps was more anatomically plausible compared to the one derived with currently available priors. We provide evidence that the improved delineation compensates age-related bias in the segmentation of iron rich subcortical regions. The new tissue priors, allowing robust detection of basal ganglia and thalamus, have the potential to enhance the sensitivity of voxel-based morphometry in both healthy and diseased brains. PMID:26854557

  14. Automated measurement of CT noise in patient images with a novel structure coherence feature

    NASA Astrophysics Data System (ADS)

    Chun, Minsoo; Choi, Young Hun; Hyo Kim, Jong

    2015-12-01

    While the assessment of CT noise constitutes an important task for the optimization of scan protocols in clinical routine, the majority of noise measurements in practice still rely on manual operation, hence limiting their efficiency and reliability. This study presents an algorithm for the automated measurement of CT noise in patient images with a novel structure coherence feature. The proposed algorithm consists of a four-step procedure including subcutaneous fat tissue selection, the calculation of structure coherence feature, the determination of homogeneous ROIs, and the estimation of the average noise level. In an evaluation with 94 CT scans (16 517 images) of pediatric and adult patients along with the participation of two radiologists, ROIs were placed on a homogeneous fat region at 99.46% accuracy, and the agreement of the automated noise measurements with the radiologists’ reference noise measurements (PCC  =  0.86) was substantially higher than the within and between-rater agreements of noise measurements (PCCwithin  =  0.75, PCCbetween  =  0.70). In addition, the absolute noise level measurements matched closely the theoretical noise levels generated by a reduced-dose simulation technique. Our proposed algorithm has the potential to be used for examining the appropriateness of radiation dose and the image quality of CT protocols for research purposes as well as clinical routine.

  15. Automated measurement of lysosomal structure alterations in oocytes of mussels exposed to petroleum hydrocarbons.

    PubMed

    Cajaraville, M P; Marigómez, J A; Angulo, E

    1991-09-01

    The present study examines the structure of the lysosomal system of mature oocytes in mussels, Mytilus galloprovincialis, after a 21 day exposure to the water accommodated fraction (WAF) of two crude oils (types Ural and Maya) and of a commercial lubricant oil. The automated image analysis indicates that lysosomes, showing cytochemically demonstrable beta-glucuronidase activity, are smaller and much more numerous in oocytes of mussels treated with a 40% dose of Ural- and Lubricant-WAF when compared to controls. It is suggested that the structure of the lysosomal system of oocytes is different from that of somatic cells (i.e., digestive cells) and that budding or "fission" into smaller bodies occurs in oocyte lysosomes under certain petroleum hydrocarbon-exposure conditions. These changes in the lysosomal compartment appear to be associated to the process of gamete release or spawning.

  16. An expert system executive for automated assembly of large space truss structures

    NASA Technical Reports Server (NTRS)

    Allen, Cheryl L.

    1993-01-01

    Langley Research Center developed a unique test bed for investigating the practical problems associated with the assembly of large space truss structures using robotic manipulators. The test bed is the result of an interdisciplinary effort that encompasses the full spectrum of assembly problems - from the design of mechanisms to the development of software. The automated structures assembly test bed and its operation are described, the expert system executive and its development are detailed, and the planned system evolution is discussed. Emphasis is on the expert system implementation of the program executive. The executive program must direct and reliably perform complex assembly tasks with the flexibility to recover from realistic system errors. The employment of an expert system permits information that pertains to the operation of the system to be encapsulated concisely within a knowledge base. This consolidation substantially reduced code, increased flexibility, eased software upgrades, and realized a savings in software maintenance costs.

  17. Automated assembly of large space structures using an expert system executive

    NASA Technical Reports Server (NTRS)

    Allen, Cheryl L.

    1993-01-01

    NASA LaRC has developed a unique testbed for investigating the practical problems associated with the assembly of large space structures using robotic manipulators. The testbed is an interdisciplinary effort which considers the full spectrum of assembly problems from the design of mechanisms to the development of software. This paper will describe the automated structures assembly testbed and its operation, detail the expert system executive and its development, and discuss the planned system evolution. Emphasis will be placed on the expert system development of the program executive. The executive program must be capable of directing and reliably performing complex assembly tasks with the flexibility to recover from realistic system errors. By employing an expert system, information pertaining to the operation of the system was encapsulated concisely within a knowledge base. This lead to a substantial reduction in code, increased flexibility, eased software upgrades, and realized a savings in software maintenance costs.

  18. Automating gene library synthesis by structure-based combinatorial protein engineering: examples from plant sesquiterpene synthases.

    PubMed

    Dokarry, Melissa; Laurendon, Caroline; O'Maille, Paul E

    2012-01-01

    Structure-based combinatorial protein engineering (SCOPE) is a homology-independent recombination method to create multiple crossover gene libraries by assembling defined combinations of structural elements ranging from single mutations to domains of protein structure. SCOPE was originally inspired by DNA shuffling, which mimics recombination during meiosis, where mutations from parental genes are "shuffled" to create novel combinations in the resulting progeny. DNA shuffling utilizes sequence identity between parental genes to mediate template-switching events (the annealing and extension of one parental gene fragment on another) in PCR reassembly reactions to generate crossovers and hence recombination between parental genes. In light of the conservation of protein structure and degeneracy of sequence, SCOPE was developed to enable the "shuffling" of distantly related genes with no requirement for sequence identity. The central principle involves the use of oligonucleotides to encode for crossover regions to choreograph template-switching events during PCR assembly of gene fragments to create chimeric genes. This approach was initially developed to create libraries of hybrid DNA polymerases from distantly related parents, and later developed to create a combinatorial mutant library of sesquiterpene synthases to explore the catalytic landscapes underlying the functional divergence of related enzymes. This chapter presents a simplified protocol of SCOPE that can be integrated with different mutagenesis techniques and is suitable for automation by liquid-handling robots. Two examples are presented to illustrate the application of SCOPE to create gene libraries using plant sesquiterpene synthases as the model system. In the first example, we outline how to create an active-site library as a series of complex mixtures of diverse mutants. In the second example, we outline how to create a focused library as an array of individual clones to distil minimal combinations of

  19. A method for fully automated measurement of neurological structures in MRI

    NASA Astrophysics Data System (ADS)

    Ashton, Edward A.; Riek, Jonathan K.; Molinelli, Larry; Berg, Michel J.; Parker, Kevin J.

    2003-05-01

    A method for fully automating the measurement of various neurological structures in MRI is presented. This technique uses an atlas-based trained maximum likelihood classifier. The classifier requires a map of prior probabilities, which is obtained by registering a large number of previously classified data sets to the atlas and calculating the resulting probability that each represented tissue type or structure will appear at each voxel in the data set. Classification is then carried out using the standard maximum likelihood discriminant function, assuming normal statistics. The results of this classification process can be used in three ways, depending on the type of structure that is being detected or measured. In the most straightforward case, measurement of a normal neural sub-structure such as the hippocampus, the results of the classifier provide a localization point for the initiation of a deformable template model, which is then optimized with respect to the original data. The detection and measurement of abnormal structures, such as white matter lesions in multiple sclerosis patients, requires a slightly different approach. Lesions are detected through the application of a spectral matched filter to areas identified by the classifier as white matter. Finally, detection of unknown abnormalities can be accomplished through anomaly detection.

  20. A script for automated 3-dimentional structure generation and conformer search from 2- dimentional chemical drawing.

    PubMed

    Ishikawa, Yoshinobu

    2013-01-01

    Building 3-dimensional (3D) molecules is the starting point in molecular modeling. Conformer search and identification of a global energy minimum structure are often performed computationally during spectral analysis of data from NMR, IR, and VCD or during rational drug design through ligand-based, structure-based, and QSAR approaches. I herein report a convenient script that allows for automated building of 3D structures and conformer searching from 2-dimensional (2D) drawing of chemical structures. With this Bash shell script, which runs on Mac OS X and the Linux platform, the tasks are consecutively and iteratively executed without a 3D molecule builder via the command line interface of the free (academic) software OpenBabel, Balloon, and MOPAC2012. A large number of 2D chemical drawing files can be processed simultaneously, and the script functions with stereoisomers. Semi-empirical quantum chemical calculation ensures reliable ranking of the generated conformers on the basis of energy. In addition to an energy-sorted list of file names of the conformers, their Gaussian input files are provided for ab initio and density functional theory calculations to predict rigorous electronic energies, structures, and properties. This script is freely available to all scientists.

  1. A Script for Automated 3-Dimentional Structure Generation and Conformer Search from 2- Dimentional Chemical Drawing

    PubMed Central

    Ishikawa, Yoshinobu

    2013-01-01

    Building 3-dimensional (3D) molecules is the starting point in molecular modeling. Conformer search and identification of a global energy minimum structure are often performed computationally during spectral analysis of data from NMR, IR, and VCD or during rational drug design through ligand-based, structure-based, and QSAR approaches. I herein report a convenient script that allows for automated building of 3D structures and conformer searching from 2-dimensional (2D) drawing of chemical structures. With this Bash shell script, which runs on Mac OS X and the Linux platform, the tasks are consecutively and iteratively executed without a 3D molecule builder via the command line interface of the free (academic) software OpenBabel, Balloon, and MOPAC2012. A large number of 2D chemical drawing files can be processed simultaneously, and the script functions with stereoisomers. Semi-empirical quantum chemical calculation ensures reliable ranking of the generated conformers on the basis of energy. In addition to an energy-sorted list of file names of the conformers, their Gaussian input files are provided for ab initio and density functional theory calculations to predict rigorous electronic energies, structures, and properties. This script is freely available to all scientists. PMID:24391363

  2. Improved reliability, accuracy and quality in automated NMR structure calculation with ARIA.

    PubMed

    Mareuil, Fabien; Malliavin, Thérèse E; Nilges, Michael; Bardiaux, Benjamin

    2015-08-01

    In biological NMR, assignment of NOE cross-peaks and calculation of atomic conformations are critical steps in the determination of reliable high-resolution structures. ARIA is an automated approach that performs NOE assignment and structure calculation in a concomitant manner in an iterative procedure. The log-harmonic shape for distance restraint potential and the Bayesian weighting of distance restraints, recently introduced in ARIA, were shown to significantly improve the quality and the accuracy of determined structures. In this paper, we propose two modifications of the ARIA protocol: (1) the softening of the force field together with adapted hydrogen radii, which is meaningful in the context of the log-harmonic potential with Bayesian weighting, (2) a procedure that automatically adjusts the violation tolerance used in the selection of active restraints, based on the fitting of the structure to the input data sets. The new ARIA protocols were fine-tuned on a set of eight protein targets from the CASD-NMR initiative. As a result, the convergence problems previously observed for some targets was resolved and the obtained structures exhibited better quality. In addition, the new ARIA protocols were applied for the structure calculation of ten new CASD-NMR targets in a blind fashion, i.e. without knowing the actual solution. Even though optimisation of parameters and pre-filtering of unrefined NOE peak lists were necessary for half of the targets, ARIA consistently and reliably determined very precise and highly accurate structures for all cases. In the context of integrative structural biology, an increasing number of experimental methods are used that produce distance data for the determination of 3D structures of macromolecules, stressing the importance of methods that successfully make use of ambiguous and noisy distance data.

  3. Automated structure refinement of macromolecular assemblies from cryo-EM maps using Rosetta.

    PubMed

    Wang, Ray Yu-Ruei; Song, Yifan; Barad, Benjamin A; Cheng, Yifan; Fraser, James S; DiMaio, Frank

    2016-09-26

    Cryo-EM has revealed the structures of many challenging yet exciting macromolecular assemblies at near-atomic resolution (3-4.5Å), providing biological phenomena with molecular descriptions. However, at these resolutions, accurately positioning individual atoms remains challenging and error-prone. Manually refining thousands of amino acids - typical in a macromolecular assembly - is tedious and time-consuming. We present an automated method that can improve the atomic details in models that are manually built in near-atomic-resolution cryo-EM maps. Applying the method to three systems recently solved by cryo-EM, we are able to improve model geometry while maintaining the fit-to-density. Backbone placement errors are automatically detected and corrected, and the refinement shows a large radius of convergence. The results demonstrate that the method is amenable to structures with symmetry, of very large size, and containing RNA as well as covalently bound ligands. The method should streamline the cryo-EM structure determination process, providing accurate and unbiased atomic structure interpretation of such maps.

  4. Automated structure refinement of macromolecular assemblies from cryo-EM maps using Rosetta

    PubMed Central

    Wang, Ray Yu-Ruei; Song, Yifan; Barad, Benjamin A; Cheng, Yifan; Fraser, James S; DiMaio, Frank

    2016-01-01

    Cryo-EM has revealed the structures of many challenging yet exciting macromolecular assemblies at near-atomic resolution (3–4.5Å), providing biological phenomena with molecular descriptions. However, at these resolutions, accurately positioning individual atoms remains challenging and error-prone. Manually refining thousands of amino acids – typical in a macromolecular assembly – is tedious and time-consuming. We present an automated method that can improve the atomic details in models that are manually built in near-atomic-resolution cryo-EM maps. Applying the method to three systems recently solved by cryo-EM, we are able to improve model geometry while maintaining the fit-to-density. Backbone placement errors are automatically detected and corrected, and the refinement shows a large radius of convergence. The results demonstrate that the method is amenable to structures with symmetry, of very large size, and containing RNA as well as covalently bound ligands. The method should streamline the cryo-EM structure determination process, providing accurate and unbiased atomic structure interpretation of such maps. DOI: http://dx.doi.org/10.7554/eLife.17219.001 PMID:27669148

  5. Automated torso organ segmentation from 3D CT images using structured perceptron and dual decomposition

    NASA Astrophysics Data System (ADS)

    Nimura, Yukitaka; Hayashi, Yuichiro; Kitasaka, Takayuki; Mori, Kensaku

    2015-03-01

    This paper presents a method for torso organ segmentation from abdominal CT images using structured perceptron and dual decomposition. A lot of methods have been proposed to enable automated extraction of organ regions from volumetric medical images. However, it is necessary to adjust empirical parameters of them to obtain precise organ regions. This paper proposes an organ segmentation method using structured output learning. Our method utilizes a graphical model and binary features which represent the relationship between voxel intensities and organ labels. Also we optimize the weights of the graphical model by structured perceptron and estimate the best organ label for a given image by dynamic programming and dual decomposition. The experimental result revealed that the proposed method can extract organ regions automatically using structured output learning. The error of organ label estimation was 4.4%. The DICE coefficients of left lung, right lung, heart, liver, spleen, pancreas, left kidney, right kidney, and gallbladder were 0.91, 0.95, 0.77, 0.81, 0.74, 0.08, 0.83, 0.84, and 0.03, respectively.

  6. Modelling and interpreting biologically crusted dryland soil sub-surface structure using automated micropenetrometry

    NASA Astrophysics Data System (ADS)

    Hoon, Stephen R.; Felde, Vincent J. M. N. L.; Drahorad, Sylvie L.; Felix-Henningsen, Peter

    2015-04-01

    Soil penetrometers are used routinely to determine the shear strength of soils and deformable sediments both at the surface and throughout a depth profile in disciplines as diverse as soil science, agriculture, geoengineering and alpine avalanche-safety (e.g. Grunwald et al. 2001, Van Herwijnen et al. 2009). Generically, penetrometers comprise two principal components: An advancing probe, and a transducer; the latter to measure the pressure or force required to cause the probe to penetrate or advance through the soil or sediment. The force transducer employed to determine the pressure can range, for example, from a simple mechanical spring gauge to an automatically data-logged electronic transducer. Automated computer control of the penetrometer step size and probe advance rate enables precise measurements to be made down to a resolution of 10's of microns, (e.g. the automated electronic micropenetrometer (EMP) described by Drahorad 2012). Here we discuss the determination, modelling and interpretation of biologically crusted dryland soil sub-surface structures using automated micropenetrometry. We outline a model enabling the interpretation of depth dependent penetration resistance (PR) profiles and their spatial differentials using the model equations, σ {}(z) ={}σ c0{}+Σ 1n[σ n{}(z){}+anz + bnz2] and dσ /dz = Σ 1n[dσ n(z) /dz{} {}+{}Frn(z)] where σ c0 and σ n are the plastic deformation stresses for the surface and nth soil structure (e.g. soil crust, layer, horizon or void) respectively, and Frn(z)dz is the frictional work done per unit volume by sliding the penetrometer rod an incremental distance, dz, through the nth layer. Both σ n(z) and Frn(z) are related to soil structure. They determine the form of σ {}(z){} measured by the EMP transducer. The model enables pores (regions of zero deformation stress) to be distinguished from changes in layer structure or probe friction. We have applied this method to both artificial calibration soils in the

  7. Description and recognition of regular and distorted secondary structures in proteins using the automated protein structure analysis method.

    PubMed

    Ranganathan, Sushilee; Izotov, Dmitry; Kraka, Elfi; Cremer, Dieter

    2009-08-01

    The Automated Protein Structure Analysis (APSA) method, which describes the protein backbone as a smooth line in three-dimensional space and characterizes it by curvature kappa and torsion tau as a function of arc length s, was applied on 77 proteins to determine all secondary structural units via specific kappa(s) and tau(s) patterns. A total of 533 alpha-helices and 644 beta-strands were recognized by APSA, whereas DSSP gives 536 and 651 units, respectively. Kinks and distortions were quantified and the boundaries (entry and exit) of secondary structures were classified. Similarity between proteins can be easily quantified using APSA, as was demonstrated for the roll architecture of proteins ubiquitin and spinach ferridoxin. A twenty-by-twenty comparison of all alpha domains showed that the curvature-torsion patterns generated by APSA provide an accurate and meaningful similarity measurement for secondary, super secondary, and tertiary protein structure. APSA is shown to accurately reflect the conformation of the backbone effectively reducing three-dimensional structure information to two-dimensional representations that are easy to interpret and understand. 2008 Wiley-Liss, Inc.

  8. Managing expectations: assessment of chemistry databases generated by automated extraction of chemical structures from patents.

    PubMed

    Senger, Stefan; Bartek, Luca; Papadatos, George; Gaulton, Anna

    2015-12-01

    First public disclosure of new chemical entities often takes place in patents, which makes them an important source of information. However, with an ever increasing number of patent applications, manual processing and curation on such a large scale becomes even more challenging. An alternative approach better suited for this large corpus of documents is the automated extraction of chemical structures. A number of patent chemistry databases generated by using the latter approach are now available but little is known that can help to manage expectations when using them. This study aims to address this by comparing two such freely available sources, SureChEMBL and IBM SIIP (IBM Strategic Intellectual Property Insight Platform), with manually curated commercial databases. When looking at the percentage of chemical structures successfully extracted from a set of patents, using SciFinder as our reference, 59 and 51 % were also found in our comparison in SureChEMBL and IBM SIIP, respectively. When performing this comparison with compounds as starting point, i.e. establishing if for a list of compounds the databases provide the links between chemical structures and patents they appear in, we obtained similar results. SureChEMBL and IBM SIIP found 62 and 59 %, respectively, of the compound-patent pairs obtained from Reaxys. In our comparison of automatically generated vs. manually curated patent chemistry databases, the former successfully provided approximately 60 % of links between chemical structure and patents. It needs to be stressed that only a very limited number of patents and compound-patent pairs were used for our comparison. Nevertheless, our results will hopefully help to manage expectations of users of patent chemistry databases of this type and provide a useful framework for more studies like ours as well as guide future developments of the workflows used for the automated extraction of chemical structures from patents. The challenges we have encountered

  9. Vibration based structural health monitoring of an arch bridge: From automated OMA to damage detection

    NASA Astrophysics Data System (ADS)

    Magalhães, F.; Cunha, A.; Caetano, E.

    2012-04-01

    In order to evaluate the usefulness of approaches based on modal parameters tracking for structural health monitoring of bridges, in September of 2007, a dynamic monitoring system was installed in a concrete arch bridge at the city of Porto, in Portugal. The implementation of algorithms to perform the continuous on-line identification of modal parameters based on structural responses to ambient excitation (automated Operational Modal Analysis) has permitted to create a very complete database with the time evolution of the bridge modal characteristics during more than 2 years. This paper describes the strategy that was followed to minimize the effects of environmental and operational factors on the bridge natural frequencies, enabling, in a subsequent stage, the identification of structural anomalies. Alternative static and dynamic regression models are tested and complemented by a Principal Components Analysis. Afterwards, the identification of damages is tried with control charts. At the end, it is demonstrated that the adopted processing methodology permits the detection of realistic damage scenarios, associated with frequency shifts around 0.2%, which were simulated with a numerical model.

  10. Automated mutual exclusion rules discovery for structured observational codes in echocardiography reporting

    PubMed Central

    Forsberg, Thomas A.; Sevenster, Merlijn; Bieganski, Szymon; Bhagat, Puran; Kanasseril, Melvin; Jia, Yugang; Spencer, Kirk T.

    2015-01-01

    Structured reporting in medicine has been argued to support and enhance machine-assisted processing and communication of pertinent information. Retrospective studies showed that structured echocardiography reports, constructed through point-and-click selection of finding codes (FCs), contain pair-wise contradictory FCs (e.g., “No tricuspid regurgitation” and “Severe regurgitation”) downgrading report quality and reliability thereof. In a prospective study, contradictions were detected automatically using an extensive rule set that encodes mutual exclusion patterns between FCs. Rules creation is a labor and knowledge-intensive task that could benefit from automation. We propose a machine-learning approach to discover mutual exclusion rules in a corpus of 101,211 structured echocardiography reports through semantic and statistical analysis. Ground truth is derived from the extensive prospectively evaluated rule set. On the unseen test set, F-measure (0.439) and above-chance level AUC (0.885) show that our approach can potentially support the manual rules creation process. Our methods discovered previously unknown rules per expert review. PMID:26958191

  11. Automated metric characterization of urban structure using building decomposition from very high resolution imagery

    NASA Astrophysics Data System (ADS)

    Heinzel, Johannes; Kemper, Thomas

    2015-03-01

    Classification approaches for urban areas are mostly of qualitative and semantic nature. They produce interpreted classes similar to those from land cover and land use classifications. As a complement to those classes, quantitative measures directly derived from the image could lead to a metric characterization of the urban area. While these metrics lack of qualitative interpretation they are able to provide objective measure of the urban structures. Such quantitative measures are especially important in rapidly growing cities since, beside of the growth in area, they can provide structural information for specific areas and detect changes. Rustenburg, which serves as test area for the present study, is amongst the fastest growing cities in South Africa. It reveals a heterogeneous face of housing and building structures reflecting social and/or economic differences often linked to the spatial distribution of industrial and local mining sites. Up to date coverage with aerial photographs is provided by aerial surveys in regular intervals. Also recent satellite systems provide imagery with suitable resolution. Using such set of very high resolution images a fully automated algorithm has been developed which outputs metric classes by systematically combining important measures of building structure. The measurements are gained by decomposition of buildings directly from the imagery and by using methods from mathematical morphology. The decomposed building objects serve as basis for the computation of grid statistics. Finally a systematic combination of the single features leads to combined metrical classes. For the dominant urban structures verification results indicate an overall accuracy of at least 80% on the single feature level and 70% for the combined classes.

  12. ABodyBuilder: Automated antibody structure prediction with data–driven accuracy estimation

    PubMed Central

    Leem, Jinwoo; Dunbar, James; Georges, Guy; Shi, Jiye; Deane, Charlotte M.

    2016-01-01

    ABSTRACT Computational modeling of antibody structures plays a critical role in therapeutic antibody design. Several antibody modeling pipelines exist, but no freely available methods currently model nanobodies, provide estimates of expected model accuracy, or highlight potential issues with the antibody's experimental development. Here, we describe our automated antibody modeling pipeline, ABodyBuilder, designed to overcome these issues. The algorithm itself follows the standard 4 steps of template selection, orientation prediction, complementarity-determining region (CDR) loop modeling, and side chain prediction. ABodyBuilder then annotates the ‘confidence’ of the model as a probability that a component of the antibody (e.g., CDRL3 loop) will be modeled within a root–mean square deviation threshold. It also flags structural motifs on the model that are known to cause issues during in vitro development. ABodyBuilder was tested on 4 separate datasets, including the 11 antibodies from the Antibody Modeling Assessment–II competition. ABodyBuilder builds models that are of similar quality to other methodologies, with sub–Angstrom predictions for the ‘canonical’ CDR loops. Its ability to model nanobodies, and rapidly generate models (∼30 seconds per model) widens its potential usage. ABodyBuilder can also help users in decision–making for the development of novel antibodies because it provides model confidence and potential sequence liabilities. ABodyBuilder is freely available at http://opig.stats.ox.ac.uk/webapps/abodybuilder. PMID:27392298

  13. [Automated morphometric evaluation of the chromatin structure of liver cell nuclei after vagotomy].

    PubMed

    Butusova, N N; Zhukotskiĭ, A V; Sherbo, I V; Gribkov, E N; Dubovaia, T K

    1989-05-01

    The morphometric analysis of the interphase chromatine structure of the hepatic cells nuclei was carried out on the automated TV installation for the quantitative analysis of images "IBAS-2" (by the OPTON firm, the FRG) according to 50 optical and geometric parameters during various periods (1.2 and 4 weeks) after the vagotomy operation. It is determined that upper-molecular organisation of chromatine undergoes the biggest changes one week after operation, and changes of granular component are more informative than changes of the nongranular component (with the difference 15-20%). It was also revealed that chromatine components differ in tinctorial properties, which are evidently dependent on physicochemical characteristics of the chromatine under various functional conditions of the cell. As a result of the correlation analysis the group of morphometric indices of chromatine structure was revealed, which are highly correlated with level of transcription activity of chromatine during various terms after denervation. The correlation quotient of these parameters is 0.85-0.97. The summing up: vagus denervation of the liver causes changes in the morphofunctional organisation of the chromatine.

  14. Automated laser-based barely visible impact damage detection in honeycomb sandwich composite structures

    SciTech Connect

    Girolamo, D. Yuan, F. G.; Girolamo, L.

    2015-03-31

    Nondestructive evaluation (NDE) for detection and quantification of damage in composite materials is fundamental in the assessment of the overall structural integrity of modern aerospace systems. Conventional NDE systems have been extensively used to detect the location and size of damages by propagating ultrasonic waves normal to the surface. However they usually require physical contact with the structure and are time consuming and labor intensive. An automated, contactless laser ultrasonic imaging system for barely visible impact damage (BVID) detection in advanced composite structures has been developed to overcome these limitations. Lamb waves are generated by a Q-switched Nd:YAG laser, raster scanned by a set of galvano-mirrors over the damaged area. The out-of-plane vibrations are measured through a laser Doppler Vibrometer (LDV) that is stationary at a point on the corner of the grid. The ultrasonic wave field of the scanned area is reconstructed in polar coordinates and analyzed for high resolution characterization of impact damage in the composite honeycomb panel. Two methodologies are used for ultrasonic wave-field analysis: scattered wave field analysis (SWA) and standing wave energy analysis (SWEA) in the frequency domain. The SWA is employed for processing the wave field and estimate spatially dependent wavenumber values, related to discontinuities in the structural domain. The SWEA algorithm extracts standing waves trapped within damaged areas and, by studying the spectrum of the standing wave field, returns high fidelity damage imaging. While the SWA can be used to locate the impact damage in the honeycomb panel, the SWEA produces damage images in good agreement with X-ray computed tomographic (X-ray CT) scans. The results obtained prove that the laser-based nondestructive system is an effective alternative to overcome limitations of conventional NDI technologies.

  15. E-novo: an automated workflow for efficient structure-based lead optimization.

    PubMed

    Pearce, Bradley C; Langley, David R; Kang, Jia; Huang, Hongwei; Kulkarni, Amit

    2009-07-01

    An automated E-Novo protocol designed as a structure-based lead optimization tool was prepared through Pipeline Pilot with existing CHARMm components in Discovery Studio. A scaffold core having 3D binding coordinates of interest is generated from a ligand-bound protein structural model. Ligands of interest are generated from the scaffold using an R-group fragmentation/enumeration tool within E-Novo, with their cores aligned. The ligand side chains are conformationally sampled and are subjected to core-constrained protein docking, using a modified CHARMm-based CDOCKER method to generate top poses along with CDOCKER energies. In the final stage of E-Novo, a physics-based binding energy scoring function ranks the top ligand CDOCKER poses using a more accurate Molecular Mechanics-Generalized Born with Surface Area method. Correlation of the calculated ligand binding energies with experimental binding affinities were used to validate protocol performance. Inhibitors of Src tyrosine kinase, CDK2 kinase, beta-secretase, factor Xa, HIV protease, and thrombin were used to test the protocol using published ligand crystal structure data within reasonably defined binding sites. In-house Respiratory Syncytial Virus inhibitor data were used as a more challenging test set using a hand-built binding model. Least squares fits for all data sets suggested reasonable validation of the protocol within the context of observed ligand binding poses. The E-Novo protocol provides a convenient all-in-one structure-based design process for rapid assessment and scoring of lead optimization libraries.

  16. SV-AUTOPILOT: optimized, automated construction of structural variation discovery and benchmarking pipelines.

    PubMed

    Leung, Wai Yi; Marschall, Tobias; Paudel, Yogesh; Falquet, Laurent; Mei, Hailiang; Schönhuth, Alexander; Maoz Moss, Tiffanie Yael

    2015-03-25

    Many tools exist to predict structural variants (SVs), utilizing a variety of algorithms. However, they have largely been developed and tested on human germline or somatic (e.g. cancer) variation. It seems appropriate to exploit this wealth of technology available for humans also for other species. Objectives of this work included: a) Creating an automated, standardized pipeline for SV prediction. b) Identifying the best tool(s) for SV prediction through benchmarking. c) Providing a statistically sound method for merging SV calls. The SV-AUTOPILOT meta-tool platform is an automated pipeline for standardization of SV prediction and SV tool development in paired-end next-generation sequencing (NGS) analysis. SV-AUTOPILOT comes in the form of a virtual machine, which includes all datasets, tools and algorithms presented here. The virtual machine easily allows one to add, replace and update genomes, SV callers and post-processing routines and therefore provides an easy, out-of-the-box environment for complex SV discovery tasks. SV-AUTOPILOT was used to make a direct comparison between 7 popular SV tools on the Arabidopsis thaliana genome using the Landsberg (Ler) ecotype as a standardized dataset. Recall and precision measurements suggest that Pindel and Clever were the most adaptable to this dataset across all size ranges while Delly performed well for SVs larger than 250 nucleotides. A novel, statistically-sound merging process, which can control the false discovery rate, reduced the false positive rate on the Arabidopsis benchmark dataset used here by >60%. SV-AUTOPILOT provides a meta-tool platform for future SV tool development and the benchmarking of tools on other genomes using a standardized pipeline. It optimizes detection of SVs in non-human genomes using statistically robust merging. The benchmarking in this study has demonstrated the power of 7 different SV tools for analyzing different size classes and types of structural variants. The optional merge

  17. Repair, Evaluation, Maintenance, and Rehabilitation Research Program. Instrumentation Automation for Concrete Structures. Report 2. Automation Hardware and Retrofitting Techniques.

    DTIC Science & Technology

    1987-06-01

    0.% AW = %’,"% i. Power requirements and recommendations: 115 VAC (90- 132V tolerance), 50/60 Hz, 168 watts. j-. Compatible equipment: RS-232-C and 8...structures, charge controllers, meters, alarms, and batteries. A n. Comments: None. .’ J 4 iM85 SOLR ELECTRIC CHARGING UNIT (PHOTO COURTESY OF ACO

  18. Application of AN Automated Wireless Structural Monitoring System for Long-Span Suspension Bridges

    NASA Astrophysics Data System (ADS)

    Kurata, M.; Lynch, J. P.; van der Linden, G. W.; Hipley, P.; Sheng, L.-H.

    2011-06-01

    This paper describes an automated wireless structural monitoring system installed at the New Carquinez Bridge (NCB). The designed system utilizes a dense network of wireless sensors installed in the bridge but remotely controlled by a hierarchically designed cyber-environment. The early efforts have included performance verification of a dense network of wireless sensors installed on the bridge and the establishment of a cellular gateway to the system for remote access from the internet. Acceleration of the main bridge span was the primary focus of the initial field deployment of the wireless monitoring system. An additional focus of the study is on ensuring wireless sensors can survive for long periods without human intervention. Toward this end, the life-expectancy of the wireless sensors has been enhanced by embedding efficient power management schemes in the sensors while integrating solar panels for power harvesting. The dynamic characteristics of the NCB under daily traffic and wind loads were extracted from the vibration response of the bridge deck and towers. These results have been compared to a high-fidelity finite element model of the bridge.

  19. Application of an automated wireless structural monitoring system for long-span suspension bridges

    SciTech Connect

    Kurata, M.; Lynch, J. P.; Linden, G. W. van der; Hipley, P.; Sheng, L.-H.

    2011-06-23

    This paper describes an automated wireless structural monitoring system installed at the New Carquinez Bridge (NCB). The designed system utilizes a dense network of wireless sensors installed in the bridge but remotely controlled by a hierarchically designed cyber-environment. The early efforts have included performance verification of a dense network of wireless sensors installed on the bridge and the establishment of a cellular gateway to the system for remote access from the internet. Acceleration of the main bridge span was the primary focus of the initial field deployment of the wireless monitoring system. An additional focus of the study is on ensuring wireless sensors can survive for long periods without human intervention. Toward this end, the life-expectancy of the wireless sensors has been enhanced by embedding efficient power management schemes in the sensors while integrating solar panels for power harvesting. The dynamic characteristics of the NCB under daily traffic and wind loads were extracted from the vibration response of the bridge deck and towers. These results have been compared to a high-fidelity finite element model of the bridge.

  20. Automated foveola localization in retinal 3D-OCT images using structural support vector machine prediction.

    PubMed

    Liu, Yu-Ying; Ishikawa, Hiroshi; Chen, Mei; Wollstein, Gadi; Schuman, Joel S; Rehg, James M

    2012-01-01

    We develop an automated method to determine the foveola location in macular 3D-OCT images in either healthy or pathological conditions. Structural Support Vector Machine (S-SVM) is trained to directly predict the location of the foveola, such that the score at the ground truth position is higher than that at any other position by a margin scaling with the associated localization loss. This S-SVM formulation directly minimizes the empirical risk of localization error, and makes efficient use of all available training data. It deals with the localization problem in a more principled way compared to the conventional binary classifier learning that uses zero-one loss and random sampling of negative examples. A total of 170 scans were collected for the experiment. Our method localized 95.1% of testing scans within the anatomical area of the foveola. Our experimental results show that the proposed method can effectively identify the location of the foveola, facilitating diagnosis around this important landmark.

  1. Combined computational metabolite prediction and automated structure-based analysis of mass spectrometric data.

    PubMed

    Stranz, David D; Miao, Shichang; Campbell, Scott; Maydwell, George; Ekins, Sean

    2008-01-01

    ABSTRACT As high-throughput technologies have developed in the pharmaceutical industry, the demand for identification of possible metabolites using predominantly liquid chromatographic/mass spectrometry-mass spectrometry/mass spectrometry (LC/MS-MS/MS) for a large number of molecules in drug discovery has also increased. In parallel, computational technologies have also been developed to generate predictions for metabolites alongside methods to predict MS spectra and score the quality of the match with experimental spectra. The goal of the current study was to generate metabolite predictions from molecular structure with a software product, MetaDrug. In vitro microsomal incubations were used to ultimately produce MS data that could be used to verify the predictions with Apex, which is a new software tool that can predict the molecular ion spectrum and a fragmentation spectrum, automating the detailed examination of both MS and MS/MS spectra. For the test molecule imipramine used to illustrate the combined in vitro/in silico process proposed, MetaDrug predicts 16 metabolites. Following rat microsomal incubations with imipramine and analysis of the MS(n) data using the Apex software, strong evidence was found for imipramine and five metabolites and weaker evidence for five additional metabolites. This study suggests a new approach to streamline MS data analysis using a combination of predictive computational approaches with software capable of comparing the predicted metabolite output with empirical data when looking at drug metabolites.

  2. The Chemical Validation and Standardization Platform (CVSP): large-scale automated validation of chemical structure datasets.

    PubMed

    Karapetyan, Karen; Batchelor, Colin; Sharpe, David; Tkachenko, Valery; Williams, Antony J

    2015-01-01

    There are presently hundreds of online databases hosting millions of chemical compounds and associated data. As a result of the number of cheminformatics software tools that can be used to produce the data, subtle differences between the various cheminformatics platforms, as well as the naivety of the software users, there are a myriad of issues that can exist with chemical structure representations online. In order to help facilitate validation and standardization of chemical structure datasets from various sources we have delivered a freely available internet-based platform to the community for the processing of chemical compound datasets. The chemical validation and standardization platform (CVSP) both validates and standardizes chemical structure representations according to sets of systematic rules. The chemical validation algorithms detect issues with submitted molecular representations using pre-defined or user-defined dictionary-based molecular patterns that are chemically suspicious or potentially requiring manual review. Each identified issue is assigned one of three levels of severity - Information, Warning, and Error - in order to conveniently inform the user of the need to browse and review subsets of their data. The validation process includes validation of atoms and bonds (e.g., making aware of query atoms and bonds), valences, and stereo. The standard form of submission of collections of data, the SDF file, allows the user to map the data fields to predefined CVSP fields for the purpose of cross-validating associated SMILES and InChIs with the connection tables contained within the SDF file. This platform has been applied to the analysis of a large number of data sets prepared for deposition to our ChemSpider database and in preparation of data for the Open PHACTS project. In this work we review the results of the automated validation of the DrugBank dataset, a popular drug and drug target database utilized by the community, and ChEMBL 17 data set

  3. Towards Automated Seismic Moment Tensor Inversion in Australia Using 3D Structural Model

    NASA Astrophysics Data System (ADS)

    Hingee, M.; Tkalcic, H.; Fichtner, A.; Sambridge, M.; Kennett, B. L.; Gorbatov, A.

    2009-12-01

    functions. Implementation of this 3D model will improve warning systems, and we present results that are an important step towards automated MT inversion in Australia. [1] Fichtner, A., Kennett, B.L.N., Igel, H., Bunge, H.-P., 2009. Full seismic waveform tomography for upper-mantle structure in the Australasian region using adjoint methods. Geophys. J. Int., in press.

  4. Using Automated Morphometry to Detect Associations Between ERP Latency and Structural Brain MRI in Normal Adults

    PubMed Central

    Cardenas, Valerie A.; Chao, Linda L.; Blumenfeld, Rob; Song, Enmin; Meyerhoff, Dieter J.; Weiner, Michael W.; Studholme, Colin

    2008-01-01

    Despite the clinical significance of event-related potential (ERP) latency abnormalities, little attention has focused on the anatomic substrate of latency variability. Volume conduction models do not identify the anatomy responsible for delayed neural transmission between neural sources. To explore the anatomic substrate of ERP latency variability in normal adults using automated measures derived from magnetic resonance imaging (MRI), ERPs were recorded in the visual three-stimulus oddball task in 59 healthy participants. Latencies of the P3a and P3b components were measured at the vertex. Measures of local anatomic size in the brain were estimated from structural MRI, using tissue segmentation and deformation morphometry. A general linear model was fitted relating latency to measures of local anatomic size, covarying for intracranial vault volume. Longer P3b latencies were related to contractions in thalamus extending superiorly into the corpus callosum, white matter (WM) anterior to the central sulcus on the left and right, left temporal WM, the right anterior limb of the internal capsule extending into the lenticular nucleus, and larger cerebrospinal fluid volumes. There was no evidence for a relationship between gray matter (GM) volumes and P3b latency. Longer P3a latencies were related to contractions in left temporal WM, and left parietal GM and WM near the interhemispheric fissure. P3b latency variability is related chiefly to WM, thalamus, and lenticular nucleus, whereas P3a latency variability is not related as strongly to anatomy. These results imply that the WM connectivity between generators influences P3b latency more than the generators themselves do. PMID:15834860

  5. Computational strategies for the automated design of RNA nanoscale structures from building blocks using NanoTiler☆

    PubMed Central

    Bindewald, Eckart; Grunewald, Calvin; Boyle, Brett; O’Connor, Mary; Shapiro, Bruce A.

    2013-01-01

    One approach to designing RNA nanoscale structures is to use known RNA structural motifs such as junctions, kissing loops or bulges and to construct a molecular model by connecting these building blocks with helical struts. We previously developed an algorithm for detecting internal loops, junctions and kissing loops in RNA structures. Here we present algorithms for automating or assisting many of the steps that are involved in creating RNA structures from building blocks: (1) assembling building blocks into nanostructures using either a combinatorial search or constraint satisfaction; (2) optimizing RNA 3D ring structures to improve ring closure; (3) sequence optimisation; (4) creating a unique non-degenerate RNA topology descriptor. This effectively creates a computational pipeline for generating molecular models of RNA nanostructures and more specifically RNA ring structures with optimized sequences from RNA building blocks. We show several examples of how the algorithms can be utilized to generate RNA tecto-shapes. PMID:18838281

  6. Automated Sample Exchange Robots for the Structural Biology Beam Lines at the Photon Factory

    SciTech Connect

    Hiraki, Masahiko; Watanabe, Shokei; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Gaponov, Yurii; Wakatsuki, Soichi

    2007-01-19

    We are now developing automated sample exchange robots for high-throughput protein crystallographic experiments for onsite use at synchrotron beam lines. It is part of the fully automated robotics systems being developed at the Photon Factory, for the purposes of protein crystallization, monitoring crystal growth, harvesting and freezing crystals, mounting the crystals inside a hutch and for data collection. We have already installed the sample exchange robots based on the SSRL automated mounting system at our insertion device beam lines BL-5A and AR-NW12A at the Photon Factory. In order to reduce the time required for sample exchange further, a prototype of a double-tonged system was developed. As a result of preliminary experiments with double-tonged robots, the sample exchange time was successfully reduced from 70 seconds to 10 seconds with the exception of the time required for pre-cooling and warming up the tongs.

  7. AutoQSAR: an automated machine learning tool for best-practice quantitative structure-activity relationship modeling.

    PubMed

    Dixon, Steven L; Duan, Jianxin; Smith, Ethan; Von Bargen, Christopher D; Sherman, Woody; Repasky, Matthew P

    2016-10-01

    We introduce AutoQSAR, an automated machine-learning application to build, validate and deploy quantitative structure-activity relationship (QSAR) models. The process of descriptor generation, feature selection and the creation of a large number of QSAR models has been automated into a single workflow within AutoQSAR. The models are built using a variety of machine-learning methods, and each model is scored using a novel approach. Effectiveness of the method is demonstrated through comparison with literature QSAR models using identical datasets for six end points: protein-ligand binding affinity, solubility, blood-brain barrier permeability, carcinogenicity, mutagenicity and bioaccumulation in fish. AutoQSAR demonstrates similar or better predictive performance as compared with published results for four of the six endpoints while requiring minimal human time and expertise.

  8. Automated longitudinal registration of high resolution structural MRI brain sub-volumes in non-human primates

    PubMed Central

    Lecoeur, Jérémy; Wang, Feng; Chen, Li Min; Li, Rui; Avison, Malcolm J.; Dawant, Benoit M.

    2011-01-01

    Accurate anatomic co-registration is a prerequisite for identifying structural and functional changes in longitudinal studies of brain plasticity. Current MRI methods permit collection of brain images across multiple scales, ranging from whole brain at relatively low resolution (≥1 mm), to local brain areas at the level of cortical layers and columns (~100 µm) in the same session, allowing detection of subtle structural changes on a similar spatial scale. To measure these changes reliably, high resolution structural and functional images of local brain regions must be registered accurately across imaging sessions. The present study describes a robust fully automated strategy for the registration of high resolution structural images of brain sub-volumes to lower resolution whole brain images collected within a session, and the registration of partially overlapping high resolution MRI sub-volumes (“slabs”) across imaging sessions. In high field (9.4 T) reduced field-of-view high resolution structural imaging studies using a surface coil in an anesthetized non-human primate model, this fully automated coregistration pipeline was robust in the face of significant inhomogeneities in image intensity and tissue contrast arising from the spatially inhomogeneous transmit and receive properties of the surface coil, achieving a registration accuracy of 30 ± 15 µm between sessions. PMID:21920386

  9. Cryo automated electron tomography: towards high-resolution reconstructions of plastic-embedded structures.

    PubMed

    Braunfeld, M B; Koster, A J; Sedat, J W; Agard, D A

    1994-05-01

    The use of fully automated data collection methods for electron tomography allows a substantial reduction in beam dose. The goal has been to develop new protocols for data collection defining optimal approaches for maintaining data self-consistency and maximizing the useful resolution of the reconstruction. The effects of irradiation and post-cure microwaving were examined for a variety of embedding media (Epon, Epox, Lowicryl) in order to quantify beam damage with the goal of identifying the most beam stable embedding medium. Surprisingly, the substantial dose reduction made possible by automated data collection did not result in a significant decrease in specimen shrinkage even for samples stabilized by pre-irradiation. We believe that the accelerated shrinkage is a direct consequence of the stroboscopic illumination patterns inherent to automated data collection. Furthermore neither the choice of embedding resin nor microwave post-curing greatly affected shrinkage. Finally, cryogenic data collection was investigated as a means to minimize the effects of secondary radiation damage. Minimal pre-irradiation coupled with low-temperature automated data collection greatly reduces shrinkage and should result in high-quality data for three-dimensional reconstructions.

  10. Computer Automated Structure Evaluation (CASE) of the teratogenicity of retinoids with the aid of a novel geometry index

    NASA Astrophysics Data System (ADS)

    Klopman, Gilles; Dimayuga, Mario L.

    1990-06-01

    The CASE (Computer Automated Structure Evaluation) program, with the aid of a geometry index for discriminating cis and trans isomers, has been used to study a set of retinoids tested for teratogenicity in hamsters. CASE identified 8 fragments, the most important representing the non-polar terminus of a retinoid with an additional ring system which introduces some rigidity in the isoprenoid side chain. The geometry index helped to identify relevant fragments with an all- trans configuration and to distinguish them from irrelevant fragments with other configurations.

  11. Development of automated extraction method of biliary tract from abdominal CT volumes based on local intensity structure analysis

    NASA Astrophysics Data System (ADS)

    Koga, Kusuto; Hayashi, Yuichiro; Hirose, Tomoaki; Oda, Masahiro; Kitasaka, Takayuki; Igami, Tsuyoshi; Nagino, Masato; Mori, Kensaku

    2014-03-01

    In this paper, we propose an automated biliary tract extraction method from abdominal CT volumes. The biliary tract is the path by which bile is transported from liver to the duodenum. No extraction method have been reported for the automated extraction of the biliary tract from common contrast CT volumes. Our method consists of three steps including: (1) extraction of extrahepatic bile duct (EHBD) candidate regions, (2) extraction of intrahepatic bile duct (IHBD) candidate regions, and (3) combination of these candidate regions. The IHBD has linear structures and intensities of the IHBD are low in CT volumes. We use a dark linear structure enhancement (DLSE) filter based on a local intensity structure analysis method using the eigenvalues of the Hessian matrix for the IHBD candidate region extraction. The EHBD region is extracted using a thresholding process and a connected component analysis. In the combination process, we connect the IHBD candidate regions to each EHBD candidate region and select a bile duct region from the connected candidate regions. We applied the proposed method to 22 cases of CT volumes. An average Dice coefficient of extraction result was 66.7%.

  12. Revisiting automated G-protein coupled receptor modeling: the benefit of additional template structures for a neurokinin-1 receptor model.

    PubMed

    Kneissl, Benny; Leonhardt, Bettina; Hildebrandt, Andreas; Tautermann, Christofer S

    2009-05-28

    The feasibility of automated procedures for the modeling of G-protein coupled receptors (GPCR) is investigated on the example of the human neurokinin-1 (NK1) receptor. We use a combined method of homology modeling and molecular docking and analyze the information content of the resulting docking complexes regarding the binding mode for further refinements. Moreover, we explore the impact of different template structures, the bovine rhodopsin structure, the human beta(2) adrenergic receptor, and in particular a combination of both templates to include backbone flexibility in the target conformational space. Our results for NK1 modeling demonstrate that model selection from a set of decoys can in general not solely rely on docking experiments but still requires additional mutagenesis data. However, an enrichment factor of 2.6 in a nearly fully automated approach indicates that reasonable models can be created automatically if both available templates are used for model construction. Thus, the recently resolved GPCR structures open new ways to improve the model building fundamentally.

  13. The scheme of combined application of optimization and simulation models for formation of an optimum structure of an automated control system of space systems

    NASA Astrophysics Data System (ADS)

    Chernigovskiy, A. S.; Tsarev, R. Yu; Nikiforov, A. Yu; Zelenkov, P. V.

    2016-11-01

    With the development of automated control systems of space systems, there are new classes of spacecraft that requires improvement of their structure and expand their functions. When designing the automated control system of space systems occurs various tasks such as: determining location of elements and subsystems in the space, hardware selection, the distribution of the set of functions performed by the system units, all of this under certain conditions on the quality of control and connectivity of components. The problem of synthesis of structure of automated control system of space systems formalized using discrete variables at various levels of system detalization. A sequence of tasks and stages of the formation of automated control system of space systems structure is developed. The authors have developed and proposed a scheme of the combined implementation of optimization and simulation models to ensure rational distribution of functions between the automated control system complex and the rest of the system units. The proposed approach allows to make reasonable hardware selection, taking into account the different requirements for the operation of automated control systems of space systems.

  14. Evaluation of stereo-array isotope labeling (SAIL) patterns for automated structural analysis of proteins with CYANA.

    PubMed

    Ikeya, Teppei; Terauchi, Tsutomu; Güntert, Peter; Kainosho, Masatsune

    2006-07-01

    Recently we have developed the stereo-array isotope labeling (SAIL) technique to overcome the conventional molecular size limitation in NMR protein structure determination by employing complete stereo- and regiospecific patterns of stable isotopes. SAIL sharpens signals and simplifies spectra without the loss of requisite structural information, thus making large classes of proteins newly accessible to detailed solution structure determination. The automated structure calculation program CYANA can efficiently analyze SAIL-NOESY spectra and calculate structures without manual analysis. Nevertheless, the original SAIL method might not be capable of determining the structures of proteins larger than 50 kDa or membrane proteins, for which the spectra are characterized by many broadened and overlapped peaks. Here we have carried out simulations of new SAIL patterns optimized for minimal relaxation and overlap, to evaluate the combined use of SAIL and CYANA for solving the structures of larger proteins and membrane proteins. The modified approach reduces the number of peaks to nearly half of that observed with uniform labeling, while still yielding well-defined structures and is expected to enable NMR structure determinations of these challenging systems.

  15. PONDEROSA-C/S: client-server based software package for automated protein 3D structure determination.

    PubMed

    Lee, Woonghee; Stark, Jaime L; Markley, John L

    2014-11-01

    Peak-picking Of Noe Data Enabled by Restriction Of Shift Assignments-Client Server (PONDEROSA-C/S) builds on the original PONDEROSA software (Lee et al. in Bioinformatics 27:1727-1728. doi: 10.1093/bioinformatics/btr200, 2011) and includes improved features for structure calculation and refinement. PONDEROSA-C/S consists of three programs: Ponderosa Server, Ponderosa Client, and Ponderosa Analyzer. PONDEROSA-C/S takes as input the protein sequence, a list of assigned chemical shifts, and nuclear Overhauser data sets ((13)C- and/or (15)N-NOESY). The output is a set of assigned NOEs and 3D structural models for the protein. Ponderosa Analyzer supports the visualization, validation, and refinement of the results from Ponderosa Server. These tools enable semi-automated NMR-based structure determination of proteins in a rapid and robust fashion. We present examples showing the use of PONDEROSA-C/S in solving structures of four proteins: two that enable comparison with the original PONDEROSA package, and two from the Critical Assessment of automated Structure Determination by NMR (Rosato et al. in Nat Methods 6:625-626. doi: 10.1038/nmeth0909-625 , 2009) competition. The software package can be downloaded freely in binary format from http://pine.nmrfam.wisc.edu/download_packages.html. Registered users of the National Magnetic Resonance Facility at Madison can submit jobs to the PONDEROSA-C/S server at http://ponderosa.nmrfam.wisc.edu, where instructions, tutorials, and instructions can be found. Structures are normally returned within 1-2 days.

  16. Automated protein structure modeling in CASP9 by I-TASSER pipeline combined with QUARK-based ab initio folding and FG-MD-based structure refinement.

    PubMed

    Xu, Dong; Zhang, Jian; Roy, Ambrish; Zhang, Yang

    2011-01-01

    I-TASSER is an automated pipeline for protein tertiary structure prediction using multiple threading alignments and iterative structure assembly simulations. In CASP9 experiments, two new algorithms, QUARK and fragment-guided molecular dynamics (FG-MD), were added to the I-TASSER pipeline for improving the structural modeling accuracy. QUARK is a de novo structure prediction algorithm used for structure modeling of proteins that lack detectable template structures. For distantly homologous targets, QUARK models are found useful as a reference structure for selecting good threading alignments and guiding the I-TASSER structure assembly simulations. FG-MD is an atomic-level structural refinement program that uses structural fragments collected from the PDB structures to guide molecular dynamics simulation and improve the local structure of predicted model, including hydrogen-bonding networks, torsion angles, and steric clashes. Despite considerable progress in both the template-based and template-free structure modeling, significant improvements on protein target classification, domain parsing, model selection, and ab initio folding of β-proteins are still needed to further improve the I-TASSER pipeline. Copyright © 2011 Wiley-Liss, Inc.

  17. Automated protein structure modeling in CASP9 by I-TASSER pipeline combined with QUARK-based ab initio folding and FG-MD-based structure refinement

    PubMed Central

    Xu, Dong; Zhang, Jian; Roy, Ambrish; Zhang, Yang

    2011-01-01

    I-TASSER is an automated pipeline for protein tertiary structure prediction using multiple threading alignments and iterative structure assembly simulations. In CASP9 experiments, two new algorithms, QUARK and FG-MD, were added to the I-TASSER pipeline for improving the structural modeling accuracy. QUARK is a de novo structure prediction algorithm used for structure modeling of proteins that lack detectable template structures. For distantly homologous targets, QUARK models are found useful as a reference structure for selecting good threading alignments and guiding the I-TASSER structure assembly simulations. FG-MD is an atomic-level structural refinement program that uses structural fragments collected from the PDB structures to guide molecular dynamics simulation and improve the local structure of predicted model, including hydrogen-bonding networks, torsion angles and steric clashes. Despite considerable progress in both the template-based and template-free structure modeling, significant improvements on protein target classification, domain parsing, model selection, and ab initio folding of beta-proteins are still needed to further improve the I-TASSER pipeline. PMID:22069036

  18. The second round of Critical Assessment of Automated Structure Determination of Proteins by NMR: CASD-NMR-2013.

    PubMed

    Rosato, Antonio; Vranken, Wim; Fogh, Rasmus H; Ragan, Timothy J; Tejero, Roberto; Pederson, Kari; Lee, Hsiau-Wei; Prestegard, James H; Yee, Adelinda; Wu, Bin; Lemak, Alexander; Houliston, Scott; Arrowsmith, Cheryl H; Kennedy, Michael; Acton, Thomas B; Xiao, Rong; Liu, Gaohua; Montelione, Gaetano T; Vuister, Geerten W

    2015-08-01

    The second round of the community-wide initiative Critical Assessment of automated Structure Determination of Proteins by NMR (CASD-NMR-2013) comprised ten blind target datasets, consisting of unprocessed spectral data, assigned chemical shift lists and unassigned NOESY peak and RDC lists, that were made available in both curated (i.e. manually refined) or un-curated (i.e. automatically generated) form. Ten structure calculation programs, using fully automated protocols only, generated a total of 164 three-dimensional structures (entries) for the ten targets, sometimes using both curated and un-curated lists to generate multiple entries for a single target. The accuracy of the entries could be established by comparing them to the corresponding manually solved structure of each target, which was not available at the time the data were provided. Across the entire data set, 71 % of all entries submitted achieved an accuracy relative to the reference NMR structure better than 1.5 Å. Methods based on NOESY peak lists achieved even better results with up to 100% of the entries within the 1.5 Å threshold for some programs. However, some methods did not converge for some targets using un-curated NOESY peak lists. Over 90% of the entries achieved an accuracy better than the more relaxed threshold of 2.5 Å that was used in the previous CASD-NMR-2010 round. Comparisons between entries generated with un-curated versus curated peaks show only marginal improvements for the latter in those cases where both calculations converged.

  19. Automated Processing of Imaging Data through Multi-tiered Classification of Biological Structures Illustrated Using Caenorhabditis elegans

    PubMed Central

    Zhan, Mei; Crane, Matthew M.; Entchev, Eugeni V.; Caballero, Antonio; Fernandes de Abreu, Diana Andrea; Ch’ng, QueeLim; Lu, Hang

    2015-01-01

    Quantitative imaging has become a vital technique in biological discovery and clinical diagnostics; a plethora of tools have recently been developed to enable new and accelerated forms of biological investigation. Increasingly, the capacity for high-throughput experimentation provided by new imaging modalities, contrast techniques, microscopy tools, microfluidics and computer controlled systems shifts the experimental bottleneck from the level of physical manipulation and raw data collection to automated recognition and data processing. Yet, despite their broad importance, image analysis solutions to address these needs have been narrowly tailored. Here, we present a generalizable formulation for autonomous identification of specific biological structures that is applicable for many problems. The process flow architecture we present here utilizes standard image processing techniques and the multi-tiered application of classification models such as support vector machines (SVM). These low-level functions are readily available in a large array of image processing software packages and programming languages. Our framework is thus both easy to implement at the modular level and provides specific high-level architecture to guide the solution of more complicated image-processing problems. We demonstrate the utility of the classification routine by developing two specific classifiers as a toolset for automation and cell identification in the model organism Caenorhabditis elegans. To serve a common need for automated high-resolution imaging and behavior applications in the C. elegans research community, we contribute a ready-to-use classifier for the identification of the head of the animal under bright field imaging. Furthermore, we extend our framework to address the pervasive problem of cell-specific identification under fluorescent imaging, which is critical for biological investigation in multicellular organisms or tissues. Using these examples as a guide, we envision

  20. Semi-automated processing and routing within indoor structures for emergency response applications

    NASA Astrophysics Data System (ADS)

    Liu, Jianfei; Lyons, Kyle; Subramanian, Kalpathi; Ribarsky, William

    2010-04-01

    In this work, we propose new automation tools to process 2D building geometry data for effective communication and timely response to critical events in commercial buildings. Given the scale and complexity of commercial buildings, robust and visually rich tools are needed during an emergency. Our data processing pipeline consists of three major components, (1) adjacency graph construction, representing spatial relationships within a building (between hallways, offices, stairways, elevators), (2) identification of elements involved in evacuation routes (hallways, stairways), (3) 3D building network construction, by connecting the oor elements via stairways and elevators. We have used these tools to process a cluster of five academic buildings. Our automation tools (despite some needed manual processing) show a significant advantage over manual processing (a few minutes vs. 2-4 hours). Designed as a client-server model, our system supports analytical capabilities to determine dynamic routing within a building under constraints(parts of the building blocked during emergencies, for instance). Visualization capabilities are provided for easy interaction with the system, on both desktop (command post) stations as well as mobile hand-held devices, simulating a command post-responder scenario.

  1. Production implementation of fully automated, closed loop cure control for advanced composite structures

    NASA Astrophysics Data System (ADS)

    Johnson, Sean A.; Roberts, Nancy K.

    Economic of advanced composite part production requires development and use of the most aggressive cure cycles possible without sacrificing quality. As cure cycles are shortened and heating rates increase, tolerance windows for process parameters become increasingly narrow. These factors are intensified by condensation curing systems which generate large amounts of volatiles. Management of the situation requires fully automated, closed loop process control and a fundamental understanding of the material system used for the application. No turnkey system for this application is currently available. General Dynamics Pomona Division (GD/PD) has developed an integrated closed loop control system which is now being proofed in production. Realization of this system will enable cure time reductions of nearly 50 percent, while increasing yield and maintaining quality.

  2. Automating tasks in protein structure determination with the Clipper Python module.

    PubMed

    McNicholas, Stuart; Croll, Tristan; Burnley, Tom; Palmer, Colin M; Hoh, Soon Wen; Jenkins, Huw T; Dodson, Eleanor; Cowtan, Kevin; Agirre, Jon

    2017-09-13

    Scripting programming languages provide the fastest means of prototyping complex functionality. Those with a syntax and grammar resembling human language also greatly enhance the maintainability of the produced source code. Furthermore, the combination of a powerful, machine-independent scripting language with binary libraries tailored for each computer architecture allows programs to break free from the tight boundaries of efficiency traditionally associated with scripts. In the present work, we describe how an efficient C++ crystallographic library such as Clipper can be wrapped, adapted and generalised for use in both crystallographic and electron cryo-microscopy applications, scripted with the Python language. We shall also place an emphasis on best practices in automation, illustrating how this can be achieved with this new Python module. This article is protected by copyright. All rights reserved. © 2017 The Protein Society.

  3. The role of social and ecological processes in structuring animal populations: a case study from automated tracking of wild birds

    PubMed Central

    Farine, Damien R.; Firth, Josh A.; Aplin, Lucy M.; Crates, Ross A.; Culina, Antica; Garroway, Colin J.; Hinde, Camilla A.; Kidd, Lindall R.; Milligan, Nicole D.; Psorakis, Ioannis; Radersma, Reinder; Verhelst, Brecht; Voelkl, Bernhard; Sheldon, Ben C.

    2015-01-01

    Both social and ecological factors influence population process and structure, with resultant consequences for phenotypic selection on individuals. Understanding the scale and relative contribution of these two factors is thus a central aim in evolutionary ecology. In this study, we develop a framework using null models to identify the social and spatial patterns that contribute to phenotypic structure in a wild population of songbirds. We used automated technologies to track 1053 individuals that formed 73 737 groups from which we inferred a social network. Our framework identified that both social and spatial drivers contributed to assortment in the network. In particular, groups had a more even sex ratio than expected and exhibited a consistent age structure that suggested local association preferences, such as preferential attachment or avoidance. By contrast, recent immigrants were spatially partitioned from locally born individuals, suggesting differential dispersal strategies by phenotype. Our results highlight how different scales of social decision-making, ranging from post-natal dispersal settlement to fission–fusion dynamics, can interact to drive phenotypic structure in animal populations. PMID:26064644

  4. AIDA: ab initio domain assembly for automated multi-domain protein structure prediction and domain–domain interaction prediction

    PubMed Central

    Xu, Dong; Jaroszewski, Lukasz; Li, Zhanwen; Godzik, Adam

    2015-01-01

    Motivation: Most proteins consist of multiple domains, independent structural and evolutionary units that are often reshuffled in genomic rearrangements to form new protein architectures. Template-based modeling methods can often detect homologous templates for individual domains, but templates that could be used to model the entire query protein are often not available. Results: We have developed a fast docking algorithm ab initio domain assembly (AIDA) for assembling multi-domain protein structures, guided by the ab initio folding potential. This approach can be extended to discontinuous domains (i.e. domains with ‘inserted’ domains). When tested on experimentally solved structures of multi-domain proteins, the relative domain positions were accurately found among top 5000 models in 86% of cases. AIDA server can use domain assignments provided by the user or predict them from the provided sequence. The latter approach is particularly useful for automated protein structure prediction servers. The blind test consisting of 95 CASP10 targets shows that domain boundaries could be successfully determined for 97% of targets. Availability and implementation: The AIDA package as well as the benchmark sets used here are available for download at http://ffas.burnham.org/AIDA/. Contact: adam@sanfordburnham.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25701568

  5. The role of social and ecological processes in structuring animal populations: a case study from automated tracking of wild birds.

    PubMed

    Farine, Damien R; Firth, Josh A; Aplin, Lucy M; Crates, Ross A; Culina, Antica; Garroway, Colin J; Hinde, Camilla A; Kidd, Lindall R; Milligan, Nicole D; Psorakis, Ioannis; Radersma, Reinder; Verhelst, Brecht; Voelkl, Bernhard; Sheldon, Ben C

    2015-04-01

    Both social and ecological factors influence population process and structure, with resultant consequences for phenotypic selection on individuals. Understanding the scale and relative contribution of these two factors is thus a central aim in evolutionary ecology. In this study, we develop a framework using null models to identify the social and spatial patterns that contribute to phenotypic structure in a wild population of songbirds. We used automated technologies to track 1053 individuals that formed 73 737 groups from which we inferred a social network. Our framework identified that both social and spatial drivers contributed to assortment in the network. In particular, groups had a more even sex ratio than expected and exhibited a consistent age structure that suggested local association preferences, such as preferential attachment or avoidance. By contrast, recent immigrants were spatially partitioned from locally born individuals, suggesting differential dispersal strategies by phenotype. Our results highlight how different scales of social decision-making, ranging from post-natal dispersal settlement to fission-fusion dynamics, can interact to drive phenotypic structure in animal populations.

  6. Advances in inspection automation

    NASA Astrophysics Data System (ADS)

    Weber, Walter H.; Mair, H. Douglas; Jansen, Dion; Lombardi, Luciano

    2013-01-01

    This new session at QNDE reflects the growing interest in inspection automation. Our paper describes a newly developed platform that makes the complex NDE automation possible without the need for software programmers. Inspection tasks that are tedious, error-prone or impossible for humans to perform can now be automated using a form of drag and drop visual scripting. Our work attempts to rectify the problem that NDE is not keeping pace with the rest of factory automation. Outside of NDE, robots routinely and autonomously machine parts, assemble components, weld structures and report progress to corporate databases. By contrast, components arriving in the NDT department typically require manual part handling, calibrations and analysis. The automation examples in this paper cover the development of robotic thickness gauging and the use of adaptive contour following on the NRU reactor inspection at Chalk River.

  7. Method and system for automated on-chip material and structural certification of MEMS devices

    DOEpatents

    Sinclair, Michael B.; DeBoer, Maarten P.; Smith, Norman F.; Jensen, Brian D.; Miller, Samuel L.

    2003-05-20

    A new approach toward MEMS quality control and materials characterization is provided by a combined test structure measurement and mechanical response modeling approach. Simple test structures are cofabricated with the MEMS devices being produced. These test structures are designed to isolate certain types of physical response, so that measurement of their behavior under applied stress can be easily interpreted as quality control and material properties information.

  8. Automated detection and labeling of high-density EEG electrodes from structural MR images

    NASA Astrophysics Data System (ADS)

    Marino, Marco; Liu, Quanying; Brem, Silvia; Wenderoth, Nicole; Mantini, Dante

    2016-10-01

    Objective. Accurate knowledge about the positions of electrodes in electroencephalography (EEG) is very important for precise source localizations. Direct detection of electrodes from magnetic resonance (MR) images is particularly interesting, as it is possible to avoid errors of co-registration between electrode and head coordinate systems. In this study, we propose an automated MR-based method for electrode detection and labeling, particularly tailored to high-density montages. Approach. Anatomical MR images were processed to create an electrode-enhanced image in individual space. Image processing included intensity non-uniformity correction, background noise and goggles artifact removal. Next, we defined a search volume around the head where electrode positions were detected. Electrodes were identified as local maxima in the search volume and registered to the Montreal Neurological Institute standard space using an affine transformation. This allowed the matching of the detected points with the specific EEG montage template, as well as their labeling. Matching and labeling were performed by the coherent point drift method. Our method was assessed on 8 MR images collected in subjects wearing a 256-channel EEG net, using the displacement with respect to manually selected electrodes as performance metric. Main results. Average displacement achieved by our method was significantly lower compared to alternative techniques, such as the photogrammetry technique. The maximum displacement was for more than 99% of the electrodes lower than 1 cm, which is typically considered an acceptable upper limit for errors in electrode positioning. Our method showed robustness and reliability, even in suboptimal conditions, such as in the case of net rotation, imprecisely gathered wires, electrode detachment from the head, and MR image ghosting. Significance. We showed that our method provides objective, repeatable and precise estimates of EEG electrode coordinates. We hope our work

  9. Distributed cyberinfrastructure tools for automated data processing of structural monitoring data

    NASA Astrophysics Data System (ADS)

    Zhang, Yilan; Kurata, Masahiro; Lynch, Jerome P.; van der Linden, Gwendolyn; Sederat, Hassan; Prakash, Atul

    2012-04-01

    The emergence of cost-effective sensing technologies has now enabled the use of dense arrays of sensors to monitor the behavior and condition of large-scale bridges. The continuous operation of dense networks of sensors presents a number of new challenges including how to manage such massive amounts of data that can be created by the system. This paper reports on the progress of the creation of cyberinfrastructure tools which hierarchically control networks of wireless sensors deployed in a long-span bridge. The internet-enabled cyberinfrastructure is centrally managed by a powerful database which controls the flow of data in the entire monitoring system architecture. A client-server model built upon the database provides both data-provider and system end-users with secured access to various levels of information of a bridge. In the system, information on bridge behavior (e.g., acceleration, strain, displacement) and environmental condition (e.g., wind speed, wind direction, temperature, humidity) are uploaded to the database from sensor networks installed in the bridge. Then, data interrogation services interface with the database via client APIs to autonomously process data. The current research effort focuses on an assessment of the scalability and long-term robustness of the proposed cyberinfrastructure framework that has been implemented along with a permanent wireless monitoring system on the New Carquinez (Alfred Zampa Memorial) Suspension Bridge in Vallejo, CA. Many data interrogation tools are under development using sensor data and bridge metadata (e.g., geometric details, material properties, etc.) Sample data interrogation clients including those for the detection of faulty sensors, automated modal parameter extraction.

  10. Semi-automated measurement of anatomical structures using statistical and morphological priors

    NASA Astrophysics Data System (ADS)

    Ashton, Edward A.; Du, Tong

    2004-05-01

    Rapid, accurate and reproducible delineation and measurement of arbitrary anatomical structures in medical images is a widely held goal, with important applications in both clinical diagnostics and, perhaps more significantly, pharmaceutical trial evaluation. This process requires the ability first to localize a structure within the body, and then to find a best approximation of the structure"s boundaries within a given scan. Structures that are tortuous and small in cross section, such as the hippocampus in the brain or the abdominal aorta, present a particular challenge. Their apparent shape and position can change significantly from slice to slice, and accurate prior shape models for such structures are often difficult to form. In this work, we have developed a system that makes use of both a user-defined shape model and a statistical maximum likelihood classifier to identify and measure structures of this sort in MRI and CT images. Experiments show that this system can reduce analysis time by 75% or more with respect to manual tracing with no loss of precision or accuracy.

  11. Automating crystallographic structure solution and refinement of protein–ligand complexes

    PubMed Central

    Echols, Nathaniel; Moriarty, Nigel W.; Klei, Herbert E.; Afonine, Pavel V.; Bunkóczi, Gábor; Headd, Jeffrey J.; McCoy, Airlie J.; Oeffner, Robert D.; Read, Randy J.; Terwilliger, Thomas C.; Adams, Paul D.

    2014-01-01

    High-throughput drug-discovery and mechanistic studies often require the determination of multiple related crystal structures that only differ in the bound ligands, point mutations in the protein sequence and minor conformational changes. If performed manually, solution and refinement requires extensive repetition of the same tasks for each structure. To accelerate this process and minimize manual effort, a pipeline encompassing all stages of ligand building and refinement, starting from integrated and scaled diffraction intensities, has been implemented in Phenix. The resulting system is able to successfully solve and refine large collections of structures in parallel without extensive user intervention prior to the final stages of model completion and validation. PMID:24419387

  12. Automated Lipid A Structure Assignment from Hierarchical Tandem Mass Spectrometry Data

    NASA Astrophysics Data System (ADS)

    Ting, Ying S.; Shaffer, Scott A.; Jones, Jace W.; Ng, Wailap V.; Ernst, Robert K.; Goodlett, David R.

    2011-05-01

    Infusion-based electrospray ionization (ESI) coupled to multiple-stage tandem mass spectrometry (MS n ) is a standard methodology for investigating lipid A structural diversity (Shaffer et al. J. Am. Soc. Mass. Spectrom. 18(6), 1080-1092, 2007). Annotation of these MS n spectra, however, has remained a manual, expert-driven process. In order to keep up with the data acquisition rates of modern instruments, we devised a computational method to annotate lipid A MS n spectra rapidly and automatically, which we refer to as hierarchical tandem mass spectrometry (HiTMS) algorithm. As a first-pass tool, HiTMS aids expert interpretation of lipid A MS n data by providing the analyst with a set of candidate structures that may then be confirmed or rejected. HiTMS deciphers the signature ions (e.g., A-, Y-, and Z-type ions) and neutral losses of MS n spectra using a species-specific library based on general prior structural knowledge of the given lipid A species under investigation. Candidates are selected by calculating the correlation between theoretical and acquired MS n spectra. At a false discovery rate of less than 0.01, HiTMS correctly assigned 85% of the structures in a library of 133 manually annotated Francisella tularensis subspecies novicida lipid A structures. Additionally, HiTMS correctly assigned 85% of the structures in a smaller library of lipid A species from Yersinia pestis demonstrating that it may be used across species.

  13. Automated delineation of brain structures in patients undergoing radiotherapy for primary brain tumors: from atlas to dose-volume histograms.

    PubMed

    Conson, Manuel; Cella, Laura; Pacelli, Roberto; Comerci, Marco; Liuzzi, Raffaele; Salvatore, Marco; Quarantelli, Mario

    2014-09-01

    To implement and evaluate a magnetic resonance imaging atlas-based automated segmentation (MRI-ABAS) procedure for cortical and sub-cortical grey matter areas definition, suitable for dose-distribution analyses in brain tumor patients undergoing radiotherapy (RT). 3T-MRI scans performed before RT in ten brain tumor patients were used. The MRI-ABAS procedure consists of grey matter classification and atlas-based regions of interest definition. The Simultaneous Truth and Performance Level Estimation (STAPLE) algorithm was applied to structures manually delineated by four experts to generate the standard reference. Performance was assessed comparing multiple geometrical metrics (including Dice Similarity Coefficient - DSC). Dosimetric parameters from dose-volume-histograms were also generated and compared. Compared with manual delineation, MRI-ABAS showed excellent reproducibility [median DSCABAS=1 (95% CI, 0.97-1.0) vs. DSCMANUAL=0.90 (0.73-0.98)], acceptable accuracy [DSCABAS=0.81 (0.68-0.94) vs. DSCMANUAL=0.90 (0.76-0.98)], and an overall 90% reduction in delineation time. Dosimetric parameters obtained using MRI-ABAS were comparable with those obtained by manual contouring. The speed, reproducibility, and robustness of the process make MRI-ABAS a valuable tool for investigating radiation dose-volume effects in non-target brain structures providing additional standardized data without additional time-consuming procedures. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  14. CompaRNA: a server for continuous benchmarking of automated methods for RNA secondary structure prediction

    PubMed Central

    Puton, Tomasz; Kozlowski, Lukasz P.; Rother, Kristian M.; Bujnicki, Janusz M.

    2013-01-01

    We present a continuous benchmarking approach for the assessment of RNA secondary structure prediction methods implemented in the CompaRNA web server. As of 3 October 2012, the performance of 28 single-sequence and 13 comparative methods has been evaluated on RNA sequences/structures released weekly by the Protein Data Bank. We also provide a static benchmark generated on RNA 2D structures derived from the RNAstrand database. Benchmarks on both data sets offer insight into the relative performance of RNA secondary structure prediction methods on RNAs of different size and with respect to different types of structure. According to our tests, on the average, the most accurate predictions obtained by a comparative approach are generated by CentroidAlifold, MXScarna, RNAalifold and TurboFold. On the average, the most accurate predictions obtained by single-sequence analyses are generated by CentroidFold, ContextFold and IPknot. The best comparative methods typically outperform the best single-sequence methods if an alignment of homologous RNA sequences is available. This article presents the results of our benchmarks as of 3 October 2012, whereas the rankings presented online are continuously updated. We will gladly include new prediction methods and new measures of accuracy in the new editions of CompaRNA benchmarks. PMID:23435231

  15. Direct-method SAD phasing with partial-structure iteration: towards automation.

    PubMed

    Wang, J W; Chen, J R; Gu, Y X; Zheng, C D; Fan, H F

    2004-11-01

    The probability formula of direct-method SAD (single-wavelength anomalous diffraction) phasing proposed by Fan & Gu (1985, Acta Cryst. A41, 280-284) contains partial-structure information in the form of a Sim-weighting term. Previously, only the substructure of anomalous scatterers has been included in this term. In the case that the subsequent density modification and model building yields only structure fragments, which do not straightforwardly lead to the complete solution, the partial structure can be fed back into the Sim-weighting term of the probability formula in order to strengthen its phasing power and to benefit the subsequent automatic model building. The procedure has been tested with experimental SAD data from two known proteins with copper and sulfur as the anomalous scatterers.

  16. Automated Aufbau of antibody structures from given sequences using Macromoltek's SmrtMolAntibody.

    PubMed

    Berrondo, Monica; Kaufmann, Susana; Berrondo, Manuel

    2014-08-01

    This study was a part of the second antibody modeling assessment. The assessment is a blind study of the performance of multiple software programs used for antibody homology modeling. In the study, research groups were given sequences for 11 antibodies and asked to predict their corresponding structures. The results were measured using root-mean-square deviation (rmsd) between the submitted models and X-ray crystal structures. In 10 of 11 cases, the results using SmrtMolAntibody show good agreement between the submitted models and X-ray crystal structures. In the first stage, the average rmsd was 1.4 Å. Average rmsd values for the framework was 1.2 Å and for the H3 loop was 3.0 Å. In stage two, there was a slight improvement with an rmsd for the H3 loop of 2.9 Å.

  17. Automated Quantification of Arbitrary Arm-Segment Structure in Spiral Galaxies

    NASA Astrophysics Data System (ADS)

    Davis, Darren Robert

    This thesis describes a system that, given approximately-centered images of spiral galaxies, produces quantitative descriptions of spiral galaxy structure without the need for per-image human input. This structure information consists of a list of spiral arm segments, each associated with a fitted logarithmic spiral arc and a pixel region. This list-of-arcs representation allows description of arbitrary spiral galaxy structure: the arms do not need to be symmetric, may have forks or bends, and, more generally, may be arranged in any manner with a consistent spiral-pattern center (non-merging galaxies have a sufficiently well-defined center). Such flexibility is important in order to accommodate the myriad structure variations observed in spiral galaxies. From the arcs produced from our method it is possible to calculate measures of spiral galaxy structure such as winding direction, winding tightness, arm counts, asymmetry, or other values of interest (including user-defined measures). In addition to providing information about the spiral arm "skeleton" of each galaxy, our method can enable analyses of brightness within individual spiral arms, since we provide the pixel regions associated with each spiral arm segment. For winding direction, arm tightness, and arm count, comparable information is available (to various extents) from previous efforts; to the extent that such information is available, we find strong correspondence with our output. We also characterize the changes to (and invariances in) our output as a function of modifications to important algorithm parameters. By enabling generation of extensive data about spiral galaxy structure from large-scale sky surveys, our method will enable new discoveries and tests regarding the nature of galaxies and the universe, and will facilitate subsequent work to automatically fit detailed brightness models of spiral galaxies.

  18. Automated test bench for simulation of radiation electrification of spacecraft structural dielectrics

    NASA Astrophysics Data System (ADS)

    Vladimirov, A. M.; Bezhayev, A. Yu; Zykov, V. M.; Isaychenko, V. I.; Lukashchuk, A. A.; Lukonin, S. E.

    2017-01-01

    The paper describes the test bench “Prognoz-2” designed in Testing Center, Institute of Non-Destructive Testing, Tomsk Polytechnic University, which can be used: for ground testing of individual samples of spacecraft structural materials (e.g. thermal control coatings or cover glasses for solar batteries) or ceramics of the plasma thruster discharge channel), and whole spacecraft units or instruments (e.g. instruments of solar and stellar orientation or correcting plasma thrusters) exposed to radiation electrification factors; to verify the calculation mathematical models of radiation electrification of structural dielectrics under the impact of space factors in different orbits.

  19. A method for automated determination of the crystal structures from X-ray powder diffraction data

    SciTech Connect

    Hofmann, D. W. M. Kuleshova, L. N.

    2006-05-15

    An algorithm is proposed for determining the crystal structure of compounds. In the framework of this algorithm, X-ray powder diffraction patterns are compared using a new similarity index. Unlike the indices traditionally employed in X-ray powder diffraction analysis, the new similarity index can be applied even in the case of overlapping peaks and large differences in unit cell parameters. The capabilities of the proposed procedure are demonstrated by solving the crystal structures of a number of organic pigments (PY111, PR181, Me-PR170)

  20. Evolutionary Trace Annotation Server: automated enzyme function prediction in protein structures using 3D templates

    PubMed Central

    Matthew Ward, R.; Venner, Eric; Daines, Bryce; Murray, Stephen; Erdin, Serkan; Kristensen, David M.; Lichtarge, Olivier

    2009-01-01

    Summary:The Evolutionary Trace Annotation (ETA) Server predicts enzymatic activity. ETA starts with a structure of unknown function, such as those from structural genomics, and with no prior knowledge of its mechanism uses the phylogenetic Evolutionary Trace (ET) method to extract key functional residues and propose a function-associated 3D motif, called a 3D template. ETA then searches previously annotated structures for geometric template matches that suggest molecular and thus functional mimicry. In order to maximize the predictive value of these matches, ETA next applies distinctive specificity filters—evolutionary similarity, function plurality and match reciprocity. In large scale controls on enzymes, prediction coverage is 43% but the positive predictive value rises to 92%, thus minimizing false annotations. Users may modify any search parameter, including the template. ETA thus expands the ET suite for protein structure annotation, and can contribute to the annotation efforts of metaservers. Availability:The ETA Server is a web application available at http://mammoth.bcm.tmc.edu/eta/. Contact: lichtarge@bcm.edu PMID:19307237

  1. Combining structure and sequence information allows automated prediction of substrate specificities within enzyme families.

    PubMed

    Röttig, Marc; Rausch, Christian; Kohlbacher, Oliver

    2010-01-08

    An important aspect of the functional annotation of enzymes is not only the type of reaction catalysed by an enzyme, but also the substrate specificity, which can vary widely within the same family. In many cases, prediction of family membership and even substrate specificity is possible from enzyme sequence alone, using a nearest neighbour classification rule. However, the combination of structural information and sequence information can improve the interpretability and accuracy of predictive models. The method presented here, Active Site Classification (ASC), automatically extracts the residues lining the active site from one representative three-dimensional structure and the corresponding residues from sequences of other members of the family. From a set of representatives with known substrate specificity, a Support Vector Machine (SVM) can then learn a model of substrate specificity. Applied to a sequence of unknown specificity, the SVM can then predict the most likely substrate. The models can also be analysed to reveal the underlying structural reasons determining substrate specificities and thus yield valuable insights into mechanisms of enzyme specificity. We illustrate the high prediction accuracy achieved on two benchmark data sets and the structural insights gained from ASC by a detailed analysis of the family of decarboxylating dehydrogenases. The ASC web service is available at http://asc.informatik.uni-tuebingen.de/.

  2. Automated Remote Focusing, Drift Correction, and Photostimulation to Evaluate Structural Plasticity in Dendritic Spines

    PubMed Central

    Evans, Paul R.; Garrett, Tavita R.; Yan, Long; Yasuda, Ryohei

    2017-01-01

    Long-term structural plasticity of dendritic spines plays a key role in synaptic plasticity, the cellular basis for learning and memory. The biochemical step is mediated by a complex network of signaling proteins in spines. Two-photon imaging techniques combined with two-photon glutamate uncaging allows researchers to induce and quantify structural plasticity in single dendritic spines. However, this method is laborious and slow, making it unsuitable for high throughput screening of factors necessary for structural plasticity. Here we introduce a MATLAB-based module built for Scanimage to automatically track, image, and stimulate multiple dendritic spines. We implemented an electrically tunable lens in combination with a drift correction algorithm to rapidly and continuously track targeted spines and correct sample movements. With a straightforward user interface to design custom multi-position experiments, we were able to adequately image and produce targeted plasticity in multiple dendritic spines using glutamate uncaging. Our methods are inexpensive, open source, and provides up to a five-fold increase in throughput for quantifying structural plasticity of dendritic spines. PMID:28114380

  3. Automated assignment of NMR chemical shifts based on a known structure and 4D spectra.

    PubMed

    Trautwein, Matthias; Fredriksson, Kai; Möller, Heiko M; Exner, Thomas E

    2016-08-01

    Apart from their central role during 3D structure determination of proteins the backbone chemical shift assignment is the basis for a number of applications, like chemical shift perturbation mapping and studies on the dynamics of proteins. This assignment is not a trivial task even if a 3D protein structure is known and needs almost as much effort as the assignment for structure prediction if performed manually. We present here a new algorithm based solely on 4D [(1)H,(15)N]-HSQC-NOESY-[(1)H,(15)N]-HSQC spectra which is able to assign a large percentage of chemical shifts (73-82 %) unambiguously, demonstrated with proteins up to a size of 250 residues. For the remaining residues, a small number of possible assignments is filtered out. This is done by comparing distances in the 3D structure to restraints obtained from the peak volumes in the 4D spectrum. Using dead-end elimination, assignments are removed in which at least one of the restraints is violated. Including additional information from chemical shift predictions, a complete unambiguous assignment was obtained for Ubiquitin and 95 % of the residues were correctly assigned in the 251 residue-long N-terminal domain of enzyme I. The program including source code is available at https://github.com/thomasexner/4Dassign .

  4. Low-Cost Impact Detection and Location for Automated Inspections of 3D Metallic Based Structures

    PubMed Central

    Morón, Carlos; Portilla, Marina P.; Somolinos, José A.; Morales, Rafael

    2015-01-01

    This paper describes a new low-cost means to detect and locate mechanical impacts (collisions) on a 3D metal-based structure. We employ the simple and reasonably hypothesis that the use of a homogeneous material will allow certain details of the impact to be automatically determined by measuring the time delays of acoustic wave propagation throughout the 3D structure. The location of strategic piezoelectric sensors on the structure and an electronic-computerized system has allowed us to determine the instant and position at which the impact is produced. The proposed automatic system allows us to fully integrate impact point detection and the task of inspecting the point or zone at which this impact occurs. What is more, the proposed method can be easily integrated into a robot-based inspection system capable of moving over 3D metallic structures, thus avoiding (or minimizing) the need for direct human intervention. Experimental results are provided to show the effectiveness of the proposed approach. PMID:26029951

  5. Combining Structure and Sequence Information Allows Automated Prediction of Substrate Specificities within Enzyme Families

    PubMed Central

    Röttig, Marc; Rausch, Christian; Kohlbacher, Oliver

    2010-01-01

    An important aspect of the functional annotation of enzymes is not only the type of reaction catalysed by an enzyme, but also the substrate specificity, which can vary widely within the same family. In many cases, prediction of family membership and even substrate specificity is possible from enzyme sequence alone, using a nearest neighbour classification rule. However, the combination of structural information and sequence information can improve the interpretability and accuracy of predictive models. The method presented here, Active Site Classification (ASC), automatically extracts the residues lining the active site from one representative three-dimensional structure and the corresponding residues from sequences of other members of the family. From a set of representatives with known substrate specificity, a Support Vector Machine (SVM) can then learn a model of substrate specificity. Applied to a sequence of unknown specificity, the SVM can then predict the most likely substrate. The models can also be analysed to reveal the underlying structural reasons determining substrate specificities and thus yield valuable insights into mechanisms of enzyme specificity. We illustrate the high prediction accuracy achieved on two benchmark data sets and the structural insights gained from ASC by a detailed analysis of the family of decarboxylating dehydrogenases. The ASC web service is available at http://asc.informatik.uni-tuebingen.de/. PMID:20072606

  6. Low-cost impact detection and location for automated inspections of 3D metallic based structures.

    PubMed

    Morón, Carlos; Portilla, Marina P; Somolinos, José A; Morales, Rafael

    2015-05-28

    This paper describes a new low-cost means to detect and locate mechanical impacts (collisions) on a 3D metal-based structure. We employ the simple and reasonably hypothesis that the use of a homogeneous material will allow certain details of the impact to be automatically determined by measuring the time delays of acoustic wave propagation throughout the 3D structure. The location of strategic piezoelectric sensors on the structure and an electronic-computerized system has allowed us to determine the instant and position at which the impact is produced. The proposed automatic system allows us to fully integrate impact point detection and the task of inspecting the point or zone at which this impact occurs. What is more, the proposed method can be easily integrated into a robot-based inspection system capable of moving over 3D metallic structures, thus avoiding (or minimizing) the need for direct human intervention. Experimental results are provided to show the effectiveness of the proposed approach.

  7. Using Structure-Based Organic Chemistry Online Tutorials with Automated Correction for Student Practice and Review

    ERIC Educational Resources Information Center

    O'Sullivan, Timothy P.; Hargaden, Gra´inne C.

    2014-01-01

    This article describes the development and implementation of an open-access organic chemistry question bank for online tutorials and assessments at University College Cork and Dublin Institute of Technology. SOCOT (structure-based organic chemistry online tutorials) may be used to supplement traditional small-group tutorials, thereby allowing…

  8. Using Structure-Based Organic Chemistry Online Tutorials with Automated Correction for Student Practice and Review

    ERIC Educational Resources Information Center

    O'Sullivan, Timothy P.; Hargaden, Gra´inne C.

    2014-01-01

    This article describes the development and implementation of an open-access organic chemistry question bank for online tutorials and assessments at University College Cork and Dublin Institute of Technology. SOCOT (structure-based organic chemistry online tutorials) may be used to supplement traditional small-group tutorials, thereby allowing…

  9. Automated antibody structure prediction using Accelrys tools: results and best practices.

    PubMed

    Fasnacht, Marc; Butenhof, Ken; Goupil-Lamy, Anne; Hernandez-Guzman, Francisco; Huang, Hongwei; Yan, Lisa

    2014-08-01

    We describe the methodology and results from our participation in the second Antibody Modeling Assessment experiment. During the experiment we predicted the structure of eleven unpublished antibody Fv fragments. Our prediction methods centered on template-based modeling; potential templates were selected from an antibody database based on their sequence similarity to the target in the framework regions. Depending on the quality of the templates, we constructed models of the antibody framework regions either using a single, chimeric or multiple template approach. The hypervariable loop regions in the initial models were rebuilt by grafting the corresponding regions from suitable templates onto the model. For the H3 loop region, we further refined models using ab initio methods. The final models were subjected to constrained energy minimization to resolve severe local structural problems. The analysis of the models submitted show that Accelrys tools allow for the construction of quite accurate models for the framework and the canonical CDR regions, with RMSDs to the X-ray structure on average below 1 Å for most of these regions. The results show that accurate prediction of the H3 hypervariable loops remains a challenge. Furthermore, model quality assessment of the submitted models show that the models are of quite high quality, with local geometry assessment scores similar to that of the target X-ray structures.

  10. Semi-automated structural analysis of high resolution magnetic and gamma-ray spectrometry airborne surveys

    NASA Astrophysics Data System (ADS)

    Debeglia, N.; Martelet, G.; Perrin, J.; Truffert, C.; Ledru, P.; Tourlière, B.

    2005-08-01

    A user-controlled procedure was implemented for the structural analysis of geophysical maps. Local edge segments are first extracted using a suitable edge detector function, then linked into straight discontinuities and, finally, organised in complex boundary lines best delineating geophysical features. Final boundary lines may be attributed by a geologist to lithological contacts and/or structural geological features. Tests of some edge detectors, (i) horizontal gradient magnitude (HGM), (ii) various orders of the analytic signal ( An), reduced to the pole or not, (iii) enhanced horizontal derivative (EHD), (iv) composite analytic signal (CAS), were performed on synthetic magnetic data (with and without noise). As a result of these comparisons, the horizontal gradient appears to remain the best operator for the analysis of magnetic data. Computation of gradients in the frequency domain, including filtering and upward continuation of noisy data, is well-suited to the extraction of magnetic gradients associated to deep sources, while space-domain smoothing and differentiation techniques is generally preferable in the case of shallow magnetic sources, or for gamma-ray spectrometry analysis. Algorithms for edge extraction, segment linking, and line following can be controlled by choosing adequate edge detector and processing parameters which allows adaptation to a desired scale of interpretation. Tests on synthetic and real case data demonstrate the adaptability of the procedure and its ability to produce basic layer for multi-data analysis. The method was applied to the interpretation of high-resolution airborne magnetic and gamma-ray spectrometry data collected in northern Namibia. It allowed the delineation of dyke networks concealed by superficial weathering and demonstrated the presence of lithological variations in alluvial flows. The output from the structural analysis procedure are compatible with standard GIS softwares and enable the geologist to (i) compare

  11. Automating unambiguous NOE data usage in NVR for NMR protein structure-based assignments.

    PubMed

    Akhmedov, Murodzhon; Çatay, Bülent; Apaydın, Mehmet Serkan

    2015-12-01

    Nuclear Magnetic Resonance (NMR) Spectroscopy is an important technique that allows determining protein structure in solution. An important problem in protein structure determination using NMR spectroscopy is the mapping of peaks to corresponding amino acids, also known as the assignment problem. Structure-Based Assignment (SBA) is an approach to solve this problem using a template structure that is homologous to the target. Our previously developed approach Nuclear Vector Replacement-Binary Integer Programming (NVR-BIP) computed the optimal solution for small proteins, but was unable to solve the assignments of large proteins. NVR-Ant Colony Optimization (ACO) extended the applicability of the NVR approach for such proteins. One of the input data utilized in these approaches is the Nuclear Overhauser Effect (NOE) data. NOE is an interaction observed between two protons if the protons are located close in space. These protons could be amide protons, protons attached to the alpha-carbon atom in the backbone of the protein, or side chain protons. NVR only uses backbone protons. In this paper, we reformulate the NVR-BIP model to distinguish the type of proton in NOE data and use the corresponding proton coordinates in the extended formulation. In addition, the threshold value over interproton distances is set in a standard manner for all proteins by extracting the NOE upper bound distance information from the data. We also convert NOE intensities into distance thresholds. Our new approach thus handles the NOE data correctly and without manually determined parameters. We accordingly adapt NVR-ACO solution methodology to these changes. Computational results show that our approaches obtain optimal solutions for small proteins. For the large proteins our ant colony optimization-based approach obtains promising results.

  12. Enhanced Aircraft Design Capability for the Automated Structural Optimization System. Phase 1.

    DTIC Science & Technology

    1996-01-31

    creation and evolution . The creation takes place in the GOA in the form of a finite number of designs randomly generated to form the initial population...are feasible or not. Evolution is then applied to the population to produce a new population of, hopefully, better designs. The evolution B-3 of a...chromosomal" diploid strings that are closer, in structure, to human codings than traditional GOA haploid strings. For example, the human code carries 23 pairs

  13. Automated conductivity profiler for multilayer GaAs-(AlGa)As structures

    NASA Astrophysics Data System (ADS)

    Stiles, K. R.; Lee, J. W.

    1982-09-01

    An apparatus for automatic conductivity profiling of GaAs-(AlGa)As multilayer structures is described. The apparatus includes a microprocessor which controls a solenoid valve sequence in order to chemically etch the sample, and a programmable calculator which calculates sample conductance versus number of etch steps from which layer conductivities are calculated. Conductivity profiles of multilayer GaAs-(AlGa)As heterostructure laser material are presented and compared to profiles done by an earlier manual technique.

  14. Identifying relevant biomarkers of brain injury from structural MRI: Validation using automated approaches in children with unilateral cerebral palsy.

    PubMed

    Pagnozzi, Alex M; Dowson, Nicholas; Doecke, James; Fiori, Simona; Bradley, Andrew P; Boyd, Roslyn N; Rose, Stephen

    2017-01-01

    Previous studies have proposed that the early elucidation of brain injury from structural Magnetic Resonance Images (sMRI) is critical for the clinical assessment of children with cerebral palsy (CP). Although distinct aetiologies, including cortical maldevelopments, white and grey matter lesions and ventricular enlargement, have been categorised, these injuries are commonly only assessed in a qualitative fashion. As a result, sMRI remains relatively underexploited for clinical assessments, despite its widespread use. In this study, several automated and validated techniques to automatically quantify these three classes of injury were generated in a large cohort of children (n = 139) aged 5-17, including 95 children diagnosed with unilateral CP. Using a feature selection approach on a training data set (n = 97) to find severity of injury biomarkers predictive of clinical function (motor, cognitive, communicative and visual function), cortical shape and regional lesion burden were most often chosen associated with clinical function. Validating the best models on the unseen test data (n = 42), correlation values ranged between 0.545 and 0.795 (p<0.008), indicating significant associations with clinical function. The measured prevalence of injury, including ventricular enlargement (70%), white and grey matter lesions (55%) and cortical malformations (30%), were similar to the prevalence observed in other cohorts of children with unilateral CP. These findings support the early characterisation of injury from sMRI into previously defined aetiologies as part of standard clinical assessment. Furthermore, the strong and significant association between quantifications of injury observed on structural MRI and multiple clinical scores accord with empirically established structure-function relationships.

  15. Automated polyp measurement based on colon structure decomposition for CT colonography

    NASA Astrophysics Data System (ADS)

    Wang, Huafeng; Li, Lihong C.; Han, Hao; Peng, Hao; Song, Bowen; Wei, Xinzhou; Liang, Zhengrong

    2014-03-01

    Accurate assessment of colorectal polyp size is of great significance for early diagnosis and management of colorectal cancers. Due to the complexity of colon structure, polyps with diverse geometric characteristics grow from different landform surfaces. In this paper, we present a new colon decomposition approach for polyp measurement. We first apply an efficient maximum a posteriori expectation-maximization (MAP-EM) partial volume segmentation algorithm to achieve an effective electronic cleansing on colon. The global colon structure is then decomposed into different kinds of morphological shapes, e.g. haustral folds or haustral wall. Meanwhile, the polyp location is identified by an automatic computer aided detection algorithm. By integrating the colon structure decomposition with the computer aided detection system, a patch volume of colon polyps is extracted. Thus, polyp size assessment can be achieved by finding abnormal protrusion on a relative uniform morphological surface from the decomposed colon landform. We evaluated our method via physical phantom and clinical datasets. Experiment results demonstrate the feasibility of our method in consistently quantifying the size of polyp volume and, therefore, facilitating characterizing for clinical management.

  16. Automated method for the identification and analysis of vascular tree structures in retinal vessel network

    NASA Astrophysics Data System (ADS)

    Joshi, Vinayak S.; Garvin, Mona K.; Reinhardt, Joseph M.; Abramoff, Michael D.

    2011-03-01

    Structural analysis of retinal vessel network has so far served in the diagnosis of retinopathies and systemic diseases. The retinopathies are known to affect the morphologic properties of retinal vessels such as course, shape, caliber, and tortuosity. Whether the arteries and the veins respond to these changes together or in tandem has always been a topic of discussion. However the diseases such as diabetic retinopathy and retinopathy of prematurity have been diagnosed with the morphologic changes specific either to arteries or to veins. Thus a method describing the separation of retinal vessel trees imaged in a two dimensional color fundus image may assist in artery-vein classification and quantitative assessment of morphologic changes particular to arteries or veins. We propose a method based on mathematical morphology and graph search to identify and label the retinal vessel trees, which provides a structural mapping of vessel network in terms of each individual primary vessel, its branches and spatial positions of branching and cross-over points. The method was evaluated on a dataset of 15 fundus images resulting into an accuracy of 92.87 % correctly assigned vessel pixels when compared with the manual labeling of separated vessel trees. Accordingly, the structural mapping method performs well and we are currently investigating its potential in evaluating the characteristic properties specific to arteries or veins.

  17. Automated preliminary design of simplified wing structures to satisfy strength and flutter requirements

    NASA Technical Reports Server (NTRS)

    Stroud, W. J.; Dexter, C. B.; Stein, M.

    1972-01-01

    A simple structural model of an aircraft wing is used to show the effects of strength (stress) and flutter requirements on the design of minimum-weight aircraft-wing structures. The wing is idealized as an isotropic sandwich plate with a variable cover thickness distribution and a variable depth between covers. Plate theory is used for the structural analysis, and piston theory is used for the unsteady aerodynamics in the flutter analysis. Mathematical programming techniques are used to find the minimum-weight cover thickness distribution which satisfies flutter, strength, and minimum-gage constraints. The method of solution, some sample results, and the computer program used to obtain these results are presented. The results indicate that the cover thickness distribution obtained when designing for the strength requirement alone may be quite different from the cover thickness distribution obtained when designing for either the flutter requirement alone or for both the strength and flutter requirements concurrently. This conclusion emphasizes the need for designing for both flutter and strength from the outset.

  18. Automated Voxel Model from Point Clouds for Structural Analysis of Cultural Heritage

    NASA Astrophysics Data System (ADS)

    Bitelli, G.; Castellazzi, G.; D'Altri, A. M.; De Miranda, S.; Lambertini, A.; Selvaggi, I.

    2016-06-01

    In the context of cultural heritage, an accurate and comprehensive digital survey of a historical building is today essential in order to measure its geometry in detail for documentation or restoration purposes, for supporting special studies regarding materials and constructive characteristics, and finally for structural analysis. Some proven geomatic techniques, such as photogrammetry and terrestrial laser scanning, are increasingly used to survey buildings with different complexity and dimensions; one typical product is in form of point clouds. We developed a semi-automatic procedure to convert point clouds, acquired from laserscan or digital photogrammetry, to a filled volume model of the whole structure. The filled volume model, in a voxel format, can be useful for further analysis and also for the generation of a Finite Element Model (FEM) of the surveyed building. In this paper a new approach is presented with the aim to decrease operator intervention in the workflow and obtain a better description of the structure. In order to achieve this result a voxel model with variable resolution is produced. Different parameters are compared and different steps of the procedure are tested and validated in the case study of the North tower of the San Felice sul Panaro Fortress, a monumental historical building located in San Felice sul Panaro (Modena, Italy) that was hit by an earthquake in 2012.

  19. Automated procedure for design of wing structures to satisfy strength and flutter requirements

    NASA Technical Reports Server (NTRS)

    Haftka, R. T.

    1973-01-01

    A pilot computer program was developed for the design of minimum mass wing structures under flutter, strength, and minimum gage constraints. The wing structure is idealized by finite elements, and second-order piston theory aerodynamics is used in the flutter calculation. Mathematical programing methods are used for the optimization. Computation times during the design process are reduced by three techniques. First, iterative analysis methods used to reduce significantly reanalysis times. Second, the number of design variables is kept small by not using a one-to-one correspondence between finite elements and design variables. Third, a technique for using approximate second derivatives with Newton's method for the optimization is incorporated. The program output is compared witH previous published results. It is found that some flutter characteristics, such as the flutter speed, can display discontinous dependence on the design variables (which are the thicknesses of the structural elements). It is concluded that it is undesirable to use such quantities in the formulation of the flutter constraint.

  20. Psi4 1.1: An Open-Source Electronic Structure Program Emphasizing Automation, Advanced Libraries, and Interoperability.

    PubMed

    Parrish, Robert M; Burns, Lori A; Smith, Daniel G A; Simmonett, Andrew C; DePrince, A Eugene; Hohenstein, Edward G; Bozkaya, Uğur; Sokolov, Alexander Yu; Di Remigio, Roberto; Richard, Ryan M; Gonthier, Jérôme F; James, Andrew M; McAlexander, Harley R; Kumar, Ashutosh; Saitow, Masaaki; Wang, Xiao; Pritchard, Benjamin P; Verma, Prakash; Schaefer, Henry F; Patkowski, Konrad; King, Rollin A; Valeev, Edward F; Evangelista, Francesco A; Turney, Justin M; Crawford, T Daniel; Sherrill, C David

    2017-07-11

    Psi4 is an ab initio electronic structure program providing methods such as Hartree-Fock, density functional theory, configuration interaction, and coupled-cluster theory. The 1.1 release represents a major update meant to automate complex tasks, such as geometry optimization using complete-basis-set extrapolation or focal-point methods. Conversion of the top-level code to a Python module means that Psi4 can now be used in complex workflows alongside other Python tools. Several new features have been added with the aid of libraries providing easy access to techniques such as density fitting, Cholesky decomposition, and Laplace denominators. The build system has been completely rewritten to simplify interoperability with independent, reusable software components for quantum chemistry. Finally, a wide range of new theoretical methods and analyses have been added to the code base, including functional-group and open-shell symmetry adapted perturbation theory, density-fitted coupled cluster with frozen natural orbitals, orbital-optimized perturbation and coupled-cluster methods (e.g., OO-MP2 and OO-LCCD), density-fitted multiconfigurational self-consistent field, density cumulant functional theory, algebraic-diagrammatic construction excited states, improvements to the geometry optimizer, and the "X2C" approach to relativistic corrections, among many other improvements.

  1. Upper-mantle shear-wave structure under East and Southeast Asia from Automated Multimode Inversion of waveforms

    NASA Astrophysics Data System (ADS)

    Legendre, C. P.; Zhao, L.; Chen, Q.-F.

    2015-10-01

    We present a new Sv-velocity model of the upper mantle under East and Southeast Asia constrained by the inversion of seismic waveforms recorded by broad-band stations. Seismograms from earthquakes occurred between 1977 and 2012 are collected from about 4786 permanent and temporary stations in the region whenever and wherever available. Automated Multimode Inversion of surface and multiple-S waveforms is applied to extract structural information from the seismograms, in the form of linear equations with uncorrelated uncertainties. The equations are then solved for the seismic velocity perturbations in the crust and upper mantle with respect to a three-dimensional (3-D) reference model and a realistic crust. Major features of the lithosphere-asthenosphere system in East and Southeast Asia are identified in the resulting model. At lithospheric depth, low velocities can be seen beneath Tibet, whereas high velocities are found beneath cratonic regions, such as the Siberian, North China, Yangtze,) Tarim, and Dharwarand cratons. A number of microplates are mapped and the interaction with neighbouring plates is discussed. Slabs from the Pacific and Indian Oceans can be seen in the upper mantle. Passive marginal basins and subduction zones are also properly resolved.

  2. Automated tracing of open-field coronal structures for an optimized large-scale magnetic field reconstruction

    NASA Astrophysics Data System (ADS)

    Uritsky, V. M.; Davila, J. M.; Jones, S. I.

    2014-12-01

    Solar Probe Plus and Solar Orbiter will provide detailed measurements in the inner heliosphere magnetically connected with the topologically complex and eruptive solar corona. Interpretation of these measurements will require accurate reconstruction of the large-scale coronal magnetic field. In a related presentation by S. Jones et al., we argue that such reconstruction can be performed using photospheric extrapolation methods constrained by white-light coronagraph images. Here, we present the image-processing component of this project dealing with an automated segmentation of fan-like coronal loop structures. In contrast to the existing segmentation codes designed for detecting small-scale closed loops in the vicinity of active regions, we focus on the large-scale geometry of the open-field coronal features observed at significant radial distances from the solar surface. The coronagraph images used for the loop segmentation are transformed into a polar coordinate system and undergo radial detrending and initial noise reduction. The preprocessed images are subject to an adaptive second order differentiation combining radial and azimuthal directions. An adjustable thresholding technique is applied to identify candidate coronagraph features associated with the large-scale coronal field. A blob detection algorithm is used to extract valid features and discard noisy data pixels. The obtained features are interpolated using higher-order polynomials which are used to derive empirical directional constraints for magnetic field extrapolation procedures based on photospheric magnetograms.

  3. Automated transient thermography for the inspection of CFRP structures: experimental results and developed procedures

    NASA Astrophysics Data System (ADS)

    Theodorakeas, P.; Avdelidis, N. P.; Hrissagis, K.; Ibarra-Castanedo, C.; Koui, M.; Maldague, X.

    2011-05-01

    In thermography surveys, the inspector uses the camera to acquire images from the examined part. Common problems are the lack of repeatability when trying to repeat the scanning process, the need to carry the equipment during scanning, and long setting-up time. The aim of this paper is to present transient thermography results on CFRP plates for assessing different types of fabricated defects (impact damage, inclusions for delaminations, etc), as well as and to discuss and present a prototype robotic scanner to apply non destructive testing (thermographic scanning) on materials and structures. Currently, the scanning process is not automatic. The equipment to be developed, will be able to perform thermal NDT scanning on structures, create the appropriate scanning conditions (material thermal excitation), and ensure precision and tracking of scanning process. A thermographic camera that will be used for the image acquisition of the non destructive inspection, will be installed on a x, y, z, linear manipulator's end effector and would be surrounded by excitation sources (optical lamps), required for the application of transient thermography. In this work various CFRP samples of different shape, thickness and geometry were investigated using two different thermographic systems in order to compare and evaluate their effectiveness concerning the internal defect detectability under different testing conditions.

  4. Development of a Genetic Algorithm to Automate Clustering of a Dependency Structure Matrix

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; Korte, John J.; Bilardo, Vincent J.

    2006-01-01

    Much technology assessment and organization design data exists in Microsoft Excel spreadsheets. Tools are needed to put this data into a form that can be used by design managers to make design decisions. One need is to cluster data that is highly coupled. Tools such as the Dependency Structure Matrix (DSM) and a Genetic Algorithm (GA) can be of great benefit. However, no tool currently combines the DSM and a GA to solve the clustering problem. This paper describes a new software tool that interfaces a GA written as an Excel macro with a DSM in spreadsheet format. The results of several test cases are included to demonstrate how well this new tool works.

  5. Endoscopic system for automated high dynamic range inspection of moving periodic structures

    NASA Astrophysics Data System (ADS)

    Hahlweg, Cornelius; Rothe, Hendrik

    2015-09-01

    In the current paper an advanced endoscopic system for high resolution and high dynamic range inspection of periodic structures in rotating machines is presented. We address the system architecture, short time illumination, special optical problems, such as excluding the specular reflex, image processing, forward velocity prediction and metrological image processing. There are several special requirements to be met, such as the thermal stability above 100°C, robustness of the image field, illumination in view direction and the separation of metallic surface diffuse scatter. To find a compromise between image resolution and frame rate, an external sensor system was applied for synchronization with the moving target. The system originally was intended for inspection of thermal engines, but turned out to be of a more general use. Beside the theoretical part and dimensioning issues, practical examples and measurement results are included.

  6. Automated diffeomorphic registration of anatomical structures with rigid parts: application to dynamic cervical MRI.

    PubMed

    Commowick, Olivier; Wiest-Daesslé, Nicolas; Prima, Sylvain

    2012-01-01

    We propose an iterative two-step method to compute a diffeomorphic non-rigid transformation between images of anatomical structures with rigid parts, without any user intervention or prior knowledge on the image intensities. First we compute spatially sparse, locally optimal rigid transformations between the two images using a new block matching strategy and an efficient numerical optimiser (BOBYQA). Then we derive a dense, regularised velocity field based on these local transformations using matrix logarithms and M-smoothing. These two steps are iterated until convergence and the final diffeomorphic transformation is defined as the exponential of the accumulated velocity field. We show our algorithm to outperform the state-of-the-art log-domain diffeomorphic demons method on dynamic cervical MRI data.

  7. Automated structure–activity relationship mining: connecting chemical structure to biological profiles

    PubMed Central

    Wawer, Mathias J.; Jaramillo, David E.; Dancik, Vlado; Fass, Daniel M.; Haggarty, Stephen J.; Shamji, Alykhan F.; Wagner, Bridget K.; Schreiber, Stuart L.; Clemons, Paul A.

    2017-01-01

    Understanding structure–activity relationships (SARs) of small molecules is important for developing probes and novel therapeutic agents in chemical biology and drug discovery. Increasingly multiplexed small-molecule profiling assays allow simultaneous measurement of many biological response parameters for the same compound, e.g. expression levels for many genes or binding constants against many proteins. While such methods promise to capture SARs with high granularity, few computational methods are available to support SAR analyses of high-dimensional compound activity profiles. Many of these methods are not generally applicable or reduce the activity space to scalar summary statistics before establishing SARs. In this article, we present a versatile computational method that automatically extracts interpretable SAR rules from high-dimensional profiling data. The rules connect chemical structural features of compounds to patterns in their biological activity profiles. We applied our method to data from novel cell-based gene-expression and imaging assays collected on more than 30,000 small molecules. Based on the rules identified for this dataset, we prioritized groups of compounds for further study, including a novel set of putative histone deacetylase inhibitors. PMID:24710340

  8. Automated grid generation from models of complex geologic structure and stratigraphy

    SciTech Connect

    Gable, C.; Trease, H.; Cherry, T.

    1996-04-01

    The construction of computational grids which accurately reflect complex geologic structure and stratigraphy for flow and transport models poses a formidable task. With an understanding of stratigraphy, material properties and boundary and initial conditions, the task of incorporating this data into a numerical model can be difficult and time consuming. Most GIS tools for representing complex geologic volumes and surfaces are not designed for producing optimal grids for flow and transport computation. We have developed a tool, GEOMESH, for generating finite element grids that maintain the geometric integrity of input volumes, surfaces, and geologic data and produce an optimal (Delaunay) tetrahedral grid that can be used for flow and transport computations. GEOMESH also satisfies the constraint that the geometric coupling coefficients of the grid are positive for all elements. GEOMESH generates grids for two dimensional cross sections, three dimensional regional models, represents faults and fractures, and has the capability of including finer grids representing tunnels and well bores into grids. GEOMESH also permits adaptive grid refinement in three dimensions. The tools to glue, merge and insert grids together demonstrate how complex grids can be built from simpler pieces. The resulting grid can be utilized by unstructured finite element or integrated finite difference computational physics codes.

  9. Automation or De-automation

    NASA Astrophysics Data System (ADS)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  10. Evaluation of software tools for automated identification of neuroanatomical structures in quantitative β-amyloid PET imaging to diagnose Alzheimer's disease.

    PubMed

    Tuszynski, Tobias; Rullmann, Michael; Luthardt, Julia; Butzke, Daniel; Tiepolt, Solveig; Gertz, Hermann-Josef; Hesse, Swen; Seese, Anita; Lobsien, Donald; Sabri, Osama; Barthel, Henryk

    2016-06-01

    For regional quantification of nuclear brain imaging data, defining volumes of interest (VOIs) by hand is still the gold standard. As this procedure is time-consuming and operator-dependent, a variety of software tools for automated identification of neuroanatomical structures were developed. As the quality and performance of those tools are poorly investigated so far in analyzing amyloid PET data, we compared in this project four algorithms for automated VOI definition (HERMES Brass, two PMOD approaches, and FreeSurfer) against the conventional method. We systematically analyzed florbetaben brain PET and MRI data of ten patients with probable Alzheimer's dementia (AD) and ten age-matched healthy controls (HCs) collected in a previous clinical study. VOIs were manually defined on the data as well as through the four automated workflows. Standardized uptake value ratios (SUVRs) with the cerebellar cortex as a reference region were obtained for each VOI. SUVR comparisons between ADs and HCs were carried out using Mann-Whitney-U tests, and effect sizes (Cohen's d) were calculated. SUVRs of automatically generated VOIs were correlated with SUVRs of conventionally derived VOIs (Pearson's tests). The composite neocortex SUVRs obtained by manually defined VOIs were significantly higher for ADs vs. HCs (p=0.010, d=1.53). This was also the case for the four tested automated approaches which achieved effect sizes of d=1.38 to d=1.62. SUVRs of automatically generated VOIs correlated significantly with those of the hand-drawn VOIs in a number of brain regions, with regional differences in the degree of these correlations. Best overall correlation was observed in the lateral temporal VOI for all tested software tools (r=0.82 to r=0.95, p<0.001). Automated VOI definition by the software tools tested has a great potential to substitute for the current standard procedure to manually define VOIs in β-amyloid PET data analysis.

  11. Automation pilot

    NASA Technical Reports Server (NTRS)

    1983-01-01

    An important concept of the Action Information Management System (AIMS) approach is to evaluate office automation technology in the context of hands on use by technical program managers in the conduct of human acceptance difficulties which may accompany the transition to a significantly changing work environment. The improved productivity and communications which result from application of office automation technology are already well established for general office environments, but benefits unique to NASA are anticipated and these will be explored in detail.

  12. Internal Transcribed Spacer 2 (nu ITS2 rRNA) Sequence-Structure Phylogenetics: Towards an Automated Reconstruction of the Green Algal Tree of Life

    PubMed Central

    Buchheim, Mark A.; Keller, Alexander; Koetschan, Christian; Förster, Frank; Merget, Benjamin; Wolf, Matthias

    2011-01-01

    Background Chloroplast-encoded genes (matK and rbcL) have been formally proposed for use in DNA barcoding efforts targeting embryophytes. Extending such a protocol to chlorophytan green algae, though, is fraught with problems including non homology (matK) and heterogeneity that prevents the creation of a universal PCR toolkit (rbcL). Some have advocated the use of the nuclear-encoded, internal transcribed spacer two (ITS2) as an alternative to the traditional chloroplast markers. However, the ITS2 is broadly perceived to be insufficiently conserved or to be confounded by introgression or biparental inheritance patterns, precluding its broad use in phylogenetic reconstruction or as a DNA barcode. A growing body of evidence has shown that simultaneous analysis of nucleotide data with secondary structure information can overcome at least some of the limitations of ITS2. The goal of this investigation was to assess the feasibility of an automated, sequence-structure approach for analysis of IT2 data from a large sampling of phylum Chlorophyta. Methodology/Principal Findings Sequences and secondary structures from 591 chlorophycean, 741 trebouxiophycean and 938 ulvophycean algae, all obtained from the ITS2 Database, were aligned using a sequence structure-specific scoring matrix. Phylogenetic relationships were reconstructed by Profile Neighbor-Joining coupled with a sequence structure-specific, general time reversible substitution model. Results from analyses of the ITS2 data were robust at multiple nodes and showed considerable congruence with results from published phylogenetic analyses. Conclusions/Significance Our observations on the power of automated, sequence-structure analyses of ITS2 to reconstruct phylum-level phylogenies of the green algae validate this approach to assessing diversity for large sets of chlorophytan taxa. Moreover, our results indicate that objections to the use of ITS2 for DNA barcoding should be weighed against the utility of an automated

  13. Automated NMR structure determination of stereo-array isotope labeled ubiquitin from minimal sets of spectra using the SAIL-FLYA system.

    PubMed

    Ikeya, Teppei; Takeda, Mitsuhiro; Yoshida, Hitoshi; Terauchi, Tsutomu; Jee, Jun-Goo; Kainosho, Masatsune; Güntert, Peter

    2009-08-01

    Stereo-array isotope labeling (SAIL) has been combined with the fully automated NMR structure determination algorithm FLYA to determine the three-dimensional structure of the protein ubiquitin from different sets of input NMR spectra. SAIL provides a complete stereo- and regio-specific pattern of stable isotopes that results in sharper resonance lines and reduced signal overlap, without information loss. Here we show that as a result of the superior quality of the SAIL NMR spectra, reliable, fully automated analyses of the NMR spectra and structure calculations are possible using fewer input spectra than with conventional uniformly 13C/15N-labeled proteins. FLYA calculations with SAIL ubiquitin, using a single three-dimensional "through-bond" spectrum (and 2D HSQC spectra) in addition to the 13C-edited and 15N-edited NOESY spectra for conformational restraints, yielded structures with an accuracy of 0.83-1.15 A for the backbone RMSD to the conventionally determined solution structure of SAIL ubiquitin. NMR structures can thus be determined almost exclusively from the NOESY spectra that yield the conformational restraints, without the need to record many spectra only for determining intermediate, auxiliary data of the chemical shift assignments. The FLYA calculations for this report resulted in 252 ubiquitin structure bundles, obtained with different input data but identical structure calculation and refinement methods. These structures cover the entire range from highly accurate structures to seriously, but not trivially, wrong structures, and thus constitute a valuable database for the substantiation of structure validation methods.

  14. Automated clustering of probe molecules from solvent mapping of protein surfaces: new algorithms applied to hot-spot mapping and structure-based drug design

    NASA Astrophysics Data System (ADS)

    Lerner, Michael G.; Meagher, Kristin L.; Carlson, Heather A.

    2008-10-01

    Use of solvent mapping, based on multiple-copy minimization (MCM) techniques, is common in structure-based drug discovery. The minima of small-molecule probes define locations for complementary interactions within a binding pocket. Here, we present improved methods for MCM. In particular, a Jarvis-Patrick (JP) method is outlined for grouping the final locations of minimized probes into physical clusters. This algorithm has been tested through a study of protein-protein interfaces, showing the process to be robust, deterministic, and fast in the mapping of protein "hot spots." Improvements in the initial placement of probe molecules are also described. A final application to HIV-1 protease shows how our automated technique can be used to partition data too complicated to analyze by hand. These new automated methods may be easily and quickly extended to other protein systems, and our clustering methodology may be readily incorporated into other clustering packages.

  15. Improving the correlation of structural FEA models by the application of automated high density robotized laser Doppler vibrometry

    NASA Astrophysics Data System (ADS)

    Chowanietz, Maximilian; Bhangaonkar, Avinash; Semken, Michael; Cockrill, Martin

    2016-06-01

    Sound has had an intricate relation with the wellbeing of humans since time immemorial. It has the ability to enhance the quality of life immensely when present as music; at the same time, it can degrade its quality when manifested as noise. Hence, understanding its sources and the processes by which it is produced gains acute significance. Although various theories exist with respect to evolution of bells, it is indisputable that they carry millennia of cultural significance, and at least a few centuries of perfection with respect to design, casting and tuning. Despite the science behind its design, the nuances pertaining to founding and tuning have largely been empirical, and conveyed from one generation to the next. Post-production assessment for bells remains largely person-centric and traditional. However, progressive bell manufacturers have started adopting methods such as finite element analysis (FEA) for informing and optimising their future model designs. To establish confidence in the FEA process it is necessary to correlate the virtual model against a physical example. This is achieved by performing an experimental modal analysis (EMA) and comparing the results with those from FEA. Typically to collect the data for an EMA, the vibratory response of the structure is measured with the application of accelerometers. This technique has limitations; principally these are the observer effect and limited geometric resolution. In this paper, 3-dimensional laser Doppler vibrometry (LDV) has been used to measure the vibratory response with no observer effect due to the non-contact nature of the technique; resulting in higher accuracy measurements as the input to the correlation process. The laser heads were mounted on an industrial robot that enables large objects to be measured and extensive data sets to be captured quickly through an automated process. This approach gives previously unobtainable geometric resolution resulting in a higher confidence EMA. This is

  16. Human brain atlas for automated region of interest selection in quantitative susceptibility mapping: application to determine iron content in deep gray matter structures.

    PubMed

    Lim, Issel Anne L; Faria, Andreia V; Li, Xu; Hsu, Johnny T C; Airan, Raag D; Mori, Susumu; van Zijl, Peter C M

    2013-11-15

    The purpose of this paper is to extend the single-subject Eve atlas from Johns Hopkins University, which currently contains diffusion tensor and T1-weighted anatomical maps, by including contrast based on quantitative susceptibility mapping. The new atlas combines a "deep gray matter parcellation map" (DGMPM) derived from a single-subject quantitative susceptibility map with the previously established "white matter parcellation map" (WMPM) from the same subject's T1-weighted and diffusion tensor imaging data into an MNI coordinate map named the "Everything Parcellation Map in Eve Space," also known as the "EvePM." It allows automated segmentation of gray matter and white matter structures. Quantitative susceptibility maps from five healthy male volunteers (30 to 33 years of age) were coregistered to the Eve Atlas with AIR and Large Deformation Diffeomorphic Metric Mapping (LDDMM), and the transformation matrices were applied to the EvePM to produce automated parcellation in subject space. Parcellation accuracy was measured with a kappa analysis for the left and right structures of six deep gray matter regions. For multi-orientation QSM images, the Kappa statistic was 0.85 between automated and manual segmentation, with the inter-rater reproducibility Kappa being 0.89 for the human raters, suggesting "almost perfect" agreement between all segmentation methods. Segmentation seemed slightly more difficult for human raters on single-orientation QSM images, with the Kappa statistic being 0.88 between automated and manual segmentation, and 0.85 and 0.86 between human raters. Overall, this atlas provides a time-efficient tool for automated coregistration and segmentation of quantitative susceptibility data to analyze many regions of interest. These data were used to establish a baseline for normal magnetic susceptibility measurements for over 60 brain structures of 30- to 33-year-old males. Correlating the average susceptibility with age-based iron concentrations in gray

  17. Human brain atlas for automated region of interest selection in quantitative susceptibility mapping: application to determine iron content in deep gray matter structures

    PubMed Central

    Lim, Issel Anne L.; Faria, Andreia V.; Li, Xu; Hsu, Johnny T.C.; Airan, Raag D.; Mori, Susumu; van Zijl, Peter C. M.

    2013-01-01

    The purpose of this paper is to extend the single-subject Eve atlas from Johns Hopkins University, which currently contains diffusion tensor and T1-weighted anatomical maps, by including contrast based on quantitative susceptibility mapping. The new atlas combines a “deep gray matter parcellation map” (DGMPM) derived from a single-subject quantitative susceptibility map with the previously established “white matter parcellation map” (WMPM) from the same subject’s T1-weighted and diffusion tensor imaging data into an MNI coordinate map named the “Everything Parcellation Map in Eve Space,” also known as the “EvePM.” It allows automated segmentation of gray matter and white matter structures. Quantitative susceptibility maps from five healthy male volunteers (30 to 33 years of age) were coregistered to the Eve Atlas with AIR and Large Deformation Diffeomorphic Metric Mapping (LDDMM), and the transformation matrices were applied to the EvePM to produce automated parcellation in subject space. Parcellation accuracy was measured with a kappa analysis for the left and right structures of six deep gray matter regions. For multi-orientation QSM images, the Kappa statistic was 0.85 between automated and manual segmentation, with the inter-rater reproducibility Kappa being 0.89 for the human raters, suggesting “almost perfect” agreement between all segmentation methods. Segmentation seemed slightly more difficult for human raters on single-orientation QSM images, with the Kappa statistic being 0.88 between automated and manual segmentation, and 0.85 and 0.86 between human raters. Overall, this atlas provides a time-efficient tool for automated coregistration and segmentation of quantitative susceptibility data to analyze many regions of interest. These data were used to establish a baseline for normal magnetic susceptibility measurements for over 60 brain structures of 30- to 33-year-old males. Correlating the average susceptibility with age-based iron

  18. Office automation.

    PubMed

    Arenson, R L

    1986-03-01

    By now, the term "office automation" should have more meaning for those readers who are not intimately familiar with the subject. Not all of the preceding material pertains to every department or practice, but certainly, word processing and simple telephone management are key items. The size and complexity of the organization will dictate the usefulness of electronic mail and calendar management, and the individual radiologist's personal needs and habits will determine the usefulness of the home computer. Perhaps the most important ingredient for success in the office automation arena relates to the ability to integrate information from various systems in a simple and flexible manner. Unfortunately, this is perhaps the one area that most office automation systems have ignored or handled poorly. In the personal computer world, there has been much emphasis recently on integration of packages such as spreadsheet, database management, word processing, graphics, time management, and communications. This same philosophy of integration has been applied to a few office automation systems, but these are generally vendor-specific and do not allow for a mixture of foreign subsystems. During the next few years, it is likely that a few vendors will emerge as dominant in this integrated office automation field and will stress simplicity and flexibility as major components.

  19. Phytoplankton community structure in the North Sea: coupling between remote sensing and automated in situ analysis at the single cell level

    NASA Astrophysics Data System (ADS)

    Thyssen, M.; Alvain, S.; Lefèbvre, A.; Dessailly, D.; Rijkeboer, M.; Guiselin, N.; Creach, V.; Artigas, L.-F.

    2014-11-01

    Phytoplankton observation in the ocean can be a challenge in oceanography. Accurate estimations of their biomass and dynamics will help to understand ocean ecosystems and refine global climate models. This requires relevant datasets of phytoplankton at a functional level and on a daily and sub meso scale. In order to achieve this, an automated, high frequency, dedicated scanning flow cytometer (SFC, Cytobuoy, NL), has been developed to cover the entire size range of phytoplankton cells whilst simultaneously taking pictures of the largest of them. This cytometer was directly connected to the water inlet of a~pocket Ferry Box during a cruise in the North Sea, 8-12 May 2011 (DYMAPHY project, INTERREG IV A "2 Seas"), in order to identify the phytoplankton community structure of near surface waters (6 m) with a high resolution spacial basis (2.2 ± 1.8 km). Ten groups of cells, distinguished on the basis of their optical pulse shapes, were described (abundance, size estimate, red fluorescence per unit volume). Abundances varied depending on the hydrological status of the traversed waters, reflecting different stages of the North Sea blooming period. Comparisons between several techniques analyzing chlorophyll a and the scanning flow cytometer, using the integrated red fluorescence emitted by each counted cell, showed significant correlations. The community structure observed from the automated flow cytometry was compared with the PHYSAT reflectance anomalies over a daily scale. The number of matchups observed between the SFC automated high frequency in situ sampling and the remote sensing was found to be two to three times better than when using traditional water sampling strategies. Significant differences in the phytoplankton community structure within the two days for which matchups were available, suggest that it is possible to label PHYSAT anomalies not only with dominant groups, but at the level of the community structure.

  20. Habitat automation

    NASA Technical Reports Server (NTRS)

    Swab, Rodney E.

    1992-01-01

    A habitat, on either the surface of the Moon or Mars, will be designed and built with the proven technologies of that day. These technologies will be mature and readily available to the habitat designer. We believe an acceleration of the normal pace of automation would allow a habitat to be safer and more easily maintained than would be the case otherwise. This document examines the operation of a habitat and describes elements of that operation which may benefit from an increased use of automation. Research topics within the automation realm are then defined and discussed with respect to the role they can have in the design of the habitat. Problems associated with the integration of advanced technologies into real-world projects at NASA are also addressed.

  1. Fully automated protein purification

    PubMed Central

    Camper, DeMarco V.; Viola, Ronald E.

    2009-01-01

    Obtaining highly purified proteins is essential to begin investigating their functional and structural properties. The steps that are typically involved in purifying proteins can include an initial capture, intermediate purification, and a final polishing step. Completing these steps can take several days and require frequent attention to ensure success. Our goal was to design automated protocols that will allow the purification of proteins with minimal operator intervention. Separate methods have been produced and tested that automate the sample loading, column washing, sample elution and peak collection steps for ion-exchange, metal affinity, hydrophobic interaction and gel filtration chromatography. These individual methods are designed to be coupled and run sequentially in any order to achieve a flexible and fully automated protein purification protocol. PMID:19595984

  2. Automated High Throughput Drug Target Crystallography

    SciTech Connect

    Rupp, B

    2005-02-18

    The molecular structures of drug target proteins and receptors form the basis for 'rational' or structure guided drug design. The majority of target structures are experimentally determined by protein X-ray crystallography, which as evolved into a highly automated, high throughput drug discovery and screening tool. Process automation has accelerated tasks from parallel protein expression, fully automated crystallization, and rapid data collection to highly efficient structure determination methods. A thoroughly designed automation technology platform supported by a powerful informatics infrastructure forms the basis for optimal workflow implementation and the data mining and analysis tools to generate new leads from experimental protein drug target structures.

  3. Significant reduction in errors associated with nonbonded contacts in protein crystal structures: automated all-atom refinement with PrimeX.

    PubMed

    Bell, Jeffrey A; Ho, Kenneth L; Farid, Ramy

    2012-08-01

    All-atom models are essential for many applications in molecular modeling and computational chemistry. Nonbonded atomic contacts much closer than the sum of the van der Waals radii of the two atoms (clashes) are commonly observed in such models derived from protein crystal structures. A set of 94 recently deposited protein structures in the resolution range 1.5-2.8 Å were analyzed for clashes by the addition of all H atoms to the models followed by optimization and energy minimization of the positions of just these H atoms. The results were compared with the same set of structures after automated all-atom refinement with PrimeX and with nonbonded contacts in protein crystal structures at a resolution equal to or better than 0.9 Å. The additional PrimeX refinement produced structures with reasonable summary geometric statistics and similar R(free) values to the original structures. The frequency of clashes at less than 0.8 times the sum of van der Waals radii was reduced over fourfold compared with that found in the original structures, to a level approaching that found in the ultrahigh-resolution structures. Moreover, severe clashes at less than or equal to 0.7 times the sum of atomic radii were reduced 15-fold. All-atom refinement with PrimeX produced improved crystal structure models with respect to nonbonded contacts and yielded changes in structural details that dramatically impacted on the interpretation of some protein-ligand interactions.

  4. Automating Finance

    ERIC Educational Resources Information Center

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  5. Structure_threader: An improved method for automation and parallelization of programs structure, fastStructure and MavericK on multicore CPU systems.

    PubMed

    Pina-Martins, Francisco; Silva, Diogo N; Fino, Joana; Paulo, Octávio S

    2017-08-04

    Structure_threader is a program to parallelize multiple runs of genetic clustering software that does not make use of multithreading technology (structure, fastStructure and MavericK) on multicore computers. Our approach was benchmarked across multiple systems and displayed great speed improvements relative to the single-threaded implementation, scaling very close to linearly with the number of physical cores used. Structure_threader was compared to previous software written for the same task-ParallelStructure and StrAuto and was proven to be the faster (up to 25% faster) wrapper under all tested scenarios. Furthermore, Structure_threader can perform several automatic and convenient operations, assisting the user in assessing the most biologically likely value of 'K' via implementations such as the "Evanno," or "Thermodynamic Integration" tests and automatically draw the "meanQ" plots (static or interactive) for each value of K (or even combined plots). Structure_threader is written in python 3 and licensed under the GPLv3. It can be downloaded free of charge at https://github.com/StuntsPT/Structure_threader. © 2017 John Wiley & Sons Ltd.

  6. SU-C-9A-02: Structured Noise Index as An Automated Quality Control for Nuclear Medicine: A Two Year Experience

    SciTech Connect

    Nelson, J; Christianson, O; Samei, E

    2014-06-01

    Purpose: Flood-field uniformity evaluation is an essential element in the assessment of nuclear medicine (NM) gamma cameras. It serves as the central element of the quality control (QC) program, acquired and analyzed on a daily basis prior to clinical imaging. Uniformity images are traditionally analyzed using pixel value-based metrics which often fail to capture subtle structure and patterns caused by changes in gamma camera performance requiring additional visual inspection which is subjective and time demanding. The goal of this project was to develop and implement a robust QC metrology for NM that is effective in identifying non-uniformity issues, reporting issues in a timely manner for efficient correction prior to clinical involvement, all incorporated into an automated effortless workflow, and to characterize the program over a two year period. Methods: A new quantitative uniformity analysis metric was developed based on 2D noise power spectrum metrology and confirmed based on expert observer visual analysis. The metric, termed Structured Noise Index (SNI) was then integrated into an automated program to analyze, archive, and report on daily NM QC uniformity images. The effectiveness of the program was evaluated over a period of 2 years. Results: The SNI metric successfully identified visually apparent non-uniformities overlooked by the pixel valuebased analysis methods. Implementation of the program has resulted in nonuniformity identification in about 12% of daily flood images. In addition, due to the vigilance of staff response, the percentage of days exceeding trigger value shows a decline over time. Conclusion: The SNI provides a robust quantification of the NM performance of gamma camera uniformity. It operates seamlessly across a fleet of multiple camera models. The automated process provides effective workflow within the NM spectra between physicist, technologist, and clinical engineer. The reliability of this process has made it the preferred

  7. Prediction of the three-dimensional structures of the biotinylated domain from yeast pyruvate carboxylase and of the lipoylated H-protein from the pea leaf glycine cleavage system: a new automated method for the prediction of protein tertiary structure.

    PubMed Central

    Brocklehurst, S. M.; Perham, R. N.

    1993-01-01

    A new, automated, knowledge-based method for the construction of three-dimensional models of proteins is described. Geometric restraints on target structures are calculated from a consideration of homologous template structures and the wider knowledge base of unrelated protein structures. Three-dimensional structures are calculated from initial partly folded states by high-temperature molecular dynamics simulations followed slow cooling of the system (simulated annealing) using nonphysical potentials. Three-dimensional models for the biotinylated domain from the pyruvate carboxylase of yeast and the lipoylated H-protein from the glycine cleavage system of pea leaf were constructed, based on the known structures of two lipoylated domains of 2-oxo acid dehydrogenase multienzyme complexes. Despite their weak sequence similarity, the three proteins are predicted to have similar three-dimensional structures, representative of a new protein module. Implications for the mechanisms of posttranslational modification of these proteins and their catalytic function are discussed. PMID:8518734

  8. Numerical analysis of stiffened shells of revolution. Volume 3: Users' manual for STARS-2B, 2V, shell theory automated for rotational structures, 2 (buckling, vibrations), digital computer programs

    NASA Technical Reports Server (NTRS)

    Svalbonas, V.

    1973-01-01

    The User's manual for the shell theory automated for rotational structures (STARS) 2B and 2V (buckling, vibrations) is presented. Several features of the program are: (1) arbitrary branching of the shell meridians, (2) arbitrary boundary conditions, (3) minimum input requirements to describe a complex, practical shell of revolution structure, and (4) accurate analysis capability using a minimum number of degrees of freedom.

  9. Developing a Graphical User Interface to Automate the Estimation and Prediction of Risk Values for Flood Protective Structures using Artificial Neural Network

    NASA Astrophysics Data System (ADS)

    Hasan, M.; Helal, A.; Gabr, M.

    2014-12-01

    In this project, we focus on providing a computer-automated platform for a better assessment of the potential failures and retrofit measures of flood-protecting earth structures, e.g., dams and levees. Such structures play an important role during extreme flooding events as well as during normal operating conditions. Furthermore, they are part of other civil infrastructures such as water storage and hydropower generation. Hence, there is a clear need for accurate evaluation of stability and functionality levels during their service lifetime so that the rehabilitation and maintenance costs are effectively guided. Among condition assessment approaches based on the factor of safety, the limit states (LS) approach utilizes numerical modeling to quantify the probability of potential failures. The parameters for LS numerical modeling include i) geometry and side slopes of the embankment, ii) loading conditions in terms of rate of rising and duration of high water levels in the reservoir, and iii) cycles of rising and falling water levels simulating the effect of consecutive storms throughout the service life of the structure. Sample data regarding the correlations of these parameters are available through previous research studies. We have unified these criteria and extended the risk assessment in term of loss of life through the implementation of a graphical user interface to automate input parameters that divides data into training and testing sets, and then feeds them into Artificial Neural Network (ANN) tool through MATLAB programming. The ANN modeling allows us to predict risk values of flood protective structures based on user feedback quickly and easily. In future, we expect to fine-tune the software by adding extensive data on variations of parameters.

  10. Automated RTOP Management System

    NASA Technical Reports Server (NTRS)

    Hayes, P.

    1984-01-01

    The structure of NASA's Office of Aeronautics and Space Technology electronic information system network from 1983 to 1985 is illustrated. The RTOP automated system takes advantage of existing hardware, software, and expertise, and provides: (1) computerized cover sheet and resources forms; (2) electronic signature and transmission; (3) a data-based information system; (4) graphics; (5) intercenter communications; (6) management information; and (7) text editing. The system is coordinated with Headquarters efforts in codes R,E, and T.

  11. Accuracy and Reliability of Automated Gray Matter Segmentation Pathways on Real and Simulated Structural Magnetic Resonance Images of the Human Brain

    PubMed Central

    Eggert, Lucas D.; Sommer, Jens; Jansen, Andreas; Kircher, Tilo; Konrad, Carsten

    2012-01-01

    Automated gray matter segmentation of magnetic resonance imaging data is essential for morphometric analyses of the brain, particularly when large sample sizes are investigated. However, although detection of small structural brain differences may fundamentally depend on the method used, both accuracy and reliability of different automated segmentation algorithms have rarely been compared. Here, performance of the segmentation algorithms provided by SPM8, VBM8, FSL and FreeSurfer was quantified on simulated and real magnetic resonance imaging data. First, accuracy was assessed by comparing segmentations of twenty simulated and 18 real T1 images with corresponding ground truth images. Second, reliability was determined in ten T1 images from the same subject and in ten T1 images of different subjects scanned twice. Third, the impact of preprocessing steps on segmentation accuracy was investigated. VBM8 showed a very high accuracy and a very high reliability. FSL achieved the highest accuracy but demonstrated poor reliability and FreeSurfer showed the lowest accuracy, but high reliability. An universally valid recommendation on how to implement morphometric analyses is not warranted due to the vast number of scanning and analysis parameters. However, our analysis suggests that researchers can optimize their individual processing procedures with respect to final segmentation quality and exemplifies adequate performance criteria. PMID:23028771

  12. Accuracy and reliability of automated gray matter segmentation pathways on real and simulated structural magnetic resonance images of the human brain.

    PubMed

    Eggert, Lucas D; Sommer, Jens; Jansen, Andreas; Kircher, Tilo; Konrad, Carsten

    2012-01-01

    Automated gray matter segmentation of magnetic resonance imaging data is essential for morphometric analyses of the brain, particularly when large sample sizes are investigated. However, although detection of small structural brain differences may fundamentally depend on the method used, both accuracy and reliability of different automated segmentation algorithms have rarely been compared. Here, performance of the segmentation algorithms provided by SPM8, VBM8, FSL and FreeSurfer was quantified on simulated and real magnetic resonance imaging data. First, accuracy was assessed by comparing segmentations of twenty simulated and 18 real T1 images with corresponding ground truth images. Second, reliability was determined in ten T1 images from the same subject and in ten T1 images of different subjects scanned twice. Third, the impact of preprocessing steps on segmentation accuracy was investigated. VBM8 showed a very high accuracy and a very high reliability. FSL achieved the highest accuracy but demonstrated poor reliability and FreeSurfer showed the lowest accuracy, but high reliability. An universally valid recommendation on how to implement morphometric analyses is not warranted due to the vast number of scanning and analysis parameters. However, our analysis suggests that researchers can optimize their individual processing procedures with respect to final segmentation quality and exemplifies adequate performance criteria.

  13. Significant reduction in errors associated with nonbonded contacts in protein crystal structures: automated all-atom refinement with PrimeX

    PubMed Central

    Bell, Jeffrey A.; Ho, Kenneth L.; Farid, Ramy

    2012-01-01

    All-atom models are essential for many applications in molecular modeling and computational chemistry. Non­bonded atomic contacts much closer than the sum of the van der Waals radii of the two atoms (clashes) are commonly observed in such models derived from protein crystal structures. A set of 94 recently deposited protein structures in the resolution range 1.5–2.8 Å were analyzed for clashes by the addition of all H atoms to the models followed by optimization and energy minimization of the positions of just these H atoms. The results were compared with the same set of structures after automated all-atom refinement with PrimeX and with nonbonded contacts in protein crystal structures at a resolution equal to or better than 0.9 Å. The additional PrimeX refinement produced structures with reasonable summary geometric statistics and similar R free values to the original structures. The frequency of clashes at less than 0.8 times the sum of van der Waals radii was reduced over fourfold compared with that found in the original structures, to a level approaching that found in the ultrahigh-resolution structures. Moreover, severe clashes at less than or equal to 0.7 times the sum of atomic radii were reduced 15-­fold. All-atom refinement with PrimeX produced improved crystal structure models with respect to nonbonded contacts and yielded changes in structural details that dramatically impacted on the interpretation of some protein–ligand interactions. PMID:22868759

  14. Three-dimensional structure of lipid vesicles embedded in vitreous ice and investigated by automated electron tomography.

    PubMed

    Dierksen, K; Typke, D; Hegerl, R; Walz, J; Sackmann, E; Baumeister, W

    1995-04-01

    Automated electron tomography is shown to be a suitable means to visualize the shape of phospholipid vesicles embedded in vitrified ice. With a slow-scan charge-coupled device camera as a recording device, the cumulative electron dose needed to record a data set of 60 projections at a magnification of 20,000X can be kept as low as 15 e-/A2 (or 1500 electrons/nm2). The membrane of the three-dimensionally reconstructed vesicles is clearly visible in two-dimensional sections through the three-dimensionally reconstructed volume. Some edges indicating a polygonal shape of the vesicles, frozen from the gel phase, are also clearly recognized. Because of the presently limited tilt angle range (+/- 60 degrees), the upper and lower "caps" of the vesicles (representing about 35% of the surface of the ellipsoidal particles) remain invisible in the three-dimensional reconstruction.

  15. H++ 3.0: automating pK prediction and the preparation of biomolecular structures for atomistic molecular modeling and simulations.

    PubMed

    Anandakrishnan, Ramu; Aguilar, Boris; Onufriev, Alexey V

    2012-07-01

    The accuracy of atomistic biomolecular modeling and simulation studies depend on the accuracy of the input structures. Preparing these structures for an atomistic modeling task, such as molecular dynamics (MD) simulation, can involve the use of a variety of different tools for: correcting errors, adding missing atoms, filling valences with hydrogens, predicting pK values for titratable amino acids, assigning predefined partial charges and radii to all atoms, and generating force field parameter/topology files for MD. Identifying, installing and effectively using the appropriate tools for each of these tasks can be difficult for novice and time-consuming for experienced users. H++ (http://biophysics.cs.vt.edu/) is a free open-source web server that automates the above key steps in the preparation of biomolecular structures for molecular modeling and simulations. H++ also performs extensive error and consistency checking, providing error/warning messages together with the suggested corrections. In addition to numerous minor improvements, the latest version of H++ includes several new capabilities and options: fix erroneous (flipped) side chain conformations for HIS, GLN and ASN, include a ligand in the input structure, process nucleic acid structures and generate a solvent box with specified number of common ions for explicit solvent MD.

  16. Beyond the Twilight Zone: automated prediction of structural properties of proteins by recursive neural networks and remote homology information.

    PubMed

    Mooney, Catherine; Pollastri, Gianluca

    2009-10-01

    The prediction of 1D structural properties of proteins is an important step toward the prediction of protein structure and function, not only in the ab initio case but also when homology information to known structures is available. Despite this the vast majority of 1D predictors do not incorporate homology information into the prediction process. We develop a novel structural alignment method, SAMD, which we use to build alignments of putative remote homologues that we compress into templates of structural frequency profiles. We use these templates as additional input to ensembles of recursive neural networks, which we specialise for the prediction of query sequences that show only remote homology to any Protein Data Bank structure. We predict four 1D structural properties - secondary structure, relative solvent accessibility, backbone structural motifs, and contact density. Secondary structure prediction accuracy, tested by five-fold cross-validation on a large set of proteins allowing less than 25% sequence identity between training and test set and query sequences and templates, exceeds 82%, outperforming its ab initio counterpart, other state-of-the-art secondary structure predictors (Jpred 3 and PSIPRED) and two other systems based on PSI-BLAST and COMPASS templates. We show that structural information from homologues improves prediction accuracy well beyond the Twilight Zone of sequence similarity, even below 5% sequence identity, for all four structural properties. Significant improvement over the extraction of structural information directly from PDB templates suggests that the combination of sequence and template information is more informative than templates alone.

  17. Automated campaign system

    NASA Astrophysics Data System (ADS)

    Vondran, Gary; Chao, Hui; Lin, Xiaofan; Beyer, Dirk; Joshi, Parag; Atkins, Brian; Obrador, Pere

    2006-02-01

    To run a targeted campaign involves coordination and management across numerous organizations and complex process flows. Everything from market analytics on customer databases, acquiring content and images, composing the materials, meeting the sponsoring enterprise brand standards, driving through production and fulfillment, and evaluating results; all processes are currently performed by experienced highly trained staff. Presented is a developed solution that not only brings together technologies that automate each process, but also automates the entire flow so that a novice user could easily run a successful campaign from their desktop. This paper presents the technologies, structure, and process flows used to bring this system together. Highlighted will be how the complexity of running a targeted campaign is hidden from the user through technologies, all while providing the benefits of a professionally managed campaign.

  18. The automation of science.

    PubMed

    King, Ross D; Rowland, Jem; Oliver, Stephen G; Young, Michael; Aubrey, Wayne; Byrne, Emma; Liakata, Maria; Markham, Magdalena; Pir, Pinar; Soldatova, Larisa N; Sparkes, Andrew; Whelan, Kenneth E; Clare, Amanda

    2009-04-03

    The basis of science is the hypothetico-deductive method and the recording of experiments in sufficient detail to enable reproducibility. We report the development of Robot Scientist "Adam," which advances the automation of both. Adam has autonomously generated functional genomics hypotheses about the yeast Saccharomyces cerevisiae and experimentally tested these hypotheses by using laboratory automation. We have confirmed Adam's conclusions through manual experiments. To describe Adam's research, we have developed an ontology and logical language. The resulting formalization involves over 10,000 different research units in a nested treelike structure, 10 levels deep, that relates the 6.6 million biomass measurements to their logical description. This formalization describes how a machine contributed to scientific knowledge.

  19. FIJI Macro 3D ART VeSElecT: 3D Automated Reconstruction Tool for Vesicle Structures of Electron Tomograms

    PubMed Central

    Kaltdorf, Kristin Verena; Schulze, Katja; Helmprobst, Frederik; Kollmannsberger, Philip; Stigloher, Christian

    2017-01-01

    Automatic image reconstruction is critical to cope with steadily increasing data from advanced microscopy. We describe here the Fiji macro 3D ART VeSElecT which we developed to study synaptic vesicles in electron tomograms. We apply this tool to quantify vesicle properties (i) in embryonic Danio rerio 4 and 8 days past fertilization (dpf) and (ii) to compare Caenorhabditis elegans N2 neuromuscular junctions (NMJ) wild-type and its septin mutant (unc-59(e261)). We demonstrate development-specific and mutant-specific changes in synaptic vesicle pools in both models. We confirm the functionality of our macro by applying our 3D ART VeSElecT on zebrafish NMJ showing smaller vesicles in 8 dpf embryos then 4 dpf, which was validated by manual reconstruction of the vesicle pool. Furthermore, we analyze the impact of C. elegans septin mutant unc-59(e261) on vesicle pool formation and vesicle size. Automated vesicle registration and characterization was implemented in Fiji as two macros (registration and measurement). This flexible arrangement allows in particular reducing false positives by an optional manual revision step. Preprocessing and contrast enhancement work on image-stacks of 1nm/pixel in x and y direction. Semi-automated cell selection was integrated. 3D ART VeSElecT removes interfering components, detects vesicles by 3D segmentation and calculates vesicle volume and diameter (spherical approximation, inner/outer diameter). Results are collected in color using the RoiManager plugin including the possibility of manual removal of non-matching confounder vesicles. Detailed evaluation considered performance (detected vesicles) and specificity (true vesicles) as well as precision and recall. We furthermore show gain in segmentation and morphological filtering compared to learning based methods and a large time gain compared to manual segmentation. 3D ART VeSElecT shows small error rates and its speed gain can be up to 68 times faster in comparison to manual annotation

  20. Automated discovery of structural features of the optic nerve head on the basis of image and genetic data

    NASA Astrophysics Data System (ADS)

    Christopher, Mark; Tang, Li; Fingert, John H.; Scheetz, Todd E.; Abramoff, Michael D.

    2014-03-01

    Evaluation of optic nerve head (ONH) structure is a commonly used clinical technique for both diagnosis and monitoring of glaucoma. Glaucoma is associated with characteristic changes in the structure of the ONH. We present a method for computationally identifying ONH structural features using both imaging and genetic data from a large cohort of participants at risk for primary open angle glaucoma (POAG). Using 1054 participants from the Ocular Hypertension Treatment Study, ONH structure was measured by application of a stereo correspondence algorithm to stereo fundus images. In addition, the genotypes of several known POAG genetic risk factors were considered for each participant. ONH structural features were discovered using both a principal component analysis approach to identify the major modes of variance within structural measurements and a linear discriminant analysis approach to capture the relationship between genetic risk factors and ONH structure. The identified ONH structural features were evaluated based on the strength of their associations with genotype and development of POAG by the end of the OHTS study. ONH structural features with strong associations with genotype were identified for each of the genetic loci considered. Several identified ONH structural features were significantly associated (p < 0.05) with the development of POAG after Bonferroni correction. Further, incorporation of genetic risk status was found to substantially increase performance of early POAG prediction. These results suggest incorporating both imaging and genetic data into ONH structural modeling significantly improves the ability to explain POAG-related changes to ONH structure.

  1. Structure tensor based automated detection of macular edema and central serous retinopathy using optical coherence tomography images.

    PubMed

    Hassan, Bilal; Raja, Gulistan; Hassan, Taimur; Usman Akram, M

    2016-04-01

    Macular edema (ME) and central serous retinopathy (CSR) are two macular diseases that affect the central vision of a person if they are left untreated. Optical coherence tomography (OCT) imaging is the latest eye examination technique that shows a cross-sectional region of the retinal layers and that can be used to detect many retinal disorders in an early stage. Many researchers have done clinical studies on ME and CSR and reported significant findings in macular OCT scans. However, this paper proposes an automated method for the classification of ME and CSR from OCT images using a support vector machine (SVM) classifier. Five distinct features (three based on the thickness profiles of the sub-retinal layers and two based on cyst fluids within the sub-retinal layers) are extracted from 30 labeled images (10 ME, 10 CSR, and 10 healthy), and SVM is trained on these. We applied our proposed algorithm on 90 time-domain OCT (TD-OCT) images (30 ME, 30 CSR, 30 healthy) of 73 patients. Our algorithm correctly classified 88 out of 90 subjects with accuracy, sensitivity, and specificity of 97.77%, 100%, and 93.33%, respectively.

  2. Elucidating structural order and disorder phenomena in mullite-type Al4B2O9 by automated electron diffraction tomography

    NASA Astrophysics Data System (ADS)

    Zhao, Haishuang; Krysiak, Yaşar; Hoffmann, Kristin; Barton, Bastian; Molina-Luna, Leopoldo; Neder, Reinhard B.; Kleebe, Hans-Joachim; Gesing, Thorsten M.; Schneider, Hartmut; Fischer, Reinhard X.; Kolb, Ute

    2017-05-01

    The crystal structure and disorder phenomena of Al4B2O9, an aluminum borate from the mullite-type family, were studied using automated diffraction tomography (ADT), a recently established method for collection and analysis of electron diffraction data. Al4B2O9, prepared by sol-gel approach, crystallizes in the monoclinic space group C2/m. The ab initio structure determination based on three-dimensional electron diffraction data from single ordered crystals reveals that edge-connected AlO6 octahedra expanding along the b axis constitute the backbone. The ordered structure (A) was confirmed by TEM and HAADF-STEM images. Furthermore, disordered crystals with diffuse scattering along the b axis are observed. Analysis of the modulation pattern implies a mean superstructure (AAB) with a threefold b axis, where B corresponds to an A layer shifted by ½a and ½c. Diffraction patterns simulated for the AAB sequence including additional stacking disorder are in good agreement with experimental electron diffraction patterns.

  3. An automated approach for defining core atoms and domains in an ensemble of NMR-derived protein structures.

    PubMed

    Kelley, L A; Gardner, S P; Sutcliffe, M J

    1997-06-01

    A single NMR-derived protein structure is usually deposited as an ensemble containing many structures, each consistent with the restraint set used. The number of NMR-derived structures deposited in the Protein Data Bank (PDB) is increasing rapidly. In addition, many of the structures deposited in an ensemble exhibit variation in only some regions of the structure, often with the majority of the structure remaining largely invariant across the family of structures. Therefore it is useful to determine the set of atoms whose positions are 'well defined' across an ensemble (also known as the 'core' atoms). We have developed a computer program, NMRCORE, which automatically defines (i) the core atoms, and (ii) the rigid body(ies), or domain(s), in which they occur. The program uses a sorted list of the variances in individual dihedral angles across the ensemble to define the core, followed by the automatic clustering of the variances in pairwise inter-atom distances across the ensemble to define the rigid body(ies) which comprise the core. The program is freely available via the World Wide Web (http://neon.chem.le.ac.uk/nmrcore/).

  4. Structure-Function Modeling of Optical Coherence Tomography and Standard Automated Perimetry in the Retina of Patients with Autosomal Dominant Retinitis Pigmentosa.

    PubMed

    Smith, Travis B; Parker, Maria; Steinkamp, Peter N; Weleber, Richard G; Smith, Ning; Wilson, David J

    2016-01-01

    To assess relationships between structural and functional biomarkers, including new topographic measures of visual field sensitivity, in patients with autosomal dominant retinitis pigmentosa. Spectral domain optical coherence tomography line scans and hill of vision (HOV) sensitivity surfaces from full-field standard automated perimetry were semi-automatically aligned for 60 eyes of 35 patients. Structural biomarkers were extracted from outer retina b-scans along horizontal and vertical midlines. Functional biomarkers were extracted from local sensitivity profiles along the b-scans and from the full visual field. These included topographic measures of functional transition such as the contour of most rapid sensitivity decline around the HOV, herein called HOV slope for convenience. Biomarker relationships were assessed pairwise by coefficients of determination (R2) from mixed-effects analysis with automatic model selection. Structure-function relationships were accurately modeled (conditional R(2)>0.8 in most cases). The best-fit relationship models and correlation patterns for horizontally oriented biomarkers were different than vertically oriented ones. The structural biomarker with the largest number of significant functional correlates was the ellipsoid zone (EZ) width, followed by the total photoreceptor layer thickness. The strongest correlation observed was between EZ width and HOV slope distance (marginal R(2) = 0.85, p<10(-10)). The mean sensitivity defect at the EZ edge was 7.6 dB. Among all functional biomarkers, the HOV slope mean value, HOV slope mean distance, and maximum sensitivity along the b-scan had the largest number of significant structural correlates. Topographic slope metrics show promise as functional biomarkers relevant to the transition zone. EZ width is strongly associated with the location of most rapid HOV decline.

  5. Towards fully automated structure-based NMR resonance assignment of ¹⁵N-labeled proteins from automatically picked peaks.

    PubMed

    Jang, Richard; Gao, Xin; Li, Ming

    2011-03-01

    In NMR resonance assignment, an indispensable step in NMR protein studies, manually processed peaks from both N-labeled and C-labeled spectra are typically used as inputs. However, the use of homologous structures can allow one to use only N-labeled NMR data and avoid the added expense of using C-labeled data. We propose a novel integer programming framework for structure-based backbone resonance assignment using N-labeled data. The core consists of a pair of integer programming models: one for spin system forming and amino acid typing, and the other for backbone resonance assignment. The goal is to perform the assignment directly from spectra without any manual intervention via automatically picked peaks, which are much noisier than manually picked peaks, so methods must be error-tolerant. In the case of semi-automated/manually processed peak data, we compare our system with the Xiong-Pandurangan-Bailey-Kellogg's contact replacement (CR) method, which is the most error-tolerant method for structure-based resonance assignment. Our system, on average, reduces the error rate of the CR method by five folds on their data set. In addition, by using an iterative algorithm, our system has the added capability of using the NOESY data to correct assignment errors due to errors in predicting the amino acid and secondary structure type of each spin system. On a publicly available data set for human ubiquitin, where the typing accuracy is 83%, we achieve 91% accuracy, compared to the 59% accuracy obtained without correcting for such errors. In the case of automatically picked peaks, using assignment information from yeast ubiquitin, we achieve a fully automatic assignment with 97% accuracy. To our knowledge, this is the first system that can achieve fully automatic structure-based assignment directly from spectra. This has implications in NMR protein mutant studies, where the assignment step is repeated for each mutant.

  6. Structure-Function Modeling of Optical Coherence Tomography and Standard Automated Perimetry in the Retina of Patients with Autosomal Dominant Retinitis Pigmentosa

    PubMed Central

    Smith, Travis B.; Parker, Maria; Steinkamp, Peter N.; Weleber, Richard G.; Smith, Ning; Wilson, David J.

    2016-01-01

    Purpose To assess relationships between structural and functional biomarkers, including new topographic measures of visual field sensitivity, in patients with autosomal dominant retinitis pigmentosa. Methods Spectral domain optical coherence tomography line scans and hill of vision (HOV) sensitivity surfaces from full-field standard automated perimetry were semi-automatically aligned for 60 eyes of 35 patients. Structural biomarkers were extracted from outer retina b-scans along horizontal and vertical midlines. Functional biomarkers were extracted from local sensitivity profiles along the b-scans and from the full visual field. These included topographic measures of functional transition such as the contour of most rapid sensitivity decline around the HOV, herein called HOV slope for convenience. Biomarker relationships were assessed pairwise by coefficients of determination (R2) from mixed-effects analysis with automatic model selection. Results Structure-function relationships were accurately modeled (conditional R2>0.8 in most cases). The best-fit relationship models and correlation patterns for horizontally oriented biomarkers were different than vertically oriented ones. The structural biomarker with the largest number of significant functional correlates was the ellipsoid zone (EZ) width, followed by the total photoreceptor layer thickness. The strongest correlation observed was between EZ width and HOV slope distance (marginal R2 = 0.85, p<10−10). The mean sensitivity defect at the EZ edge was 7.6 dB. Among all functional biomarkers, the HOV slope mean value, HOV slope mean distance, and maximum sensitivity along the b-scan had the largest number of significant structural correlates. Conclusions Topographic slope metrics show promise as functional biomarkers relevant to the transition zone. EZ width is strongly associated with the location of most rapid HOV decline. PMID:26845445

  7. Automated 3D architecture reconstruction from photogrammetric structure-and-motion: A case study of the One Pilla pagoda, Hanoi, Vienam

    NASA Astrophysics Data System (ADS)

    To, T.; Nguyen, D.; Tran, G.

    2015-04-01

    Heritage system of Vietnam has decline because of poor-conventional condition. For sustainable development, it is required a firmly control, space planning organization, and reasonable investment. Moreover, in the field of Cultural Heritage, the use of automated photogrammetric systems, based on Structure from Motion techniques (SfM), is widely used. With the potential of high-resolution, low-cost, large field of view, easiness, rapidity and completeness, the derivation of 3D metric information from Structure-and- Motion images is receiving great attention. In addition, heritage objects in form of 3D physical models are recorded not only for documentation issues, but also for historical interpretation, restoration, cultural and educational purposes. The study suggests the archaeological documentation of the "One Pilla" pagoda placed in Hanoi capital, Vietnam. The data acquired through digital camera Cannon EOS 550D, CMOS APS-C sensor 22.3 x 14.9 mm. Camera calibration and orientation were carried out by VisualSFM, CMPMVS (Multi-View Reconstruction) and SURE (Photogrammetric Surface Reconstruction from Imagery) software. The final result represents a scaled 3D model of the One Pilla Pagoda and displayed different views in MeshLab software.

  8. Automated three-dimensional registration and volume rebuilding for wide-field angiographic and structural optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Zang, Pengxiao; Liu, Gangjun; Zhang, Miao; Wang, Jie; Hwang, Thomas S.; Wilson, David J.; Huang, David; Li, Dengwang; Jia, Yali

    2017-02-01

    We propose a three-dimensional (3-D) registration method to correct motion artifacts and construct the volume structure for angiographic and structural optical coherence tomography (OCT). This algorithm is particularly suitable for the nonorthogonal wide-field OCT scan acquired by a ultrahigh-speed swept-source system (>200 kHz A-scan rate). First, the transverse motion artifacts are corrected by the between-frame registration based on en face OCT angiography (OCTA). After A-scan transverse translation between B-frames, the axial motions are corrected based on the rebuilt boundary of inner limiting membrane. Finally, a within-frame registration is performed for local optimization based on cross-sectional OCTA. We evaluated this algorithm on retinal volumes of six normal subjects. The results showed significantly improved retinal smoothness in 3-D-registered structural OCT and image contrast on en face OCTA.

  9. Characterization of PZT Capacitor Structures with Various Electrode Materials Processed In-Situ Using AN Automated, Rotating Elemental Target, Ion Beam Deposition System

    NASA Astrophysics Data System (ADS)

    Gifford, Kenneth Douglas

    Ferroelectric thin film capacitor structures containing lead zirconate titanate (PZT) as the dielectric, with the chemical formula Pb(rm Zr_{x }Ti_{1-x})O_3, were synthesized in-situ with an automated ion beam sputter deposition system. Platinum (Pt), conductive ruthenium oxide (RuO_2), and two types of Pt-RuO_2 hybrid electrodes were used as the electrode materials. The capacitor structures are characterized in terms of microstructure and electrical characteristics. Reduction or elimination of non-ferroelectric phases, that nucleate during PZT processing on Pt/TiO _2/MgO and RuO_2/MgO substrates, is achieved by reducing the thickness of the individually deposited layers and by interposing a buffer layer (~100-200A) of PbTiO _3 (PT) between the bottom electrode and the PZT film. Capacitor structures containing a Pt electrode exhibit poor fatigue resistance, irregardless of the PZT microstructure or the use of a PT buffer layer. From these results, and results from similar capacitors synthesized with sol-gel and laser ablation, PZT-based capacitor structures containing Pt electrodes are considered to be unsuitable for use in memory devices. Using a PT buffer layer, in capacitor structures containing RuO_2 top and bottom electrodes and polycrystalline, highly (101) oriented PZT, reduces or eliminates the nucleation of zirconium-titanium oxide, non-ferroelectric species at the bottom electrode interface during processing. This results in good fatigue resistance up to ~2times10^ {10} switching cycles. DC leakage current density vs. time measurements follow the Curie-von Schweidler law, J(t) ~ t^ {rm -n}. Identification of the high electric field current conduction mechanism is inconclusive. The good fatigue resistance, low dc leakage current, and excellent retention, qualifies the use of these capacitor structures in non-volatile random access (NVRAM) and dynamic random access (DRAM) memory devices. Excellent fatigue resistance (10% loss in remanent polarization up to

  10. Reliability and validity of MRI-based automated volumetry software relative to auto-assisted manual measurement of subcortical structures in HIV-infected patients from a multisite study.

    PubMed

    Dewey, Jeffrey; Hana, George; Russell, Troy; Price, Jared; McCaffrey, Daniel; Harezlak, Jaroslaw; Sem, Ekta; Anyanwu, Joy C; Guttmann, Charles R; Navia, Bradford; Cohen, Ronald; Tate, David F

    2010-07-15

    The automated volumetric output of FreeSurfer and Individual Brain Atlases using Statistical Parametric Mapping (IBASPM), two widely used and well published software packages, was examined for accuracy and consistency relative to auto-assisted manual (AAM) tracings (i.e., manual correction of automated output) when measuring the caudate, putamen, amygdala, and hippocampus in the baseline scans of 120 HIV-infected patients (86.7% male, 47.3+/-6.3y.o., mean HIV duration 12.0+/-6.3years) from the NIH-funded HIV Neuroimaging Consortium (HIVNC) cohort. The data was examined for accuracy and consistency relative to auto-assisted manual tracing, and construct validity was assessed by correlating automated and AAM volumetric measures with relevant clinical measures of HIV progression. When results were averaged across all patients in the eight structures examined, FreeSurfer achieved lower absolute volume difference in five, higher sensitivity in seven, and higher spatial overlap in all eight structures. Additionally, FreeSurfer results exhibited less variability in all measures. Output from both methods identified discrepant correlations with clinical measures of HIV progression relative to AAM segmented data. Overall, FreeSurfer proved more effective in the context of subcortical volumetry in HIV-patients, particularly in a multisite cohort study such as this. These findings emphasize that regardless of the automated method used, visual inspection of segmentation output, along with manual correction if necessary, remains critical to ensuring the validity of reported results. Copyright 2010 Elsevier Inc. All rights reserved.

  11. Using a semi-automated filtering process to improve large footprint lidar sub-canopy elevation models and forest structure metrics

    NASA Astrophysics Data System (ADS)

    Fricker, G. A.; Saatchi, S.; Meyer, V.; Gillespie, T.; Sheng, Y.

    2011-12-01

    Quantification of sub-canopy topography and forest structure is important for developing a better understanding of how forest ecosystems function. This study focuses on a three-step method to adapt discrete return lidar (DRL) filtering techniques to Laser Vegetation Imaging Sensor (LVIS) large-footprint lidar (LFL) waveforms to improve the accuracy of both sub-canopy digital elevation models (DEMs), as well as forest structure measurements. The results of the experiment demonstrate that LFL ground surfaces can be effectively filtered using methods adapted from DRL point filtering methods, and the resulting data will produce more accurate digital elevation models, as well as improved estimates of forest structure. The first step quantifies the slope present at the center of each LFL pulse, and the average error expected at each particular degree of slope is modeled. Areas of high terrain slope show consistently more error in LFL ground detection, and empirical relationships between terrain angle and expected LVIS ground detection error are established. These relationships are then used to create an algorithm for LFL ground elevation correction. The second step uses an iterative, expanding window filter to identify outlier points which are not part of the ground surface, as well as manual editing to identify laser pulses which are not at ground level. The semi-automated methods improved the LVIS DEM accuracy significantly by identifying significant outliers in the LVIS point cloud. The final step develops an approach which utilizes both the filtered LFL DEMs, and the modeled error introduced by terrain slope to improve both sub-canopy elevation models, and above ground LFL waveform metrics. DRL and LVIS data from Barro Colorado Island, Panama, and La Selva, Costa Rica were used to develop and test the algorithm. Acknowledgements: Special thanks to Dr. Jim Dilling for providing the DRL lidar data for Barro Colorado Island.

  12. Intensity targeted radial structure tensor analysis and its application for automated mediastinal lymph node detection from CT volumes

    NASA Astrophysics Data System (ADS)

    Oda, Hirohisa; Nimura, Yukitaka; Oda, Masahiro; Kitasaka, Takayuki; Iwano, Shingo; Honma, Hirotoshi; Takabatake, Hirotsugu; Mori, Masaki; Natori, Hiroshi; Mori, Kensaku

    2016-03-01

    This paper presents a new blob-like enhancement filter based on Intensity Targeted Radial Structure Tensor (ITRST) analysis to improve mediastinal lymph node detection from chest CT volumes. Blob-like structure enhancement filter based on Radial Structure Tensor (RST) analysis can be utilized for initial detection of lymph node candidate regions. However, some of lymph nodes cannot be detected because RST analysis is influenced by neighboring regions whose intensity is very high or low, such as contrast-enhanced blood vessels and air. To overcome the problem, we propose ITRST analysis that integrate the prior knowledge on detection target intensity into RST analysis. Our lymph node detection method consists of two steps. First, candidate regions are obtained by ITRST analysis. Second, false positives (FPs) are removed by the Support Vector Machine (SVM) classifier. We applied the proposed method to 47 cases. Among 19 lymph nodes whose short axis is no less than 10 mm, 100.0 % of them were detected with 247.7 FPs/case by ITRST analysis, while only 80.0 % were detected with 123.0 FPs/case by RST analysis. After the false positive (FP) reduction by SVM, ITRST analysis outperformed RST analysis in lymph node detection performance.

  13. Automated Building Extraction from High-Resolution Satellite Imagery in Urban Areas Using Structural, Contextual, and Spectral Information

    NASA Astrophysics Data System (ADS)

    Jin, Xiaoying; Davis, Curt H.

    2005-12-01

    High-resolution satellite imagery provides an important new data source for building extraction. We demonstrate an integrated strategy for identifying buildings in 1-meter resolution satellite imagery of urban areas. Buildings are extracted using structural, contextual, and spectral information. First, a series of geodesic opening and closing operations are used to build a differential morphological profile (DMP) that provides image structural information. Building hypotheses are generated and verified through shape analysis applied to the DMP. Second, shadows are extracted using the DMP to provide reliable contextual information to hypothesize position and size of adjacent buildings. Seed building rectangles are verified and grown on a finely segmented image. Next, bright buildings are extracted using spectral information. The extraction results from the different information sources are combined after independent extraction. Performance evaluation of the building extraction on an urban test site using IKONOS satellite imagery of the City of Columbia, Missouri, is reported. With the combination of structural, contextual, and spectral information,[InlineEquation not available: see fulltext.] of the building areas are extracted with a quality percentage[InlineEquation not available: see fulltext.].

  14. Automated method for determination of dissolved organic carbon-water distribution constants of structurally diverse pollutants using pre-equilibrium solid-phase microextraction.

    PubMed

    Ripszam, Matyas; Haglund, Peter

    2015-02-01

    Dissolved organic carbon (DOC) plays a key role in determining the environmental fate of semivolatile organic environmental contaminants. The goal of the present study was to develop a method using commercially available hardware to rapidly characterize the sorption properties of DOC in water samples. The resulting method uses negligible-depletion direct immersion solid-phase microextraction (SPME) and gas chromatography-mass spectrometry. Its performance was evaluated using Nordic reference fulvic acid and 40 priority environmental contaminants that cover a wide range of physicochemical properties. Two SPME fibers had to be used to cope with the span of properties, 1 coated with polydimethylsiloxane and 1 coated with polystyrene divinylbenzene polydimethylsiloxane, for nonpolar and semipolar contaminants, respectively. The measured DOC-water distribution constants showed reasonably good reproducibility (standard deviation ≤ 0.32) and good correlation (R(2)  = 0.80) with log octanol-water partition coefficients for nonpolar persistent organic pollutants. The sample pretreatment is limited to filtration, and the method is easy to adjust to different DOC concentrations. These experiments also utilized the latest SPME automation that largely decreases total cycle time (to 20 min or shorter) and increases sample throughput, which is advantageous in cases when many samples of DOC must be characterized or when the determinations must be performed quickly, for example, to avoid precipitation, aggregation, and other changes of DOC structure and properties. The data generated by this method are valuable as a basis for transport and fate modeling studies. © 2014 SETAC.

  15. Role of molar concentration in structural, optical and gas sensing performance of anatase phase TiO2 nanofilms: automated nebulizer spray pyrolysis (ANSP) technique

    NASA Astrophysics Data System (ADS)

    Gopala Krishnan, V.; Elango, P.; Ganesan, V.

    2017-07-01

    TiO2 nanofilms were deposited on a glass substrate at 500 °C using automated nebulizer spray pyrolysis. The anatase polycrystalline structure with increased grain size and variations of surfactant planes ( T c) were influenced by molar concentration on XRD study. AFM study shows the average roughness values were increased with increase in molar concentration. A granular domain like microstructure with crack and void-free particle was examined by FESEM. The maximum transmittance 95.5% (529.6 nm) for x = 0.05 M/L, further increment of molar concentration showed the decremented transmittance with red shift absorption edge and the calculated band gap values ( E g = 3.53-3.20 eV) also noted. The gas sensing performances of films were studied with respect to various gas sensing parameters and the ammonia (NH3) gas showed better sensing response ( S max = 89%) at 150 °C for 300 ppm gas concentration against other gases (C2H6O, CH4O, C3H8O and C3H6O).

  16. Determination of Molecular Structures of HIV Envelope Glycoproteins using Cryo-Electron Tomography and Automated Sub-tomogram Averaging

    PubMed Central

    Meyerson, Joel R.; White, Tommi A.; Bliss, Donald; Moran, Amy; Bartesaghi, Alberto; Borgnia, Mario J.; de la Cruz, M. Jason V.; Schauder, David; Hartnell, Lisa M.; Nandwani, Rachna; Dawood, Moez; Kim, Brianna; Kim, Jun Hong; Sununu, John; Yang, Lisa; Bhatia, Siddhant; Subramaniam, Carolyn; Hurt, Darrell E.; Gaudreault, Laurent; Subramaniam, Sriram

    2011-01-01

    Since its discovery nearly 30 years ago, more than 60 million people have been infected with the human immunodeficiency virus (HIV) (www.usaid.gov). The virus infects and destroys CD4+ T-cells thereby crippling the immune system, and causing an acquired immunodeficiency syndrome (AIDS) 2. Infection begins when the HIV Envelope glycoprotein "spike" makes contact with the CD4 receptor on the surface of the CD4+ T-cell. This interaction induces a conformational change in the spike, which promotes interaction with a second cell surface co-receptor 5,9. The significance of these protein interactions in the HIV infection pathway makes them of profound importance in fundamental HIV research, and in the pursuit of an HIV vaccine. The need to better understand the molecular-scale interactions of HIV cell contact and neutralization motivated the development of a technique to determine the structures of the HIV spike interacting with cell surface receptor proteins and molecules that block infection. Using cryo-electron tomography and 3D image processing, we recently demonstrated the ability to determine such structures on the surface of native virus, at ˜20 Å resolution 9,14. This approach is not limited to resolving HIV Envelope structures, and can be extended to other viral membrane proteins and proteins reconstituted on a liposome. In this protocol, we describe how to obtain structures of HIV envelope glycoproteins starting from purified HIV virions and proceeding stepwise through preparing vitrified samples, collecting, cryo-electron microscopy data, reconstituting and processing 3D data volumes, averaging and classifying 3D protein subvolumes, and interpreting results to produce a protein model. The computational aspects of our approach were adapted into modules that can be accessed and executed remotely using the Biowulf GNU/Linux parallel processing cluster at the NIH (http://biowulf.nih.gov). This remote access, combined with low-cost computer hardware and high

  17. Determination of molecular structures of HIV envelope glycoproteins using cryo-electron tomography and automated sub-tomogram averaging.

    PubMed

    Meyerson, Joel R; White, Tommi A; Bliss, Donald; Moran, Amy; Bartesaghi, Alberto; Borgnia, Mario J; de la Cruz, M Jason V; Schauder, David; Hartnell, Lisa M; Nandwani, Rachna; Dawood, Moez; Kim, Brianna; Kim, Jun Hong; Sununu, John; Yang, Lisa; Bhatia, Siddhant; Subramaniam, Carolyn; Hurt, Darrell E; Gaudreault, Laurent; Subramaniam, Sriram

    2011-12-01

    Since its discovery nearly 30 years ago, more than 60 million people have been infected with the human immunodeficiency virus (HIV) (www.usaid.gov). The virus infects and destroys CD4+ T-cells thereby crippling the immune system, and causing an acquired immunodeficiency syndrome (AIDS) (2). Infection begins when the HIV Envelope glycoprotein "spike" makes contact with the CD4 receptor on the surface of the CD4+ T-cell. This interaction induces a conformational change in the spike, which promotes interaction with a second cell surface co-receptor (5,9). The significance of these protein interactions in the HIV infection pathway makes them of profound importance in fundamental HIV research, and in the pursuit of an HIV vaccine. The need to better understand the molecular-scale interactions of HIV cell contact and neutralization motivated the development of a technique to determine the structures of the HIV spike interacting with cell surface receptor proteins and molecules that block infection. Using cryo-electron tomography and 3D image processing, we recently demonstrated the ability to determine such structures on the surface of native virus, at ˜20 Å resolution (9,14). This approach is not limited to resolving HIV Envelope structures, and can be extended to other viral membrane proteins and proteins reconstituted on a liposome. In this protocol, we describe how to obtain structures of HIV envelope glycoproteins starting from purified HIV virions and proceeding stepwise through preparing vitrified samples, collecting, cryo-electron microscopy data, reconstituting and processing 3D data volumes, averaging and classifying 3D protein subvolumes, and interpreting results to produce a protein model. The computational aspects of our approach were adapted into modules that can be accessed and executed remotely using the Biowulf GNU/Linux parallel processing cluster at the NIH (http://biowulf.nih.gov). This remote access, combined with low-cost computer hardware and

  18. Use of conditional rule structure to automate clinical decision support: a comparison of artificial intelligence and deterministic programming techniques.

    PubMed

    Friedman, R H; Frank, A D

    1983-08-01

    A rule-based computer system was developed to perform clinical decision-making support within a medical information system, oncology practice, and clinical research. This rule-based system, which has been programmed using deterministic rules, possesses features of generalizability, modularity of structure, convenience in rule acquisition, explanability, and utility for patient care and teaching, features which have been identified as advantages of artificial intelligence (AI) rule-based systems. Formal rules are primarily represented as conditional statements; common conditions and actions are stored in system dictionaries so that they can be recalled at any time to form new decision rules. Important similarities and differences exist in the structure of this system and clinical computer systems utilizing artificial intelligence (AI) production rule techniques. The non-AI rule-based system possesses advantages in cost and ease of implementation. The degree to which significant medical decision problems can be solved by this technique remains uncertain as does whether the more complex AI methodologies will be required.

  19. A linear programming approach to reconstructing subcellular structures from confocal images for automated generation of representative 3D cellular models.

    PubMed

    Wood, Scott T; Dean, Brian C; Dean, Delphine

    2013-04-01

    This paper presents a novel computer vision algorithm to analyze 3D stacks of confocal images of fluorescently stained single cells. The goal of the algorithm is to create representative in silico model structures that can be imported into finite element analysis software for mechanical characterization. Segmentation of cell and nucleus boundaries is accomplished via standard thresholding methods. Using novel linear programming methods, a representative actin stress fiber network is generated by computing a linear superposition of fibers having minimum discrepancy compared with an experimental 3D confocal image. Qualitative validation is performed through analysis of seven 3D confocal image stacks of adherent vascular smooth muscle cells (VSMCs) grown in 2D culture. The presented method is able to automatically generate 3D geometries of the cell's boundary, nucleus, and representative F-actin network based on standard cell microscopy data. These geometries can be used for direct importation and implementation in structural finite element models for analysis of the mechanics of a single cell to potentially speed discoveries in the fields of regenerative medicine, mechanobiology, and drug discovery.

  20. Use of conditional rule structure to automate clinical decision support: a comparison of artificial intelligence and deterministic programming techniques

    SciTech Connect

    Friedman, R.H.; Frank, A.D.

    1983-08-01

    A rule-based computer system was developed to perform clinical decision-making support within a medical information system, oncology practice, and clinical research. This rule-based system, which has been programmed using deterministic rules, possesses features of generalizability, modularity of structure, convenience in rule acquisition, explanability, and utility for patient care and teaching, features which have been identified as advantages of artificial intelligence (AI) rule-based systems. Formal rules are primarily represented as conditional statements; common conditions and actions are stored in system dictionaries so that they can be recalled at any time to form new decision rules. Important similarities and differences exist in the structure of this system and clinical computer systems utilizing artificial intelligence (AI) production rule techniques. The non-AI rule-based system posesses advantages in cost and ease of implementation. The degree to which significant medical decision problems can be solved by this technique remains uncertain as does whether the more complex AI methodologies will be required. 15 references.

  1. Automated Approaches to RFI Flagging

    NASA Astrophysics Data System (ADS)

    Garimella, Karthik; Momjian, Emmanuel

    2017-01-01

    It is known that Radio Frequency Interference (RFI) is a major issue in centimeter wavelength radio astronomy. Radio astronomy software packages include tools to excise RFI; both manual and automated utilizing the visibilities (the uv data). Here we present results on an automated RFI flagging approach that utilizes a uv-grid, which is the intermediate product when converting uv data points to an image. It is a well known fact that any signal that appears widespread in a given domain (e.g., image domain) is compact in the Fourier domain (uv-grid domain), i.e., RFI sources that appear as large scale structures (e.g., stripes) in images can be located and flagged using the uv-grid data set. We developed several automated uv-grid based flagging algorithms to detect and excise RFI. These algorithms will be discussed, and results of applying them to measurement sets will be presented.

  2. Space power subsystem automation technology

    NASA Technical Reports Server (NTRS)

    Graves, J. R. (Compiler)

    1982-01-01

    The technology issues involved in power subsystem automation and the reasonable objectives to be sought in such a program were discussed. The complexities, uncertainties, and alternatives of power subsystem automation, along with the advantages from both an economic and a technological perspective were considered. Whereas most spacecraft power subsystems now use certain automated functions, the idea of complete autonomy for long periods of time is almost inconceivable. Thus, it seems prudent that the technology program for power subsystem automation be based upon a growth scenario which should provide a structured framework of deliberate steps to enable the evolution of space power subsystems from the current practice of limited autonomy to a greater use of automation with each step being justified on a cost/benefit basis. Each accomplishment should move toward the objectives of decreased requirement for ground control, increased system reliability through onboard management, and ultimately lower energy cost through longer life systems that require fewer resources to operate and maintain. This approach seems well-suited to the evolution of more sophisticated algorithms and eventually perhaps even the use of some sort of artificial intelligence. Multi-hundred kilowatt systems of the future will probably require an advanced level of autonomy if they are to be affordable and manageable.

  3. Combining automated peak tracking in SAR by NMR with structure-based backbone assignment from 15N-NOESY

    PubMed Central

    2012-01-01

    Background Chemical shift mapping is an important technique in NMR-based drug screening for identifying the atoms of a target protein that potentially bind to a drug molecule upon the molecule's introduction in increasing concentrations. The goal is to obtain a mapping of peaks with known residue assignment from the reference spectrum of the unbound protein to peaks with unknown assignment in the target spectrum of the bound protein. Although a series of perturbed spectra help to trace a path from reference peaks to target peaks, a one-to-one mapping generally is not possible, especially for large proteins, due to errors, such as noise peaks, missing peaks, missing but then reappearing, overlapped, and new peaks not associated with any peaks in the reference. Due to these difficulties, the mapping is typically done manually or semi-automatically, which is not efficient for high-throughput drug screening. Results We present PeakWalker, a novel peak walking algorithm for fast-exchange systems that models the errors explicitly and performs many-to-one mapping. On the proteins: hBclXL, UbcH5B, and histone H1, it achieves an average accuracy of over 95% with less than 1.5 residues predicted per target peak. Given these mappings as input, we present PeakAssigner, a novel combined structure-based backbone resonance and NOE assignment algorithm that uses just 15N-NOESY, while avoiding TOCSY experiments and 13C-labeling, to resolve the ambiguities for a one-to-one mapping. On the three proteins, it achieves an average accuracy of 94% or better. Conclusions Our mathematical programming approach for modeling chemical shift mapping as a graph problem, while modeling the errors directly, is potentially a time- and cost-effective first step for high-throughput drug screening based on limited NMR data and homologous 3D structures. PMID:22536902

  4. Application of DEN refinement and automated model building to a difficult case of molecular-replacement phasing: the structure of a putative succinyl-diaminopimelate desuccinylase from Corynebacterium glutamicum.

    PubMed

    Brunger, Axel T; Das, Debanu; Deacon, Ashley M; Grant, Joanna; Terwilliger, Thomas C; Read, Randy J; Adams, Paul D; Levitt, Michael; Schröder, Gunnar F

    2012-04-01

    Phasing by molecular replacement remains difficult for targets that are far from the search model or in situations where the crystal diffracts only weakly or to low resolution. Here, the process of determining and refining the structure of Cgl1109, a putative succinyl-diaminopimelate desuccinylase from Corynebacterium glutamicum, at ∼3 Å resolution is described using a combination of homology modeling with MODELLER, molecular-replacement phasing with Phaser, deformable elastic network (DEN) refinement and automated model building using AutoBuild in a semi-automated fashion, followed by final refinement cycles with phenix.refine and Coot. This difficult molecular-replacement case illustrates the power of including DEN restraints derived from a starting model to guide the movements of the model during refinement. The resulting improved model phases provide better starting points for automated model building and produce more significant difference peaks in anomalous difference Fourier maps to locate anomalous scatterers than does standard refinement. This example also illustrates a current limitation of automated procedures that require manual adjustment of local sequence misalignments between the homology model and the target sequence.

  5. Application of DEN refinement and automated model building to a difficult case of molecular-replacement phasing: the structure of a putative succinyl-diaminopimelate desuccinylase from Corynebacterium glutamicum

    PubMed Central

    Brunger, Axel T.; Das, Debanu; Deacon, Ashley M.; Grant, Joanna; Terwilliger, Thomas C.; Read, Randy J.; Adams, Paul D.; Levitt, Michael; Schröder, Gunnar F.

    2012-01-01

    Phasing by molecular replacement remains difficult for targets that are far from the search model or in situations where the crystal diffracts only weakly or to low resolution. Here, the process of determining and refining the structure of Cgl1109, a putative succinyl-diaminopimelate desuccinylase from Corynebacterium glutamicum, at ∼3 Å resolution is described using a combination of homology modeling with MODELLER, molecular-replacement phasing with Phaser, deformable elastic network (DEN) refinement and automated model building using AutoBuild in a semi-automated fashion, followed by final refinement cycles with phenix.refine and Coot. This difficult molecular-replacement case illustrates the power of including DEN restraints derived from a starting model to guide the movements of the model during refinement. The resulting improved model phases provide better starting points for automated model building and produce more significant difference peaks in anomalous difference Fourier maps to locate anomalous scatterers than does standard refinement. This example also illustrates a current limitation of automated procedures that require manual adjustment of local sequence misalignments between the homology model and the target sequence. PMID:22505259

  6. Autonomy and Automation

    NASA Technical Reports Server (NTRS)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  7. A method and knowledge base for automated inference of patient problems from structured data in an electronic medical record

    PubMed Central

    Pang, Justine; Feblowitz, Joshua C; Maloney, Francine L; Wilcox, Allison R; Ramelson, Harley Z; Schneider, Louise I; Bates, David W

    2011-01-01

    Background Accurate knowledge of a patient's medical problems is critical for clinical decision making, quality measurement, research, billing and clinical decision support. Common structured sources of problem information include the patient problem list and billing data; however, these sources are often inaccurate or incomplete. Objective To develop and validate methods of automatically inferring patient problems from clinical and billing data, and to provide a knowledge base for inferring problems. Study design and methods We identified 17 target conditions and designed and validated a set of rules for identifying patient problems based on medications, laboratory results, billing codes, and vital signs. A panel of physicians provided input on a preliminary set of rules. Based on this input, we tested candidate rules on a sample of 100 000 patient records to assess their performance compared to gold standard manual chart review. The physician panel selected a final rule for each condition, which was validated on an independent sample of 100 000 records to assess its accuracy. Results Seventeen rules were developed for inferring patient problems. Analysis using a validation set of 100 000 randomly selected patients showed high sensitivity (range: 62.8–100.0%) and positive predictive value (range: 79.8–99.6%) for most rules. Overall, the inference rules performed better than using either the problem list or billing data alone. Conclusion We developed and validated a set of rules for inferring patient problems. These rules have a variety of applications, including clinical decision support, care improvement, augmentation of the problem list, and identification of patients for research cohorts. PMID:21613643

  8. Workflow automation architecture standard

    SciTech Connect

    Moshofsky, R.P.; Rohen, W.T.

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  9. ELECTRIC WELDING EQUIPMENT AND AUTOMATION OF WELDING IN CONSTRUCTION,

    DTIC Science & Technology

    WELDING , *ARC WELDING , AUTOMATION, CONSTRUCTION, INDUSTRIES, POWER EQUIPMENT, GENERATORS, POWER TRANSFORMERS, RESISTANCE WELDING , SPOT WELDING , MACHINES, AUTOMATIC, STRUCTURES, WIRING DIAGRAMS, USSR.

  10. Automation in Clinical Microbiology

    PubMed Central

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  11. Automation of industrial bioprocesses.

    PubMed

    Beyeler, W; DaPra, E; Schneider, K

    2000-01-01

    The dramatic development of new electronic devices within the last 25 years has had a substantial influence on the control and automation of industrial bioprocesses. Within this short period of time the method of controlling industrial bioprocesses has changed completely. In this paper, the authors will use a practical approach focusing on the industrial applications of automation systems. From the early attempts to use computers for the automation of biotechnological processes up to the modern process automation systems some milestones are highlighted. Special attention is given to the influence of Standards and Guidelines on the development of automation systems.

  12. Shoe-String Automation

    SciTech Connect

    Duncan, M.L.

    2001-07-30

    Faced with a downsizing organization, serious budget reductions and retirement of key metrology personnel, maintaining capabilities to provide necessary services to our customers was becoming increasingly difficult. It appeared that the only solution was to automate some of our more personnel-intensive processes; however, it was crucial that the most personnel-intensive candidate process be automated, at the lowest price possible and with the lowest risk of failure. This discussion relates factors in the selection of the Standard Leak Calibration System for automation, the methods of automation used to provide the lowest-cost solution and the benefits realized as a result of the automation.

  13. Automating the multiprocessing environment

    NASA Technical Reports Server (NTRS)

    Arpasi, Dale J.

    1989-01-01

    An approach to automate the programming and operation of tree-structured networks of multiprocessor systems is discussed. A conceptual, knowledge-based operating environment is presented, and requirements for two major technology elements are identified as follows: (1) An intelligent information translator is proposed for implementating information transfer between dissimilar hardware and software, thereby enabling independent and modular development of future systems and promoting a language-independence of codes and information; (2) A resident system activity manager, which recognizes the systems capabilities and monitors the status of all systems within the environment, is proposed for integrating dissimilar systems into effective parallel processing resources to optimally meet user needs. Finally, key computational capabilities which must be provided before the environment can be realized are identified.

  14. Distributed Experiment Automation System

    NASA Astrophysics Data System (ADS)

    Lebedev, Gennadi

    2003-03-01

    Module based distributed system for controlling and automation scientific experiments were developed. System divides in five main layers: 1. Data processing and presentation modules, 2. Controllers - support primary command evaluation, data analysis and synchronization between Device Drivers. 3. Data Server. Provide real time data storage and management. 4. Device Drivers, support communication, preliminary signals acquisitions and control of peripheral devices. 5. Utility - batch processing, login, errors of execution handling, experimental data persistent storage and management, modules and devices monitoring, alarm state, remote components messaging and notification processing. System used networking (DCOM protocol) for communication between distributed modules. Configuration, modules parameters, data and commands links defined in scripting file (XML format). This modular structure allows great flexibility and extensibility as modules can be added and configured as required without any extensive programming.

  15. Protein fabrication automation

    PubMed Central

    Cox, J. Colin; Lape, Janel; Sayed, Mahmood A.; Hellinga, Homme W.

    2007-01-01

    Facile “writing” of DNA fragments that encode entire gene sequences potentially has widespread applications in biological analysis and engineering. Rapid writing of open reading frames (ORFs) for expressed proteins could transform protein engineering and production for protein design, synthetic biology, and structural analysis. Here we present a process, protein fabrication automation (PFA), which facilitates the rapid de novo construction of any desired ORF from oligonucleotides with low effort, high speed, and little human interaction. PFA comprises software for sequence design, data management, and the generation of instruction sets for liquid-handling robotics, a liquid-handling robot, a robust PCR scheme for gene assembly from synthetic oligonucleotides, and a genetic selection system to enrich correctly assembled full-length synthetic ORFs. The process is robust and scalable. PMID:17242375

  16. Automated Standard Hazard Tool

    NASA Technical Reports Server (NTRS)

    Stebler, Shane

    2014-01-01

    The current system used to generate standard hazard reports is considered cumbersome and iterative. This study defines a structure for this system's process in a clear, algorithmic way so that standard hazard reports and basic hazard analysis may be completed using a centralized, web-based computer application. To accomplish this task, a test server is used to host a prototype of the tool during development. The prototype is configured to easily integrate into NASA's current server systems with minimal alteration. Additionally, the tool is easily updated and provides NASA with a system that may grow to accommodate future requirements and possibly, different applications. Results of this project's success are outlined in positive, subjective reviews complete by payload providers and NASA Safety and Mission Assurance personnel. Ideally, this prototype will increase interest in the concept of standard hazard automation and lead to the full-scale production of a user-ready application.

  17. Automated Solid Phase Extraction (SPE) LC/NMR Applied to the Structural Analysis of Extractable Compounds from a Pharmaceutical Packaging Material of Construction.

    PubMed

    Norwood, Daniel L; Mullis, James O; Davis, Mark; Pennino, Scott; Egert, Thomas; Gonnella, Nina C

    2013-01-01

    The structural analysis (i.e., identification) of organic chemical entities leached into drug product formulations has traditionally been accomplished with techniques involving the combination of chromatography with mass spectrometry. These include gas chromatography/mass spectrometry (GC/MS) for volatile and semi-volatile compounds, and various forms of liquid chromatography/mass spectrometry (LC/MS or HPLC/MS) for semi-volatile and relatively non-volatile compounds. GC/MS and LC/MS techniques are complementary for structural analysis of leachables and potentially leachable organic compounds produced via laboratory extraction of pharmaceutical container closure/delivery system components and corresponding materials of construction. Both hyphenated analytical techniques possess the separating capability, compound specific detection attributes, and sensitivity required to effectively analyze complex mixtures of trace level organic compounds. However, hyphenated techniques based on mass spectrometry are limited by the inability to determine complete bond connectivity, the inability to distinguish between many types of structural isomers, and the inability to unambiguously determine aromatic substitution patterns. Nuclear magnetic resonance spectroscopy (NMR) does not have these limitations; hence it can serve as a complement to mass spectrometry. However, NMR technology is inherently insensitive and its ability to interface with chromatography has been historically challenging. This article describes the application of NMR coupled with liquid chromatography and automated solid phase extraction (SPE-LC/NMR) to the structural analysis of extractable organic compounds from a pharmaceutical packaging material of construction. The SPE-LC/NMR technology combined with micro-cryoprobe technology afforded the sensitivity and sample mass required for full structure elucidation. Optimization of the SPE-LC/NMR analytical method was achieved using a series of model compounds

  18. The disc damage likelihood scale: Diagnostic accuracy and correlations with cup-to-disc ratio, structural tests and standard automated perimetry.

    PubMed

    Kara-José, Andrea C; Melo, Luiz Alberto S; Esporcatte, Bruno L B; Endo, Angelica T N H; Leite, Mauro Toledo; Tavares, Ivan Maynart

    2017-01-01

    Our objective was to compare the diagnostic accuracies of and to determine the correlations between the disc damage likelihood scale (DDLS) and anatomical and functional tests used for glaucoma detection. A total of 54 healthy subjects (54 eyes) and 47 primary open-angle glaucoma patients (47 eyes) were included in this cross-sectional observational study. DDLS scores and cup-to-disc (C/D) ratios were evaluated. Subjects underwent standard automated perimetry (SAP), optic disc and retinal nerve fiber layer (RNFL) imaging with time and spectral-domain optical coherence tomography (TD and SD-OCT), Heidelberg Retina Tomograph (HRT II), and scanning laser polarimetry (GDx-VCC). Areas under the receiver operating characteristic curves (AROCs) for DDLS and diagnostic tests parameters were calculated. DDLS correlations (Spearman's rank) among these parameters were analyzed. Fifty-four eyes were healthy and 47 had glaucoma, including 16 preperimetric glaucoma. DDLS, vertical and horizontal C/D ratios had the largest AROCs (0.92, 0.94 and 0.91, respectively). DDLS diagnostic accuracy was better than the accuracies of HRT II parameters, TD and SD-OCT RNFL thicknesses, and SAP mean deviation (MD) index. There were no significant differences between the accuracies of the DDLS and the C/D ratios, TD-OCT vertical (0.89) and horizontal (0.86) C/D ratios, TD-OCT C/D area ratio (0.89), and GDx-VCC NFI (0.81). DDLS showed significant strong correlations with vertical (r = 0.79) and horizontal (0.74) C/D ratios, and with the parameters vertical C/D ratio and C/D area ratio from HRT II (both 0.77) and TD-OCT (0.75 and 0.72, respectively). DDLS had significant moderate correlations with most of the other structural measurements and SAP MD. The optic disc clinical evaluation with DDLS system and C/D ratio demonstrated excellent accuracy in distinguishing glaucomatous from healthy eyes. DDLS had moderate to strong correlations with most structural and functional parameters. These

  19. Automated DNA Sequencing System

    SciTech Connect

    Armstrong, G.A.; Ekkebus, C.P.; Hauser, L.J.; Kress, R.L.; Mural, R.J.

    1999-04-25

    Oak Ridge National Laboratory (ORNL) is developing a core DNA sequencing facility to support biological research endeavors at ORNL and to conduct basic sequencing automation research. This facility is novel because its development is based on existing standard biology laboratory equipment; thus, the development process is of interest to the many small laboratories trying to use automation to control costs and increase throughput. Before automation, biology Laboratory personnel purified DNA, completed cycle sequencing, and prepared 96-well sample plates with commercially available hardware designed specifically for each step in the process. Following purification and thermal cycling, an automated sequencing machine was used for the sequencing. A technician handled all movement of the 96-well sample plates between machines. To automate the process, ORNL is adding a CRS Robotics A- 465 arm, ABI 377 sequencing machine, automated centrifuge, automated refrigerator, and possibly an automated SpeedVac. The entire system will be integrated with one central controller that will direct each machine and the robot. The goal of this system is to completely automate the sequencing procedure from bacterial cell samples through ready-to-be-sequenced DNA and ultimately to completed sequence. The system will be flexible and will accommodate different chemistries than existing automated sequencing lines. The system will be expanded in the future to include colony picking and/or actual sequencing. This discrete event, DNA sequencing system will demonstrate that smaller sequencing labs can achieve cost-effective the laboratory grow.

  20. Management Planning for Workplace Automation.

    ERIC Educational Resources Information Center

    McDole, Thomas L.

    Several factors must be considered when implementing office automation. Included among these are whether or not to automate at all, the effects of automation on employees, requirements imposed by automation on the physical environment, effects of automation on the total organization, and effects on clientele. The reasons behind the success or…

  1. Laboratory Automation and Middleware.

    PubMed

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory.

  2. Complacency and Automation Bias in the Use of Imperfect Automation.

    PubMed

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  3. Hardware flexibility of laboratory automation systems: analysis and new flexible automation architectures.

    PubMed

    Najmabadi, Peyman; Goldenberg, Andrew A; Emili, Andrew

    2007-03-01

    Development of flexible laboratory automation systems has attracted tremendous attention in recent years as biotechnology scientists perform diverse types of protocols and tend to continuously modify them as part of their research. This article is a system level study of hardware flexibility of laboratory automation architectures for high-throughput automation of various sample preparation protocols. Hardware flexibility (system components' adaptability to protocol variations) of automation systems is addressed through the introduction of three main parametric flexibility measures functional, structural, and throughput. A new quantitative measurement method for these parameters in the realm of the Axiomatic Theory is introduced in this article. The method relies on defining probability of success functions for flexibility parameters and calculating their information contents. As flexibility information content decreases, automation system flexibility increases.

  4. An automation simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.; Mutammara, Atheel

    1988-01-01

    The work being done in porting ROBOSIM (a graphical simulation system developed jointly by NASA-MSFC and Vanderbilt University) to the HP350SRX graphics workstation is described. New additional ROBOSIM features, like collision detection and new kinematics simulation methods are also discussed. Based on the experiences of the work on ROBOSIM, a new graphics structural modeling environment is suggested which is intended to be a part of a new knowledge-based multiple aspect modeling testbed. The knowledge-based modeling methodologies and tools already available are described. Three case studies in the area of Space Station automation are also reported. First a geometrical structural model of the station is presented. This model was developed using the ROBOSIM package. Next the possible application areas of an integrated modeling environment in the testing of different Space Station operations are discussed. One of these possible application areas is the modeling of the Environmental Control and Life Support System (ECLSS), which is one of the most complex subsystems of the station. Using the multiple aspect modeling methodology, a fault propagation model of this system is being built and is described.

  5. Automating checks of plan check automation.

    PubMed

    Halabi, Tarek; Lu, Hsiao-Ming

    2014-07-08

    While a few physicists have designed new plan check automation solutions for their clinics, fewer, if any, managed to adapt existing solutions. As complex and varied as the systems they check, these programs must gain the full confidence of those who would run them on countless patient plans. The present automation effort, planCheck, therefore focuses on versatility and ease of implementation and verification. To demonstrate this, we apply planCheck to proton gantry, stereotactic proton gantry, stereotactic proton fixed beam (STAR), and IMRT treatments.

  6. Automation and Cataloging.

    ERIC Educational Resources Information Center

    Furuta, Kenneth; And Others

    1990-01-01

    These three articles address issues in library cataloging that are affected by automation: (1) the impact of automation and bibliographic utilities on professional catalogers; (2) the effect of the LASS microcomputer software on the cost of authority work in cataloging at the University of Arizona; and (3) online subject heading and classification…

  7. Order Division Automated System.

    ERIC Educational Resources Information Center

    Kniemeyer, Justin M.; And Others

    This publication was prepared by the Order Division Automation Project staff to fulfill the Library of Congress' requirement to document all automation efforts. The report was originally intended for internal use only and not for distribution outside the Library. It is now felt that the library community at-large may have an interest in the…

  8. More Benefits of Automation.

    ERIC Educational Resources Information Center

    Getz, Malcolm

    1988-01-01

    Describes a study that measured the benefits of an automated catalog and automated circulation system from the library user's point of view in terms of the value of time saved. Topics discussed include patterns of use, access time, availability of information, search behaviors, and the effectiveness of the measures used. (seven references)…

  9. Work and Programmable Automation.

    ERIC Educational Resources Information Center

    DeVore, Paul W.

    A new industrial era based on electronics and the microprocessor has arrived, an era that is being called intelligent automation. Intelligent automation, in the form of robots, replaces workers, and the new products, using microelectronic devices, require significantly less labor to produce than the goods they replace. The microprocessor thus…

  10. The Automated Office.

    ERIC Educational Resources Information Center

    Naclerio, Nick

    1979-01-01

    Clerical personnel may be able to climb career ladders as a result of office automation and expanded job opportunities in the word processing area. Suggests opportunities in an automated office system and lists books and periodicals on word processing for counselors and teachers. (MF)

  11. Planning for Office Automation.

    ERIC Educational Resources Information Center

    Sherron, Gene T.

    1982-01-01

    The steps taken toward office automation by the University of Maryland are described. Office automation is defined and some types of word processing systems are described. Policies developed in the writing of a campus plan are listed, followed by a section on procedures adopted to implement the plan. (Author/MLW)

  12. WANTED: Fully Automated Indexing.

    ERIC Educational Resources Information Center

    Purcell, Royal

    1991-01-01

    Discussion of indexing focuses on the possibilities of fully automated indexing. Topics discussed include controlled indexing languages such as subject heading lists and thesauri, free indexing languages, natural indexing languages, computer-aided indexing, expert systems, and the need for greater creativity to further advance automated indexing.…

  13. Automation, parallelism, and robotics for proteomics.

    PubMed

    Alterovitz, Gil; Liu, Jonathan; Chow, Jijun; Ramoni, Marco F

    2006-07-01

    The speed of the human genome project (Lander, E. S., Linton, L. M., Birren, B., Nusbaum, C. et al., Nature 2001, 409, 860-921) was made possible, in part, by developments in automation of sequencing technologies. Before these technologies, sequencing was a laborious, expensive, and personnel-intensive task. Similarly, automation and robotics are changing the field of proteomics today. Proteomics is defined as the effort to understand and characterize proteins in the categories of structure, function and interaction (Englbrecht, C. C., Facius, A., Comb. Chem. High Throughput Screen. 2005, 8, 705-715). As such, this field nicely lends itself to automation technologies since these methods often require large economies of scale in order to achieve cost and time-saving benefits. This article describes some of the technologies and methods being applied in proteomics in order to facilitate automation within the field as well as in linking proteomics-based information with other related research areas.

  14. Automation in Immunohematology

    PubMed Central

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-01-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process. PMID:22988378

  15. Making the transition to automation

    SciTech Connect

    Christenson, D.J. )

    1992-10-01

    By 1995, the Bureau of Reclamation's hydropower plant near Hungry Horse, Montana, will be remotely operated from Grand Coulee dam (about 300 miles away) in Washington State. Automation at Hungry Horse will eliminate the need for four full-time power plant operators. Between now and then, a transition plan that offers employees choices for retraining, transferring, or taking early retirement will smooth the transition in reducing from five operators to one. The transition plan also includes the use of temporary employees to offset risks of reducing staff too soon. When completed in 1953, the Hungry Horse structure was the world's fourth largest and fourth highest concrete dam. The arch-gravity structure has a crest length of 2,115 feet; it is 3,565 feet above sea level. The four turbine-generator units in the powerhouse total 284 MW, and supply approximately 1 billion kilowatt-hours of electricity annually to the federal power grid managed by the Bonneville Power Administration. In 1988, Reclamation began to automate operations at many of its hydro plants, and to establish centralized control points. The control center concept will increase efficiency. It also will coordinate water movements and power supply throughout the West. In the Pacific Northwest, the Grand Coulee and Black Canyon plants are automated control centers. Several Reclamation-owned facilities in the Columbia River Basin, including Hungry Horse, will be connected to these centers via microwave and telephone lines. When automation is complete, constant monitoring by computer will replace hourly manual readings and equipment checks. Computers also are expected to increase water use efficiency by 1 to 2 percent by ensuring operation for maximum turbine efficiency. Unit efficiency curves for various heads will be programmed into the system.

  16. Definitive Metabolite Identification Coupled with Automated Ligand Identification System (ALIS) Technology: A Novel Approach to Uncover Structure-Activity Relationships and Guide Drug Design in a Factor IXa Inhibitor Program.

    PubMed

    Zhang, Ting; Liu, Yong; Yang, Xianshu; Martin, Gary E; Yao, Huifang; Shang, Jackie; Bugianesi, Randal M; Ellsworth, Kenneth P; Sonatore, Lisa M; Nizner, Peter; Sherer, Edward C; Hill, Susan E; Knemeyer, Ian W; Geissler, Wayne M; Dandliker, Peter J; Helmy, Roy; Wood, Harold B

    2016-03-10

    A potent and selective Factor IXa (FIXa) inhibitor was subjected to a series of liver microsomal incubations, which generated a number of metabolites. Using automated ligand identification system-affinity selection (ALIS-AS) methodology, metabolites in the incubation mixture were prioritized by their binding affinities to the FIXa protein. Microgram quantities of the metabolites of interest were then isolated through microisolation analytical capabilities, and structurally characterized using MicroCryoProbe heteronuclear 2D NMR techniques. The isolated metabolites recovered from the NMR experiments were then submitted directly to an in vitro FIXa enzymatic assay. The order of the metabolites' binding affinity to the Factor IXa protein from the ALIS assay was completely consistent with the enzymatic assay results. This work showcases an innovative and efficient approach to uncover structure-activity relationships (SARs) and guide drug design via microisolation-structural characterization and ALIS capabilities.

  17. Systematic review automation technologies.

    PubMed

    Tsafnat, Guy; Glasziou, Paul; Choong, Miew Keen; Dunn, Adam; Galgani, Filippo; Coiera, Enrico

    2014-07-09

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects.We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time.

  18. Systematic review automation technologies

    PubMed Central

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  19. Automated Fabrication Technologies for High Performance Polymer Composites

    NASA Technical Reports Server (NTRS)

    Shuart , M. J.; Johnston, N. J.; Dexter, H. B.; Marchello, J. M.; Grenoble, R. W.

    1998-01-01

    New fabrication technologies are being exploited for building high graphite-fiber-reinforced composite structure. Stitched fiber preforms and resin film infusion have been successfully demonstrated for large, composite wing structures. Other automatic processes being developed include automated placement of tacky, drapable epoxy towpreg, automated heated head placement of consolidated ribbon/tape, and vacuum-assisted resin transfer molding. These methods have the potential to yield low cost high performance structures by fabricating composite structures to net shape out-of-autoclave.

  20. Automation synthesis modules review.

    PubMed

    Boschi, S; Lodi, F; Malizia, C; Cicoria, G; Marengo, M

    2013-06-01

    The introduction of (68)Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived (68)Ge/(68)Ga generator has been at the bases of the development of (68)Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for (68)Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Managing laboratory automation

    PubMed Central

    Saboe, Thomas J.

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed. PMID:18925018

  2. Automating network meta-analysis.

    PubMed

    van Valkenhoef, Gert; Lu, Guobing; de Brock, Bert; Hillege, Hans; Ades, A E; Welton, Nicky J

    2012-12-01

    Mixed treatment comparison (MTC) (also called network meta-analysis) is an extension of traditional meta-analysis to allow the simultaneous pooling of data from clinical trials comparing more than two treatment options. Typically, MTCs are performed using general-purpose Markov chain Monte Carlo software such as WinBUGS, requiring a model and data to be specified using a specific syntax. It would be preferable if, for the most common cases, both could be derived from a well-structured data file that can be easily checked for errors. Automation is particularly valuable for simulation studies in which the large number of MTCs that have to be estimated may preclude manual model specification and analysis. Moreover, automated model generation raises issues that provide additional insight into the nature of MTC. We present a method for the automated generation of Bayesian homogeneous variance random effects consistency models, including the choice of basic parameters and trial baselines, priors, and starting values for the Markov chain(s). We validate our method against the results of five published MTCs. The method is implemented in freely available open source software. This means that performing an MTC no longer requires manually writing a statistical model. This reduces time and effort, and facilitates error checking of the dataset. Copyright © 2012 John Wiley & Sons, Ltd.

  3. Fully automated segmentation of cartilage from the MR images of knee using a multi-atlas and local structural analysis method

    PubMed Central

    Lee, June-Goo; Gumus, Serter; Moon, Chan Hong; Kwoh, C. Kent; Bae, Kyongtae Ty

    2014-01-01

    Purpose: To develop a fully automated method to segment cartilage from the magnetic resonance (MR) images of knee and to evaluate the performance of the method on a public, open dataset. Methods: The segmentation scheme consisted of three procedures: multiple-atlas building, applying a locally weighted vote (LWV), and region adjustment. In the atlas building procedure, all training cases were registered to a target image by a nonrigid registration scheme and the best matched atlases selected. A LWV algorithm was applied to merge the information from these atlases and generate the initial segmentation result. Subsequently, for the region adjustment procedure, the statistical information of bone, cartilage, and surrounding regions was computed from the initial segmentation result. The statistical information directed the automated determination of the seed points inside and outside bone regions for the graph-cut based method. Finally, the region adjustment was conducted by the revision of outliers and the inclusion of abnormal bone regions. Results: A total of 150 knee MR images from a public, open dataset (available atwww.ski10.org) were used for the development and evaluation of this approach. The 150 cases were divided into the training set (100 cases) and the test set (50 cases). The cartilages were segmented successfully in all test cases in an average of 40 min computation time. The average dice similarity coefficient was 71.7% ± 8.0% for femoral and 72.4% ± 6.9% for tibial cartilage. Conclusions: The authors have developed a fully automated segmentation program for knee cartilage from MR images. The performance of the program based on 50 test cases was highly promising. PMID:25186408

  4. Fully automated segmentation of cartilage from the MR images of knee using a multi-atlas and local structural analysis method.

    PubMed

    Lee, June-Goo; Gumus, Serter; Moon, Chan Hong; Kwoh, C Kent; Bae, Kyongtae Ty

    2014-09-01

    To develop a fully automated method to segment cartilage from the magnetic resonance (MR) images of knee and to evaluate the performance of the method on a public, open dataset. The segmentation scheme consisted of three procedures: multiple-atlas building, applying a locally weighted vote (LWV), and region adjustment. In the atlas building procedure, all training cases were registered to a target image by a nonrigid registration scheme and the best matched atlases selected. A LWV algorithm was applied to merge the information from these atlases and generate the initial segmentation result. Subsequently, for the region adjustment procedure, the statistical information of bone, cartilage, and surrounding regions was computed from the initial segmentation result. The statistical information directed the automated determination of the seed points inside and outside bone regions for the graph-cut based method. Finally, the region adjustment was conducted by the revision of outliers and the inclusion of abnormal bone regions. A total of 150 knee MR images from a public, open dataset (available atwww.ski10.org) were used for the development and evaluation of this approach. The 150 cases were divided into the training set (100 cases) and the test set (50 cases). The cartilages were segmented successfully in all test cases in an average of 40 min computation time. The average dice similarity coefficient was 71.7%±8.0% for femoral and 72.4%±6.9% for tibial cartilage. The authors have developed a fully automated segmentation program for knee cartilage from MR images. The performance of the program based on 50 test cases was highly promising.

  5. Monitoring of the physical status of Mars-500 subjects as a model of structuring an automated system in support of the training process in an exploration mission

    NASA Astrophysics Data System (ADS)

    Fomina, Elena; Savinkina, Alexandra; Kozlovskaya, Inesa; Lysova, Nataliya; Angeli, Tomas; Chernova, Maria; Uskov, Konstantin; Kukoba, Tatyana; Sonkin, Valentin; Ba, Norbert

    Physical training sessions aboard the ISS are performed under the permanent continuous control from Earth. Every week the instructors give their recommendations on how to proceed with the training considering the results of analysis of the daily records of training cosmonauts and data of the monthly fitness testing. It is obvious that in very long exploration missions this system of monitoring will be inapplicable. For this reason we venture to develop an automated system to control the physical training process using the current ISS locomotion test parameters as the leading criteria. Simulation of an extended exploration mission in experiment MARS-500 enabled the trial application of the automated system for assessing shifts in cosmonauts’ physical status in response to exercises of varying category and dismissal periods. Methods. Six subjects spent 520 days in the analog of an interplanetary vehicle at IBMP (Moscow). A variety of training regimens and facilities were used to maintain a high level of physical performance of the subjects. The resistance exercises involved expanders, strength training device (MDS) and vibrotraining device (Galileo). The cycling exercises were performed on the bicycle ergometer (VB-3) and a treadmill with the motor in or out of motion. To study the effect of prolonged periods of dismissal from training on physical performance, the training flow was interrupted for a month once in the middle and then at the end of isolation. In addition to the in-flight locomotion test integrated into the automated training control system, the physical status of subjects was attested by analysis of the records of the monthly incremental testing on the bicycle ergometer and MDS. Results. It was demonstrated that the recommended training regimens maintained high physical performance levels despite the limited motor activities in isolation. According to the locomotion testing, the subjects increased velocity significantly and reduced the physiological

  6. Automated fiber placement: Evolution and current demonstrations

    NASA Technical Reports Server (NTRS)

    Grant, Carroll G.; Benson, Vernon M.

    1993-01-01

    The automated fiber placement process has been in development at Hercules since 1980. Fiber placement is being developed specifically for aircraft and other high performance structural applications. Several major milestones have been achieved during process development. These milestones are discussed in this paper. The automated fiber placement process is currently being demonstrated on the NASA ACT program. All demonstration projects to date have focused on fiber placement of transport aircraft fuselage structures. Hercules has worked closely with Boeing and Douglas on these demonstration projects. This paper gives a description of demonstration projects and results achieved.

  7. Xenon International Automated Control

    SciTech Connect

    2016-08-05

    The Xenon International Automated Control software monitors, displays status, and allows for manual operator control as well as fully automatic control of multiple commercial and PNNL designed hardware components to generate and transmit atmospheric radioxenon concentration measurements every six hours.

  8. Automating the Media Center.

    ERIC Educational Resources Information Center

    Holloway, Mary A.

    1988-01-01

    Discusses the need to develop more efficient information retrieval skills by the use of new technology. Lists four stages used in automating the media center. Describes North Carolina's pilot programs. Proposes benefits and looks at the media center's future. (MVL)

  9. Planning for Office Automation.

    ERIC Educational Resources Information Center

    Mick, Colin K.

    1983-01-01

    Outlines a practical approach to planning for office automation termed the "Focused Process Approach" (the "what" phase, "how" phase, "doing" phase) which is a synthesis of the problem-solving and participatory planning approaches. Thirteen references are provided. (EJS)

  10. Automated decision stations

    NASA Technical Reports Server (NTRS)

    Tischendorf, Mark

    1990-01-01

    This paper discusses the combination of software robots and expert systems to automate everyday business tasks. Tasks which require people to repetitively interact with multiple systems screens as well as multiple systems.

  11. [Equipment components and system board for an automated microscopy system].

    PubMed

    Medovyĭ, V S; Parpara, A A; Sokolinskiĭ, B Z; Dem'ianov, V L

    2007-01-01

    Characteristics of modern equipment components for general- or special-purpose modular systems of automated microscopy are considered. Medical, technical, and economic assessment of various configurations of automated microscopy systems is performed. These systems can be used in telemedicine and hematological, cytological, and parasitologic analysis. Schemes for optimization of the configuration of automated microscopy systems are described. The structure of MECOS-Ts2 automated microscopy systems is considered. The system board of MECOS-Ts2 systems makes it possible to use a wide variety of equipment components and software from different manufacturers.

  12. Automated Status Notification System

    NASA Technical Reports Server (NTRS)

    2005-01-01

    NASA Lewis Research Center's Automated Status Notification System (ASNS) was born out of need. To prevent "hacker attacks," Lewis' telephone system needed to monitor communications activities 24 hr a day, 7 days a week. With decreasing staff resources, this continuous monitoring had to be automated. By utilizing existing communications hardware, a UNIX workstation, and NAWK (a pattern scanning and processing language), we implemented a continuous monitoring system.

  13. Automated Microfluidics for Genomics

    DTIC Science & Technology

    2001-10-25

    Abstract--The Genomation Laboratory at the University of Washington is developing an automated fluid handling system called " Acapella " to prepare...Photonic Systems, Inc. (Redmond, WA), an automated submicroliter fluid sample preparation system called ACAPELLA is being developed. Reactions such...technology include minimal residual disease quantification and sample preparation for DNA. Preliminary work on the ACAPELLA is presented in [4][5]. This

  14. Automated Pilot Advisory System

    NASA Technical Reports Server (NTRS)

    Parks, J. L., Jr.; Haidt, J. G.

    1981-01-01

    An Automated Pilot Advisory System (APAS) was developed and operationally tested to demonstrate the concept that low cost automated systems can provide air traffic and aviation weather advisory information at high density uncontrolled airports. The system was designed to enhance the see and be seen rule of flight, and pilots who used the system preferred it over the self announcement system presently used at uncontrolled airports.

  15. Automating Index Preparation

    DTIC Science & Technology

    1987-03-23

    Automating Index Preparation* Pehong Chent Michael A. Harrison Computer Science Division University of CaliforniaI Berkeley, CA 94720 March 23, 1987...Abstract Index preparation is a tedious and time-consuming task. In this paper we indicate * how the indexing process can be automated in a way which...identified and analyzed. Specifically, we describe a framework for placing index commands in the document and a general purpose index processor which

  16. Automated Lattice Perturbation Theory

    SciTech Connect

    Monahan, Christopher

    2014-11-01

    I review recent developments in automated lattice perturbation theory. Starting with an overview of lattice perturbation theory, I focus on the three automation packages currently "on the market": HiPPy/HPsrc, Pastor and PhySyCAl. I highlight some recent applications of these methods, particularly in B physics. In the final section I briefly discuss the related, but distinct, approach of numerical stochastic perturbation theory.

  17. Metrology automation reliability

    NASA Astrophysics Data System (ADS)

    Chain, Elizabeth E.

    1996-09-01

    At Motorola's MOS-12 facility automated measurements on 200- mm diameter wafers proceed in a hands-off 'load-and-go' mode requiring only wafer loading, measurement recipe loading, and a 'run' command for processing. Upon completion of all sample measurements, the data is uploaded to the factory's data collection software system via a SECS II interface, eliminating the requirement of manual data entry. The scope of in-line measurement automation has been extended to the entire metrology scheme from job file generation to measurement and data collection. Data analysis and comparison to part specification limits is also carried out automatically. Successful integration of automated metrology into the factory measurement system requires that automated functions, such as autofocus and pattern recognition algorithms, display a high degree of reliability. In the 24- hour factory reliability data can be collected automatically on every part measured. This reliability data is then uploaded to the factory data collection software system at the same time as the measurement data. Analysis of the metrology reliability data permits improvements to be made as needed, and provides an accurate accounting of automation reliability. This reliability data has so far been collected for the CD-SEM (critical dimension scanning electron microscope) metrology tool, and examples are presented. This analysis method can be applied to such automated in-line measurements as CD, overlay, particle and film thickness measurements.

  18. Automated Groundwater Screening

    SciTech Connect

    Taylor, Glenn A.; Collard, Leonard, B.

    2005-10-31

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application.

  19. Elements of EAF automation processes

    NASA Astrophysics Data System (ADS)

    Ioana, A.; Constantin, N.; Dragna, E. C.

    2017-01-01

    Our article presents elements of Electric Arc Furnace (EAF) automation. So, we present and analyze detailed two automation schemes: the scheme of electrical EAF automation system; the scheme of thermic EAF automation system. The application results of these scheme of automation consists in: the sensitive reduction of specific consummation of electrical energy of Electric Arc Furnace, increasing the productivity of Electric Arc Furnace, increase the quality of the developed steel, increasing the durability of the building elements of Electric Arc Furnace.

  20. Automated Telerobotic Inspection Of Surfaces

    NASA Technical Reports Server (NTRS)

    Balaram, J.; Prasad, K. Venkatesh

    1996-01-01

    Method of automated telerobotic inspection of surfaces undergoing development. Apparatus implementing method includes video camera that scans over surfaces to be inspected, in manner of mine detector. Images of surfaces compared with reference images to detect flaws. Developed for inspecting external structures of Space Station Freedom for damage from micrometeorites and debris from prior artificial satellites. On Earth, applied to inspection for damage, missing parts, contamination, and/or corrosion on interior surfaces of pipes or exterior surfaces of bridges, towers, aircraft, and ships.

  1. Approaches to automated protein crystal harvesting

    PubMed Central

    Deller, Marc C.; Rupp, Bernhard

    2014-01-01

    The harvesting of protein crystals is almost always a necessary step in the determination of a protein structure using X-ray crystallographic techniques. However, protein crystals are usually fragile and susceptible to damage during the harvesting process. For this reason, protein crystal harvesting is the single step that remains entirely dependent on skilled human intervention. Automation has been implemented in the majority of other stages of the structure-determination pipeline, including cloning, expression, purification, crystallization and data collection. The gap in automation between crystallization and data collection results in a bottleneck in throughput and presents unfortunate opportunities for crystal damage. Several automated protein crystal harvesting systems have been developed, including systems utilizing microcapillaries, microtools, microgrippers, acoustic droplet ejection and optical traps. However, these systems have yet to be commonly deployed in the majority of crystallography laboratories owing to a variety of technical and cost-related issues. Automation of protein crystal harvesting remains essential for harnessing the full benefits of fourth-generation synchrotrons, free-electron lasers and microfocus beamlines. Furthermore, automation of protein crystal harvesting offers several benefits when compared with traditional manual approaches, including the ability to harvest microcrystals, improved flash-cooling procedures and increased throughput. PMID:24637746

  2. Automated fully-stressed design with NASTRAN

    NASA Technical Reports Server (NTRS)

    Wallerstein, D. V.; Haggenmacher, G. W.

    1976-01-01

    An automated strength sizing capability is described. The technique determines the distribution of material among the elements of a structural model. The sizing is based on either a fully stressed design or a scaled feasible fully stressed design. Results obtained from the application of the strength sizing to the structural sizing of a composite material wing box using material strength allowables are presented. These results demonstrate the rapid convergence of the structural sizes to a usable design.

  3. PLAN: Shared Automated Circulation System in California.

    ERIC Educational Resources Information Center

    Kershner, Lois

    1983-01-01

    Background information about Peninsula Library Automated Network member libraries and description of the circulation system of choice include basic components of this cooperative effort: Joint Powers Agreement and organizational structure; jurisdiction responsibilities and financial planning; database and policy areas requiring joint decision;…

  4. Automated mesoscale winds determined from satellite imagery

    NASA Technical Reports Server (NTRS)

    1987-01-01

    A new automated technique for extracting mesoscale fields from GOES visible/infrared satellite imagery was developed. Quality control parameters were defined to allow objective editing of the wind fields. The system can produce equivalent or superior cloud wind estimates compared to the time consuming manual methods used on various interactive meteorological processing systems. Analysis of automated mesoscale cloud wind for a test case yields an estimated random error value one meter per second and produces both regional and mesoscale vector wind field structure and divergence patterns that are consistent in time and highly correlated with subsequent severe thunderstorm development.

  5. Power subsystem automation study

    NASA Technical Reports Server (NTRS)

    Imamura, M. S.; Moser, R. L.; Veatch, M.

    1983-01-01

    Generic power-system elements and their potential faults are identified. Automation functions and their resulting benefits are defined and automation functions between power subsystem, central spacecraft computer, and ground flight-support personnel are partitioned. All automation activities were categorized as data handling, monitoring, routine control, fault handling, planning and operations, or anomaly handling. Incorporation of all these classes of tasks, except for anomaly handling, in power subsystem hardware and software was concluded to be mandatory to meet the design and operational requirements of the space station. The key drivers are long mission lifetime, modular growth, high-performance flexibility, a need to accommodate different electrical user-load equipment, onorbit assembly/maintenance/servicing, and potentially large number of power subsystem components. A significant effort in algorithm development and validation is essential in meeting the 1987 technology readiness date for the space station.

  6. Automated telescope scheduling

    NASA Astrophysics Data System (ADS)

    Johnston, Mark D.

    1988-08-01

    With the ever increasing level of automation of astronomical telescopes the benefits and feasibility of automated planning and scheduling are becoming more apparent. Improved efficiency and increased overall telescope utilization are the most obvious goals. Automated scheduling at some level has been done for several satellite observatories, but the requirements on these systems were much less stringent than on modern ground or satellite observatories. The scheduling problem is particularly acute for Hubble Space Telescope: virtually all observations must be planned in excruciating detail weeks to months in advance. Space Telescope Science Institute has recently made significant progress on the scheduling problem by exploiting state-of-the-art artificial intelligence software technology. What is especially interesting is that this effort has already yielded software that is well suited to scheduling groundbased telescopes, including the problem of optimizing the coordinated scheduling of more than one telescope.

  7. Automated telescope scheduling

    NASA Technical Reports Server (NTRS)

    Johnston, Mark D.

    1988-01-01

    With the ever increasing level of automation of astronomical telescopes the benefits and feasibility of automated planning and scheduling are becoming more apparent. Improved efficiency and increased overall telescope utilization are the most obvious goals. Automated scheduling at some level has been done for several satellite observatories, but the requirements on these systems were much less stringent than on modern ground or satellite observatories. The scheduling problem is particularly acute for Hubble Space Telescope: virtually all observations must be planned in excruciating detail weeks to months in advance. Space Telescope Science Institute has recently made significant progress on the scheduling problem by exploiting state-of-the-art artificial intelligence software technology. What is especially interesting is that this effort has already yielded software that is well suited to scheduling groundbased telescopes, including the problem of optimizing the coordinated scheduling of more than one telescope.

  8. Success with an automated computer control system

    NASA Astrophysics Data System (ADS)

    Roberts, M. L.; Moore, T. L.

    1991-05-01

    LLNL has successfully implemented a distributed computer control system for automated operation of an FN tandem accelerator. The control system software utilized is the Thaumaturgic Automated Control Logic (TACL) written by the Continuous Electron Beam Accelerator Facility and co-developed with LLNL. Using TACL, accelerator components are controlled through CAMAC using a two-tiered structure. Analog control and measurement are at 12 or 16 bit precision as appropriate. Automated operation has been implemented for several nuclear analytical techniques including hydrogen depth profiling and accelerator mass Spectrometry. An additional advantage of TACL lies in its expansion capabilities. Without disturbing existing control definitions and algorithms, additional control algorithms and display functions can be implemented quickly.

  9. Automated gas chromatography

    DOEpatents

    Mowry, Curtis D.; Blair, Dianna S.; Rodacy, Philip J.; Reber, Stephen D.

    1999-01-01

    An apparatus and process for the continuous, near real-time monitoring of low-level concentrations of organic compounds in a liquid, and, more particularly, a water stream. A small liquid volume of flow from a liquid process stream containing organic compounds is diverted by an automated process to a heated vaporization capillary where the liquid volume is vaporized to a gas that flows to an automated gas chromatograph separation column to chromatographically separate the organic compounds. Organic compounds are detected and the information transmitted to a control system for use in process control. Concentrations of organic compounds less than one part per million are detected in less than one minute.

  10. Automation in optics manufacturing

    NASA Astrophysics Data System (ADS)

    Pollicove, Harvey M.; Moore, Duncan T.

    1991-01-01

    The optics industry has not followed the lead of the machining and electronics industries in applying advances in computer aided engineering (CAE), computer assisted manufacturing (CAM), automation or quality management techniques. Automation based on computer integrated manufacturing (CIM) and flexible machining systems (FMS) has been widely implemented in these industries. Optics continues to rely on standalone equipment that preserves the highly skilled, labor intensive optical fabrication systems developed in the 1940's. This paper describes development initiatives at the Center for Optics Manufacturing that will create computer integrated manufacturing technology and support processes for the optical industry.

  11. Automated software development workstation

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Engineering software development was automated using an expert system (rule-based) approach. The use of this technology offers benefits not available from current software development and maintenance methodologies. A workstation was built with a library or program data base with methods for browsing the designs stored; a system for graphical specification of designs including a capability for hierarchical refinement and definition in a graphical design system; and an automated code generation capability in FORTRAN. The workstation was then used in a demonstration with examples from an attitude control subsystem design for the space station. Documentation and recommendations are presented.

  12. Automating the CMS DAQ

    SciTech Connect

    Bauer, G.; et al.

    2014-01-01

    We present the automation mechanisms that have been added to the Data Acquisition and Run Control systems of the Compact Muon Solenoid (CMS) experiment during Run 1 of the LHC, ranging from the automation of routine tasks to automatic error recovery and context-sensitive guidance to the operator. These mechanisms helped CMS to maintain a data taking efficiency above 90% and to even improve it to 95% towards the end of Run 1, despite an increase in the occurrence of single-event upsets in sub-detector electronics at high LHC luminosity.

  13. Automated Library System Specifications.

    DTIC Science & Technology

    1986-06-01

    AD-A78 95 AUTOMATED LIBRARY SYSTEM SPECIFICATIONS(U) ARMY LIBRARY /i MANAGEMENT OFFICE ALEXANDRIA VA ASSISTANT CHIEF OF STAFF FOR INFORMATION... MANAGEMENT M B BONNETT JUN 86 UNCLASSIFIED F/G 9/2 NLEElIIhllEEEEE IllEEEEEllllEI .1lm lliml * ~I fI.L25 MI, [OCM RL,;OCLUTO fl. ’N k~ AUTOMATED LIBRARY...SYSTEM SPECIFICATIONS .,I Prepared by Mary B. Bonnett ARMY LIBRARY MANAGEMENT OFFICE OFFICE OF THE ASSISTANT CHIEF OF STAFF FOR INFORMATION MANAGEMENT Lij

  14. Automated HMC assembly

    SciTech Connect

    Blazek, R.J.

    1993-08-01

    An automated gold wire bonder was characterized for bonding 1-mil gold wires to gallium-arsenide (GaAs) monolithic microwave integrated circuits (MMICs) which are used in microwave radar transmitter-receiver (T/R) modules. Acceptable gold wire bond test results were obtained for the fragile, 5-mil-thick GaAs MMICs with gold-metallized bond pads; and average wire bond pull strengths, shear strengths, and failure modes were determined. An automated aluminum wire bonder was modified to be used as a gold wire bonder so that a wedge bond capability was available for GaAs MMICs in addition to the gold ball bond capability.

  15. Automated knowledge generation

    NASA Technical Reports Server (NTRS)

    Myler, Harley R.; Gonzalez, Avelino J.

    1988-01-01

    The general objectives of the NASA/UCF Automated Knowledge Generation Project were the development of an intelligent software system that could access CAD design data bases, interpret them, and generate a diagnostic knowledge base in the form of a system model. The initial area of concentration is in the diagnosis of the process control system using the Knowledge-based Autonomous Test Engineer (KATE) diagnostic system. A secondary objective was the study of general problems of automated knowledge generation. A prototype was developed, based on object-oriented language (Flavors).

  16. Altering user' acceptance of automation through prior automation exposure.

    PubMed

    Bekier, Marek; Molesworth, Brett R C

    2016-08-22

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  17. RCrane: semi-automated RNA model building.

    PubMed

    Keating, Kevin S; Pyle, Anna Marie

    2012-08-01

    RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems.

  18. The 3D Euler solutions using automated Cartesian grid generation

    NASA Technical Reports Server (NTRS)

    Melton, John E.; Enomoto, Francis Y.; Berger, Marsha J.

    1993-01-01

    Viewgraphs on 3-dimensional Euler solutions using automated Cartesian grid generation are presented. Topics covered include: computational fluid dynamics (CFD) and the design cycle; Cartesian grid strategy; structured body fit; grid generation; prolate spheroid; and ONERA M6 wing.

  19. A technique for recording polycrystalline structure and orientation during in situ deformation cycles of rock analogues using an automated fabric analyser.

    PubMed

    Peternell, M; Russell-Head, D S; Wilson, C J L

    2011-05-01

    Two in situ plane-strain deformation experiments on norcamphor and natural ice using synchronous recording of crystal c-axis orientations have been performed with an automated fabric analyser and a newly developed sample press and deformation stage. Without interrupting the deformation experiment, c-axis orientations are determined for each pixel in a 5 × 5 mm sample area at a spatial resolution of 5 μm/pixel. In the case of norcamphor, changes in microstructures and associated crystallographic information, at a strain rate of ∼2 × 10(-5) s(-1), were recorded for the first time during a complete in situ deformation-cycle experiment that consisted of an annealing, deformation and post-deformation annealing path. In the case of natural ice, slower external strain rates (∼1 × 10(-6) s(-1)) enabled the investigation of small changes in the polycrystal aggregate's crystallography and microstructure for small amounts of strain. The technical setup and first results from the experiments are presented. © 2010 The Authors Journal of Microscopy © 2010 Royal Microscopical Society.

  20. Human Factors In Aircraft Automation

    NASA Technical Reports Server (NTRS)

    Billings, Charles

    1995-01-01

    Report presents survey of state of art in human factors in automation of aircraft operation. Presents examination of aircraft automation and effects on flight crews in relation to human error and aircraft accidents.

  1. Automated cognome construction and semi-automated hypothesis generation.

    PubMed

    Voytek, Jessica B; Voytek, Bradley

    2012-06-30

    Modern neuroscientific research stands on the shoulders of countless giants. PubMed alone contains more than 21 million peer-reviewed articles with 40-50,000 more published every month. Understanding the human brain, cognition, and disease will require integrating facts from dozens of scientific fields spread amongst millions of studies locked away in static documents, making any such integration daunting, at best. The future of scientific progress will be aided by bridging the gap between the millions of published research articles and modern databases such as the Allen brain atlas (ABA). To that end, we have analyzed the text of over 3.5 million scientific abstracts to find associations between neuroscientific concepts. From the literature alone, we show that we can blindly and algorithmically extract a "cognome": relationships between brain structure, function, and disease. We demonstrate the potential of data-mining and cross-platform data-integration with the ABA by introducing two methods for semi-automated hypothesis generation. By analyzing statistical "holes" and discrepancies in the literature we can find understudied or overlooked research paths. That is, we have added a layer of semi-automation to a part of the scientific process itself. This is an important step toward fundamentally incorporating data-mining algorithms into the scientific method in a manner that is generalizable to any scientific or medical field. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Automated recombinant protein expression screening in Escherichia coli.

    PubMed

    Busso, Didier; Stierlé, Matthieu; Thierry, Jean-Claude; Moras, Dino

    2008-01-01

    To fit the requirements of structural genomics programs, new as well as classical methods have been adapted to automation. This chapter describes the automated procedure developed within the Structural Biology and Genomics Platform, Strasbourg for performing recombinant protein expression screening in Escherichia coli. The procedure consists of parallel competent cells transformation, cell plating, and liquid culture inoculation, implemented for up to 96 samples at a time.

  3. Automated Management Of Documents

    NASA Technical Reports Server (NTRS)

    Boy, Guy

    1995-01-01

    Report presents main technical issues involved in computer-integrated documentation. Problems associated with automation of management and maintenance of documents analyzed from perspectives of artificial intelligence and human factors. Technologies that may prove useful in computer-integrated documentation reviewed: these include conventional approaches to indexing and retrieval of information, use of hypertext, and knowledge-based artificial-intelligence systems.

  4. Guide to Library Automation.

    ERIC Educational Resources Information Center

    Toohill, Barbara G.

    Directed toward librarians and library administrators who wish to procure automated systems or services for their libraries, this guide offers practical suggestions, advice, and methods for determining requirements, estimating costs and benefits, writing specifications procuring systems, negotiating contracts, and installing systems. The advice…

  5. Automated Management Of Documents

    NASA Technical Reports Server (NTRS)

    Boy, Guy

    1995-01-01

    Report presents main technical issues involved in computer-integrated documentation. Problems associated with automation of management and maintenance of documents analyzed from perspectives of artificial intelligence and human factors. Technologies that may prove useful in computer-integrated documentation reviewed: these include conventional approaches to indexing and retrieval of information, use of hypertext, and knowledge-based artificial-intelligence systems.

  6. Microcontroller for automation application

    NASA Technical Reports Server (NTRS)

    Cooper, H. W.

    1975-01-01

    The description of a microcontroller currently being developed for automation application was given. It is basically an 8-bit microcomputer with a 40K byte random access memory/read only memory, and can control a maximum of 12 devices through standard 15-line interface ports.

  7. Building Automation Systems.

    ERIC Educational Resources Information Center

    Honeywell, Inc., Minneapolis, Minn.

    A number of different automation systems for use in monitoring and controlling building equipment are described in this brochure. The system functions include--(1) collection of information, (2) processing and display of data at a central panel, and (3) taking corrective action by sounding alarms, making adjustments, or automatically starting and…

  8. Automated Estimating System (AES)

    SciTech Connect

    Holder, D.A.

    1989-09-01

    This document describes Version 3.1 of the Automated Estimating System, a personal computer-based software package designed to aid in the creation, updating, and reporting of project cost estimates for the Estimating and Scheduling Department of the Martin Marietta Energy Systems Engineering Division. Version 3.1 of the Automated Estimating System is capable of running in a multiuser environment across a token ring network. The token ring network makes possible services and applications that will more fully integrate all aspects of information processing, provides a central area for large data bases to reside, and allows access to the data base by multiple users. Version 3.1 of the Automated Estimating System also has been enhanced to include an Assembly pricing data base that may be used to retrieve cost data into an estimate. A WBS Title File program has also been included in Version 3.1. The WBS Title File program allows for the creation of a WBS title file that has been integrated with the Automated Estimating System to provide WBS titles in update mode and in reports. This provides for consistency in WBS titles and provides the capability to display WBS titles on reports generated at a higher WBS level.

  9. Automated Essay Scoring

    ERIC Educational Resources Information Center

    Dikli, Semire

    2006-01-01

    The impacts of computers on writing have been widely studied for three decades. Even basic computers functions, i.e. word processing, have been of great assistance to writers in modifying their essays. The research on Automated Essay Scoring (AES) has revealed that computers have the capacity to function as a more effective cognitive tool (Attali,…

  10. Automated Microbial Genome Annotation

    SciTech Connect

    Land, Miriam

    2009-05-29

    Miriam Land of the DOE Joint Genome Institute at Oak Ridge National Laboratory gives a talk on the current state and future challenges of moving toward automated microbial genome annotation at the "Sequencing, Finishing, Analysis in the Future" meeting in Santa Fe, NM

  11. Building Automation Systems.

    ERIC Educational Resources Information Center

    Honeywell, Inc., Minneapolis, Minn.

    A number of different automation systems for use in monitoring and controlling building equipment are described in this brochure. The system functions include--(1) collection of information, (2) processing and display of data at a central panel, and (3) taking corrective action by sounding alarms, making adjustments, or automatically starting and…

  12. Automated Tendering and Purchasing.

    ERIC Educational Resources Information Center

    DeZorzi, James M.

    1980-01-01

    The Middlesex County Board of Education in Hyde Park (Ontario) has developed an automated tendering/purchasing system for ordering standard items that has reduced by 80 percent the time required for tendering, evaluating, awarding, and ordering items. (Author/MLF)

  13. Automated conflict resolution issues

    NASA Technical Reports Server (NTRS)

    Wike, Jeffrey S.

    1991-01-01

    A discussion is presented of how conflicts for Space Network resources should be resolved in the ATDRSS era. The following topics are presented: a description of how resource conflicts are currently resolved; a description of issues associated with automated conflict resolution; present conflict resolution strategies; and topics for further discussion.

  14. ATC automation concepts

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    1990-01-01

    Information on the design of human-centered tools for terminal area air traffic control (ATC) is given in viewgraph form. Information is given on payoffs and products, guidelines, ATC as a team process, automation tools for ATF, and the traffic management advisor.

  15. Automated Administrative Data Bases

    NASA Technical Reports Server (NTRS)

    Marrie, M. D.; Jarrett, J. R.; Reising, S. A.; Hodge, J. E.

    1984-01-01

    Improved productivity and more effective response to information requirements for internal management, NASA Centers, and Headquarters resulted from using automated techniques. Modules developed to provide information on manpower, RTOPS, full time equivalency, and physical space reduced duplication, increased communication, and saved time. There is potential for greater savings by sharing and integrating with those who have the same requirements.

  16. Automating Small Libraries.

    ERIC Educational Resources Information Center

    Swan, James

    1996-01-01

    Presents a four-phase plan for small libraries strategizing for automation: inventory and weeding, data conversion, implementation, and enhancements. Other topics include selecting a system, MARC records, compatibility, ease of use, industry standards, searching capabilities, support services, system security, screen displays, circulation modules,…

  17. Automated Lumber Processing

    Treesearch

    Powsiri Klinkhachorn; J. Moody; Philip A. Araman

    1995-01-01

    For the past few decades, researchers have devoted time and effort to apply automation and modern computer technologies towards improving the productivity of traditional industries. To be competitive, one must streamline operations and minimize production costs, while maintaining an acceptable margin of profit. This paper describes the effort of one such endeavor...

  18. Personnel Department Automation.

    ERIC Educational Resources Information Center

    Wilkinson, David

    In 1989, the Austin Independent School District's Office of Research and Evaluation was directed to monitor the automation of personnel information and processes in the district's Department of Personnel. Earlier, a study committee appointed by the Superintendent during the 1988-89 school year identified issues related to Personnel Department…

  19. Automated Accounting. Instructor Guide.

    ERIC Educational Resources Information Center

    Moses, Duane R.

    This curriculum guide was developed to assist business instructors using Dac Easy Accounting College Edition Version 2.0 software in their accounting programs. The module consists of four units containing assignment sheets and job sheets designed to enable students to master competencies identified in the area of automated accounting. The first…

  20. Automated Student Model Improvement

    ERIC Educational Resources Information Center

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  1. Validating Automated Speaking Tests

    ERIC Educational Resources Information Center

    Bernstein, Jared; Van Moere, Alistair; Cheng, Jian

    2010-01-01

    This paper presents evidence that supports the valid use of scores from fully automatic tests of spoken language ability to indicate a person's effectiveness in spoken communication. The paper reviews the constructs, scoring, and the concurrent validity evidence of "facility-in-L2" tests, a family of automated spoken language tests in Spanish,…

  2. Automated EEG acquisition

    NASA Technical Reports Server (NTRS)

    Frost, J. D., Jr.; Hillman, C. E., Jr.

    1977-01-01

    Automated self-contained portable device can be used by technicians with minimal training. Data acquired from patient at remote site are transmitted to centralized interpretation center using conventional telephone equipment. There, diagnostic information is analyzed, and results are relayed back to remote site.

  3. Automated Inadvertent Intruder Application

    SciTech Connect

    Koffman, Larry D.; Lee, Patricia L.; Cook, James R.; Wilhite, Elmer L.

    2008-01-15

    The Environmental Analysis and Performance Modeling group of Savannah River National Laboratory (SRNL) conducts performance assessments of the Savannah River Site (SRS) low-level waste facilities to meet the requirements of DOE Order 435.1. These performance assessments, which result in limits on the amounts of radiological substances that can be placed in the waste disposal facilities, consider numerous potential exposure pathways that could occur in the future. One set of exposure scenarios, known as inadvertent intruder analysis, considers the impact on hypothetical individuals who are assumed to inadvertently intrude onto the waste disposal site. Inadvertent intruder analysis considers three distinct scenarios for exposure referred to as the agriculture scenario, the resident scenario, and the post-drilling scenario. Each of these scenarios has specific exposure pathways that contribute to the overall dose for the scenario. For the inadvertent intruder analysis, the calculation of dose for the exposure pathways is a relatively straightforward algebraic calculation that utilizes dose conversion factors. Prior to 2004, these calculations were performed using an Excel spreadsheet. However, design checks of the spreadsheet calculations revealed that errors could be introduced inadvertently when copying spreadsheet formulas cell by cell and finding these errors was tedious and time consuming. This weakness led to the specification of functional requirements to create a software application that would automate the calculations for inadvertent intruder analysis using a controlled source of input parameters. This software application, named the Automated Inadvertent Intruder Application, has undergone rigorous testing of the internal calculations and meets software QA requirements. The Automated Inadvertent Intruder Application was intended to replace the previous spreadsheet analyses with an automated application that was verified to produce the same calculations and

  4. Instrumentation Automation for Concrete Structures; Report 1: Instrumentation Automation Techniques

    DTIC Science & Technology

    1986-12-01

    to be wired to the nearest remote box. However, due to limited buffer memory, large data collection via remote front ends can be slow and ctnnbBrsomE...Dot matrix --=X-=---- Communica.t ions port: Serial _____ll_ Parallel Data buffer : ____ ::::__or = 4k Yes ______ No Paper: -=x __ Tractor feed...Flat bed No. of pens X X Parallel 6 Communications port: Serial Data buffer : > or = 4k Yes _____________________ No Paper size 11 11 X 14 11 Fonts

  5. An Automated Biological Dosimetry System

    NASA Astrophysics Data System (ADS)

    Lorch, T.; Bille, J.; Frieben, M.; Stephan, G.

    1986-04-01

    The scoring of structural chromosome aberrations in peripheral human blood lymphocytes can be used in biological dosimetry to estimate the radiation dose which an individual has received. Especially the dicentric chromosome is a rather specific indicator for an exposure to ionizing radiation. For statistical reasons, in the low dose range a great number of cells must be analysed, which is a very tedious task. The resulting high cost of a biological dose estimation limits the application of this method to cases of suspected irradiation for which physical dosimetry is not possible or not sufficient. Therefore an automated system has been designed to do the major part of the routine work. It uses a standard light microscope with motorized scanning stage, a Plumbicon TV-camera, a real-time hardware preprocessor, a binary and a grey level image buffer system. All computations are performed by a very powerful multi-microprocessor-system (POLYP) based on a MIMD-architecture. The task of the automated system can be split in finding the metaphases (see Figure 1) at low microscope magnification and scoring dicentrics at high magnification. The metaphase finding part has been completed and is now in routine use giving good results. The dicentric scoring part is still under development.

  6. Multifunction automated crawling system (MACS)

    NASA Astrophysics Data System (ADS)

    Bar-Cohen, Yoseph; Backes, Paul G.; Joffe, Benjamin

    1996-11-01

    Nondestructive evaluation instruments and sensors are becoming smaller with enhanced computer controlled capability and increasingly use commercially available hardware and software. Further, robotic instruments are being developed to serve as mobility platforms allowing automation of the inspection process. This combination of miniaturized sensing and robotics technology enables hybrid miniature technology solutions for identified aircraft inspection needs. Integration of inspection and robotics technologies is benefited by the use of a standard computing platform. JPL investigated the application of telerobotic technology to inspection of aircraft structures using capabilities that were developed for use in space exploration. A miniature crawler that can travel on the surface of aircraft using suction cups for adherence was developed and is called multifunction automated crawling systems (MACS). MACS is an operational tool that can perform rapid large area inspection of aircraft, which has a relatively large platform to carry miniature inspection instruments payload. The capability of MACS and the trend towards autonomous inspection crawlers will be reviewed and discussed in this paper.

  7. Automating spectral measurements

    NASA Astrophysics Data System (ADS)

    Goldstein, Fred T.

    2008-09-01

    This paper discusses the architecture of software utilized in spectroscopic measurements. As optical coatings become more sophisticated, there is mounting need to automate data acquisition (DAQ) from spectrophotometers. Such need is exacerbated when 100% inspection is required, ancillary devices are utilized, cost reduction is crucial, or security is vital. While instrument manufacturers normally provide point-and-click DAQ software, an application programming interface (API) may be missing. In such cases automation is impossible or expensive. An API is typically provided in libraries (*.dll, *.ocx) which may be embedded in user-developed applications. Users can thereby implement DAQ automation in several Windows languages. Another possibility, developed by FTG as an alternative to instrument manufacturers' software, is the ActiveX application (*.exe). ActiveX, a component of many Windows applications, provides means for programming and interoperability. This architecture permits a point-and-click program to act as automation client and server. Excel, for example, can control and be controlled by DAQ applications. Most importantly, ActiveX permits ancillary devices such as barcode readers and XY-stages to be easily and economically integrated into scanning procedures. Since an ActiveX application has its own user-interface, it can be independently tested. The ActiveX application then runs (visibly or invisibly) under DAQ software control. Automation capabilities are accessed via a built-in spectro-BASIC language with industry-standard (VBA-compatible) syntax. Supplementing ActiveX, spectro-BASIC also includes auxiliary serial port commands for interfacing programmable logic controllers (PLC). A typical application is automatic filter handling.

  8. Automated Conflict Resolution For Air Traffic Control

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    2005-01-01

    The ability to detect and resolve conflicts automatically is considered to be an essential requirement for the next generation air traffic control system. While systems for automated conflict detection have been used operationally by controllers for more than 20 years, automated resolution systems have so far not reached the level of maturity required for operational deployment. Analytical models and algorithms for automated resolution have been traffic conditions to demonstrate that they can handle the complete spectrum of conflict situations encountered in actual operations. The resolution algorithm described in this paper was formulated to meet the performance requirements of the Automated Airspace Concept (AAC). The AAC, which was described in a recent paper [1], is a candidate for the next generation air traffic control system. The AAC's performance objectives are to increase safety and airspace capacity and to accommodate user preferences in flight operations to the greatest extent possible. In the AAC, resolution trajectories are generated by an automation system on the ground and sent to the aircraft autonomously via data link .The algorithm generating the trajectories must take into account the performance characteristics of the aircraft, the route structure of the airway system, and be capable of resolving all types of conflicts for properly equipped aircraft without requiring supervision and approval by a controller. Furthermore, the resolution trajectories should be compatible with the clearances, vectors and flight plan amendments that controllers customarily issue to pilots in resolving conflicts. The algorithm described herein, although formulated specifically to meet the needs of the AAC, provides a generic engine for resolving conflicts. Thus, it can be incorporated into any operational concept that requires a method for automated resolution, including concepts for autonomous air to air resolution.

  9. RCrane: semi-automated RNA model building

    SciTech Connect

    Keating, Kevin S.; Pyle, Anna Marie

    2012-08-01

    RCrane is a new tool for the partially automated building of RNA crystallographic models into electron-density maps of low or intermediate resolution. This tool helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems.

  10. Automated generation of weld path trajectories.

    SciTech Connect

    Sizemore, John M.; Hinman-Sweeney, Elaine Marie; Ames, Arlo Leroy

    2003-06-01

    AUTOmated GENeration of Control Programs for Robotic Welding of Ship Structure (AUTOGEN) is software that automates the planning and compiling of control programs for robotic welding of ship structure. The software works by evaluating computer representations of the ship design and the manufacturing plan. Based on this evaluation, AUTOGEN internally identifies and appropriately characterizes each weld. Then it constructs the robot motions necessary to accomplish the welds and determines for each the correct assignment of process control values. AUTOGEN generates these robot control programs completely without manual intervention or edits except to correct wrong or missing input data. Most ship structure assemblies are unique or at best manufactured only a few times. Accordingly, the high cost inherent in all previous methods of preparing complex control programs has made robot welding of ship structures economically unattractive to the U.S. shipbuilding industry. AUTOGEN eliminates the cost of creating robot control programs. With programming costs eliminated, capitalization of robots to weld ship structures becomes economically viable. Robot welding of ship structures will result in reduced ship costs, uniform product quality, and enhanced worker safety. Sandia National Laboratories and Northrop Grumman Ship Systems worked with the National Shipbuilding Research Program to develop a means of automated path and process generation for robotic welding. This effort resulted in the AUTOGEN program, which has successfully demonstrated automated path generation and robot control. Although the current implementation of AUTOGEN is optimized for welding applications, the path and process planning capability has applicability to a number of industrial applications, including painting, riveting, and adhesive delivery.

  11. Millisecond single-molecule localization microscopy combined with convolution analysis and automated image segmentation to determine protein concentrations in complexly structured, functional cells, one cell at a time.

    PubMed

    Wollman, Adam J M; Leake, Mark C

    2015-01-01

    We present a single-molecule tool called the CoPro (concentration of proteins) method that uses millisecond imaging with convolution analysis, automated image segmentation and super-resolution localization microscopy to generate robust estimates for protein concentration in different compartments of single living cells, validated using realistic simulations of complex multiple compartment cell types. We demonstrate its utility experimentally on model Escherichia coli bacteria and Saccharomyces cerevisiae budding yeast cells, and use it to address the biological question of how signals are transduced in cells. Cells in all domains of life dynamically sense their environment through signal transduction mechanisms, many involving gene regulation. The glucose sensing mechanism of S. cerevisiae is a model system for studying gene regulatory signal transduction. It uses the multi-copy expression inhibitor of the GAL gene family, Mig1, to repress unwanted genes in the presence of elevated extracellular glucose concentrations. We fluorescently labelled Mig1 molecules with green fluorescent protein (GFP) via chromosomal integration at physiological expression levels in living S. cerevisiae cells, in addition to the RNA polymerase protein Nrd1 with the fluorescent protein reporter mCherry. Using CoPro we make quantitative estimates of Mig1 and Nrd1 protein concentrations in the cytoplasm and nucleus compartments on a cell-by-cell basis under physiological conditions. These estimates indicate a ∼4-fold shift towards higher values in the concentration of diffusive Mig1 in the nucleus if the external glucose concentration is raised, whereas equivalent levels in the cytoplasm shift to smaller values with a relative change an order of magnitude smaller. This compares with Nrd1 which is not involved directly in glucose sensing, and which is almost exclusively localized in the nucleus under high and low external glucose levels. CoPro facilitates time-resolved quantification of

  12. Towards automated biomedical ontology harmonization.

    PubMed

    Uribe, Gustavo A; Lopez, Diego M; Blobel, Bernd

    2014-01-01

    The use of biomedical ontologies is increasing, especially in the context of health systems interoperability. Ontologies are key pieces to understand the semantics of information exchanged. However, given the diversity of biomedical ontologies, it is essential to develop tools that support harmonization processes amongst them. Several algorithms and tools are proposed by computer scientist for partially supporting ontology harmonization. However, these tools face several problems, especially in the biomedical domain where ontologies are large and complex. In the harmonization process, matching is a basic task. This paper explains the different ontology harmonization processes, analyzes existing matching tools, and proposes a prototype of an ontology harmonization service. The results demonstrate that there are many open issues in the field of biomedical ontology harmonization, such as: overcoming structural discrepancies between ontologies; the lack of semantic algorithms to automate the process; the low matching efficiency of existing algorithms; and the use of domain and top level ontologies in the matching process.

  13. Automated optical assembly

    NASA Astrophysics Data System (ADS)

    Bala, John L.

    1995-08-01

    Automation and polymer science represent fundamental new technologies which can be directed toward realizing the goal of establishing a domestic, world-class, commercial optics business. Use of innovative optical designs using precision polymer optics will enable the US to play a vital role in the next generation of commercial optical products. The increased cost savings inherent in the utilization of optical-grade polymers outweighs almost every advantage of using glass for high volume situations. Optical designers must gain experience with combined refractive/diffractive designs and broaden their knowledge base regarding polymer technology beyond a cursory intellectual exercise. Implementation of a fully automated assembly system, combined with utilization of polymer optics, constitutes the type of integrated manufacturing process which will enable the US to successfully compete with the low-cost labor employed in the Far East, as well as to produce an equivalent product.

  14. Automated macromolecular crystallization screening

    DOEpatents

    Segelke, Brent W.; Rupp, Bernhard; Krupka, Heike I.

    2005-03-01

    An automated macromolecular crystallization screening system wherein a multiplicity of reagent mixes are produced. A multiplicity of analysis plates is produced utilizing the reagent mixes combined with a sample. The analysis plates are incubated to promote growth of crystals. Images of the crystals are made. The images are analyzed with regard to suitability of the crystals for analysis by x-ray crystallography. A design of reagent mixes is produced based upon the expected suitability of the crystals for analysis by x-ray crystallography. A second multiplicity of mixes of the reagent components is produced utilizing the design and a second multiplicity of reagent mixes is used for a second round of automated macromolecular crystallization screening. In one embodiment the multiplicity of reagent mixes are produced by a random selection of reagent components.

  15. The automated command transmission

    NASA Astrophysics Data System (ADS)

    Inoue, Y.; Satoh, S.

    A technique for automated command transmission (ACT) to GEO-stationed satellites is presented. The system is intended for easing the command center workload. The ACT system determines the relation of the commands to on-board units, connects the telemetry with on-board units, defines the control path on the spacecraft, identifies the correspondence of back-up units to primary units, and ascertains sunlight or eclipse conditions. The system also has the address of satellite and command decoders, the ID and content for the mission command sequence, group and inhibit codes, a listing of all available commands, and restricts the data to a command sequence. Telemetry supplies data for automated problem correction. All other missions operations are terminated during system recovery data processing after a crash. The ACT system is intended for use with the GMS spacecraft.

  16. Automated gas chromatography

    DOEpatents

    Mowry, C.D.; Blair, D.S.; Rodacy, P.J.; Reber, S.D.

    1999-07-13

    An apparatus and process for the continuous, near real-time monitoring of low-level concentrations of organic compounds in a liquid, and, more particularly, a water stream. A small liquid volume of flow from a liquid process stream containing organic compounds is diverted by an automated process to a heated vaporization capillary where the liquid volume is vaporized to a gas that flows to an automated gas chromatograph separation column to chromatographically separate the organic compounds. Organic compounds are detected and the information transmitted to a control system for use in process control. Concentrations of organic compounds less than one part per million are detected in less than one minute. 7 figs.

  17. Automated Pollution Control

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Patterned after the Cassini Resource Exchange (CRE), Sholtz and Associates established the Automated Credit Exchange (ACE), an Internet-based concept that automates the auctioning of "pollution credits" in Southern California. An early challenge of the Jet Propulsion Laboratory's Cassini mission was allocating the spacecraft's resources. To support the decision-making process, the CRE was developed. The system removes the need for the science instrument manager to know the individual instruments' requirements for the spacecraft resources. Instead, by utilizing principles of exchange, the CRE induces the instrument teams to reveal their requirements. In doing so, they arrive at an efficient allocation of spacecraft resources by trading among themselves. A Southern California RECLAIM air pollution credit trading market has been set up using same bartering methods utilized in the Cassini mission in order to help companies keep pollution and costs down.

  18. Automated assembly in space

    NASA Technical Reports Server (NTRS)

    Srivastava, Sandanand; Dwivedi, Suren N.; Soon, Toh Teck; Bandi, Reddy; Banerjee, Soumen; Hughes, Cecilia

    1989-01-01

    The installation of robots and their use of assembly in space will create an exciting and promising future for the U.S. Space Program. The concept of assembly in space is very complicated and error prone and it is not possible unless the various parts and modules are suitably designed for automation. Certain guidelines are developed for part designing and for an easy precision assembly. Major design problems associated with automated assembly are considered and solutions to resolve these problems are evaluated in the guidelines format. Methods for gripping and methods for part feeding are developed with regard to the absence of gravity in space. The guidelines for part orientation, adjustments, compliances and various assembly construction are discussed. Design modifications of various fasteners and fastening methods are also investigated.

  19. Terminal automation system maintenance

    SciTech Connect

    Coffelt, D.; Hewitt, J.

    1997-01-01

    Nothing has improved petroleum product loading in recent years more than terminal automation systems. The presence of terminal automation systems (TAS) at loading racks has increased operational efficiency and safety and enhanced their accounting and management capabilities. However, like all finite systems, they occasionally malfunction or fail. Proper servicing and maintenance can minimize this. And in the unlikely event a TAS breakdown does occur, prompt and effective troubleshooting can reduce its impact on terminal productivity. To accommodate around-the-clock loading at racks, increasingly unattended by terminal personnel, TAS maintenance, servicing and troubleshooting has become increasingly demanding. It has also become increasingly important. After 15 years of trial and error at petroleum and petrochemical storage and transfer terminals, a number of successful troubleshooting programs have been developed. These include 24-hour {open_quotes}help hotlines,{close_quotes} internal (terminal company) and external (supplier) support staff, and {open_quotes}layered{close_quotes} support. These programs are described.

  20. Automated chemiluminescence immunoassay measurements

    NASA Astrophysics Data System (ADS)

    Khalil, Omar S.; Mattingly, G. P.; Genger, K.; Mackowiak, J.; Butler, J.; Pepe, C.; Zurek, T. F.; Abunimeh, N.

    1993-06-01

    Chemiluminescence (CL) detection offers potential for high sensitivity immunoassays (CLIAs). Several approaches were attempted to automate CL measurements. Those include the use of photographic film, clear microtitration plates, and magnetic separation. We describe a photon counting detection apparatus that performs (CLIA) measurements. The CL detector moves toward a disposable reaction vessel to create a light-tight seal and then triggers and integrates a CL signal. The capture uses antibody coated polystyrene microparticles. A porous matrix, which is a part of a disposable reaction tray, entraps the microparticle-captured reaction product. The CL signal emanated off the immune complex immobilized by the porous matrix is detected. The detection system is a part of a fully automated immunoassay analyzer. Methods of achieving high sensitivities are discussed.

  1. Automated Chromosome Breakage Assessment

    NASA Technical Reports Server (NTRS)

    Castleman, Kenneth

    1985-01-01

    An automated karyotyping machine was built at JPL in 1972. It does computerized karyotyping, but it has some hardware limitations. The image processing hardware that was available at a reasonable price in 1972 was marginal, at best, for this job. In the meantime, NASA has developed an interest in longer term spaceflights and an interest in using chromosome breakage studies as a dosimeter for radiation or perhaps other damage that might occur to the tissues. This uses circulating lymphocytes as a physiological dosimeter looking for chromosome breakage on long-term spaceflights. For that reason, we have reactivated the automated karyotyping work at JPL. An update on that work, and a description of where it appears to be headed is presented.

  2. Automated Chromosome Breakage Assessment

    NASA Technical Reports Server (NTRS)

    Castleman, Kenneth

    1985-01-01

    An automated karyotyping machine was built at JPL in 1972. It does computerized karyotyping, but it has some hardware limitations. The image processing hardware that was available at a reasonable price in 1972 was marginal, at best, for this job. In the meantime, NASA has developed an interest in longer term spaceflights and an interest in using chromosome breakage studies as a dosimeter for radiation or perhaps other damage that might occur to the tissues. This uses circulating lymphocytes as a physiological dosimeter looking for chromosome breakage on long-term spaceflights. For that reason, we have reactivated the automated karyotyping work at JPL. An update on that work, and a description of where it appears to be headed is presented.

  3. Automated Assembly Center (AAC)

    NASA Technical Reports Server (NTRS)

    Stauffer, Robert J.

    1993-01-01

    The objectives of this project are as follows: to integrate advanced assembly and assembly support technology under a comprehensive architecture; to implement automated assembly technologies in the production of high-visibility DOD weapon systems; and to document the improved cost, quality, and lead time. This will enhance the production of DOD weapon systems by utilizing the latest commercially available technologies combined into a flexible system that will be able to readily incorporate new technologies as they emerge. Automated assembly encompasses the following areas: product data, process planning, information management policies and framework, three schema architecture, open systems communications, intelligent robots, flexible multi-ability end effectors, knowledge-based/expert systems, intelligent workstations, intelligent sensor systems, and PDES/PDDI data standards.

  4. Power subsystem automation study

    NASA Technical Reports Server (NTRS)

    Tietz, J. C.; Sewy, D.; Pickering, C.; Sauers, R.

    1984-01-01

    The purpose of the phase 2 of the power subsystem automation study was to demonstrate the feasibility of using computer software to manage an aspect of the electrical power subsystem on a space station. The state of the art in expert systems software was investigated in this study. This effort resulted in the demonstration of prototype expert system software for managing one aspect of a simulated space station power subsystem.

  5. Automated Microbial Metabolism Laboratory

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Development of the automated microbial metabolism laboratory (AMML) concept is reported. The focus of effort of AMML was on the advanced labeled release experiment. Labeled substrates, inhibitors, and temperatures were investigated to establish a comparative biochemical profile. Profiles at three time intervals on soil and pure cultures of bacteria isolated from soil were prepared to establish a complete library. The development of a strategy for the return of a soil sample from Mars is also reported.

  6. Automated RSO Stability Analysis

    NASA Astrophysics Data System (ADS)

    Johnson, T.

    2016-09-01

    A methodology for assessing the attitude stability of a Resident Space Object (RSO) using visual magnitude data is presented and then scaled to run in an automated fashion across the entire satellite catalog. Results obtained by applying the methodology to the Commercial Space Operations Center (COMSpOC) catalog are presented and summarized, identifying objects that have changed stability. We also examine the timeline for detecting the transition from stable to unstable attitude

  7. Automated Nitrocellulose Analysis

    DTIC Science & Technology

    1978-12-01

    is acceptable. (4) As would be expected from the theory of osmosis , a high saline content in the dialysis recipient stream (countersolution) is of...Block 39, II different from Report; IS. SUPPLEMENTARY NOTES IS. KEY WOROS (Continue on rereri Analysis Automated analysis Dialysis Glyceryl...Technicon AutoAnalyzer, involves aspiration of a stirred nitrocellulose suspension, dialysis against 9 percent saline, and hydrolysis with 5N sodium

  8. Cavendish Balance Automation

    NASA Technical Reports Server (NTRS)

    Thompson, Bryan

    2000-01-01

    This is the final report for a project carried out to modify a manual commercial Cavendish Balance for automated use in cryostat. The scope of this project was to modify an off-the-shelf manually operated Cavendish Balance to allow for automated operation for periods of hours or days in cryostat. The purpose of this modification was to allow the balance to be used in the study of effects of superconducting materials on the local gravitational field strength to determine if the strength of gravitational fields can be reduced. A Cavendish Balance was chosen because it is a fairly simple piece of equipment for measuring gravity, one the least accurately known and least understood physical constants. The principle activities that occurred under this purchase order were: (1) All the components necessary to hold and automate the Cavendish Balance in a cryostat were designed. Engineering drawings were made of custom parts to be fabricated, other off-the-shelf parts were procured; (2) Software was written in LabView to control the automation process via a stepper motor controller and stepper motor, and to collect data from the balance during testing; (3)Software was written to take the data collected from the Cavendish Balance and reduce it to give a value for the gravitational constant; (4) The components of the system were assembled and fitted to a cryostat. Also the LabView hardware including the control computer, stepper motor driver, data collection boards, and necessary cabling were assembled; and (5) The system was operated for a number of periods, data collected, and reduced to give an average value for the gravitational constant.

  9. Automated Cooperative Trajectories

    NASA Technical Reports Server (NTRS)

    Hanson, Curt; Pahle, Joseph; Brown, Nelson

    2015-01-01

    This presentation is an overview of the Automated Cooperative Trajectories project. An introduction to the phenomena of wake vortices is given, along with a summary of past research into the possibility of extracting energy from the wake by flying close parallel trajectories. Challenges and barriers to adoption of civilian automatic wake surfing technology are identified. A hardware-in-the-loop simulation is described that will support future research. Finally, a roadmap for future research and technology transition is proposed.

  10. Automated measurement of Drosophila wings.

    PubMed

    Houle, David; Mezey, Jason; Galpern, Paul; Carter, Ashley

    2003-12-11

    Many studies in evolutionary biology and genetics are limited by the rate at which phenotypic information can be acquired. The wings of Drosophila species are a favorable target for automated analysis because of the many interesting questions in evolution and development that can be addressed with them, and because of their simple structure. We have developed an automated image analysis system (WINGMACHINE) that measures the positions of all the veins and the edges of the wing blade of Drosophilid flies. A video image is obtained with the aid of a simple suction device that immobilizes the wing of a live fly. Low-level processing is used to find the major intersections of the veins. High-level processing then optimizes the fit of an a priori B-spline model of wing shape. WINGMACHINE allows the measurement of 1 wing per minute, including handling, imaging, analysis, and data editing. The repeatabilities of 12 vein intersections averaged 86% in a sample of flies of the same species and sex. Comparison of 2400 wings of 25 Drosophilid species shows that wing shape is quite conservative within the group, but that almost all taxa are diagnosably different from one another. Wing shape retains some phylogenetic structure, although some species have shapes very different from closely related species. The WINGMACHINE system facilitates artificial selection experiments on complex aspects of wing shape. We selected on an index which is a function of 14 separate measurements of each wing. After 14 generations, we achieved a 15 S.D. difference between up and down-selected treatments. WINGMACHINE enables rapid, highly repeatable measurements of wings in the family Drosophilidae. Our approach to image analysis may be applicable to a variety of biological objects that can be represented as a framework of connected lines.

  11. Enhancing the usability and performance of structured association mapping algorithms using automation, parallelization, and visualization in the GenAMap software system

    PubMed Central

    2012-01-01

    Background Structured association mapping is proving to be a powerful strategy to find genetic polymorphisms associated with disease. However, these algorithms are often distributed as command line implementations that require expertise and effort to customize and put into practice. Because of the difficulty required to use these cutting-edge techniques, geneticists often revert to simpler, less powerful methods. Results To make structured association mapping more accessible to geneticists, we have developed an automatic processing system called Auto-SAM. Auto-SAM enables geneticists to run structured association mapping algorithms automatically, using parallelization. Auto-SAM includes algorithms to discover gene-networks and find population structure. Auto-SAM can also run popular association mapping algorithms, in addition to five structured association mapping algorithms. Conclusions Auto-SAM is available through GenAMap, a front-end desktop visualization tool. GenAMap and Auto-SAM are implemented in JAVA; binaries for GenAMap can be downloaded from http://sailing.cs.cmu.edu/genamap. PMID:22471660

  12. Automated large-scale file preparation, docking, and scoring: Evaluation of ITScore and STScore using the 2012 Community Structure-Activity Resource Benchmark

    PubMed Central

    Grinter, Sam Z.; Yan, Chengfei; Huang, Sheng-You; Jiang, Lin; Zou, Xiaoqin

    2013-01-01

    In this study, we use the recently released 2012 Community Structure-Activity Resource (CSAR) Dataset to evaluate two knowledge-based scoring functions, ITScore and STScore, and a simple force-field-based potential (VDWScore). The CSAR Dataset contains 757 compounds, most with known affinities, and 57 crystal structures. With the help of the script files for docking preparation, we use the full CSAR Dataset to evaluate the performances of the scoring functions on binding affinity prediction and active/inactive compound discrimination. The CSAR subset that includes crystal structures is used as well, to evaluate the performances of the scoring functions on binding mode and affinity predictions. Within this structure subset, we investigate the importance of accurate ligand and protein conformational sampling and find that the binding affinity predictions are less sensitive to non-native ligand and protein conformations than the binding mode predictions. We also find the full CSAR Dataset to be more challenging in making binding mode predictions than the subset with structures. The script files used for preparing the CSAR Dataset for docking, including scripts for canonicalization of the ligand atoms, are offered freely to the academic community. PMID:23656179

  13. Automation in biological crystallization.

    PubMed

    Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen

    2014-06-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.

  14. Automation in biological crystallization

    PubMed Central

    Shaw Stewart, Patrick; Mueller-Dieckmann, Jochen

    2014-01-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given. PMID:24915074

  15. Autonomy, Automation, and Systems

    NASA Astrophysics Data System (ADS)

    Turner, Philip R.

    1987-02-01

    Aerospace industry interest in autonomy and automation, given fresh impetus by the national goal of establishing a Space Station, is becoming a major item of research and technology development. The promise of new technology arising from research in Artificial Intelligence (AI) has focused much attention on its potential in autonomy and automation. These technologies can improve performance in autonomous control functions that involve planning, scheduling, and fault diagnosis of complex systems. There are, however, many aspects of system and subsystem design in an autonomous system that impact AI applications, but do not directly involve AI technology. Development of a system control architecture, establishment of an operating system within the design, providing command and sensory data collection features appropriate to automated operation, and the use of design analysis tools to support system engineering are specific examples of major design issues. Aspects such as these must also receive attention and technology development support if we are to implement complex autonomous systems within the realistic limitations of mass, power, cost, and available flight-qualified technology that are all-important to a flight project.

  16. Automating existing stations

    SciTech Connect

    Little, J.E.

    1986-09-01

    The task was to automate 20 major compressor stations along ANR Pipeline Co.'s Southeastern and Southwestern pipelines in as many months. Meeting this schedule required standardized hardware and software design. Working with Bristol Babcock Co., ANR came up with an off-the-shelf station automation package suitable for a variety of compressor stations. The project involved 148 engines with 488,880-hp in the 20 stations. ANR Pipeline developed software for these engines and compressors, including horsepower prediction and efficiency. The system places processors ''intelligence'' at each station and engine to monitor and control operations. The station processor receives commands from the company's gas dispatch center at Detroit and informs dispatchers of alarms, conditions, and decision it makes. The automation system is controlled by the Detroit center through a central communications network. Operating orders from the center are sent to the station processor, which obeys orders using the most efficient means of operation at the station's disposal. In a malfunction, a control and communications backup system takes over. Commands and information are directly transmitted between the center and the individual compressor stations. Stations receive their orders based on throughput, with suction and discharge pressure overrides. Additionally, a discharge temperature override protects pipeline coatings.

  17. Automation of optical tweezers

    NASA Astrophysics Data System (ADS)

    Hsieh, Tseng-Ming; Chang, Bo-Jui; Hsu, Long

    2000-07-01

    Optical tweezers is a newly developed instrument, which makes possible the manipulation of micro-optical particles under a microscope. In this paper, we present the automation of an optical tweezers which consists of a modified optical tweezers, equipped with two motorized actuators to deflect a 1 W argon laser beam, and a computer control system including a joystick. The trapping of a single bead and a group of lactoacidofilus was shown, separately. With the aid of the joystick and two auxiliary cursers superimposed on the real-time image of a trapped bead, we demonstrated the simple and convenient operation of the automated optical tweezers. By steering the joystick and then pressing a button on it, we assign a new location for the trapped bead to move to. The increment of the motion 0.04 (mu) m for a 20X objective, is negligible. With a fast computer for image processing, the manipulation of the trapped bead is smooth and accurate. The automation of the optical tweezers is also programmable. This technique may be applied to accelerate the DNA hybridization in a gene chip. The combination of the modified optical tweezers with the computer control system provides a tool for precise manipulation of micro particles in many scientific fields.

  18. Development and implementation of industrialized, fully automated high throughput screening systems

    PubMed Central

    2003-01-01

    Automation has long been a resource for high-throughput screening at Bristol-Myers Squibb. However, with growing deck sizes and decreasing time lines, a new generation of more robust, supportable automated systems was necessary for accomplishing high-throughput screening goals. Implementation of this new generation of automated systems required numerous decisions concerning hardware, software and the value of in-house automation expertise. This project has resulted in fast, flexible, industrialized automation systems with a strong in-house support structure that we believe meets our current high-throughput screening requirements and will continue to meet them well into the future. PMID:18924614

  19. Automation model of sewerage rehabilitation planning.

    PubMed

    Yang, M D; Su, T C

    2006-01-01

    The major steps of sewerage rehabilitation include inspection of sewerage, assessment of structural conditions, computation of structural condition grades, and determination of rehabilitation methods and materials. Conventionally, sewerage rehabilitation planning relies on experts with professional background that is tedious and time-consuming. This paper proposes an automation model of planning optimal sewerage rehabilitation strategies for the sewer system by integrating image process, clustering technology, optimization, and visualization display. Firstly, image processing techniques, such as wavelet transformation and co-occurrence features extraction, were employed to extract various characteristics of structural failures from CCTV inspection images. Secondly, a classification neural network was established to automatically interpret the structural conditions by comparing the extracted features with the typical failures in a databank. Then, to achieve optimal rehabilitation efficiency, a genetic algorithm was used to determine appropriate rehabilitation methods and substitution materials for the pipe sections with a risk of mal-function and even collapse. Finally, the result from the automation model can be visualized in a geographic information system in which essential information of the sewer system and sewerage rehabilitation plans are graphically displayed. For demonstration, the automation model of optimal sewerage rehabilitation planning was applied to a sewer system in east Taichung, Chinese Taiwan.

  20. HADES PC network: an automated data entry system

    SciTech Connect

    Hegemann, D.L.

    1986-09-12

    Mound's Health Physics section is faced with an increasing need to store and retrieve radiological data. This need has been addressed by the Health Physics Automated Data Entry System (HADES) which assumed a full production status on April 1, 1986. Mound's Technical Computer Support group implemented HADES in a series of phases which allowed high priority needs to be immediately supported. As a result of the system's personal computer-based structure, additional capabilities such as automated data acquisition were easily brought on-line. Since its inception in the first quarter of 1984, HADES has matured into a cost-efficient automated data acquisition system for Mound's Health Physics section.

  1. Study of Automated Module Fabrication for Lightweight Solar Blanket Utilization

    NASA Technical Reports Server (NTRS)

    Gibson, C. E.

    1979-01-01

    Cost-effective automated techniques for accomplishing the titled purpose; based on existing in-house capability are described. As a measure of the considered automation, the production of a 50 kilowatt solar array blanket, exclusive of support and deployment structure, within an eight-month fabrication period was used. Solar cells considered for this blanket were 2 x 4 x .02 cm wrap-around cells, 2 x 2 x .005 cm and 3 x 3 x .005 cm standard bar contact thin cells, all welded contacts. Existing fabrication processes are described, the rationale for each process is discussed, and the capability for further automation is discussed.

  2. Automation and Robotics for Space-Based Systems, 1991

    NASA Technical Reports Server (NTRS)

    Williams, Robert L., II (Editor)

    1992-01-01

    The purpose of this in-house workshop was to assess the state-of-the-art of automation and robotics for space operations from an LaRC perspective and to identify areas of opportunity for future research. Over half of the presentations came from the Automation Technology Branch, covering telerobotic control, extravehicular activity (EVA) and intra-vehicular activity (IVA) robotics, hand controllers for teleoperation, sensors, neural networks, and automated structural assembly, all applied to space missions. Other talks covered the Remote Manipulator System (RMS) active damping augmentation, space crane work, modeling, simulation, and control of large, flexible space manipulators, and virtual passive controller designs for space robots.

  3. AUTOMATED INADVERTENT INTRUDER APPLICATION

    SciTech Connect

    Koffman, L; Patricia Lee, P; Jim Cook, J; Elmer Wilhite, E

    2007-05-29

    The Environmental Analysis and Performance Modeling group of Savannah River National Laboratory (SRNL) conducts performance assessments of the Savannah River Site (SRS) low-level waste facilities to meet the requirements of DOE Order 435.1. These performance assessments, which result in limits on the amounts of radiological substances that can be placed in the waste disposal facilities, consider numerous potential exposure pathways that could occur in the future. One set of exposure scenarios, known as inadvertent intruder analysis, considers the impact on hypothetical individuals who are assumed to inadvertently intrude onto the waste disposal site. Inadvertent intruder analysis considers three distinct scenarios for exposure referred to as the agriculture scenario, the resident scenario, and the post-drilling scenario. Each of these scenarios has specific exposure pathways that contribute to the overall dose for the scenario. For the inadvertent intruder analysis, the calculation of dose for the exposure pathways is a relatively straightforward algebraic calculation that utilizes dose conversion factors. Prior to 2004, these calculations were performed using an Excel spreadsheet. However, design checks of the spreadsheet calculations revealed that errors could be introduced inadvertently when copying spreadsheet formulas cell by cell and finding these errors was tedious and time consuming. This weakness led to the specification of functional requirements to create a software application that would automate the calculations for inadvertent intruder analysis using a controlled source of input parameters. This software application, named the Automated Inadvertent Intruder Application, has undergone rigorous testing of the internal calculations and meets software QA requirements. The Automated Inadvertent Intruder Application was intended to replace the previous spreadsheet analyses with an automated application that was verified to produce the same calculations and

  4. Automated follicle analysis in ovarian ultrasound

    NASA Astrophysics Data System (ADS)

    Krivanek, Anthony; Liang, Weidong; Sarty, Gordon E.; Pierson, Roger A.; Sonka, Milan

    1998-06-01

    For women undergoing assisted reproductive therapy, ovarian ultrasound has become an invaluable tool for monitoring the growth and assessing the physiological status of individual follicles. Measurements of the size and shape of follicles are the primary means of evaluation by physicians. Currently, follicle wall segmentation is achieved by manual tracing which is time consuming and susceptible to inter- operator variation. We are introducing a completely automated method of follicle wall isolation which provides faster, more consistent analysis. Our automated method is a 4-step process which employs watershed segmentation and a knowledge-based graph search algorithm which utilizes a priori information about follicle structure for inner and outer wall detection. The automated technique was tested on 36 ultrasonographic images of woman's ovaries. Five images from this set were omitted due to poor image quality. Validation of the remaining 31 ultrasound images against manually traced borders has shown an average rms error of 0.61 +/- 0.40 mm for inner border and 0.61 +/- 0.31 mm for outer border detection. Quantitative comparison of the computer-defined borders and the user-defined borders advocates the accuracy of our automated method of follicle analysis.

  5. Automated Proactive Fault Isolation: A Key to Automated Commissioning

    SciTech Connect

    Katipamula, Srinivas; Brambley, Michael R.

    2007-07-31

    In this paper, we present a generic model for automated continuous commissioing and then delve in detail into one of the processes, proactive testing for fault isolation, which is key to automating commissioning. The automated commissioining process uses passive observation-based fault detction and diagnostic techniques, followed by automated proactive testing for fault isolation, automated fault evaluation, and automated reconfiguration of controls together to continuously keep equipment controlled and running as intended. Only when hard failures occur or a physical replacement is required does the process require human intervention, and then sufficient information is provided by the automated commissioning system to target manual maintenance where it is needed. We then focus on fault isolation by presenting detailed logic that can be used to automatically isolate faults in valves, a common component in HVAC systems, as an example of how automated proactive fault isolation can be accomplished. We conclude the paper with a discussion of how this approach to isolating faults can be applied to other common HVAC components and their automated commmissioning and a summary of key conclusions of the paper.

  6. Networks Of Executive Controllers For Automation

    NASA Technical Reports Server (NTRS)

    Erickson, William K.; Cheeseman, Peter C.

    1988-01-01

    Paper discusses principal issues to be resolved in development of autonomous executive-controller shell for Space Station. Shell represents major increase in complexity of automated systems. More-complex control tasks require system that deals with different goals requiring sequences of tasks that change state of system world in complex ways. Requires integration of all functions. Applications include space station communications, tracking, life support, data processing support, navigation, and control of thermal and structural subsystems.

  7. Networks Of Executive Controllers For Automation

    NASA Technical Reports Server (NTRS)

    Erickson, William K.; Cheeseman, Peter C.

    1988-01-01

    Paper discusses principal issues to be resolved in development of autonomous executive-controller shell for Space Station. Shell represents major increase in complexity of automated systems. More-complex control tasks require system that deals with different goals requiring sequences of tasks that change state of system world in complex ways. Requires integration of all functions. Applications include space station communications, tracking, life support, data processing support, navigation, and control of thermal and structural subsystems.

  8. Automation of the aircraft design process

    NASA Technical Reports Server (NTRS)

    Heldenfels, R. R.

    1974-01-01

    The increasing use of the computer to automate the aerospace product development and engineering process is examined with emphasis on structural analysis and design. Examples of systems of computer programs in aerospace and other industries are reviewed and related to the characteristics of aircraft design in its conceptual, preliminary, and detailed phases. Problems with current procedures are identified, and potential improvements from optimum utilization of integrated disciplinary computer programs by a man/computer team are indicated.

  9. Utility of real-time prospective motion correction (PROMO) on 3D T1-weighted imaging in automated brain structure measurements

    PubMed Central

    Watanabe, Keita; Kakeda, Shingo; Igata, Natsuki; Watanabe, Rieko; Narimatsu, Hidekuni; Nozaki, Atsushi; Dan Rettmann; Abe, Osamu; Korogi, Yukunori

    2016-01-01

    PROspective MOtion correction (PROMO) can prevent motion artefacts. The aim of this study was to determine whether brain structure measurements of motion-corrected images with PROMO were reliable and equivalent to conventional images without motion artefacts. The following T1-weighted images were obtained in healthy subjects: (A) resting scans with and without PROMO and (B) two types of motion scans (“side-to-side” and “nodding” motions) with and without PROMO. The total gray matter volumes and cortical thicknesses were significantly decreased in motion scans without PROMO as compared to the resting scans without PROMO (p < 0.05). Conversely, Bland–Altman analysis indicated no bias between motion scans with PROMO, which have good image quality, and resting scans without PROMO. In addition, there was no bias between resting scans with and without PROMO. The use of PROMO facilitated more reliable brain structure measurements in subjects moving during data acquisition. PMID:27917950

  10. Utility of real-time prospective motion correction (PROMO) on 3D T1-weighted imaging in automated brain structure measurements

    NASA Astrophysics Data System (ADS)

    Watanabe, Keita; Kakeda, Shingo; Igata, Natsuki; Watanabe, Rieko; Narimatsu, Hidekuni; Nozaki, Atsushi; Dan Rettmann; Abe, Osamu; Korogi, Yukunori

    2016-12-01

    PROspective MOtion correction (PROMO) can prevent motion artefacts. The aim of this study was to determine whether brain structure measurements of motion-corrected images with PROMO were reliable and equivalent to conventional images without motion artefacts. The following T1-weighted images were obtained in healthy subjects: (A) resting scans with and without PROMO and (B) two types of motion scans (“side-to-side” and “nodding” motions) with and without PROMO. The total gray matter volumes and cortical thicknesses were significantly decreased in motion scans without PROMO as compared to the resting scans without PROMO (p < 0.05). Conversely, Bland-Altman analysis indicated no bias between motion scans with PROMO, which have good image quality, and resting scans without PROMO. In addition, there was no bias between resting scans with and without PROMO. The use of PROMO facilitated more reliable brain structure measurements in subjects moving during data acquisition.

  11. Numerical analysis of stiffened shells of revolution. Volume 4: Engineer's program manual for STARS-2S shell theory automated for rotational structures - 2 (statics) digital computer program

    NASA Technical Reports Server (NTRS)

    Svalbonas, V.; Ogilvie, P.

    1973-01-01

    The engineering programming information for the digital computer program for analyzing shell structures is presented. The program is designed to permit small changes such as altering the geometry or a table size to fit the specific requirements. Each major subroutine is discussed and the following subjects are included: (1) subroutine description, (2) pertinent engineering symbols and the FORTRAN coded counterparts, (3) subroutine flow chart, and (4) subroutine FORTRAN listing.

  12. Assessment of the Molecular Expression and Structure of Gangliosides in Brain Metastasis of Lung Adenocarcinoma by an Advanced Approach Based on Fully Automated Chip-Nanoelectrospray Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Zamfir, Alina D.; Serb, Alina; Vukeli, Željka; Flangea, Corina; Schiopu, Catalin; Fabris, Dragana; Kalanj-Bognar, Svjetlana; Capitan, Florina; Sisu, Eugen

    2011-12-01

    Gangliosides (GGs), sialic acid-containing glycosphingolipids, are known to be involved in the invasive/metastatic behavior of brain tumor cells. Development of modern methods for determination of the variations in GG expression and structure during neoplastic cell transformation is a priority in the field of biomedical analysis. In this context, we report here on the first optimization and application of chip-based nanoelectrospray (NanoMate robot) mass spectrometry (MS) for the investigation of gangliosides in a secondary brain tumor. In our work a native GG mixture extracted and purified from brain metastasis of lung adenocarcinoma was screened by NanoMate robot coupled to a quadrupole time-of-flight MS. A native GG mixture from an age-matched healthy brain tissue, sampled and analyzed under identical conditions, served as a control. Comparative MS analysis demonstrated an evident dissimilarity in GG expression in the two tissue types. Brain metastasis is characterized by many species having a reduced N-acetylneuraminic acid (Neu5Ac) content, however, modified by fucosylation or O-acetylation such as Fuc-GM4, Fuc-GM3, di- O-Ac-GM1, O-Ac-GM3. In contrast, healthy brain tissue is dominated by longer structures exhibiting from mono- to hexasialylated sugar chains. Also, significant differences in ceramide composition were discovered. By tandem MS using collision-induced dissociation at low energies, brain metastasis-associated GD3 (d18:1/18:0) species as well as an uncommon Fuc-GM1 (d18:1/18:0) detected in the normal brain tissue could be structurally characterized. The novel protocol was able to provide a reliable compositional and structural characterization with high analysis pace and at a sensitivity situated in the fmol range.

  13. Automated method for relating regional pulmonary structure and function: integration of dynamic multislice CT and thin-slice high-resolution CT

    NASA Astrophysics Data System (ADS)

    Tajik, Jehangir K.; Kugelmass, Steven D.; Hoffman, Eric A.

    1993-07-01

    We have developed a method utilizing x-ray CT for relating pulmonary perfusion to global and regional anatomy, allowing for detailed study of structure to function relationships. A thick slice, high temporal resolution mode is used to follow a bolus contrast agent for blood flow evaluation and is fused with a high spatial resolution, thin slice mode to obtain structure- function detail. To aid analysis of blood flow, we have developed a software module, for our image analysis package (VIDA), to produce the combined structure-function image. Color coded images representing blood flow, mean transit time, regional tissue content, regional blood volume, regional air content, etc. are generated and imbedded in the high resolution volume image. A text file containing these values along with a voxel's 3-D coordinates is also generated. User input can be minimized to identifying the location of the pulmonary artery from which the input function to a blood flow model is derived. Any flow model utilizing one input and one output function can be easily added to a user selectable list. We present examples from our physiologic based research findings to demonstrate the strengths of combining dynamic CT and HRCT relative to other scanning modalities to uniquely characterize pulmonary normal and pathophysiology.

  14. Automation in organizations: Eternal conflict

    NASA Technical Reports Server (NTRS)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  15. NASA space station automation: AI-based technology review

    NASA Technical Reports Server (NTRS)

    Firschein, O.; Georgeff, M. P.; Park, W.; Neumann, P.; Kautz, W. H.; Levitt, K. N.; Rom, R. J.; Poggio, A. A.

    1985-01-01

    Research and Development projects in automation for the Space Station are discussed. Artificial Intelligence (AI) based automation technologies are planned to enhance crew safety through reduced need for EVA, increase crew productivity through the reduction of routine operations, increase space station autonomy, and augment space station capability through the use of teleoperation and robotics. AI technology will also be developed for the servicing of satellites at the Space Station, system monitoring and diagnosis, space manufacturing, and the assembly of large space structures.

  16. 77 FR 48527 - National Customs Automation Program (NCAP) Test Concerning Automated Commercial Environment (ACE...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-14

    ... Automated Commercial Environment (ACE) Simplified Entry: Modification of Participant Selection Criteria and... (NCAP) test concerning the simplified entry functionality in the Automated Commercial Environment (ACE...) National Customs Automation Program (NCAP) test concerning Automated Commercial Environment (ACE...

  17. World-wide distribution automation systems

    SciTech Connect

    Devaney, T.M.

    1994-12-31

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems.

  18. Numerical analysis of stiffened shells of revolution. Volume 2: Users' manual for STAR-02S - shell theory automated for rotational structures - 2 (statics), digital computer program

    NASA Technical Reports Server (NTRS)

    Svalbonas, V.

    1973-01-01

    A procedure for the structural analysis of stiffened shells of revolution is presented. A digital computer program based on the Love-Reissner first order shell theory was developed. The computer program can analyze orthotropic thin shells of revolution, subjected to unsymmetric distributed loading or concentrated line loads, as well as thermal strains. The geometrical shapes of the shells which may be analyzed are described. The shell wall cross section can be a sheet, sandwich, or reinforced sheet or sandwich. General stiffness input options are also available.

  19. Automating CPM-GOMS

    NASA Technical Reports Server (NTRS)

    John, Bonnie; Vera, Alonso; Matessa, Michael; Freed, Michael; Remington, Roger

    2002-01-01

    CPM-GOMS is a modeling method that combines the task decomposition of a GOMS analysis with a model of human resource usage at the level of cognitive, perceptual, and motor operations. CPM-GOMS models have made accurate predictions about skilled user behavior in routine tasks, but developing such models is tedious and error-prone. We describe a process for automatically generating CPM-GOMS models from a hierarchical task decomposition expressed in a cognitive modeling tool called Apex. Resource scheduling in Apex automates the difficult task of interleaving the cognitive, perceptual, and motor resources underlying common task operators (e.g. mouse move-and-click). Apex's UI automatically generates PERT charts, which allow modelers to visualize a model's complex parallel behavior. Because interleaving and visualization is now automated, it is feasible to construct arbitrarily long sequences of behavior. To demonstrate the process, we present a model of automated teller interactions in Apex and discuss implications for user modeling. available to model human users, the Goals, Operators, Methods, and Selection (GOMS) method [6, 21] has been the most widely used, providing accurate, often zero-parameter, predictions of the routine performance of skilled users in a wide range of procedural tasks [6, 13, 15, 27, 28]. GOMS is meant to model routine behavior. The user is assumed to have methods that apply sequences of operators and to achieve a goal. Selection rules are applied when there is more than one method to achieve a goal. Many routine tasks lend themselves well to such decomposition. Decomposition produces a representation of the task as a set of nested goal states that include an initial state and a final state. The iterative decomposition into goals and nested subgoals can terminate in primitives of any desired granularity, the choice of level of detail dependent on the predictions required. Although GOMS has proven useful in HCI, tools to support the

  20. Automating CPM-GOMS

    NASA Technical Reports Server (NTRS)

    John, Bonnie; Vera, Alonso; Matessa, Michael; Freed, Michael; Remington, Roger

    2002-01-01

    CPM-GOMS is a modeling method that combines the task decomposition of a GOMS analysis with a model of human resource usage at the level of cognitive, perceptual, and motor operations. CPM-GOMS models have made accurate predictions about skilled user behavior in routine tasks, but developing such models is tedious and error-prone. We describe a process for automatically generating CPM-GOMS models from a hierarchical task decomposition expressed in a cognitive modeling tool called Apex. Resource scheduling in Apex automates the difficult task of interleaving the cognitive, perceptual, and motor resources underlying common task operators (e.g. mouse move-and-click). Apex's UI automatically generates PERT charts, which allow modelers to visualize a model's complex parallel behavior. Because interleaving and visualization is now automated, it is feasible to construct arbitrarily long sequences of behavior. To demonstrate the process, we present a model of automated teller interactions in Apex and discuss implications for user modeling. available to model human users, the Goals, Operators, Methods, and Selection (GOMS) method [6, 21] has been the most widely used, providing accurate, often zero-parameter, predictions of the routine performance of skilled users in a wide range of procedural tasks [6, 13, 15, 27, 28]. GOMS is meant to model routine behavior. The user is assumed to have methods that apply sequences of operators and to achieve a goal. Selection rules are applied when there is more than one method to achieve a goal. Many routine tasks lend themselves well to such decomposition. Decomposition produces a representation of the task as a set of nested goal states that include an initial state and a final state. The iterative decomposition into goals and nested subgoals can terminate in primitives of any desired granularity, the choice of level of detail dependent on the predictions required. Although GOMS has proven useful in HCI, tools to support the

  1. Automation for optics manufacturing

    NASA Astrophysics Data System (ADS)

    Pollicove, Harvey M.; Moore, Duncan T.

    1990-11-01

    The optics industry has not followed the lead of the machining and electronics industries in applying advances In computer aided engineering (CAE), computer assisted manufacturing (CAM), automation or quality management techniques. Automationbased on computer integrated manufacturing (CIM) and flexible machining systems (FMS) has been widely implemented In these industries. Optics continues to rely on standalone equipment that preserves the highly skilled, labor intensive optical fabrication systems developed in the 1940's. This paper describes development initiatives at the Center for Optics Manufacturing that will create computer integrated manufacturing technology and support processes for the optical industry.

  2. Automated Propellant Blending

    NASA Technical Reports Server (NTRS)

    Hohmann, Carl W. (Inventor); Harrington, Douglas W. (Inventor); Dutton, Maureen L. (Inventor); Tipton, Billy Charles, Jr. (Inventor); Bacak, James W. (Inventor); Salazar, Frank (Inventor)

    2000-01-01

    An automated propellant blending apparatus and method that uses closely metered addition of countersolvent to a binder solution with propellant particles dispersed therein to precisely control binder precipitation and particle aggregation is discussed. A profile of binder precipitation versus countersolvent-solvent ratio is established empirically and used in a computer algorithm to establish countersolvent addition parameters near the cloud point for controlling the transition of properties of the binder during agglomeration and finishing of the propellant composition particles. The system is remotely operated by computer for safety, reliability and improved product properties, and also increases product output.

  3. [Automated anesthesia record system].

    PubMed

    Zhu, Tao; Liu, Jin

    2005-12-01

    Based on Client/Server architecture, a software of automated anesthesia record system running under Windows operation system and networks has been developed and programmed with Microsoft Visual C++ 6.0, Visual Basic 6.0 and SQL Server. The system can deal with patient's information throughout the anesthesia. It can collect and integrate the data from several kinds of medical equipment such as monitor, infusion pump and anesthesia machine automatically and real-time. After that, the system presents the anesthesia sheets automatically. The record system makes the anesthesia record more accurate and integral and can raise the anesthesiologist's working efficiency.

  4. Automated fiber pigtailing machine

    DOEpatents

    Strand, Oliver T.; Lowry, Mark E.

    1999-01-01

    The Automated Fiber Pigtailing Machine (AFPM) aligns and attaches optical fibers to optoelectonic (OE) devices such as laser diodes, photodiodes, and waveguide devices without operator intervention. The so-called pigtailing process is completed with sub-micron accuracies in less than 3 minutes. The AFPM operates unattended for one hour, is modular in design and is compatible with a mass production manufacturing environment. This machine can be used to build components which are used in military aircraft navigation systems, computer systems, communications systems and in the construction of diagnostics and experimental systems.

  5. Automated fiber pigtailing machine

    DOEpatents

    Strand, O.T.; Lowry, M.E.

    1999-01-05

    The Automated Fiber Pigtailing Machine (AFPM) aligns and attaches optical fibers to optoelectronic (OE) devices such as laser diodes, photodiodes, and waveguide devices without operator intervention. The so-called pigtailing process is completed with sub-micron accuracies in less than 3 minutes. The AFPM operates unattended for one hour, is modular in design and is compatible with a mass production manufacturing environment. This machine can be used to build components which are used in military aircraft navigation systems, computer systems, communications systems and in the construction of diagnostics and experimental systems. 26 figs.

  6. Automated wire preparation system

    NASA Astrophysics Data System (ADS)

    McCulley, Deborah J.

    The first step toward an automated wire harness facility for the aerospace industry has been taken by implementing the Wire Vektor 2000 into the wire harness preparation area. An overview of the Wire Vektor 2000 is given, including the facilities for wire cutting, marking, and transporting, for wire end processing, and for system control. Production integration in the Wire Vektor 2000 system is addressed, considering the hardware/software debug system and the system throughput. The manufacturing changes that have to be made in implementing the Wire Vektor 2000 are discussed.

  7. Automated Propellant Blending

    NASA Technical Reports Server (NTRS)

    Hohmann, Carl W. (Inventor); Harrington, Douglas W. (Inventor); Dutton, Maureen L. (Inventor); Tipton, Billy Charles, Jr. (Inventor); Bacak, James W. (Inventor); Salazar, Frank (Inventor)

    1999-01-01

    An automated propellant blending apparatus and method uses closely metered addition of countersolvent to a binder solution with propellant particles dispersed therein to precisely control binder precipitation and particle aggregation. A profile of binder precipitation versus countersolvent-solvent ratio is established empirically and used in a computer algorithm to establish countersolvent addition parameters near the cloud point for controlling the transition of properties of the binder during agglomeration and finishing of the propellant composition particles. The system is remotely operated by computer for safety, reliability and improved product properties, and also increases product output.

  8. Automated image analysis of intra-tumoral and peripheral endocrine organ vascular bed regression using 'Fibrelength' as a novel structural biomarker.

    PubMed

    Hargreaves, Adam; Bigley, Alison; Price, Shirley; Kendrew, Jane; Barry, Simon T

    2017-02-10

    The study of vascular modulation has received a great deal of attention in recent years as knowledge has increased around the role of angiogenesis within disease contexts such as cancer. Despite rapidly expanding insights into the molecular processes involved and the concomitant generation of a number of anticancer vascular modulating chemotherapeutics, techniques used in the measurement of structural vascular change have advanced more modestly, particularly with regard to the preclinical quantification of off-target vascular regression within systemic, notably endocrine, blood vessels. Such changes translate into a number of major clinical side effects and there remains a need for improved preclinical screening and analysis. Here we present the generation of a novel structural biomarker, which can be incorporated into a number of contemporary image analysis platforms and used to compare tumour versus systemic host tissue vascularity. By contrasting the measurements obtained, the preclinical efficacy of vascular modulating chemotherapies can be evaluated in light of the predicted therapeutic window. Copyright © 2017 John Wiley & Sons, Ltd.

  9. VSDMIP 1.5: an automated structure- and ligand-based virtual screening platform with a PyMOL graphical user interface

    NASA Astrophysics Data System (ADS)

    Cabrera, Álvaro Cortés; Gil-Redondo, Rubén; Perona, Almudena; Gago, Federico; Morreale, Antonio

    2011-09-01

    A graphical user interface (GUI) for our previously published virtual screening (VS) and data management platform VSDMIP (Gil-Redondo et al. J Comput Aided Mol Design, 23:171-184, 2009) that has been developed as a plugin for the popular molecular visualization program PyMOL is presented. In addition, a ligand-based VS module (LBVS) has been implemented that complements the already existing structure-based VS (SBVS) module and can be used in those cases where the receptor's 3D structure is not known or for pre-filtering purposes. This updated version of VSDMIP is placed in the context of similar available software and its LBVS and SBVS capabilities are tested here on a reduced set of the Directory of Useful Decoys database. Comparison of results from both approaches confirms the trend found in previous studies that LBVS outperforms SBVS. We also show that by combining LBVS and SBVS, and using a cluster of 100 modern processors, it is possible to perform complete VS studies of several million molecules in less than a month. As the main processes in VSDMIP are 100% scalable, more powerful processors and larger clusters would notably decrease this time span. The plugin is distributed under an academic license upon request from the authors.

  10. VSDMIP 1.5: an automated structure- and ligand-based virtual screening platform with a PyMOL graphical user interface.

    PubMed

    Cabrera, Álvaro Cortés; Gil-Redondo, Rubén; Perona, Almudena; Gago, Federico; Morreale, Antonio

    2011-09-01

    A graphical user interface (GUI) for our previously published virtual screening (VS) and data management platform VSDMIP (Gil-Redondo et al. J Comput Aided Mol Design, 23:171-184, 2009) that has been developed as a plugin for the popular molecular visualization program PyMOL is presented. In addition, a ligand-based VS module (LBVS) has been implemented that complements the already existing structure-based VS (SBVS) module and can be used in those cases where the receptor's 3D structure is not known or for pre-filtering purposes. This updated version of VSDMIP is placed in the context of similar available software and its LBVS and SBVS capabilities are tested here on a reduced set of the Directory of Useful Decoys database. Comparison of results from both approaches confirms the trend found in previous studies that LBVS outperforms SBVS. We also show that by combining LBVS and SBVS, and using a cluster of ~100 modern processors, it is possible to perform complete VS studies of several million molecules in less than a month. As the main processes in VSDMIP are 100% scalable, more powerful processors and larger clusters would notably decrease this time span. The plugin is distributed under an academic license upon request from the authors. © Springer Science+Business Media B.V. 2011

  11. The Center for Optimized Structural Studies (COSS) platform for automation in cloning, expression, and purification of single proteins and protein-protein complexes.

    PubMed

    Mlynek, Georg; Lehner, Anita; Neuhold, Jana; Leeb, Sarah; Kostan, Julius; Charnagalov, Alexej; Stolt-Bergner, Peggy; Djinović-Carugo, Kristina; Pinotsis, Nikos

    2014-06-01

    Expression in Escherichia coli represents the simplest and most cost effective means for the production of recombinant proteins. This is a routine task in structural biology and biochemistry where milligrams of the target protein are required in high purity and monodispersity. To achieve these criteria, the user often needs to screen several constructs in different expression and purification conditions in parallel. We describe a pipeline, implemented in the Center for Optimized Structural Studies, that enables the systematic screening of expression and purification conditions for recombinant proteins and relies on a series of logical decisions. We first use bioinformatics tools to design a series of protein fragments, which we clone in parallel, and subsequently screen in small scale for optimal expression and purification conditions. Based on a scoring system that assesses soluble expression, we then select the top ranking targets for large-scale purification. In the establishment of our pipeline, emphasis was put on streamlining the processes such that it can be easily but not necessarily automatized. In a typical run of about 2 weeks, we are able to prepare and perform small-scale expression screens for 20-100 different constructs followed by large-scale purification of at least 4-6 proteins. The major advantage of our approach is its flexibility, which allows for easy adoption, either partially or entirely, by any average hypothesis driven laboratory in a manual or robot-assisted manner.

  12. Automated System Marketplace 1995: The Changing Face of Automation.

    ERIC Educational Resources Information Center

    Barry, Jeff; And Others

    1995-01-01

    Discusses trends in the automated system marketplace with specific attention to online vendors and their customers: academic, public, school, and special libraries. Presents vendor profiles; tables and charts on computer systems and sales; and sidebars that include a vendor source list and the differing views on procuring an automated library…

  13. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT ...

    EPA Pesticide Factsheets

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execution of the Soil Water Assessment Tool (SWAT) and KINEmatic Runoff and EROSion (KINEROS2) hydrologic models. The application of these two models allows AGWA to conduct hydrologic modeling and watershed assessments at multiple temporal and spatial scales. AGWA’s current outputs are runoff (volumes and peaks) and sediment yield, plus nitrogen and phosphorus with the SWAT model. AGWA uses commonly available GIS data layers to fully parameterize, execute, and visualize results from both models. Through an intuitive interface the user selects an outlet from which AGWA delineates and discretizes the watershed using a Digital Elevation Model (DEM) based on the individual model requirements. The watershed model elements are then intersected with soils and land cover data layers to derive the requisite model input parameters. The chosen model is then executed, and the results are imported back into AGWA for visualization. This allows managers to identify potential problem areas where additional monitoring can be undertaken or mitigation activities can be focused. AGWA also has tools to apply an array of best management practices. There are currently two versions of AGWA available; AGWA 1.5 for

  14. Space station advanced automation

    NASA Technical Reports Server (NTRS)

    Woods, Donald

    1990-01-01

    In the development of a safe, productive and maintainable space station, Automation and Robotics (A and R) has been identified as an enabling technology which will allow efficient operation at a reasonable cost. The Space Station Freedom's (SSF) systems are very complex, and interdependent. The usage of Advanced Automation (AA) will help restructure, and integrate system status so that station and ground personnel can operate more efficiently. To use AA technology for the augmentation of system management functions requires a development model which consists of well defined phases of: evaluation, development, integration, and maintenance. The evaluation phase will consider system management functions against traditional solutions, implementation techniques and requirements; the end result of this phase should be a well developed concept along with a feasibility analysis. In the development phase the AA system will be developed in accordance with a traditional Life Cycle Model (LCM) modified for Knowledge Based System (KBS) applications. A way by which both knowledge bases and reasoning techniques can be reused to control costs is explained. During the integration phase the KBS software must be integrated with conventional software, and verified and validated. The Verification and Validation (V and V) techniques applicable to these KBS are based on the ideas of consistency, minimal competency, and graph theory. The maintenance phase will be aided by having well designed and documented KBS software.

  15. Agile automated vision

    NASA Astrophysics Data System (ADS)

    Fandrich, Juergen; Schmitt, Lorenz A.

    1994-11-01

    The microelectronic industry is a protagonist in driving automated vision to new paradigms. Today semiconductor manufacturers use vision systems quite frequently in their fabs in the front-end process. In fact, the process depends on reliable image processing systems. In the back-end process, where ICs are assembled and packaged, today vision systems are only partly used. But in the next years automated vision will become compulsory for the back-end process as well. Vision will be fully integrated into every IC package production machine to increase yields and reduce costs. Modem high-speed material processing requires dedicated and efficient concepts in image processing. But the integration of various equipment in a production plant leads to unifying handling of data flow and interfaces. Only agile vision systems can act with these contradictions: fast, reliable, adaptable, scalable and comprehensive. A powerful hardware platform is a unneglectable requirement for the use of advanced and reliable, but unfortunately computing intensive image processing algorithms. The massively parallel SIMD hardware product LANTERN/VME supplies a powerful platform for existing and new functionality. LANTERN/VME is used with a new optical sensor for IC package lead inspection. This is done in 3D, including horizontal and coplanarity inspection. The appropriate software is designed for lead inspection, alignment and control tasks in IC package production and handling equipment, like Trim&Form, Tape&Reel and Pick&Place machines.

  16. Automated office blood pressure.

    PubMed

    Myers, Martin G; Godwin, Marshall

    2012-05-01

    Manual blood pressure (BP) is gradually disappearing from clinical practice with the mercury sphygmomanometer now considered to be an environmental hazard. Manual BP is also subject to measurement error on the part of the physician/nurse and patient-related anxiety which can result in poor quality BP measurements and office-induced (white coat) hypertension. Automated office (AO) BP with devices such as the BpTRU (BpTRU Medical Devices, Coquitlam, BC) has already replaced conventional manual BP in many primary care practices in Canada and has also attracted interest in other countries where research studies using AOBP have been undertaken. The basic principles of AOBP include multiple readings taken with a fully automated recorder with the patient resting alone in a quiet room. When these principles are followed, office-induced hypertension is eliminated and AOBP exhibits a much stronger correlation with the awake ambulatory BP as compared with routine manual BP measurements. Unlike routine manual BP, AOBP correlates as well with left ventricular mass as does the awake ambulatory BP. AOBP also simplifies the definition of hypertension in that the cut point for a normal AOBP (< 135/85 mm Hg) is the same as for the awake ambulatory BP and home BP. This article summarizes the currently available evidence supporting the use of AOBP in routine clinical practice and proposes an algorithm in which AOBP replaces manual BP for the diagnosis and management of hypertension.

  17. Maneuver Automation Software

    NASA Technical Reports Server (NTRS)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; hide

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  18. Automating quantum experiment control

    NASA Astrophysics Data System (ADS)

    Stevens, Kelly E.; Amini, Jason M.; Doret, S. Charles; Mohler, Greg; Volin, Curtis; Harter, Alexa W.

    2017-03-01

    The field of quantum information processing is rapidly advancing. As the control of quantum systems approaches the level needed for useful computation, the physical hardware underlying the quantum systems is becoming increasingly complex. It is already becoming impractical to manually code control for the larger hardware implementations. In this chapter, we will employ an approach to the problem of system control that parallels compiler design for a classical computer. We will start with a candidate quantum computing technology, the surface electrode ion trap, and build a system instruction language which can be generated from a simple machine-independent programming language via compilation. We incorporate compile time generation of ion routing that separates the algorithm description from the physical geometry of the hardware. Extending this approach to automatic routing at run time allows for automated initialization of qubit number and placement and additionally allows for automated recovery after catastrophic events such as qubit loss. To show that these systems can handle real hardware, we present a simple demonstration system that routes two ions around a multi-zone ion trap and handles ion loss and ion placement. While we will mainly use examples from transport-based ion trap quantum computing, many of the issues and solutions are applicable to other architectures.

  19. A Demonstration of Automated DNA Sequencing.

    ERIC Educational Resources Information Center

    Latourelle, Sandra; Seidel-Rogol, Bonnie

    1998-01-01

    Details a simulation that employs a paper-and-pencil model to demonstrate the principles behind automated DNA sequencing. Discusses the advantages of automated sequencing as well as the chemistry of automated DNA sequencing. (DDR)

  20. Automating measurement from standard radiographs

    NASA Astrophysics Data System (ADS)

    Harris, Adam I.; Dori, Dov; Sheinkop, Mitchell; Berkson, Eric; Haralick, Robert M.

    1993-04-01

    An obligatory portion of orthopaedic examination is a radiographic examination. Techniques, such as computed tomography easily lend themselves to computerized analysis. Both expense and hazards from radiation prohibit their routine use in orthopaedic practice. Standard radiographs provide a significant amount of information for the orthopaedic surgeon. From the radiographs, surgeons will make many measurements and assessments. A major problem is that measurements are performed by hand and most often by the operating surgeon who may not be completely objective. To overcome this as well as to alleviate the burden of manual measurements which must be made by trained professionals, we have initiated a program to automate certain radiographic measurements. The technique involves digitizing standard radiographs from which features are extracted and identified. This poses a challenge. Structures, such as soft tissues (muscle, and bowel) markedly decrease the signal to noise ratio of the image. The work discusses modeling of the soft tissue structures in order to enhance detection and identification of bone landmarks. These are the anchors for standard measurements which in turn have clinical utility.

  1. Automated Power-Distribution System

    NASA Technical Reports Server (NTRS)

    Ashworth, Barry; Riedesel, Joel; Myers, Chris; Miller, William; Jones, Ellen F.; Freeman, Kenneth; Walsh, Richard; Walls, Bryan K.; Weeks, David J.; Bechtel, Robert T.

    1992-01-01

    Autonomous power-distribution system includes power-control equipment and automation equipment. System automatically schedules connection of power to loads and reconfigures itself when it detects fault. Potential terrestrial applications include optimization of consumption of power in homes, power supplies for autonomous land vehicles and vessels, and power supplies for automated industrial processes.

  2. Robotics/Automated Systems Technicians.

    ERIC Educational Resources Information Center

    Doty, Charles R.

    Major resources exist that can be used to develop or upgrade programs in community colleges and technical institutes that educate robotics/automated systems technicians. The first category of resources is Economic, Social, and Education Issues. The Office of Technology Assessment (OTA) report, "Automation and the Workplace," presents analyses of…

  3. Migration monitoring with automated technology

    Treesearch

    Rhonda L. Millikin

    2005-01-01

    Automated technology can supplement ground-based methods of migration monitoring by providing: (1) unbiased and automated sampling; (2) independent validation of current methods; (3) a larger sample area for landscape-level analysis of habitat selection for stopover, and (4) an opportunity to study flight behavior. In particular, radar-acoustic sensor fusion can...

  4. Robotics/Automated Systems Technicians.

    ERIC Educational Resources Information Center

    Doty, Charles R.

    Major resources exist that can be used to develop or upgrade programs in community colleges and technical institutes that educate robotics/automated systems technicians. The first category of resources is Economic, Social, and Education Issues. The Office of Technology Assessment (OTA) report, "Automation and the Workplace," presents analyses of…

  5. Automated Test-Form Generation

    ERIC Educational Resources Information Center

    van der Linden, Wim J.; Diao, Qi

    2011-01-01

    In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…

  6. Opening up Library Automation Software

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2009-01-01

    Throughout the history of library automation, the author has seen a steady advancement toward more open systems. In the early days of library automation, when proprietary systems dominated, the need for standards was paramount since other means of inter-operability and data exchange weren't possible. Today's focus on Application Programming…

  7. Opening up Library Automation Software

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2009-01-01

    Throughout the history of library automation, the author has seen a steady advancement toward more open systems. In the early days of library automation, when proprietary systems dominated, the need for standards was paramount since other means of inter-operability and data exchange weren't possible. Today's focus on Application Programming…

  8. Translation: Aids, Robots, and Automation.

    ERIC Educational Resources Information Center

    Andreyewsky, Alexander

    1981-01-01

    Examines electronic aids to translation both as ways to automate it and as an approach to solve problems resulting from shortage of qualified translators. Describes the limitations of robotic MT (Machine Translation) systems, viewing MAT (Machine-Aided Translation) as the only practical solution and the best vehicle for further automation. (MES)

  9. Automated Circulation. SPEC Kit 43.

    ERIC Educational Resources Information Center

    Association of Research Libraries, Washington, DC. Office of Management Studies.

    Of the 64 libraries responding to a 1978 Association of Research Libraries (ARL) survey, 37 indicated that they used automated circulation systems; half of these were commercial systems, and most were batch-process or combination batch process and online. Nearly all libraries without automated systems cited lack of funding as the reason for not…

  10. Classification of Automated Search Traffic

    NASA Astrophysics Data System (ADS)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  11. Automated spectral classification and the GAIA project

    NASA Technical Reports Server (NTRS)

    Lasala, Jerry; Kurtz, Michael J.

    1995-01-01

    Two dimensional spectral types for each of the stars observed in the global astrometric interferometer for astrophysics (GAIA) mission would provide additional information for the galactic structure and stellar evolution studies, as well as helping in the identification of unusual objects and populations. The classification of the large quantity generated spectra requires that automated techniques are implemented. Approaches for the automatic classification are reviewed, and a metric-distance method is discussed. In tests, the metric-distance method produced spectral types with mean errors comparable to those of human classifiers working at similar resolution. Data and equipment requirements for an automated classification survey, are discussed. A program of auxiliary observations is proposed to yield spectral types and radial velocities for the GAIA-observed stars.

  12. Automation software for a materials testing laboratory

    NASA Technical Reports Server (NTRS)

    Mcgaw, Michael A.; Bonacuse, Peter J.

    1986-01-01

    A comprehensive software system for automating much of the experimental process has recently been completed at the Lewis Research Center's high-temperature fatigue and structures laboratory. The system was designed to support experiment definition and conduct, results analysis and archiving, and report generation activities. This was accomplished through the design and construction of several software systems, as well as through the use of several commercially available software products, all operating on a local, distributed minicomputer system. Experimental capabilities currently supported in an automated fashion include both isothermal and thermomechanical fatigue and deformation testing capabilities. The future growth and expansion of this system will be directed toward providing multiaxial test control, enhanced thermomechanical test control, and higher test frequency (hundreds of hertz).

  13. Automated spectral classification and the GAIA project

    NASA Technical Reports Server (NTRS)

    Lasala, Jerry; Kurtz, Michael J.

    1995-01-01

    Two dimensional spectral types for each of the stars observed in the global astrometric interferometer for astrophysics (GAIA) mission would provide additional information for the galactic structure and stellar evolution studies, as well as helping in the identification of unusual objects and populations. The classification of the large quantity generated spectra requires that automated techniques are implemented. Approaches for the automatic classification are reviewed, and a metric-distance method is discussed. In tests, the metric-distance method produced spectral types with mean errors comparable to those of human classifiers working at similar resolution. Data and equipment requirements for an automated classification survey, are discussed. A program of auxiliary observations is proposed to yield spectral types and radial velocities for the GAIA-observed stars.

  14. Automation impact study of Army Training Management

    SciTech Connect

    Sanquist, T.F.; Schuller, C.R.; McCallum, M.C.; Underwood, J.A.; Bettin, P.J.; King, J.L.; Melber, B.D.; Hostick, C.J.; Seaver, D.A.

    1988-01-01

    The main objectives of this impact study were to identify the potential cost savings associated with automated Army Training Management (TM), and to perform a cost-benefit analysis for an Army-wide automated TM system. A subsidiary goal was to establish baseline data for an independent evaluation of a prototype Integrated Training Management System (ITMS), to be tested in the fall of 1988. A structured analysis of TM doctrine was performed for comparison with empirical data gathered in a job analysis survey of selected units of the 9ID (MTZ) at Ft. Lewis, Washington. These observations will be extended to other units in subsequent surveys. The survey data concerning staffing levels and amount of labor expended on eight distinct TM tasks were analyzed in a cost effectiveness model. The main results of the surveys and cost effectiveness modelling are summarized. 18 figs., 47 tabs.

  15. Flexible automation of cell culture and tissue engineering tasks.

    PubMed

    Knoll, Alois; Scherer, Torsten; Poggendorf, Iris; Lütkemeyer, Dirk; Lehmann, Jürgen

    2004-01-01

    Until now, the predominant use cases of industrial robots have been routine handling tasks in the automotive industry. In biotechnology and tissue engineering, in contrast, only very few tasks have been automated with robots. New developments in robot platform and robot sensor technology, however, make it possible to automate plants that largely depend on human interaction with the production process, e.g., for material and cell culture fluid handling, transportation, operation of equipment, and maintenance. In this paper we present a robot system that lends itself to automating routine tasks in biotechnology but also has the potential to automate other production facilities that are similar in process structure. After motivating the design goals, we describe the system and its operation, illustrate sample runs, and give an assessment of the advantages. We conclude this paper by giving an outlook on possible further developments.

  16. Towards automated screening of two-dimensional crystals.

    PubMed

    Cheng, Anchi; Leung, Albert; Fellmann, Denis; Quispe, Joel; Suloway, Christian; Pulokas, James; Abeyrathne, Priyanka D; Lam, Joseph S; Carragher, Bridget; Potter, Clinton S

    2007-12-01

    Screening trials to determine the presence of two-dimensional (2D) protein crystals suitable for three-dimensional structure determination using electron crystallography is a very labor-intensive process. Methods compatible with fully automated screening have been developed for the process of crystal production by dialysis and for producing negatively stained grids of the resulting trials. Further automation via robotic handling of the EM grids, and semi-automated transmission electron microscopic imaging and evaluation of the trial grids is also possible. We, and others, have developed working prototypes for several of these tools and tested and evaluated them in a simple screen of 24 crystallization conditions. While further development of these tools is certainly required for a turn-key system, the goal of fully automated screening appears to be within reach.

  17. Automated segmentation of 3D anatomical structures on CT images by using a deep convolutional network based on end-to-end learning approach

    NASA Astrophysics Data System (ADS)

    Zhou, Xiangrong; Takayama, Ryosuke; Wang, Song; Zhou, Xinxin; Hara, Takeshi; Fujita, Hiroshi

    2017-02-01

    We have proposed an end-to-end learning approach that trained a deep convolutional neural network (CNN) for automatic CT image segmentation, which accomplished a voxel-wised multiple classification to directly map each voxel on 3D CT images to an anatomical label automatically. The novelties of our proposed method were (1) transforming the anatomical structures segmentation on 3D CT images into a majority voting of the results of 2D semantic image segmentation on a number of 2D-slices from different image orientations, and (2) using "convolution" and "deconvolution" networks to achieve the conventional "coarse recognition" and "fine extraction" functions which were integrated into a compact all-in-one deep CNN for CT image segmentation. The advantage comparing to previous works was its capability to accomplish real-time image segmentations on 2D slices of arbitrary CT-scan-range (e.g. body, chest, abdomen) and produced correspondingly-sized output. In this paper, we propose an improvement of our proposed approach by adding an organ localization module to limit CT image range for training and testing deep CNNs. A database consisting of 240 3D CT scans and a human annotated ground truth was used for training (228 cases) and testing (the remaining 12 cases). We applied the improved method to segment pancreas and left kidney regions, respectively. The preliminary results showed that the accuracies of the segmentation results were improved significantly (pancreas was 34% and kidney was 8% increased in Jaccard index from our previous results). The effectiveness and usefulness of proposed improvement for CT image segmentations were confirmed.

  18. Automated Detection and Segmentation of Vascular Structures of Skin Lesions Seen in Dermoscopy, with an application to Basal Cell Carcinoma Classification.

    PubMed

    Kharazmi, Pegah; AlJasser, Mohammed I; Lui, Harvey; Wang, Z Jane; Lee, Tim K

    2016-12-08

    Blood vessels are important biomarkers in skin lesions both diagnostically and clinically. Detection and quantification of cutaneous blood vessels provide critical information towards lesion diagnosis and assessment. In this paper, a novel framework for detection and segmentation of cutaneous vasculature from dermoscopy images is presented and the further extracted vascular features are explored for skin cancer classification. Given a dermoscopy image, we segment vascular structures of the lesion by first decomposing the image using independent component analysis into melanin and hemoglobin components. This eliminates the effect of pigmentation on the visibility of blood vessels. Using k-means clustering, the hemoglobin component is then clustered into normal, pigmented and erythema regions. Shape filters are then applied to the erythema cluster at different scales. A vessel mask is generated as a result of global thresholding. The segmentation sensitivity and specificity of 90% and 86% were achieved on a set of 500000 manually segmented pixels provided by an expert. To further demonstrate the superiority of the proposed method, based on the segmentation results, we defined and extracted vascular features towards lesion diagnosis in Basal cell carcinoma (BCC). Among a data set of 659 lesions (299 BCC and 360 non-BCC), a set of 12 vascular features are extracted from the final vessel images of the lesions and fed into a random forest classifier. When compared with a few other state-of-art methods, the proposed method achieves the best performance of 96.5% in terms of AUC in differentiating BCC from benign lesions using only the extracted vascular features.

  19. The historical development and basis of human factors guidelines for automated systems in aeronautical operations

    NASA Technical Reports Server (NTRS)

    Ciciora, J. A.; Leonard, S. D.; Johnson, N.; Amell, J.

    1984-01-01

    In order to derive general design guidelines for automated systems a study was conducted on the utilization and acceptance of existing automated systems as currently employed in several commercial fields. Four principal study area were investigated by means of structured interviews, and in some cases questionnaires. The study areas were aviation, a both scheduled airline and general commercial aviation; process control and factory applications; office automation; and automation in the power industry. The results of over eighty structured interviews were analyzed and responses categoried as various human factors issues for use by both designers and users of automated equipment. These guidelines address such items as general physical features of automated equipment; personnel orientation, acceptance, and training; and both personnel and system reliability.

  20. Automated Desalting Apparatus

    NASA Technical Reports Server (NTRS)

    Spencer, Maegan K.; Liu, De-Ling; Kanik, Isik; Beegle, Luther

    2010-01-01

    Because salt and metals can mask the signature of a variety of organic molecules (like amino acids) in any given sample, an automated system to purify complex field samples has been created for the analytical techniques of electrospray ionization/ mass spectroscopy (ESI/MS), capillary electrophoresis (CE), and biological assays where unique identification requires at least some processing of complex samples. This development allows for automated sample preparation in the laboratory and analysis of complex samples in the field with multiple types of analytical instruments. Rather than using tedious, exacting protocols for desalting samples by hand, this innovation, called the Automated Sample Processing System (ASPS), takes analytes that have been extracted through high-temperature solvent extraction and introduces them into the desalting column. After 20 minutes, the eluent is produced. This clear liquid can then be directly analyzed by the techniques listed above. The current apparatus including the computer and power supplies is sturdy, has an approximate mass of 10 kg, and a volume of about 20 20 20 cm, and is undergoing further miniaturization. This system currently targets amino acids. For these molecules, a slurry of 1 g cation exchange resin in deionized water is packed into a column of the apparatus. Initial generation of the resin is done by flowing sequentially 2.3 bed volumes of 2N NaOH and 2N HCl (1 mL each) to rinse the resin, followed by .5 mL of deionized water. This makes the pH of the resin near neutral, and eliminates cross sample contamination. Afterward, 2.3 mL of extracted sample is then loaded into the column onto the top of the resin bed. Because the column is packed tightly, the sample can be applied without disturbing the resin bed. This is a vital step needed to ensure that the analytes adhere to the resin. After the sample is drained, oxalic acid (1 mL, pH 1.6-1.8, adjusted with NH4OH) is pumped into the column. Oxalic acid works as a

  1. Automating the analytical laboratory via the Chemical Analysis Automation paradigm

    SciTech Connect

    Hollen, R.; Rzeszutko, C.

    1997-10-01

    To address the need for standardization within the analytical chemistry laboratories of the nation, the Chemical Analysis Automation (CAA) program within the US Department of Energy, Office of Science and Technology`s Robotic Technology Development Program is developing laboratory sample analysis systems that will automate the environmental chemical laboratories. The current laboratory automation paradigm consists of islands-of-automation that do not integrate into a system architecture. Thus, today the chemist must perform most aspects of environmental analysis manually using instrumentation that generally cannot communicate with other devices in the laboratory. CAA is working towards a standardized and modular approach to laboratory automation based upon the Standard Analysis Method (SAM) architecture. Each SAM system automates a complete chemical method. The building block of a SAM is known as the Standard Laboratory Module (SLM). The SLM, either hardware or software, automates a subprotocol of an analysis method and can operate as a standalone or as a unit within a SAM. The CAA concept allows the chemist to easily assemble an automated analysis system, from sample extraction through data interpretation, using standardized SLMs without the worry of hardware or software incompatibility or the necessity of generating complicated control programs. A Task Sequence Controller (TSC) software program schedules and monitors the individual tasks to be performed by each SLM configured within a SAM. The chemist interfaces with the operation of the TSC through the Human Computer Interface (HCI), a logical, icon-driven graphical user interface. The CAA paradigm has successfully been applied in automating EPA SW-846 Methods 3541/3620/8081 for the analysis of PCBs in a soil matrix utilizing commercially available equipment in tandem with SLMs constructed by CAA.

  2. Automated Electrostatics Environmental Chamber

    NASA Technical Reports Server (NTRS)

    Calle, Carlos; Lewis, Dean C.; Buchanan, Randy K.; Buchanan, Aubri

    2005-01-01

    The Mars Electrostatics Chamber (MEC) is an environmental chamber designed primarily to create atmospheric conditions like those at the surface of Mars to support experiments on electrostatic effects in the Martian environment. The chamber is equipped with a vacuum system, a cryogenic cooling system, an atmospheric-gas replenishing and analysis system, and a computerized control system that can be programmed by the user and that provides both automation and options for manual control. The control system can be set to maintain steady Mars-like conditions or to impose temperature and pressure variations of a Mars diurnal cycle at any given season and latitude. In addition, the MEC can be used in other areas of research because it can create steady or varying atmospheric conditions anywhere within the wide temperature, pressure, and composition ranges between the extremes of Mars-like and Earth-like conditions.

  3. Health care automation companies.

    PubMed

    1995-12-01

    Health care automation companies: card transaction processing/EFT/EDI-capable banks; claims auditing/analysis; claims processors/clearinghouses; coding products/services; computer hardware; computer networking/LAN/WAN; consultants; data processing/outsourcing; digital dictation/transcription; document imaging/optical disk storage; executive information systems; health information networks; hospital/health care information systems; interface engines; laboratory information systems; managed care information systems; patient identification/credit cards; pharmacy information systems; POS terminals; radiology information systems; software--claims related/computer-based patient records/home health care/materials management/supply ordering/physician practice management/translation/utilization review/outcomes; telecommunications products/services; telemedicine/teleradiology; value-added networks.

  4. Automated synthetic scene generation

    NASA Astrophysics Data System (ADS)

    Givens, Ryan N.

    Physics-based simulations generate synthetic imagery to help organizations anticipate system performance of proposed remote sensing systems. However, manually constructing synthetic scenes which are sophisticated enough to capture the complexity of real-world sites can take days to months depending on the size of the site and desired fidelity of the scene. This research, sponsored by the Air Force Research Laboratory's Sensors Directorate, successfully developed an automated approach to fuse high-resolution RGB imagery, lidar data, and hyperspectral imagery and then extract the necessary scene components. The method greatly reduces the time and money required to generate realistic synthetic scenes and developed new approaches to improve material identification using information from all three of the input datasets.

  5. Automating Frame Analysis

    SciTech Connect

    Sanfilippo, Antonio P.; Franklin, Lyndsey; Tratz, Stephen C.; Danielson, Gary R.; Mileson, Nicholas D.; Riensche, Roderick M.; McGrath, Liam

    2008-04-01

    Frame Analysis has come to play an increasingly stronger role in the study of social movements in Sociology and Political Science. While significant steps have been made in providing a theory of frames and framing, a systematic characterization of the frame concept is still largely lacking and there are no rec-ognized criteria and methods that can be used to identify and marshal frame evi-dence reliably and in a time and cost effective manner. Consequently, current Frame Analysis work is still too reliant on manual annotation and subjective inter-pretation. The goal of this paper is to present an approach to the representation, acquisition and analysis of frame evidence which leverages Content Analysis, In-formation Extraction and Semantic Search methods to provide a systematic treat-ment of a Frame Analysis and automate frame annotation.

  6. Automated mapping system patented

    NASA Astrophysics Data System (ADS)

    A patent on a satellite system dubbed Mapsat, which would be able to map the earth from space and would thereby reduce the time and cost of mapping on a smaller scale, has been issued to the U.S. Geological Survey.The Mapsat concept, invented by Alden F. Colvocoresses, a research cartographer at the USGS National Center, is based on Landsat technology but uses sensors that acquire higher-resolution image data in either a stereo or monoscopic mode. Stereo data can be processed relatively simply with automation to produce images for interpretation or to produce maps. Monoscopic and multispectral data can be processed in a computer to derive information on earth resources. Ground control, one of the most expensive phases of mapping, could be kept to a minimum.

  7. Automated Analysis Workstation

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Information from NASA Tech Briefs of work done at Langley Research Center and the Jet Propulsion Laboratory assisted DiaSys Corporation in manufacturing their first product, the R/S 2000. Since then, the R/S 2000 and R/S 2003 have followed. Recently, DiaSys released their fourth workstation, the FE-2, which automates the process of making and manipulating wet-mount preparation of fecal concentrates. The time needed to read the sample is decreased, permitting technologists to rapidly spot parasites, ova and cysts, sometimes carried in the lower intestinal tract of humans and animals. Employing the FE-2 is non-invasive, can be performed on an out-patient basis, and quickly provides confirmatory results.

  8. [From automation to robotics].

    PubMed

    1985-01-01

    The introduction of automation into the laboratory of biology seems to be unavoidable. But at which cost, if it is necessary to purchase a new machine for every new application? Fortunately the same image processing techniques, belonging to a theoretic framework called Mathematical Morphology, may be used in visual inspection tasks, both in car industry and in the biology lab. Since the market for industrial robotics applications is much higher than the market of biomedical applications, the price of image processing devices drops, and becomes sometimes less than the price of a complete microscope equipment. The power of the image processing methods of Mathematical Morphology will be illustrated by various examples, as automatic silver grain counting in autoradiography, determination of HLA genotype, electrophoretic gels analysis, automatic screening of cervical smears... Thus several heterogeneous applications may share the same image processing device, provided there is a separate and devoted work station for each of them.

  9. Robust automated knowledge capture.

    SciTech Connect

    Stevens-Adams, Susan Marie; Abbott, Robert G.; Forsythe, James Chris; Trumbo, Michael Christopher Stefan; Haass, Michael Joseph; Hendrickson, Stacey M. Langfitt

    2011-10-01

    This report summarizes research conducted through the Sandia National Laboratories Robust Automated Knowledge Capture Laboratory Directed Research and Development project. The objective of this project was to advance scientific understanding of the influence of individual cognitive attributes on decision making. The project has developed a quantitative model known as RumRunner that has proven effective in predicting the propensity of an individual to shift strategies on the basis of task and experience related parameters. Three separate studies are described which have validated the basic RumRunner model. This work provides a basis for better understanding human decision making in high consequent national security applications, and in particular, the individual characteristics that underlie adaptive thinking.

  10. Automated attendance accounting system

    NASA Technical Reports Server (NTRS)

    Chapman, C. P. (Inventor)

    1973-01-01

    An automated accounting system useful for applying data to a computer from any or all of a multiplicity of data terminals is disclosed. The system essentially includes a preselected number of data terminals which are each adapted to convert data words of decimal form to another form, i.e., binary, usable with the computer. Each data terminal may take the form of a keyboard unit having a number of depressable buttons or switches corresponding to selected data digits and/or function digits. A bank of data buffers, one of which is associated with each data terminal, is provided as a temporary storage. Data from the terminals is applied to the data buffers on a digit by digit basis for transfer via a multiplexer to the computer.

  11. Berkeley automated supernova search

    SciTech Connect

    Kare, J.T.; Pennypacker, C.R.; Muller, R.A.; Mast, T.S.; Crawford, F.S.; Burns, M.S.

    1981-01-01

    The Berkeley automated supernova search employs a computer controlled 36-inch telescope and charge coupled device (CCD) detector to image 2500 galaxies per night. A dedicated minicomputer compares each galaxy image with stored reference data to identify supernovae in real time. The threshold for detection is m/sub v/ = 18.8. We plan to monitor roughly 500 galaxies in Virgo and closer every night, and an additional 6000 galaxies out to 70 Mpc on a three night cycle. This should yield very early detection of several supernovae per year for detailed study, and reliable premaximum detection of roughly 100 supernovae per year for statistical studies. The search should be operational in mid-1982.

  12. Automated Cognome Construction and Semi-automated Hypothesis Generation

    PubMed Central

    Voytek, Jessica B.; Voytek, Bradley

    2012-01-01

    Modern neuroscientific research stands on the shoulders of countless giants. PubMed alone contains more than 21 million peer-reviewed articles with 40–50,000 more published every month. Understanding the human brain, cognition, and disease will require integrating facts from dozens of scientific fields spread amongst millions of studies locked away in static documents, making any such integration daunting, at best. The future of scientific progress will be aided by bridging the gap between the millions of published research articles and modern databases such as the Allen Brain Atlas (ABA). To that end, we have analyzed the text of over 3.5 million scientific abstracts to find associations between neuroscientific concepts. From the literature alone, we show that we can blindly and algorithmically extract a “cognome”: relationships between brain structure, function, and disease. We demonstrate the potential of data-mining and cross-platform data-integration with the ABA by introducing two methods for semiautomated hypothesis generation. By analyzing statistical “holes” and discrepancies in the literature we can find understudied or overlooked research paths. That is, we have added a layer of semi-automation to a part of the scientific process itself. This is an important step toward fundamentally incorporating data-mining algorithms into the scientific method in a manner that is generalizable to any scientific or medical field. PMID:22584238

  13. Automated data collection based on RoboDiff at the ESRF beamline MASSIF-1

    SciTech Connect

    Nurizzo, Didier Guichard, Nicolas; McSweeney, Sean; Theveneau, Pascal; Guijarro, Matias; Svensson, Olof; Mueller-Dieckmann, Christoph; Leonard, Gordon; Bowler, Matthew W.

    2016-07-27

    The European Synchrotron Radiation Facility has a long standing history in the automation of experiments in Macromolecular Crystallography. MASSIF-1 (Massively Automated Sample Screening and evaluation Integrated Facility), a beamline constructed as part of the ESRF Upgrade Phase I program, has been open to the external user community since July 2014 and offers a unique completely automated data collection service to both academic and industrial structural biologists.

  14. Programmable Automated Welding System (PAWS)

    NASA Technical Reports Server (NTRS)

    Kline, Martin D.

    1994-01-01

    An ambitious project to develop an advanced, automated welding system is being funded as part of the Navy Joining Center with Babcock & Wilcox as the prime integrator. This program, the Programmable Automated Welding System (PAWS), involves the integration of both planning and real-time control activities. Planning functions include the development of a graphical decision support system within a standard, portable environment. Real-time control functions include the development of a modular, intelligent, real-time control system and the integration of a number of welding process sensors. This paper presents each of these components of the PAWS and discusses how they can be utilized to automate the welding operation.

  15. Automating Ontological Annotation with WordNet

    SciTech Connect

    Sanfilippo, Antonio P.; Tratz, Stephen C.; Gregory, Michelle L.; Chappell, Alan R.; Whitney, Paul D.; Posse, Christian; Paulson, Patrick R.; Baddeley, Bob L.; Hohimer, Ryan E.; White, Amanda M.

    2006-01-22

    Semantic Web applications require robust and accurate annotation tools that are capable of automating the assignment of ontological classes to words in naturally occurring text (ontological annotation). Most current ontologies do not include rich lexical databases and are therefore not easily integrated with word sense disambiguation algorithms that are needed to automate ontological annotation. WordNet provides a potentially ideal solution to this problem as it offers a highly structured lexical conceptual representation that has been extensively used to develop word sense disambiguation algorithms. However, WordNet has not been designed as an ontology, and while it can be easily turned into one, the result of doing this would present users with serious practical limitations due to the great number of concepts (synonym sets) it contains. Moreover, mapping WordNet to an existing ontology may be difficult and requires substantial labor. We propose to overcome these limitations by developing an analytical platform that (1) provides a WordNet-based ontology offering a manageable and yet comprehensive set of concept classes, (2) leverages the lexical richness of WordNet to give an extensive characterization of concept class in terms of lexical instances, and (3) integrates a class recognition algorithm that automates the assignment of concept classes to words in naturally occurring text. The ensuing framework makes available an ontological annotation platform that can be effectively integrated with intelligence analysis systems to facilitate evidence marshaling and sustain the creation and validation of inference models.

  16. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  17. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  18. Automated Engineering Design (AED); An approach to automated documentation

    NASA Technical Reports Server (NTRS)

    Mcclure, C. W.

    1970-01-01

    The automated engineering design (AED) is reviewed, consisting of a high level systems programming language, a series of modular precoded subroutines, and a set of powerful software machine tools that effectively automate the production and design of new languages. AED is used primarily for development of problem and user-oriented languages. Software production phases are diagramed, and factors which inhibit effective documentation are evaluated.

  19. Utility of multispectral imaging in automated quantitative scoring of immunohistochemistry

    PubMed Central

    Fiore, Christopher; Bailey, Dyane; Conlon, Niamh; Wu, Xiaoqiu; Martin, Neil; Fiorentino, Michelangelo; Finn, Stephen; Fall, Katja; Andersson, Swen-Olof; Andren, Ove; Loda, Massimo; Flavin, Richard

    2012-01-01

    Background Automated scanning devices and image analysis software provide a means to overcome the limitations of manual semiquantitative scoring of immunohistochemistry. Common drawbacks to automated imaging systems include an inability to classify tissue type and an inability to segregate cytoplasmic and nuclear staining. Methods Immunohistochemistry for the membranous marker α-catenin, the cytoplasmic marker stathmin and the nuclear marker Ki-67 was performed on tissue microarrays (TMA) of archival formalin-fixed paraffin-embedded tissue comprising 471 (α-catenin and stathmin) and 511 (Ki-67) cases of prostate adenocarcinoma. These TMA were quantitatively analysed using two commercially available automated image analysers, the Ariol SL-50 system and the Nuance system from CRi. Both systems use brightfield microscopy for automated, unbiased and standardised quantification of immunohistochemistry, while the Nuance system has spectral deconvolution capabilities. Results Overall concordance between scores from both systems was excellent (r=0.90; 0.83–0.95). The software associated with the multispectral imager allowed accurate automated classification of tissue type into epithelial glandular structures and stroma, and a single-step segmentation of staining into cytoplasmic or nuclear compartments allowing independent evaluation of these areas. The Nuance system, however, was not able to distinguish reliably between tumour and non-tumour tissue. In addition, variance in the labour and time required for analysis between the two systems was also noted. Conclusion Despite limitations, this study suggests some beneficial role for the use of a multispectral imaging system in automated analysis of immunohistochemistry. PMID:22447914

  20. Automated shell theory for rotating structures (ASTROS)

    NASA Technical Reports Server (NTRS)

    Foster, B. J.; Thomas, J. M.

    1971-01-01

    A computer program for analyzing axisymmetric shells with inertial forces caused by rotation about the shell axis is developed by revising the STARS II shell program. The basic capabilities of the STARS II shell program, such as the treatment of the branched shells, stiffened wall construction, and thermal gradients, are retained.