Science.gov

Sample records for pcgc automated structure

  1. Revised users manual, Pulverized Coal Gasification or Combustion: 2-dimensional (87-PCGC-2): Final report, Volume 2. [87-PCGC-2

    SciTech Connect

    Smith, P.J.; Smoot, L.D.; Brewster, B.S.

    1987-12-01

    A two-dimensional, steady-state model for describing a variety of reactive and non-reactive flows, including pulverized coal combustion and gasification, is presented. Recent code revisions and additions are described. The model, referred to as 87-PCGC-2, is applicable to cylindrical axi-symmetric systems. Turbulence is accounted for in both the fluid mechanics equations and the combustion scheme. Radiation from gases, walls, and particles is taken into account using either a flux method or discrete ordinates method. The particle phase is modeled in a Lagrangian framework, such that mean paths of particle groups are followed. Several multi-step coal devolatilization schemes are included along with a heterogeneous reaction scheme that allows for both diffusion and chemical reaction. Major gas-phase reactions are modeled assuming local instantaneous equilibrium, and thus the reaction rates are limited by the turbulent rate mixing. A NO/sub x/ finite rate chemistry submodel is included which integrates chemical kinetics and the statistics of the turbulence. The gas phase is described by elliptic partial differential equations that are solved by an iterative line-by-line technique. Under-relaxation is used to achieve numerical stability. The generalized nature of the model allows for calculation of isothermal fluid mechanicsgaseous combustion, droplet combustion, particulate combustion and various mixtures of the above, including combustion of coal-water and coal-oil slurries. Both combustion and gasification environments are permissible. User information and theory are presented, along with sample problems. 106 refs.

  2. Automated design of aerospace structures

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Mccomb, H. G.

    1974-01-01

    The current state-of-the-art in structural analysis of aerospace vehicles is characterized, automated design technology is discussed, and an indication is given of the future direction of research in analysis and automated design. Representative computer programs for analysis typical of those in routine use in vehicle design activities are described, and results are shown for some selected analysis problems. Recent and planned advances in analysis capability are indicated. Techniques used to automate the more routine aspects of structural design are discussed, and some recently developed automated design computer programs are described. Finally, discussion is presented of early accomplishments in interdisciplinary automated design systems, and some indication of the future thrust of research in this field is given.

  3. Automated Characterization Of Vibrations Of A Structure

    NASA Technical Reports Server (NTRS)

    Bayard, David S.; Yam, Yeung; Mettler, Edward; Hadaegh, Fred Y.; Milman, Mark H.; Scheid, Robert E.

    1992-01-01

    Automated method of characterizing dynamical properties of large flexible structure yields estimates of modal parameters used by robust control system to stabilize structure and minimize undesired motions. Based on extraction of desired modal and control-design data from responses of structure to known vibrational excitations. Applicable to terrestrial structures where vibrations are important - aircraft, buildings, bridges, cranes, and drill strings.

  4. Automated structure solution with the PHENIX suite

    SciTech Connect

    Terwilliger, Thomas C; Zwart, Peter H; Afonine, Pavel V; Grosse - Kunstleve, Ralf W

    2008-01-01

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution, and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution, and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template- and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix. refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  5. Automated Structure Solution with the PHENIX Suite

    SciTech Connect

    Zwart, Peter H.; Zwart, Peter H.; Afonine, Pavel; Grosse-Kunstleve, Ralf W.; Hung, Li-Wei; Ioerger, Tom R.; McCoy, A.J.; McKee, Eric; Moriarty, Nigel; Read, Randy J.; Sacchettini, James C.; Sauter, Nicholas K.; Storoni, L.C.; Terwilliger, Tomas C.; Adams, Paul D.

    2008-06-09

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix.refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  6. REdiii: a pipeline for automated structure solution.

    PubMed

    Bohn, Markus Frederik; Schiffer, Celia A

    2015-05-01

    High-throughput crystallographic approaches require integrated software solutions to minimize the need for manual effort. REdiii is a system that allows fully automated crystallographic structure solution by integrating existing crystallographic software into an adaptive and partly autonomous workflow engine. The program can be initiated after collecting the first frame of diffraction data and is able to perform processing, molecular-replacement phasing, chain tracing, ligand fitting and refinement without further user intervention. Preset values for each software component allow efficient progress with high-quality data and known parameters. The adaptive workflow engine can determine whether some parameters require modifications and choose alternative software strategies in case the preconfigured solution is inadequate. This integrated pipeline is targeted at providing a comprehensive and efficient approach to screening for ligand-bound co-crystal structures while minimizing repetitiveness and allowing a high-throughput scientific discovery process. PMID:25945571

  7. REdiii: a pipeline for automated structure solution

    PubMed Central

    Bohn, Markus-Frederik; Schiffer, Celia A.

    2015-01-01

    High-throughput crystallographic approaches require integrated software solutions to minimize the need for manual effort. REdiii is a system that allows fully automated crystallographic structure solution by integrating existing crystallographic software into an adaptive and partly autonomous workflow engine. The program can be initiated after collecting the first frame of diffraction data and is able to perform processing, molecular-replacement phasing, chain tracing, ligand fitting and refinement without further user intervention. Preset values for each software component allow efficient progress with high-quality data and known parameters. The adaptive workflow engine can determine whether some parameters require modifications and choose alternative software strategies in case the preconfigured solution is inadequate. This integrated pipeline is targeted at providing a comprehensive and efficient approach to screening for ligand-bound co-crystal structures while minimizing repetitiveness and allowing a high-throughput scientific discovery process. PMID:25945571

  8. Pricing Structures for Automated Library Consortia.

    ERIC Educational Resources Information Center

    Machovec, George S.

    1993-01-01

    Discusses the development of successful pricing algorithms for cooperative library automation projects. Highlights include desirable characteristics of pricing measures, including simplicity and the ability to allow for system growth; problems with transaction-based systems; and a review of the pricing strategies of seven library consortia.…

  9. Automating the determination of 3D protein structure

    SciTech Connect

    Rayl, K.D.

    1993-12-31

    The creation of an automated method for determining 3D protein structure would be invaluable to the field of biology and presents an interesting challenge to computer science. Unfortunately, given the current level of protein knowledge, a completely automated solution method is not yet feasible, therefore, our group has decided to integrate existing databases and theories to create a software system that assists X-ray crystallographers in specifying a particular protein structure. By breaking the problem of determining overall protein structure into small subproblems, we hope to come closer to solving a novel structure by solving each component. By generating necessary information for structure determination, this method provides the first step toward designing a program to determine protein conformation automatically.

  10. The Phenix Software for Automated Determination of Macromolecular Structures

    PubMed Central

    Adams, Paul D.; Afonine, Pavel V.; Bunkóczi, Gábor; Chen, Vincent B.; Echols, Nathaniel; Headd, Jeffrey J.; Hung, Li-Wei; Jain, Swati; Kapral, Gary J.; Grosse Kunstleve, Ralf W.; McCoy, Airlie J.; Moriarty, Nigel W.; Oeffner, Robert D.; Read, Randy J.; Richardson, David C.; Richardson, Jane S.; Terwilliger, Thomas C.; Zwart, Peter H.

    2011-01-01

    X-ray crystallography is a critical tool in the study of biological systems. It is able to provide information that has been a prerequisite to understanding the fundamentals of life. It is also a method that is central to the development of new therapeutics for human disease. Significant time and effort are required to determine and optimize many macromolecular structures because of the need for manual interpretation of complex numerical data, often using many different software packages, and the repeated use of interactive three-dimensional graphics. The Phenix software package has been developed to provide a comprehensive system for macromolecular crystallographic structure solution with an emphasis on automation. This has required the development of new algorithms that minimize or eliminate subjective input in favour of built-in expert-systems knowledge, the automation of procedures that are traditionally performed by hand, and the development of a computational framework that allows a tight integration between the algorithms. The application of automated methods is particularly appropriate in the field of structural proteomics, where high throughput is desired. Features in Phenix for the automation of experimental phasing with subsequent model building, molecular replacement, structure refinement and validation are described and examples given of running Phenix from both the command line and graphical user interface. PMID:21821126

  11. Automated output-only dynamic identification of civil engineering structures

    NASA Astrophysics Data System (ADS)

    Rainieri, C.; Fabbrocino, G.

    2010-04-01

    Modal-based damage detection algorithms are well-known techniques for structural health assessment, but they are not commonly used due to the lack of automated modal identification and tracking procedures. Development of such procedures is not a trivial task since traditional modal identification requires extensive interaction from an expert user. Nevertheless, computational efforts have to be carefully considered. If fast on-line data processing is crucial for quickly varying in time systems (such as a rocket burning fuel), a number of vibration-based condition monitoring applications are performed at very different time scales, resulting in satisfactory time steps for on-line data analysis. Moreover, promising results in the field of automated modal identification have been recently achieved. In the present paper, a literature review on this topic is presented and recent developments concerning fully automated output-only modal identification procedures are described. Some case studies are also reported in order to validate the approach. They are characterized by different levels of complexity, in terms of mode coupling, dynamic interaction effects and level of vibration. Advantages and drawbacks of the proposed approach will be pointed out with reference to available experimental results. The final objective is the implementation of a fully automated system for vibration-based structural health monitoring of civil engineering structures and identification of adequate requirements about sensor number and layout, record duration and hardware characteristics able to ensure a reliable low-cost health assessment of constructions. Results of application of the proposed methodology to modal parameter estimation in operational conditions and during ground motions induced by the recent L'Aquila earthquake will be finally presented and discussed.

  12. Automated assembly of a tetrahedral truss structure using machine vision

    NASA Technical Reports Server (NTRS)

    Doggett, William R.

    1992-01-01

    The Automated Structures Assembly Laboratory is a unique facility at NASA Langley Research Center used to investigate the robotic assembly of truss structures. Two special-purpose end-effectors have been used to assemble 102 truss members and 12 panels into an 8-meter diameter structure. One end-effector is dedicated to truss member insertion, while a second end-effector is used to install panels. Until recently, the robot motions required to construct the structure were developed iteratively using the facility hardware. Recent work at Langley has resulted in a compact machine vision system capable of providing position information relative to targets on the structure. Use of the vision system to guide the robot from an approach point 10 to 18 inches from the structure, offsetting model inaccuracies, permits robot motion based on calculated points as a first step toward use of preplanned paths from an automated path planner. This paper presents recent work at Langley highlighting the application of the machine vision system during truss member insertion.

  13. Automated Low-Cost Photogrammetry for Flexible Structure Monitoring

    NASA Astrophysics Data System (ADS)

    Wang, C. H.; Mills, J. P.; Miller, P. E.

    2012-07-01

    Structural monitoring requires instruments which can provide high precision and accuracy, reliable measurements at good temporal resolution and rapid processing speeds. Long-term campaigns and flexible structures are regarded as two of the most challenging subjects in monitoring engineering structures. Long-term monitoring in civil engineering is generally considered to be labourintensive and financially expensive and it can take significant effort to arrange the necessary human resources, transportation and equipment maintenance. When dealing with flexible structure monitoring, it is of paramount importance that any monitoring equipment used is able to carry out rapid sampling. Low cost, automated, photogrammetric techniques therefore have the potential to become routinely viable for monitoring non-rigid structures. This research aims to provide a photogrammetric solution for long-term flexible structural monitoring purposes. The automated approach was achieved using low-cost imaging devices (mobile phones) to replace traditional image acquisition stations and substantially reduce the equipment costs. A self-programmed software package was developed to deal with the hardware-software integration and system operation. In order to evaluate the performance of this low-cost monitoring system, a shaking table experiment was undertaken. Different network configurations and target sizes were used to determine the best configuration. A large quantity of image data was captured by four DSLR cameras and four mobile phone cameras respectively. These image data were processed using photogrammetric techniques to calculate the final results for the system evaluation.

  14. Automated analysis of fundamental features of brain structures.

    PubMed

    Lancaster, Jack L; McKay, D Reese; Cykowski, Matthew D; Martinez, Michael J; Tan, Xi; Valaparla, Sunil; Zhang, Yi; Fox, Peter T

    2011-12-01

    Automated image analysis of the brain should include measures of fundamental structural features such as size and shape. We used principal axes (P-A) measurements to measure overall size and shape of brain structures segmented from MR brain images. The rationale was that quantitative volumetric studies of brain structures would benefit from shape standardization as had been shown for whole brain studies. P-A analysis software was extended to include controls for variability in position and orientation to support individual structure spatial normalization (ISSN). The rationale was that ISSN would provide a bias-free means to remove elementary sources of a structure's spatial variability in preparation for more detailed analyses. We studied nine brain structures (whole brain, cerebral hemispheres, cerebellum, brainstem, caudate, putamen, hippocampus, inferior frontal gyrus, and precuneus) from the 40-brain LPBA40 atlas. This paper provides the first report of anatomical positions and principal axes orientations within a standard reference frame, in addition to "shape/size related" principal axes measures, for the nine brain structures from the LPBA40 atlas. Analysis showed that overall size (mean volume) for internal brain structures was preserved using shape standardization while variance was reduced by more than 50%. Shape standardization provides increased statistical power for between-group volumetric studies of brain structures compared to volumetric studies that control only for whole brain size. To test ISSN's ability to control for spatial variability of brain structures we evaluated the overlap of 40 regions of interest (ROIs) in a standard reference frame for the nine different brain structures before and after processing. Standardizations of orientation or shape were ineffective when not combined with position standardization. The greatest reduction in spatial variability was seen for combined standardizations of position, orientation and shape. These

  15. A screened automated structural search with semiempirical methods

    NASA Astrophysics Data System (ADS)

    Ota, Yukihiro; Ruiz-Barragan, Sergi; Machida, Masahiko; Shiga, Motoyuki

    2016-03-01

    We developed an interface program between a program suite for an automated search of chemical reaction pathways, GRRM, and a program package of semiempirical methods, MOPAC. A two-step structural search is proposed as an application of this interface program. A screening test is first performed by semiempirical calculations. Subsequently, a reoptimization procedure is done by ab initio or density functional calculations. We apply this approach to ion adsorption on cellulose. The computational efficiency is also shown for a GRRM search. The interface program is suitable for the structural search of large molecular systems for which semiempirical methods are applicable.

  16. Towards Automated Structure-Based NMR Resonance Assignment

    NASA Astrophysics Data System (ADS)

    Jang, Richard; Gao, Xin; Li, Ming

    We propose a general framework for solving the structure-based NMR backbone resonance assignment problem. The core is a novel 0-1 integer programming model that can start from a complete or partial assignment, generate multiple assignments, and model not only the assignment of spins to residues, but also pairwise dependencies consisting of pairs of spins to pairs of residues. It is still a challenge for automated resonance assignment systems to perform the assignment directly from spectra without any manual intervention. To test the feasibility of this for structure-based assignment, we integrated our system with our automated peak picking and sequence-based resonance assignment system to obtain an assignment for the protein TM1112 with 91% recall and 99% precision without manual intervention. Since using a known structure has the potential to allow one to use only N-labeled NMR data and avoid the added expense of using C-labeled data, we work towards the goal of automated structure-based assignment using only such labeled data. Our system reduced the assignment error of Xiong-Pandurangan-Bailey-Kellogg's contact replacement (CR) method, which to our knowledge is the most error-tolerant method for this problem, by 5 folds on average. By using an iterative algorithm, our system has the added capability of using the NOESY data to correct assignment errors due to errors in predicting the amino acid and secondary structure type of each spin system. On a publicly available data set for Ubiquitin, where the type prediction accuracy is 83%, we achieved 91% assignment accuracy, compared to the 59% accuracy that was obtained without correcting for typing errors.

  17. A telerobotic system for automated assembly of large space structures

    NASA Technical Reports Server (NTRS)

    Rhodes, Marvin D.; Will, Ralph W.; Wise, Marion A.

    1989-01-01

    Future space missions such as polar platforms and antennas are anticipated to require large truss structures as their primary support system. During the past several years considerable research has been conducted to develop hardware and construction techniques suitable for astronaut assembly of truss structures in space. A research program has recently been initiated to develop the technology and to demonstrate the potential for automated in-space assembly of large erectable structures. The initial effort will be focussed on automated assembly of a tetrahedral truss composed of 2-meter members. The facility is designed as a ground based system to permit evaluation of assembly concepts and was not designed for space qualification. The system is intended to be used as a tool from which more sophisticated procedures and operations can be developed. The facility description includes a truss structure, motionbases and a robot arm equipped with an end effector. Other considerations and requirements of the structural assembly describe computer control systems to monitor and control the operations of the assembly facility.

  18. Towards automated crystallographic structure refinement with phenix.refine.

    PubMed

    Afonine, Pavel V; Grosse-Kunstleve, Ralf W; Echols, Nathaniel; Headd, Jeffrey J; Moriarty, Nigel W; Mustyakimov, Marat; Terwilliger, Thomas C; Urzhumtsev, Alexandre; Zwart, Peter H; Adams, Paul D

    2012-04-01

    phenix.refine is a program within the PHENIX package that supports crystallographic structure refinement against experimental data with a wide range of upper resolution limits using a large repertoire of model parameterizations. It has several automation features and is also highly flexible. Several hundred parameters enable extensive customizations for complex use cases. Multiple user-defined refinement strategies can be applied to specific parts of the model in a single refinement run. An intuitive graphical user interface is available to guide novice users and to assist advanced users in managing refinement projects. X-ray or neutron diffraction data can be used separately or jointly in refinement. phenix.refine is tightly integrated into the PHENIX suite, where it serves as a critical component in automated model building, final structure refinement, structure validation and deposition to the wwPDB. This paper presents an overview of the major phenix.refine features, with extensive literature references for readers interested in more detailed discussions of the methods. PMID:22505256

  19. Towards automated crystallographic structure refinement with phenix.refine

    PubMed Central

    Afonine, Pavel V.; Grosse-Kunstleve, Ralf W.; Echols, Nathaniel; Headd, Jeffrey J.; Moriarty, Nigel W.; Mustyakimov, Marat; Terwilliger, Thomas C.; Urzhumtsev, Alexandre; Zwart, Peter H.; Adams, Paul D.

    2012-01-01

    phenix.refine is a program within the PHENIX package that supports crystallographic structure refinement against experimental data with a wide range of upper resolution limits using a large repertoire of model parameterizations. It has several automation features and is also highly flexible. Several hundred parameters enable extensive customizations for complex use cases. Multiple user-defined refinement strategies can be applied to specific parts of the model in a single refinement run. An intuitive graphical user interface is available to guide novice users and to assist advanced users in managing refinement projects. X-ray or neutron diffraction data can be used separately or jointly in refinement. phenix.refine is tightly integrated into the PHENIX suite, where it serves as a critical component in automated model building, final structure refinement, structure validation and deposition to the wwPDB. This paper presents an overview of the major phenix.refine features, with extensive literature references for readers interested in more detailed discussions of the methods. PMID:22505256

  20. Automated extraction of chemical structure information from digital raster images

    PubMed Central

    Park, Jungkap; Rosania, Gus R; Shedden, Kerby A; Nguyen, Mandee; Lyu, Naesung; Saitou, Kazuhiro

    2009-01-01

    Background To search for chemical structures in research articles, diagrams or text representing molecules need to be translated to a standard chemical file format compatible with cheminformatic search engines. Nevertheless, chemical information contained in research articles is often referenced as analog diagrams of chemical structures embedded in digital raster images. To automate analog-to-digital conversion of chemical structure diagrams in scientific research articles, several software systems have been developed. But their algorithmic performance and utility in cheminformatic research have not been investigated. Results This paper aims to provide critical reviews for these systems and also report our recent development of ChemReader – a fully automated tool for extracting chemical structure diagrams in research articles and converting them into standard, searchable chemical file formats. Basic algorithms for recognizing lines and letters representing bonds and atoms in chemical structure diagrams can be independently run in sequence from a graphical user interface-and the algorithm parameters can be readily changed-to facilitate additional development specifically tailored to a chemical database annotation scheme. Compared with existing software programs such as OSRA, Kekule, and CLiDE, our results indicate that ChemReader outperforms other software systems on several sets of sample images from diverse sources in terms of the rate of correct outputs and the accuracy on extracting molecular substructure patterns. Conclusion The availability of ChemReader as a cheminformatic tool for extracting chemical structure information from digital raster images allows research and development groups to enrich their chemical structure databases by annotating the entries with published research articles. Based on its stable performance and high accuracy, ChemReader may be sufficiently accurate for annotating the chemical database with links to scientific research

  1. Verification Test of Automated Robotic Assembly of Space Truss Structures

    NASA Technical Reports Server (NTRS)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.

    1995-01-01

    A multidisciplinary program has been conducted at the Langley Research Center to develop operational procedures for supervised autonomous assembly of truss structures suitable for large-aperture antennas. The hardware and operations required to assemble a 102-member tetrahedral truss and attach 12 hexagonal panels were developed and evaluated. A brute-force automation approach was used to develop baseline assembly hardware and software techniques. However, as the system matured and operations were proven, upgrades were incorporated and assessed against the baseline test results. These upgrades included the use of distributed microprocessors to control dedicated end-effector operations, machine vision guidance for strut installation, and the use of an expert system-based executive-control program. This paper summarizes the developmental phases of the program, the results of several assembly tests, and a series of proposed enhancements. No problems that would preclude automated in-space assembly or truss structures have been encountered. The test system was developed at a breadboard level and continued development at an enhanced level is warranted.

  2. pmx: Automated protein structure and topology generation for alchemical perturbations.

    PubMed

    Gapsys, Vytautas; Michielssens, Servaas; Seeliger, Daniel; de Groot, Bert L

    2015-02-15

    Computational protein design requires methods to accurately estimate free energy changes in protein stability or binding upon an amino acid mutation. From the different approaches available, molecular dynamics-based alchemical free energy calculations are unique in their accuracy and solid theoretical basis. The challenge in using these methods lies in the need to generate hybrid structures and topologies representing two physical states of a system. A custom made hybrid topology may prove useful for a particular mutation of interest, however, a high throughput mutation analysis calls for a more general approach. In this work, we present an automated procedure to generate hybrid structures and topologies for the amino acid mutations in all commonly used force fields. The described software is compatible with the Gromacs simulation package. The mutation libraries are readily supported for five force fields, namely Amber99SB, Amber99SB*-ILDN, OPLS-AA/L, Charmm22*, and Charmm36. PMID:25487359

  3. pmx: Automated protein structure and topology generation for alchemical perturbations

    PubMed Central

    Gapsys, Vytautas; Michielssens, Servaas; Seeliger, Daniel; de Groot, Bert L

    2015-01-01

    Computational protein design requires methods to accurately estimate free energy changes in protein stability or binding upon an amino acid mutation. From the different approaches available, molecular dynamics-based alchemical free energy calculations are unique in their accuracy and solid theoretical basis. The challenge in using these methods lies in the need to generate hybrid structures and topologies representing two physical states of a system. A custom made hybrid topology may prove useful for a particular mutation of interest, however, a high throughput mutation analysis calls for a more general approach. In this work, we present an automated procedure to generate hybrid structures and topologies for the amino acid mutations in all commonly used force fields. The described software is compatible with the Gromacs simulation package. The mutation libraries are readily supported for five force fields, namely Amber99SB, Amber99SB*-ILDN, OPLS-AA/L, Charmm22*, and Charmm36. PMID:25487359

  4. Automated glioblastoma segmentation based on a multiparametric structured unsupervised classification.

    PubMed

    Juan-Albarracín, Javier; Fuster-Garcia, Elies; Manjón, José V; Robles, Montserrat; Aparici, F; Martí-Bonmatí, L; García-Gómez, Juan M

    2015-01-01

    Automatic brain tumour segmentation has become a key component for the future of brain tumour treatment. Currently, most of brain tumour segmentation approaches arise from the supervised learning standpoint, which requires a labelled training dataset from which to infer the models of the classes. The performance of these models is directly determined by the size and quality of the training corpus, whose retrieval becomes a tedious and time-consuming task. On the other hand, unsupervised approaches avoid these limitations but often do not reach comparable results than the supervised methods. In this sense, we propose an automated unsupervised method for brain tumour segmentation based on anatomical Magnetic Resonance (MR) images. Four unsupervised classification algorithms, grouped by their structured or non-structured condition, were evaluated within our pipeline. Considering the non-structured algorithms, we evaluated K-means, Fuzzy K-means and Gaussian Mixture Model (GMM), whereas as structured classification algorithms we evaluated Gaussian Hidden Markov Random Field (GHMRF). An automated postprocess based on a statistical approach supported by tissue probability maps is proposed to automatically identify the tumour classes after the segmentations. We evaluated our brain tumour segmentation method with the public BRAin Tumor Segmentation (BRATS) 2013 Test and Leaderboard datasets. Our approach based on the GMM model improves the results obtained by most of the supervised methods evaluated with the Leaderboard set and reaches the second position in the ranking. Our variant based on the GHMRF achieves the first position in the Test ranking of the unsupervised approaches and the seventh position in the general Test ranking, which confirms the method as a viable alternative for brain tumour segmentation. PMID:25978453

  5. Automated Glioblastoma Segmentation Based on a Multiparametric Structured Unsupervised Classification

    PubMed Central

    Juan-Albarracín, Javier; Fuster-Garcia, Elies; Manjón, José V.; Robles, Montserrat; Aparici, F.; Martí-Bonmatí, L.; García-Gómez, Juan M.

    2015-01-01

    Automatic brain tumour segmentation has become a key component for the future of brain tumour treatment. Currently, most of brain tumour segmentation approaches arise from the supervised learning standpoint, which requires a labelled training dataset from which to infer the models of the classes. The performance of these models is directly determined by the size and quality of the training corpus, whose retrieval becomes a tedious and time-consuming task. On the other hand, unsupervised approaches avoid these limitations but often do not reach comparable results than the supervised methods. In this sense, we propose an automated unsupervised method for brain tumour segmentation based on anatomical Magnetic Resonance (MR) images. Four unsupervised classification algorithms, grouped by their structured or non-structured condition, were evaluated within our pipeline. Considering the non-structured algorithms, we evaluated K-means, Fuzzy K-means and Gaussian Mixture Model (GMM), whereas as structured classification algorithms we evaluated Gaussian Hidden Markov Random Field (GHMRF). An automated postprocess based on a statistical approach supported by tissue probability maps is proposed to automatically identify the tumour classes after the segmentations. We evaluated our brain tumour segmentation method with the public BRAin Tumor Segmentation (BRATS) 2013 Test and Leaderboard datasets. Our approach based on the GMM model improves the results obtained by most of the supervised methods evaluated with the Leaderboard set and reaches the second position in the ranking. Our variant based on the GHMRF achieves the first position in the Test ranking of the unsupervised approaches and the seventh position in the general Test ranking, which confirms the method as a viable alternative for brain tumour segmentation. PMID:25978453

  6. Automated optimum design of wing structures. Deterministic and probabilistic approaches

    NASA Technical Reports Server (NTRS)

    Rao, S. S.

    1982-01-01

    The automated optimum design of airplane wing structures subjected to multiple behavior constraints is described. The structural mass of the wing is considered the objective function. The maximum stress, wing tip deflection, root angle of attack, and flutter velocity during the pull up maneuver (static load), the natural frequencies of the wing structure, and the stresses induced in the wing structure due to landing and gust loads are suitably constrained. Both deterministic and probabilistic approaches are used for finding the stresses induced in the airplane wing structure due to landing and gust loads. A wing design is represented by a uniform beam with a cross section in the form of a hollow symmetric double wedge. The airfoil thickness and chord length are the design variables, and a graphical procedure is used to find the optimum solutions. A supersonic wing design is represented by finite elements. The thicknesses of the skin and the web and the cross sectional areas of the flanges are the design variables, and nonlinear programming techniques are used to find the optimum solution.

  7. Semi-Automated Discovery of Application Session Structure

    SciTech Connect

    Kannan, J.; Jung, J.; Paxson, V.; Koksal, C.

    2006-09-07

    While the problem of analyzing network traffic at the granularity of individual connections has seen considerable previous work and tool development, understanding traffic at a higher level---the structure of user-initiated sessions comprised of groups of related connections---remains much less explored. Some types of session structure, such as the coupling between an FTP control connection and the data connections it spawns, have prespecified forms, though the specifications do not guarantee how the forms appear in practice. Other types of sessions, such as a user reading email with a browser, only manifest empirically. Still other sessions might exist without us even knowing of their presence, such as a botnet zombie receiving instructions from its master and proceeding in turn to carry them out. We present algorithms rooted in the statistics of Poisson processes that can mine a large corpus of network connection logs to extract the apparent structure of application sessions embedded in the connections. Our methods are semi-automated in that we aim to present an analyst with high-quality information (expressed as regular expressions) reflecting different possible abstractions of an application's session structure. We develop and test our methods using traces from a large Internet site, finding diversity in the number of applications that manifest, their different session structures, and the presence of abnormal behavior. Our work has applications to traffic characterization and monitoring, source models for synthesizing network traffic, and anomaly detection.

  8. 12 CFR Appendix D to Part 360 - Sweep/Automated Credit Account File Structure

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 5 2012-01-01 2012-01-01 false Sweep/Automated Credit Account File Structure D Appendix D to Part 360 Banks and Banking FEDERAL DEPOSIT INSURANCE CORPORATION REGULATIONS AND STATEMENTS OF GENERAL POLICY RESOLUTION AND RECEIVERSHIP RULES Pt. 360, App. D Appendix D to Part 360—Sweep/Automated Credit Account File Structure...

  9. Development of a machine vision system for automated structural assembly

    NASA Technical Reports Server (NTRS)

    Sydow, P. Daniel; Cooper, Eric G.

    1992-01-01

    Research is being conducted at the LaRC to develop a telerobotic assembly system designed to construct large space truss structures. This research program was initiated within the past several years, and a ground-based test-bed was developed to evaluate and expand the state of the art. Test-bed operations currently use predetermined ('taught') points for truss structural assembly. Total dependence on the use of taught points for joint receptacle capture and strut installation is neither robust nor reliable enough for space operations. Therefore, a machine vision sensor guidance system is being developed to locate and guide the robot to a passive target mounted on the truss joint receptacle. The vision system hardware includes a miniature video camera, passive targets mounted on the joint receptacles, target illumination hardware, and an image processing system. Discrimination of the target from background clutter is accomplished through standard digital processing techniques. Once the target is identified, a pose estimation algorithm is invoked to determine the location, in three-dimensional space, of the target relative to the robots end-effector. Preliminary test results of the vision system in the Automated Structural Assembly Laboratory with a range of lighting and background conditions indicate that it is fully capable of successfully identifying joint receptacle targets throughout the required operational range. Controlled optical bench test results indicate that the system can also provide the pose estimation accuracy to define the target position.

  10. Automated web service composition supporting conditional branch structures

    NASA Astrophysics Data System (ADS)

    Wang, Pengwei; Ding, Zhijun; Jiang, Changjun; Zhou, Mengchu

    2014-01-01

    The creation of value-added services by automatic composition of existing ones is gaining a significant momentum as the potential silver bullet in service-oriented architecture. However, service composition faces two aspects of difficulties. First, users' needs present such characteristics as diversity, uncertainty and personalisation; second, the existing services run in a real-world environment that is highly complex and dynamically changing. These difficulties may cause the emergence of nondeterministic choices in the process of service composition, which has gone beyond what the existing automated service composition techniques can handle. According to most of the existing methods, the process model of composite service includes sequence constructs only. This article presents a method to introduce conditional branch structures into the process model of composite service when needed, in order to satisfy users' diverse and personalised needs and adapt to the dynamic changes of real-world environment. UML activity diagrams are used to represent dependencies in composite service. Two types of user preferences are considered in this article, which have been ignored by the previous work and a simple programming language style expression is adopted to describe them. Two different algorithms are presented to deal with different situations. A real-life case is provided to illustrate the proposed concepts and methods.

  11. Software design for automated assembly of truss structures

    NASA Technical Reports Server (NTRS)

    Herstrom, Catherine L.; Grantham, Carolyn; Allen, Cheryl L.; Doggett, William R.; Will, Ralph W.

    1992-01-01

    Concern over the limited intravehicular activity time has increased the interest in performing in-space assembly and construction operations with automated robotic systems. A technique being considered at LaRC is a supervised-autonomy approach, which can be monitored by an Earth-based supervisor that intervenes only when the automated system encounters a problem. A test-bed to support evaluation of the hardware and software requirements for supervised-autonomy assembly methods was developed. This report describes the design of the software system necessary to support the assembly process. The software is hierarchical and supports both automated assembly operations and supervisor error-recovery procedures, including the capability to pause and reverse any operation. The software design serves as a model for the development of software for more sophisticated automated systems and as a test-bed for evaluation of new concepts and hardware components.

  12. Automated construction of lightweight, simple, field-erected structures

    NASA Technical Reports Server (NTRS)

    Leonard, R. S.

    1980-01-01

    The feasibility of automation of construction processes which could result in mobile construction robots is examined. The construction of a large photovoltaic power plant with a peak power output of 100 MW is demonstrated. The reasons to automate the construction process, a conventional construction scenario as the reference for evaluation, and a list of potential cost benefits using robots are presented. The technical feasibility of using robots to construct SPS ground stations is addressed.

  13. Automated Eukaryotic Gene Structure Annotation Using EVidenceModeler and the Program to Assemble Spliced Alignments

    SciTech Connect

    Haas, B J; Salzberg, S L; Zhu, W; Pertea, M; Allen, J E; Orvis, J; White, O; Buell, C R; Wortman, J R

    2007-12-10

    EVidenceModeler (EVM) is presented as an automated eukaryotic gene structure annotation tool that reports eukaryotic gene structures as a weighted consensus of all available evidence. EVM, when combined with the Program to Assemble Spliced Alignments (PASA), yields a comprehensive, configurable annotation system that predicts protein-coding genes and alternatively spliced isoforms. Our experiments on both rice and human genome sequences demonstrate that EVM produces automated gene structure annotation approaching the quality of manual curation.

  14. Concurrent combined verification: reducing false positives in automated NMR structure verification through the evaluation of multiple challenge control structures.

    PubMed

    Golotvin, Sergey S; Pol, Rostislav; Sasaki, Ryan R; Nikitina, Asya; Keyes, Philip

    2012-06-01

    Automated structure verification using (1)H NMR data or a combination of (1)H and heteronuclear single-quantum correlation (HSQC) data is gaining more interest as a routine application for qualitative evaluation of large compound libraries produced by synthetic chemistry. The goal of this automated software method is to identify a manageable subset of compounds and data that require human review. In practice, the automated method will flag structure and data combinations that exhibit some inconsistency (i.e. strange chemical shifts, conflicts in multiplicity, or overestimated and underestimated integration values) and validate those that appear consistent. One drawback of this approach is that no automated system can guarantee that all passing structures are indeed correct structures. The major reason for this is that approaches using only (1)H or even (1)H and HSQC spectra often do not provide sufficient information to properly distinguish between similar structures. Therefore, current implementations of automated structure verification systems allow, in principle, false positive results. Presented in this work is a method that greatly reduces the probability of an automated validation system passing incorrect structures (i.e. false positives). This novel method was applied to automatically validate 127 non-proprietary compounds from several commercial sources. Presented also is the impact of this approach on false positive and false negative results. PMID:22549844

  15. Exploring representations of protein structure for automated remote homology detection and mapping of protein structure space

    PubMed Central

    2014-01-01

    Background Due to rapid sequencing of genomes, there are now millions of deposited protein sequences with no known function. Fast sequence-based comparisons allow detecting close homologs for a protein of interest to transfer functional information from the homologs to the given protein. Sequence-based comparison cannot detect remote homologs, in which evolution has adjusted the sequence while largely preserving structure. Structure-based comparisons can detect remote homologs but most methods for doing so are too expensive to apply at a large scale over structural databases of proteins. Recently, fragment-based structural representations have been proposed that allow fast detection of remote homologs with reasonable accuracy. These representations have also been used to obtain linearly-reducible maps of protein structure space. It has been shown, as additionally supported from analysis in this paper that such maps preserve functional co-localization of the protein structure space. Methods Inspired by a recent application of the Latent Dirichlet Allocation (LDA) model for conducting structural comparisons of proteins, we propose higher-order LDA-obtained topic-based representations of protein structures to provide an alternative route for remote homology detection and organization of the protein structure space in few dimensions. Various techniques based on natural language processing are proposed and employed to aid the analysis of topics in the protein structure domain. Results We show that a topic-based representation is just as effective as a fragment-based one at automated detection of remote homologs and organization of protein structure space. We conduct a detailed analysis of the information content in the topic-based representation, showing that topics have semantic meaning. The fragment-based and topic-based representations are also shown to allow prediction of superfamily membership. Conclusions This work opens exciting venues in designing novel

  16. Automated frequency domain system identification of a large space structure

    NASA Technical Reports Server (NTRS)

    Yam, Y.; Bayard, D. S.; Hadaegh, F. Y.; Mettler, E.; Milman, M. H.

    1989-01-01

    This paper presents the development and experimental results of an automated on-orbit system identification method for large flexible spacecraft that yields estimated quantities to support on-line design and tuning of robust high performance control systems. The procedure consists of applying an input to the plant, obtaining an output, and then conducting nonparametric identification to yield the spectral estimate of the system transfer function. A parametric model is determined by curve fitting the spectral estimate to a rational transfer function. The identification method has been demonstrated experimentally on the Large Spacecraft Control Laboratory in JPL.

  17. Exploiting structure similarity in refinement: automated NCS and target-structure restraints in BUSTER

    SciTech Connect

    Smart, Oliver S. Womack, Thomas O.; Flensburg, Claus; Keller, Peter; Paciorek, Włodek; Sharff, Andrew; Vonrhein, Clemens; Bricogne, Gérard

    2012-04-01

    Local structural similarity restraints (LSSR) provide a novel method for exploiting NCS or structural similarity to an external target structure. Two examples are given where BUSTER re-refinement of PDB entries with LSSR produces marked improvements, enabling further structural features to be modelled. Maximum-likelihood X-ray macromolecular structure refinement in BUSTER has been extended with restraints facilitating the exploitation of structural similarity. The similarity can be between two or more chains within the structure being refined, thus favouring NCS, or to a distinct ‘target’ structure that remains fixed during refinement. The local structural similarity restraints (LSSR) approach considers all distances less than 5.5 Å between pairs of atoms in the chain to be restrained. For each, the difference from the distance between the corresponding atoms in the related chain is found. LSSR applies a restraint penalty on each difference. A functional form that reaches a plateau for large differences is used to avoid the restraints distorting parts of the structure that are not similar. Because LSSR are local, there is no need to separate out domains. Some restraint pruning is still necessary, but this has been automated. LSSR have been available to academic users of BUSTER since 2009 with the easy-to-use -autoncs and @@target target.pdb options. The use of LSSR is illustrated in the re-refinement of PDB entries http://scripts.iucr.org/cgi-bin/cr.cgi?rm, where -target enables the correct ligand-binding structure to be found, and http://scripts.iucr.org/cgi-bin/cr.cgi?rm, where -autoncs contributes to the location of an additional copy of the cyclic peptide ligand.

  18. Finite element based electrostatic-structural coupled analysis with automated mesh morphing

    SciTech Connect

    OWEN,STEVEN J.; ZHULIN,V.I.; OSTERGAARD,D.F.

    2000-02-29

    A co-simulation tool based on finite element principles has been developed to solve coupled electrostatic-structural problems. An automated mesh morphing algorithm has been employed to update the field mesh after structural deformation. The co-simulation tool has been successfully applied to the hysteric behavior of a MEMS switch.

  19. Guiding automated NMR structure determination using a global optimization metric, the NMR DP score.

    PubMed

    Huang, Yuanpeng Janet; Mao, Binchen; Xu, Fei; Montelione, Gaetano T

    2015-08-01

    ASDP is an automated NMR NOE assignment program. It uses a distinct bottom-up topology-constrained network anchoring approach for NOE interpretation, with 2D, 3D and/or 4D NOESY peak lists and resonance assignments as input, and generates unambiguous NOE constraints for iterative structure calculations. ASDP is designed to function interactively with various structure determination programs that use distance restraints to generate molecular models. In the CASD-NMR project, ASDP was tested and further developed using blinded NMR data, including resonance assignments, either raw or manually-curated (refined) NOESY peak list data, and in some cases (15)N-(1)H residual dipolar coupling data. In these blinded tests, in which the reference structure was not available until after structures were generated, the fully-automated ASDP program performed very well on all targets using both the raw and refined NOESY peak list data. Improvements of ASDP relative to its predecessor program for automated NOESY peak assignments, AutoStructure, were driven by challenges provided by these CASD-NMR data. These algorithmic improvements include (1) using a global metric of structural accuracy, the discriminating power score, for guiding model selection during the iterative NOE interpretation process, and (2) identifying incorrect NOESY cross peak assignments caused by errors in the NMR resonance assignment list. These improvements provide a more robust automated NOESY analysis program, ASDP, with the unique capability of being utilized with alternative structure generation and refinement programs including CYANA, CNS, and/or Rosetta. PMID:26081575

  20. Aircraft wing structural design optimization based on automated finite element modelling and ground structure approach

    NASA Astrophysics Data System (ADS)

    Yang, Weizhu; Yue, Zhufeng; Li, Lei; Wang, Peiyan

    2016-01-01

    An optimization procedure combining an automated finite element modelling (AFEM) technique with a ground structure approach (GSA) is proposed for structural layout and sizing design of aircraft wings. The AFEM technique, based on CATIA VBA scripting and PCL programming, is used to generate models automatically considering the arrangement of inner systems. GSA is used for local structural topology optimization. The design procedure is applied to a high-aspect-ratio wing. The arrangement of the integral fuel tank, landing gear and control surfaces is considered. For the landing gear region, a non-conventional initial structural layout is adopted. The positions of components, the number of ribs and local topology in the wing box and landing gear region are optimized to obtain a minimum structural weight. Constraints include tank volume, strength, buckling and aeroelastic parameters. The results show that the combined approach leads to a greater weight saving, i.e. 26.5%, compared with three additional optimizations based on individual design approaches.

  1. Texture analysis for automated classification of geologic structures

    USGS Publications Warehouse

    Shankar, V.; Rodriguez, J.J.; Gettings, M.E.

    2006-01-01

    Texture present in aeromagnetic anomaly images offers an abundance of useful geological information for discriminating between rock types, but current analysis of such images still relies on tedious, human interpretation. This study is believed to be the first effort to quantitatively assess the performance of texture-based digital image analysis for this geophysical exploration application. We computed several texture measures and determined the best subset using automated feature selection techniques. Pattern classification experiments measured the ability of various texture measures to automatically predict rock types. The classification accuracy was significantly better than a priori probability and prior weights-of-evidence results. The accuracy rates and choice of texture measures that minimize the error rate are reported. ?? 2006 IEEE.

  2. Automated hexahedral meshing of anatomic structures using deformable registration.

    PubMed

    Grosland, Nicole M; Bafna, Ritesh; Magnotta, Vincent A

    2009-02-01

    This work introduces a novel method of automating the process of patient-specific finite element (FE) model development using a mapped mesh technique. The objective is to map a predefined mesh (template) of high quality directly onto a new bony surface (target) definition, thereby yielding a similar mesh with minimal user interaction. To bring the template mesh into correspondence with the target surface, a deformable registration technique based on the FE method has been adopted. The procedure has been made hierarchical allowing several levels of mesh refinement to be used, thus reducing the time required to achieve a solution. Our initial efforts have focused on the phalanx bones of the human hand. Mesh quality metrics, such as element volume and distortion were evaluated. Furthermore, the distance between the target surface and the final mapped mesh were measured. The results have satisfactorily proven the applicability of the proposed method. PMID:18688764

  3. Development and verification testing of automation and robotics for assembly of space structures

    NASA Technical Reports Server (NTRS)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.

    1993-01-01

    A program was initiated within the past several years to develop operational procedures for automated assembly of truss structures suitable for large-aperture antennas. The assembly operations require the use of a robotic manipulator and are based on the principle of supervised autonomy to minimize crew resources. A hardware testbed was established to support development and evaluation testing. A brute-force automation approach was used to develop the baseline assembly hardware and software techniques. As the system matured and an operation was proven, upgrades were incorprated and assessed against the baseline test results. This paper summarizes the developmental phases of the program, the results of several assembly tests, the current status, and a series of proposed developments for additional hardware and software control capability. No problems that would preclude automated in-space assembly of truss structures have been encountered. The current system was developed at a breadboard level and continued development at an enhanced level is warranted.

  4. About the automated pattern creation of 3D jacquard double needle bed warp knitted structures

    NASA Astrophysics Data System (ADS)

    Renkens, W.; Kyosev, Y.

    2016-07-01

    Three dimensional structures can be produced on jacquard warp knitting machines with double needle bed. This work presents theoretical considerations about the modelling and simulation of these structures. After that a method is described, how to obtain production parameters from the simulation data. The analysis demonstrates, that the automated pattern creation of 3D structures is not always possible and not all mathematical solutions of the problem can be knittable.

  5. Automated Detection of Eruptive Structures for Solar Eruption Prediction

    NASA Astrophysics Data System (ADS)

    Georgoulis, Manolis K.

    2012-07-01

    The problem of data processing and assimilation for solar eruption prediction is, for contemporary solar physics, more pressing than the problem of data acquisition. Although critical solar data, such as the coronal magnetic field, are still not routinely available, space-based observatories deliver diverse, high-quality information at such a high rate that a manual or semi-manual processing becomes meaningless. We discuss automated data analysis methods and explain, using basic physics, why some of them are unlikely to advance eruption prediction. From this finding we also understand why solar eruption prediction is likely to remain inherently probabilistic. We discuss some promising eruption prediction measures and report on efforts to adapt them for use with high-resolution, high-cadence photospheric and coronal data delivered by the Solar Dynamics Observatory. Concluding, we touch on the problem of physical understanding and synthesis of different results: combining different measures inferred by different data sets is a yet-to-be-done exercise that, however, presents our best opportunity of realizing benefits in solar eruption prediction via a meaningful, targeted assimilation of solar data.

  6. Non-Uniform Sampling and J-UNIO Automation for Efficient Protein NMR Structure Determination.

    PubMed

    Didenko, Tatiana; Proudfoot, Andrew; Dutta, Samit Kumar; Serrano, Pedro; Wüthrich, Kurt

    2015-08-24

    High-resolution structure determination of small proteins in solution is one of the big assets of NMR spectroscopy in structural biology. Improvements in the efficiency of NMR structure determination by advances in NMR experiments and automation of data handling therefore attracts continued interest. Here, non-uniform sampling (NUS) of 3D heteronuclear-resolved [(1)H,(1)H]-NOESY data yielded two- to three-fold savings of instrument time for structure determinations of soluble proteins. With the 152-residue protein NP_372339.1 from Staphylococcus aureus and the 71-residue protein NP_346341.1 from Streptococcus pneumonia we show that high-quality structures can be obtained with NUS NMR data, which are equally well amenable to robust automated analysis as the corresponding uniformly sampled data. PMID:26227870

  7. Automated search of natively folded protein fragments for high-throughput structure determination in structural genomics.

    PubMed Central

    Kuroda, Y.; Tani, K.; Matsuo, Y.; Yokoyama, S.

    2000-01-01

    Structural genomic projects envision almost routine protein structure determinations, which are currently imaginable only for small proteins with molecular weights below 25,000 Da. For larger proteins, structural insight can be obtained by breaking them into small segments of amino acid sequences that can fold into native structures, even when isolated from the rest of the protein. Such segments are autonomously folding units (AFU) and have sizes suitable for fast structural analyses. Here, we propose to expand an intuitive procedure often employed for identifying biologically important domains to an automatic method for detecting putative folded protein fragments. The procedure is based on the recognition that large proteins can be regarded as a combination of independent domains conserved among diverse organisms. We thus have developed a program that reorganizes the output of BLAST searches and detects regions with a large number of similar sequences. To automate the detection process, it is reduced to a simple geometrical problem of recognizing rectangular shaped elevations in a graph that plots the number of similar sequences at each residue of a query sequence. We used our program to quantitatively corroborate the premise that segments with conserved sequences correspond to domains that fold into native structures. We applied our program to a test data set composed of 99 amino acid sequences containing 150 segments with structures listed in the Protein Data Bank, and thus known to fold into native structures. Overall, the fragments identified by our program have an almost 50% probability of forming a native structure, and comparable results are observed with sequences containing domain linkers classified in SCOP. Furthermore, we verified that our program identifies AFU in libraries from various organisms, and we found a significant number of AFU candidates for structural analysis, covering an estimated 5 to 20% of the genomic databases. Altogether, these

  8. Automated Finite Element Modeling of Wing Structures for Shape Optimization

    NASA Technical Reports Server (NTRS)

    Harvey, Michael Stephen

    1993-01-01

    The displacement formulation of the finite element method is the most general and most widely used technique for structural analysis of airplane configurations. Modem structural synthesis techniques based on the finite element method have reached a certain maturity in recent years, and large airplane structures can now be optimized with respect to sizing type design variables for many load cases subject to a rich variety of constraints including stress, buckling, frequency, stiffness and aeroelastic constraints (Refs. 1-3). These structural synthesis capabilities use gradient based nonlinear programming techniques to search for improved designs. For these techniques to be practical a major improvement was required in computational cost of finite element analyses (needed repeatedly in the optimization process). Thus, associated with the progress in structural optimization, a new perspective of structural analysis has emerged, namely, structural analysis specialized for design optimization application, or.what is known as "design oriented structural analysis" (Ref. 4). This discipline includes approximation concepts and methods for obtaining behavior sensitivity information (Ref. 1), all needed to make the optimization of large structural systems (modeled by thousands of degrees of freedom and thousands of design variables) practical and cost effective.

  9. Development of a machine vision guidance system for automated assembly of space structures

    NASA Technical Reports Server (NTRS)

    Cooper, Eric G.; Sydow, P. Daniel

    1992-01-01

    The topics are presented in viewgraph form and include: automated structural assembly robot vision; machine vision requirements; vision targets and hardware; reflective efficiency; target identification; pose estimation algorithms; triangle constraints; truss node with joint receptacle targets; end-effector mounted camera and light assembly; vision system results from optical bench tests; and future work.

  10. Automated on-orbit frequency domain identification for large space structures

    NASA Technical Reports Server (NTRS)

    Bayard, D. S.; Hadaegh, F. Y.; Yam, Y.; Scheid, R. E.; Mettler, E.; Milman, M. H.

    1991-01-01

    Recent experiences in the field of flexible structure control in space have indicated a need for on-orbit system identification to support robust control redesign to avoid in-flight instabilities and maintain high spacecraft performance. This paper highlights an automated frequency domain system identification methodology recently developed to fulfill this need. The methodology is focused to support (1) the estimation of system quantities useful for robust control analysis and design; (2) experiment design tailored to performing system identification in a typically constrained on-orbit environment; and (3) the automation of operations to reduce 'human in the loop' requirements.

  11. Automated dynamic analytical model improvement for damped structures

    NASA Technical Reports Server (NTRS)

    Fuh, J. S.; Berman, A.

    1985-01-01

    A method is described to improve a linear nonproportionally damped analytical model of a structure. The procedure finds the smallest changes in the analytical model such that the improved model matches the measured modal parameters. Features of the method are: (1) ability to properly treat complex valued modal parameters of a damped system; (2) applicability to realistically large structural models; and (3) computationally efficiency without involving eigensolutions and inversion of a large matrix.

  12. SCOPmap: Automated assignment of protein structures to evolutionary superfamilies

    PubMed Central

    Cheek, Sara; Qi, Yuan; Krishna, S Sri; Kinch, Lisa N; Grishin, Nick V

    2004-01-01

    Background Inference of remote homology between proteins is very challenging and remains a prerogative of an expert. Thus a significant drawback to the use of evolutionary-based protein structure classifications is the difficulty in assigning new proteins to unique positions in the classification scheme with automatic methods. To address this issue, we have developed an algorithm to map protein domains to an existing structural classification scheme and have applied it to the SCOP database. Results The general strategy employed by this algorithm is to combine the results of several existing sequence and structure comparison tools applied to a query protein of known structure in order to find the homologs already classified in SCOP database and thus determine classification assignments. The algorithm is able to map domains within newly solved structures to the appropriate SCOP superfamily level with ~95% accuracy. Examples of correctly mapped remote homologs are discussed. The algorithm is also capable of identifying potential evolutionary relationships not specified in the SCOP database, thus helping to make it better. The strategy of the mapping algorithm is not limited to SCOP and can be applied to any other evolutionary-based classification scheme as well. SCOPmap is available for download. Conclusion The SCOPmap program is useful for assigning domains in newly solved structures to appropriate superfamilies and for identifying evolutionary links between different superfamilies. PMID:15598351

  13. MemProtMD: Automated Insertion of Membrane Protein Structures into Explicit Lipid Membranes

    PubMed Central

    Stansfeld, Phillip J.; Goose, Joseph E.; Caffrey, Martin; Carpenter, Elisabeth P.; Parker, Joanne L.; Newstead, Simon; Sansom, Mark S.P.

    2015-01-01

    Summary There has been exponential growth in the number of membrane protein structures determined. Nevertheless, these structures are usually resolved in the absence of their lipid environment. Coarse-grained molecular dynamics (CGMD) simulations enable insertion of membrane proteins into explicit models of lipid bilayers. We have automated the CGMD methodology, enabling membrane protein structures to be identified upon their release into the PDB and embedded into a membrane. The simulations are analyzed for protein-lipid interactions, identifying lipid binding sites, and revealing local bilayer deformations plus molecular access pathways within the membrane. The coarse-grained models of membrane protein/bilayer complexes are transformed to atomistic resolution for further analysis and simulation. Using this automated simulation pipeline, we have analyzed a number of recently determined membrane protein structures to predict their locations within a membrane, their lipid/protein interactions, and the functional implications of an enhanced understanding of the local membrane environment of each protein. PMID:26073602

  14. Fully automated high-quality NMR structure determination of small 2H-enriched proteins

    PubMed Central

    Tang, Yuefeng; Schneider, William M.; Shen, Yang; Raman, Srivatsan; Inouye, Masayori; Baker, David; Roth, Monica J.

    2010-01-01

    Determination of high-quality small protein structures by nuclear magnetic resonance (NMR) methods generally requires acquisition and analysis of an extensive set of structural constraints. The process generally demands extensive backbone and sidechain resonance assignments, and weeks or even months of data collection and interpretation. Here we demonstrate rapid and high-quality protein NMR structure generation using CS-Rosetta with a perdeuterated protein sample made at a significantly reduced cost using new bacterial culture condensation methods. Our strategy provides the basis for a high-throughput approach for routine, rapid, high-quality structure determination of small proteins. As an example, we demonstrate the determination of a high-quality 3D structure of a small 8 kDa protein, E. coli cold shock protein A (CspA), using <4 days of data collection and fully automated data analysis methods together with CS-Rosetta. The resulting CspA structure is highly converged and in excellent agreement with the published crystal structure, with a backbone RMSD value of 0.5 Å, an all atom RMSD value of 1.2 Å to the crystal structure for well-defined regions, and RMSD value of 1.1 Å to crystal structure for core, non-solvent exposed sidechain atoms. Cross validation of the structure with 15N- and 13C-edited NOESY data obtained with a perdeuterated 15N, 13C-enriched 13CH3 methyl protonated CspA sample confirms that essentially all of these independently-interpreted NOE-based constraints are already satisfied in each of the 10 CS-Rosetta structures. By these criteria, the CS-Rosetta structure generated by fully automated analysis of data for a perdeuterated sample provides an accurate structure of CspA. This represents a general approach for rapid, automated structure determination of small proteins by NMR. PMID:20734145

  15. From bacterial to human dihydrouridine synthase: automated structure determination

    SciTech Connect

    Whelan, Fiona Jenkins, Huw T.; Griffiths, Samuel C.; Byrne, Robert T.; Dodson, Eleanor J.; Antson, Alfred A.

    2015-06-30

    The crystal structure of a human dihydrouridine synthase, an enzyme associated with lung cancer, with 18% sequence identity to a T. maritima enzyme, has been determined at 1.9 Å resolution by molecular replacement after extensive molecular remodelling of the template. The reduction of uridine to dihydrouridine at specific positions in tRNA is catalysed by dihydrouridine synthase (Dus) enzymes. Increased expression of human dihydrouridine synthase 2 (hDus2) has been linked to pulmonary carcinogenesis, while its knockdown decreased cancer cell line viability, suggesting that it may serve as a valuable target for therapeutic intervention. Here, the X-ray crystal structure of a construct of hDus2 encompassing the catalytic and tRNA-recognition domains (residues 1–340) determined at 1.9 Å resolution is presented. It is shown that the structure can be determined automatically by phenix.mr-rosetta starting from a bacterial Dus enzyme with only 18% sequence identity and a significantly divergent structure. The overall fold of the human Dus2 is similar to that of bacterial enzymes, but has a larger recognition domain and a unique three-stranded antiparallel β-sheet insertion into the catalytic domain that packs next to the recognition domain, contributing to domain–domain interactions. The structure may inform the development of novel therapeutic approaches in the fight against lung cancer.

  16. Reduced complexity structural modeling for automated airframe synthesis

    NASA Technical Reports Server (NTRS)

    Hajela, Prabhat

    1987-01-01

    A procedure is developed for the optimum sizing of wing structures based on representing the built-up finite element assembly of the structure by equivalent beam models. The reduced-order beam models are computationally less demanding in an optimum design environment which dictates repetitive analysis of several trial designs. The design procedure is implemented in a computer program requiring geometry and loading information to create the wing finite element model and its equivalent beam model, and providing a rapid estimate of the optimum weight obtained from a fully stressed design approach applied to the beam. The synthesis procedure is demonstrated for representative conventional-cantilever and joined wing configurations.

  17. Application of a hierarchical structure stochastic learning automation

    NASA Technical Reports Server (NTRS)

    Neville, R. G.; Chrystall, M. S.; Mars, P.

    1979-01-01

    A hierarchical structure automaton was developed using a two state stochastic learning automato (SLA) in a time shared model. Application of the hierarchical SLA to systems with multidimensional, multimodal performance criteria is described. Results of experiments performed with the hierarchical SLA using a performance index with a superimposed noise component of ? or - delta distributed uniformly over the surface are discussed.

  18. From bacterial to human dihydrouridine synthase: automated structure determination

    PubMed Central

    Whelan, Fiona; Jenkins, Huw T.; Griffiths, Samuel C.; Byrne, Robert T.; Dodson, Eleanor J.; Antson, Alfred A.

    2015-01-01

    The reduction of uridine to dihydrouridine at specific positions in tRNA is catalysed by dihydrouridine synthase (Dus) enzymes. Increased expression of human dihydrouridine synthase 2 (hDus2) has been linked to pulmonary carcinogenesis, while its knockdown decreased cancer cell line viability, suggesting that it may serve as a valuable target for therapeutic intervention. Here, the X-ray crystal structure of a construct of hDus2 encompassing the catalytic and tRNA-recognition domains (residues 1–340) determined at 1.9 Å resolution is presented. It is shown that the structure can be determined automatically by phenix.mr_rosetta starting from a bacterial Dus enzyme with only 18% sequence identity and a significantly divergent structure. The overall fold of the human Dus2 is similar to that of bacterial enzymes, but has a larger recognition domain and a unique three-stranded antiparallel β-sheet insertion into the catalytic domain that packs next to the recognition domain, contributing to domain–domain interactions. The structure may inform the development of novel therapeutic approaches in the fight against lung cancer. PMID:26143927

  19. Automating the parallel processing of fluid and structural dynamics calculations

    NASA Technical Reports Server (NTRS)

    Arpasi, Dale J.; Cole, Gary L.

    1987-01-01

    The NASA Lewis Research Center is actively involved in the development of expert system technology to assist users in applying parallel processing to computational fluid and structural dynamic analysis. The goal of this effort is to eliminate the necessity for the physical scientist to become a computer scientist in order to effectively use the computer as a research tool. Programming and operating software utilities have previously been developed to solve systems of ordinary nonlinear differential equations on parallel scalar processors. Current efforts are aimed at extending these capabilties to systems of partial differential equations, that describe the complex behavior of fluids and structures within aerospace propulsion systems. This paper presents some important considerations in the redesign, in particular, the need for algorithms and software utilities that can automatically identify data flow patterns in the application program and partition and allocate calculations to the parallel processors. A library-oriented multiprocessing concept for integrating the hardware and software functions is described.

  20. Automating the parallel processing of fluid and structural dynamics calculations

    NASA Technical Reports Server (NTRS)

    Arpasi, Dale J.; Cole, Gary L.

    1987-01-01

    The NASA Lewis Research Center is actively involved in the development of expert system technology to assist users in applying parallel processing to computational fluid and structural dynamic analysis. The goal of this effort is to eliminate the necessity for the physical scientist to become a computer scientist in order to effectively use the computer as a research tool. Programming and operating software utilities have previously been developed to solve systems of ordinary nonlinear differential equations on parallel scalar processors. Current efforts are aimed at extending these capabilities to systems of partial differential equations, that describe the complex behavior of fluids and structures within aerospace propulsion systems. This paper presents some important considerations in the redesign, in particular, the need for algorithms and software utilities that can automatically identify data flow patterns in the application program and partition and allocate calculations to the parallel processors. A library-oriented multiprocessing concept for integrating the hardware and software functions is described.

  1. Three-dimensional visualization for evaluating automated, geomorphic pattern-recognition analyses of crustal structures

    NASA Astrophysics Data System (ADS)

    Foley, M. G.

    1991-02-01

    We are developing and applying a suite of automated remote geologic analysis (RGA) methods at Pacific Northwest Laboratory (PNL) for extracting structural and tectonic patterns from digital models of topography and other spatially registered geophysical data. In analyzing a map area, the geologist employs a variety of spatial representations (e.g., topographic maps; oblique, vertical and vertical stereographic aerial photographs; satellite-sensor images) in addition to actual field observations to provide a basis for recognizing features (patterns) diagnostic or suggestive of various geologic and geomorphic features. We intend that our automated analyses of digital models of elevation use the same photogeologic pattern-recognition methods as the geologist's; otherwise there is no direct basis for manually evaluating results of the automated analysis. Any system for automating geologic analysis should extend the geologist's pattern-recognition abilities and quantify them, rather than replace them. This requirement means that results of automated structural pattern-recognition analyses must be evaluated by geologists using the same method that would be employed in manual field checking: visual examination of the three-dimensional relationships among rocks, erosional patterns, and identifiable structures. Interactive computer-graphics in quantitative (i.e., spatially registered), simulated three-dimensional perspective and stereo are thus critical to the integration and interpretation of topography, imagery, point data, RGA-identified fracture/fault planes, stratigraphy, contoured geophysical data, nonplanar surfaces, boreholes, and three-dimensional zones (e.g., crush zones at fracture intersections). This graphical interaction presents the megabytes of digital geologic and geophysical data to the geologist in the same spatial format that field observations would take, permitting direct evaluation of RGA methods and results.

  2. Three-dimensional visualization for evaluating automated, geomorphic pattern-recognition analyses of crustal structures

    SciTech Connect

    Foley, M.G.

    1991-02-01

    We are developing and applying a suite of automated remote geologic analysis (RGA) methods at Pacific Northwest Laboratory (PNL) for extracting structural and tectonic patterns from digital models of topography and other spatially registered geophysical data. In analyzing a map area, the geologist employs a variety of spatial representations (e.g., topographic maps; oblique, vertical and vertical stereographic aerial photographs; satellite-sensor images) in addition to actual field observations to provide a basis for recognizing features (patterns) diagnostic or suggestive of various geologic and geomorphic features. We intend that our automated analyses of digital models of elevation use the same photogeologic pattern-recognition methods as the geologist's; otherwise there is no direct basis for manually evaluating results of the automated analysis. Any system for automating geologic analysis should extend the geologist's pattern-recognition abilities and quantify them, rather than replace them. This requirement means that results of automated structural pattern-recognition analyses must be evaluated by geologists using the same method that would be employed in manual field checking: visual examination of the three-dimensional relationships among rocks, erosional patterns, and identifiable structures. Interactive computer-graphics in quantitative (i.e., spatially registered), simulated three-dimensional perspective and stereo are thus critical to the integration and interpretation of topography, imagery, point data, RGA-identified fracture/fault planes, stratigraphy, contoured geophysical data, nonplanar surfaces, boreholes, and three-dimensional zones (e.g., crush zones at fracture intersections). This graphical interaction presents the megabytes of digital geologic and geophysical data to the geologist in the same spatial format that field observations would take, permitting direct evaluation of RGA methods and results. 5 refs., 2 figs.

  3. An automated procedure for covariation-based detection of RNA structure

    SciTech Connect

    Winker, S.; Overbeek, R.; Woese, C.R.; Olsen, G.J.; Pfluger, N.

    1989-12-01

    This paper summarizes our investigations into the computational detection of secondary and tertiary structure of ribosomal RNA. We have developed a new automated procedure that not only identifies potential bondings of secondary and tertiary structure, but also provides the covariation evidence that supports the proposed bondings, and any counter-evidence that can be detected in the known sequences. A small number of previously unknown bondings have been detected in individual RNA molecules (16S rRNA and 7S RNA) through the use of our automated procedure. Currently, we are systematically studying mitochondrial rRNA. Our goal is to detect tertiary structure within 16S rRNA and quaternary structure between 16S and 23S rRNA. Our ultimate hope is that automated covariation analysis will contribute significantly to a refined picture of ribosome structure. Our colleagues in biology have begun experiments to test certain hypotheses suggested by an examination of our program's output. These experiments involve sequencing key portions of the 23S ribosomal RNA for species in which the known 16S ribosomal RNA exhibits variation (from the dominant pattern) at the site of a proposed bonding. The hope is that the 23S ribosomal RNA of these species will exhibit corresponding complementary variation or generalized covariation. 24 refs.

  4. Revealing biological information using data structuring and automated learning.

    PubMed

    Mohorianu, Irina; Moulton, Vincent

    2010-11-01

    The intermediary steps between a biological hypothesis, concretized in the input data, and meaningful results, validated using biological experiments, commonly employ bioinformatics tools. Starting with storage of the data and ending with a statistical analysis of the significance of the results, every step in a bioinformatics analysis has been intensively studied and the resulting methods and models patented. This review summarizes the bioinformatics patents that have been developed mainly for the study of genes, and points out the universal applicability of bioinformatics methods to other related studies such as RNA interference. More specifically, we overview the steps undertaken in the majority of bioinformatics analyses, highlighting, for each, various approaches that have been developed to reveal details from different perspectives. First we consider data warehousing, the first task that has to be performed efficiently, optimizing the structure of the database, in order to facilitate both the subsequent steps and the retrieval of information. Next, we review data mining, which occupies the central part of most bioinformatics analyses, presenting patents concerning differential expression, unsupervised and supervised learning. Last, we discuss how networks of interactions of genes or other players in the cell may be created, which help draw biological conclusions and have been described in several patents. PMID:21288193

  5. Knowledge structure representation and automated updates in intelligent information management systems

    NASA Technical Reports Server (NTRS)

    Corey, Stephen; Carnahan, Richard S., Jr.

    1990-01-01

    A continuing effort to apply rapid prototyping and Artificial Intelligence techniques to problems associated with projected Space Station-era information management systems is examined. In particular, timely updating of the various databases and knowledge structures within the proposed intelligent information management system (IIMS) is critical to support decision making processes. Because of the significantly large amounts of data entering the IIMS on a daily basis, information updates will need to be automatically performed with some systems requiring that data be incorporated and made available to users within a few hours. Meeting these demands depends first, on the design and implementation of information structures that are easily modified and expanded, and second, on the incorporation of intelligent automated update techniques that will allow meaningful information relationships to be established. Potential techniques are studied for developing such an automated update capability and IIMS update requirements are examined in light of results obtained from the IIMS prototyping effort.

  6. Automated protein motif generation in the structure-based protein function prediction tool ProMOL.

    PubMed

    Osipovitch, Mikhail; Lambrecht, Mitchell; Baker, Cameron; Madha, Shariq; Mills, Jeffrey L; Craig, Paul A; Bernstein, Herbert J

    2015-12-01

    ProMOL, a plugin for the PyMOL molecular graphics system, is a structure-based protein function prediction tool. ProMOL includes a set of routines for building motif templates that are used for screening query structures for enzyme active sites. Previously, each motif template was generated manually and required supervision in the optimization of parameters for sensitivity and selectivity. We developed an algorithm and workflow for the automation of motif building and testing routines in ProMOL. The algorithm uses a set of empirically derived parameters for optimization and requires little user intervention. The automated motif generation algorithm was first tested in a performance comparison with a set of manually generated motifs based on identical active sites from the same 112 PDB entries. The two sets of motifs were equally effective in identifying alignments with homologs and in rejecting alignments with unrelated structures. A second set of 296 active site motifs were generated automatically, based on Catalytic Site Atlas entries with literature citations, as an expansion of the library of existing manually generated motif templates. The new motif templates exhibited comparable performance to the existing ones in terms of hit rates against native structures, homologs with the same EC and Pfam designations, and randomly selected unrelated structures with a different EC designation at the first EC digit, as well as in terms of RMSD values obtained from local structural alignments of motifs and query structures. This research is supported by NIH grant GM078077. PMID:26573864

  7. COMPUTER AUTOMATED STUDY OF THE STRUCTURE-MUTAGENICITY RELATIONSHIPS OF NON-FUSED-RING NITROARENES AND RELATED COMPOUNDS

    EPA Science Inventory

    A quantitative structure-activity analysis of the mutagenicity of non-fused ring nitroaromatic compounds is reported. The analysis is performed on the basis of substructural fragment descriptors according to a recently developed methodology acronymed CASE (Computer Automated Stru...

  8. Automation of three-dimensional structured mesh generation for turbomachinery blade passages

    NASA Technical Reports Server (NTRS)

    Ascoli, Edward P.; Prueger, George H.

    1995-01-01

    Hybrid tools have been developed which greatly reduce the time required to generate three-dimensional structured CFD meshes for turbomachinery blade passages. RAGGS, an existing Rockwell proprietary, general purpose mesh generation and visualization system, provides the starting point and framework for tool development. Utilities which manipulate and interface with RAGGS tools have been developed to (1) facilitate blade geometry inputs from point or CAD representations, (2) automate auxiliary surface creation, and (3) streamline and automate edge, surface, and subsequent volume mesh generation from minimal inputs. The emphasis of this approach has been to maintain all the functionality of the general purpose mesh generator while simultaneously eliminating the bulk of the repetitive and tediuos manual steps in the mesh generation process. Using this approach, mesh generation cycle times have been reduced from the order of days down to the order of hours.

  9. Automated fine structure image analysis method for discrimination of diabetic retinopathy stage using conjunctival microvasculature images.

    PubMed

    Khansari, Maziyar M; O'Neill, William; Penn, Richard; Chau, Felix; Blair, Norman P; Shahidi, Mahnaz

    2016-07-01

    The conjunctiva is a densely vascularized mucus membrane covering the sclera of the eye with a unique advantage of accessibility for direct visualization and non-invasive imaging. The purpose of this study is to apply an automated quantitative method for discrimination of different stages of diabetic retinopathy (DR) using conjunctival microvasculature images. Fine structural analysis of conjunctival microvasculature images was performed by ordinary least square regression and Fisher linear discriminant analysis. Conjunctival images between groups of non-diabetic and diabetic subjects at different stages of DR were discriminated. The automated method's discriminate rates were higher than those determined by human observers. The method allowed sensitive and rapid discrimination by assessment of conjunctival microvasculature images and can be potentially useful for DR screening and monitoring. PMID:27446692

  10. Automated fine structure image analysis method for discrimination of diabetic retinopathy stage using conjunctival microvasculature images

    PubMed Central

    Khansari, Maziyar M; O’Neill, William; Penn, Richard; Chau, Felix; Blair, Norman P; Shahidi, Mahnaz

    2016-01-01

    The conjunctiva is a densely vascularized mucus membrane covering the sclera of the eye with a unique advantage of accessibility for direct visualization and non-invasive imaging. The purpose of this study is to apply an automated quantitative method for discrimination of different stages of diabetic retinopathy (DR) using conjunctival microvasculature images. Fine structural analysis of conjunctival microvasculature images was performed by ordinary least square regression and Fisher linear discriminant analysis. Conjunctival images between groups of non-diabetic and diabetic subjects at different stages of DR were discriminated. The automated method’s discriminate rates were higher than those determined by human observers. The method allowed sensitive and rapid discrimination by assessment of conjunctival microvasculature images and can be potentially useful for DR screening and monitoring. PMID:27446692

  11. Automated detection of structural alerts (chemical fragments) in (eco)toxicology

    PubMed Central

    Lepailleur, Alban; Poezevara, Guillaume; Bureau, Ronan

    2013-01-01

    This mini-review describes the evolution of different algorithms dedicated to the automated discovery of chemical fragments associated to (eco)toxicological endpoints. These structural alerts correspond to one of the most interesting approach of in silico toxicology due to their direct link with specific toxicological mechanisms. A number of expert systems are already available but, since the first work in this field which considered a binomial distribution of chemical fragments between two datasets, new data miners were developed and applied with success in chemoinformatics. The frequency of a chemical fragment in a dataset is often at the core of the process for the definition of its toxicological relevance. However, recent progresses in data mining provide new insights into the automated discovery of new rules. Particularly, this review highlights the notion of Emerging Patterns that can capture contrasts between classes of data. PMID:24688706

  12. A two-level structure for advanced space power system automation

    NASA Technical Reports Server (NTRS)

    Loparo, Kenneth A.; Chankong, Vira

    1990-01-01

    The tasks to be carried out during the three-year project period are: (1) performing extensive simulation using existing mathematical models to build a specific knowledge base of the operating characteristics of space power systems; (2) carrying out the necessary basic research on hierarchical control structures, real-time quantitative algorithms, and decision-theoretic procedures; (3) developing a two-level automation scheme for fault detection and diagnosis, maintenance and restoration scheduling, and load management; and (4) testing and demonstration. The outlines of the proposed system structure that served as a master plan for this project, work accomplished, concluding remarks, and ideas for future work are also addressed.

  13. A fully automated trabecular bone structural analysis tool based on T2* -weighted magnetic resonance imaging.

    PubMed

    Kraiger, Markus; Martirosian, Petros; Opriessnig, Peter; Eibofner, Frank; Rempp, Hansjoerg; Hofer, Michael; Schick, Fritz; Stollberger, Rudolf

    2012-03-01

    One major source affecting the precision of bone structure analysis in quantitative magnetic resonance imaging (qMRI) is inter- and intraoperator variability, inherent in delineating and tracing regions of interest along longitudinal studies. In this paper an automated analysis tool, featuring bone marrow segmentation, region of interest generation, and characterization of cancellous bone of articular joints is presented. In evaluation studies conducted at the knee joint the novel analysis tool significantly decreased the standard error of measurement and improved the sensitivity in detecting minor structural changes. It further eliminated the need of time-consuming user interaction, and thereby increasing reproducibility. PMID:21862288

  14. Automated Real-Space Refinement of Protein Structures Using a Realistic Backbone Move Set

    PubMed Central

    Haddadian, Esmael J.; Gong, Haipeng; Jha, Abhishek K.; Yang, Xiaojing; DeBartolo, Joe; Hinshaw, James R.; Rice, Phoebe A.; Sosnick, Tobin R.; Freed, Karl F.

    2011-01-01

    Crystals of many important biological macromolecules diffract to limited resolution, rendering accurate model building and refinement difficult and time-consuming. We present a torsional optimization protocol that is applicable to many such situations and combines Protein Data Bank-based torsional optimization with real-space refinement against the electron density derived from crystallography or cryo-electron microscopy. Our method converts moderate- to low-resolution structures at initial (e.g., backbone trace only) or late stages of refinement to structures with increased numbers of hydrogen bonds, improved crystallographic R-factors, and superior backbone geometry. This automated method is applicable to DNA-binding and membrane proteins of any size and will aid studies of structural biology by improving model quality and saving considerable effort. The method can be extended to improve NMR and other structures. Our backbone score and its sequence profile provide an additional standard tool for evaluating structural quality. PMID:21843481

  15. Integrating automated structured analysis and design with Ada programming support environments

    NASA Technical Reports Server (NTRS)

    Hecht, Alan; Simmons, Andy

    1986-01-01

    Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.

  16. Automated High Throughput Protein Crystallization Screening at Nanoliter Scale and Protein Structural Study on Lactate Dehydrogenase

    SciTech Connect

    Fenglei Li

    2006-08-09

    The purposes of our research were: (1) To develop an economical, easy to use, automated, high throughput system for large scale protein crystallization screening. (2) To develop a new protein crystallization method with high screening efficiency, low protein consumption and complete compatibility with high throughput screening system. (3) To determine the structure of lactate dehydrogenase complexed with NADH by x-ray protein crystallography to study its inherent structural properties. Firstly, we demonstrated large scale protein crystallization screening can be performed in a high throughput manner with low cost, easy operation. The overall system integrates liquid dispensing, crystallization and detection and serves as a whole solution to protein crystallization screening. The system can dispense protein and multiple different precipitants in nanoliter scale and in parallel. A new detection scheme, native fluorescence, has been developed in this system to form a two-detector system with a visible light detector for detecting protein crystallization screening results. This detection scheme has capability of eliminating common false positives by distinguishing protein crystals from inorganic crystals in a high throughput and non-destructive manner. The entire system from liquid dispensing, crystallization to crystal detection is essentially parallel, high throughput and compatible with automation. The system was successfully demonstrated by lysozyme crystallization screening. Secondly, we developed a new crystallization method with high screening efficiency, low protein consumption and compatibility with automation and high throughput. In this crystallization method, a gas permeable membrane is employed to achieve the gentle evaporation required by protein crystallization. Protein consumption is significantly reduced to nanoliter scale for each condition and thus permits exploring more conditions in a phase diagram for given amount of protein. In addition

  17. PDB_REDO: automated re-refinement of X-ray structure models in the PDB.

    PubMed

    Joosten, Robbie P; Salzemann, Jean; Bloch, Vincent; Stockinger, Heinz; Berglund, Ann-Charlott; Blanchet, Christophe; Bongcam-Rudloff, Erik; Combet, Christophe; Da Costa, Ana L; Deleage, Gilbert; Diarena, Matteo; Fabbretti, Roberto; Fettahi, Géraldine; Flegel, Volker; Gisel, Andreas; Kasam, Vinod; Kervinen, Timo; Korpelainen, Eija; Mattila, Kimmo; Pagni, Marco; Reichstadt, Matthieu; Breton, Vincent; Tickle, Ian J; Vriend, Gert

    2009-06-01

    Structural biology, homology modelling and rational drug design require accurate three-dimensional macromolecular coordinates. However, the coordinates in the Protein Data Bank (PDB) have not all been obtained using the latest experimental and computational methods. In this study a method is presented for automated re-refinement of existing structure models in the PDB. A large-scale benchmark with 16 807 PDB entries showed that they can be improved in terms of fit to the deposited experimental X-ray data as well as in terms of geometric quality. The re-refinement protocol uses TLS models to describe concerted atom movement. The resulting structure models are made available through the PDB_REDO databank (http://www.cmbi.ru.nl/pdb_redo/). Grid computing techniques were used to overcome the computational requirements of this endeavour. PMID:22477769

  18. New tissue priors for improved automated classification of subcortical brain structures on MRI.

    PubMed

    Lorio, S; Fresard, S; Adaszewski, S; Kherif, F; Chowdhury, R; Frackowiak, R S; Ashburner, J; Helms, G; Weiskopf, N; Lutti, A; Draganski, B

    2016-04-15

    Despite the constant improvement of algorithms for automated brain tissue classification, the accurate delineation of subcortical structures using magnetic resonance images (MRI) data remains challenging. The main difficulties arise from the low gray-white matter contrast of iron rich areas in T1-weighted (T1w) MRI data and from the lack of adequate priors for basal ganglia and thalamus. The most recent attempts to obtain such priors were based on cohorts with limited size that included subjects in a narrow age range, failing to account for age-related gray-white matter contrast changes. Aiming to improve the anatomical plausibility of automated brain tissue classification from T1w data, we have created new tissue probability maps for subcortical gray matter regions. Supported by atlas-derived spatial information, raters manually labeled subcortical structures in a cohort of healthy subjects using magnetization transfer saturation and R2* MRI maps, which feature optimal gray-white matter contrast in these areas. After assessment of inter-rater variability, the new tissue priors were tested on T1w data within the framework of voxel-based morphometry. The automated detection of gray matter in subcortical areas with our new probability maps was more anatomically plausible compared to the one derived with currently available priors. We provide evidence that the improved delineation compensates age-related bias in the segmentation of iron rich subcortical regions. The new tissue priors, allowing robust detection of basal ganglia and thalamus, have the potential to enhance the sensitivity of voxel-based morphometry in both healthy and diseased brains. PMID:26854557

  19. New tissue priors for improved automated classification of subcortical brain structures on MRI☆

    PubMed Central

    Lorio, S.; Fresard, S.; Adaszewski, S.; Kherif, F.; Chowdhury, R.; Frackowiak, R.S.; Ashburner, J.; Helms, G.; Weiskopf, N.; Lutti, A.; Draganski, B.

    2016-01-01

    Despite the constant improvement of algorithms for automated brain tissue classification, the accurate delineation of subcortical structures using magnetic resonance images (MRI) data remains challenging. The main difficulties arise from the low gray-white matter contrast of iron rich areas in T1-weighted (T1w) MRI data and from the lack of adequate priors for basal ganglia and thalamus. The most recent attempts to obtain such priors were based on cohorts with limited size that included subjects in a narrow age range, failing to account for age-related gray-white matter contrast changes. Aiming to improve the anatomical plausibility of automated brain tissue classification from T1w data, we have created new tissue probability maps for subcortical gray matter regions. Supported by atlas-derived spatial information, raters manually labeled subcortical structures in a cohort of healthy subjects using magnetization transfer saturation and R2* MRI maps, which feature optimal gray-white matter contrast in these areas. After assessment of inter-rater variability, the new tissue priors were tested on T1w data within the framework of voxel-based morphometry. The automated detection of gray matter in subcortical areas with our new probability maps was more anatomically plausible compared to the one derived with currently available priors. We provide evidence that the improved delineation compensates age-related bias in the segmentation of iron rich subcortical regions. The new tissue priors, allowing robust detection of basal ganglia and thalamus, have the potential to enhance the sensitivity of voxel-based morphometry in both healthy and diseased brains. PMID:26854557

  20. Automated measurement of CT noise in patient images with a novel structure coherence feature.

    PubMed

    Chun, Minsoo; Choi, Young Hun; Kim, Jong Hyo

    2015-12-01

    While the assessment of CT noise constitutes an important task for the optimization of scan protocols in clinical routine, the majority of noise measurements in practice still rely on manual operation, hence limiting their efficiency and reliability. This study presents an algorithm for the automated measurement of CT noise in patient images with a novel structure coherence feature. The proposed algorithm consists of a four-step procedure including subcutaneous fat tissue selection, the calculation of structure coherence feature, the determination of homogeneous ROIs, and the estimation of the average noise level. In an evaluation with 94 CT scans (16 517 images) of pediatric and adult patients along with the participation of two radiologists, ROIs were placed on a homogeneous fat region at 99.46% accuracy, and the agreement of the automated noise measurements with the radiologists' reference noise measurements (PCC  =  0.86) was substantially higher than the within and between-rater agreements of noise measurements (PCCwithin  =  0.75, PCCbetween  =  0.70). In addition, the absolute noise level measurements matched closely the theoretical noise levels generated by a reduced-dose simulation technique. Our proposed algorithm has the potential to be used for examining the appropriateness of radiation dose and the image quality of CT protocols for research purposes as well as clinical routine. PMID:26561914

  1. Automated measurement of CT noise in patient images with a novel structure coherence feature

    NASA Astrophysics Data System (ADS)

    Chun, Minsoo; Choi, Young Hun; Hyo Kim, Jong

    2015-12-01

    While the assessment of CT noise constitutes an important task for the optimization of scan protocols in clinical routine, the majority of noise measurements in practice still rely on manual operation, hence limiting their efficiency and reliability. This study presents an algorithm for the automated measurement of CT noise in patient images with a novel structure coherence feature. The proposed algorithm consists of a four-step procedure including subcutaneous fat tissue selection, the calculation of structure coherence feature, the determination of homogeneous ROIs, and the estimation of the average noise level. In an evaluation with 94 CT scans (16 517 images) of pediatric and adult patients along with the participation of two radiologists, ROIs were placed on a homogeneous fat region at 99.46% accuracy, and the agreement of the automated noise measurements with the radiologists’ reference noise measurements (PCC  =  0.86) was substantially higher than the within and between-rater agreements of noise measurements (PCCwithin  =  0.75, PCCbetween  =  0.70). In addition, the absolute noise level measurements matched closely the theoretical noise levels generated by a reduced-dose simulation technique. Our proposed algorithm has the potential to be used for examining the appropriateness of radiation dose and the image quality of CT protocols for research purposes as well as clinical routine.

  2. Automated assembly of large space structures using an expert system executive

    NASA Technical Reports Server (NTRS)

    Allen, Cheryl L.

    1993-01-01

    NASA LaRC has developed a unique testbed for investigating the practical problems associated with the assembly of large space structures using robotic manipulators. The testbed is an interdisciplinary effort which considers the full spectrum of assembly problems from the design of mechanisms to the development of software. This paper will describe the automated structures assembly testbed and its operation, detail the expert system executive and its development, and discuss the planned system evolution. Emphasis will be placed on the expert system development of the program executive. The executive program must be capable of directing and reliably performing complex assembly tasks with the flexibility to recover from realistic system errors. By employing an expert system, information pertaining to the operation of the system was encapsulated concisely within a knowledge base. This lead to a substantial reduction in code, increased flexibility, eased software upgrades, and realized a savings in software maintenance costs.

  3. An expert system executive for automated assembly of large space truss structures

    NASA Technical Reports Server (NTRS)

    Allen, Cheryl L.

    1993-01-01

    Langley Research Center developed a unique test bed for investigating the practical problems associated with the assembly of large space truss structures using robotic manipulators. The test bed is the result of an interdisciplinary effort that encompasses the full spectrum of assembly problems - from the design of mechanisms to the development of software. The automated structures assembly test bed and its operation are described, the expert system executive and its development are detailed, and the planned system evolution is discussed. Emphasis is on the expert system implementation of the program executive. The executive program must direct and reliably perform complex assembly tasks with the flexibility to recover from realistic system errors. The employment of an expert system permits information that pertains to the operation of the system to be encapsulated concisely within a knowledge base. This consolidation substantially reduced code, increased flexibility, eased software upgrades, and realized a savings in software maintenance costs.

  4. PAP-LMPCR for improved, allele-specific footprinting and automated chromatin fine structure analysis

    PubMed Central

    Ingram, R.; Gao, C.; LeBon, J.; Liu, Q.; Mayoral, R. J.; Sommer, S. S.; Hoogenkamp, M.; Riggs, A. D.; Bonifer, C.

    2008-01-01

    The analysis of chromatin fine structure and transcription factor occupancy of differentially expressed genes by in vivo footprinting and ligation-mediated-PCR (LMPCR) is a powerful tool to understand the impact of chromatin on gene expression. However, as with all PCR-based techniques, the accuracy of the experiments has often been reduced by sequence similarities and the presence of GC-rich or repeat sequences, and some sequences are completely refractory to analysis. Here we describe a novel method, pyrophosphorolysis activated polymerization LMPCR or PAP-LMPCR, which is capable of generating accurate and reproducible footprints specific for individual alleles and can read through sequences previously not accessible for analysis. In addition, we have adapted this technique for automation, thus enabling the simultaneous and rapid analysis of chromatin structure at many different genes. PMID:18208840

  5. Automating gene library synthesis by structure-based combinatorial protein engineering: examples from plant sesquiterpene synthases.

    PubMed

    Dokarry, Melissa; Laurendon, Caroline; O'Maille, Paul E

    2012-01-01

    Structure-based combinatorial protein engineering (SCOPE) is a homology-independent recombination method to create multiple crossover gene libraries by assembling defined combinations of structural elements ranging from single mutations to domains of protein structure. SCOPE was originally inspired by DNA shuffling, which mimics recombination during meiosis, where mutations from parental genes are "shuffled" to create novel combinations in the resulting progeny. DNA shuffling utilizes sequence identity between parental genes to mediate template-switching events (the annealing and extension of one parental gene fragment on another) in PCR reassembly reactions to generate crossovers and hence recombination between parental genes. In light of the conservation of protein structure and degeneracy of sequence, SCOPE was developed to enable the "shuffling" of distantly related genes with no requirement for sequence identity. The central principle involves the use of oligonucleotides to encode for crossover regions to choreograph template-switching events during PCR assembly of gene fragments to create chimeric genes. This approach was initially developed to create libraries of hybrid DNA polymerases from distantly related parents, and later developed to create a combinatorial mutant library of sesquiterpene synthases to explore the catalytic landscapes underlying the functional divergence of related enzymes. This chapter presents a simplified protocol of SCOPE that can be integrated with different mutagenesis techniques and is suitable for automation by liquid-handling robots. Two examples are presented to illustrate the application of SCOPE to create gene libraries using plant sesquiterpene synthases as the model system. In the first example, we outline how to create an active-site library as a series of complex mixtures of diverse mutants. In the second example, we outline how to create a focused library as an array of individual clones to distil minimal combinations of

  6. Automated assignment of MS/MS cleavable cross-links in protein 3D-structure analysis.

    PubMed

    Götze, Michael; Pettelkau, Jens; Fritzsche, Romy; Ihling, Christian H; Schäfer, Mathias; Sinz, Andrea

    2015-01-01

    CID-MS/MS cleavable cross-linkers hold an enormous potential for an automated analysis of cross-linked products, which is essential for conducting structural proteomics studies. The created characteristic fragment ion patterns can easily be used for an automated assignment and discrimination of cross-linked products. To date, there are only a few software solutions available that make use of these properties, but none allows for an automated analysis of cleavable cross-linked products. The MeroX software fills this gap and presents a powerful tool for protein 3D-structure analysis in combination with MS/MS cleavable cross-linkers. We show that MeroX allows an automatic screening of characteristic fragment ions, considering static and variable peptide modifications, and effectively scores different types of cross-links. No manual input is required for a correct assignment of cross-links and false discovery rates are calculated. The self-explanatory graphical user interface of MeroX provides easy access for an automated cross-link search platform that is compatible with commonly used data file formats, enabling analysis of data originating from different instruments. The combination of an MS/MS cleavable cross-linker with a dedicated software tool for data analysis provides an automated workflow for 3D-structure analysis of proteins. MeroX is available at www.StavroX.com . PMID:25261217

  7. A script for automated 3-dimentional structure generation and conformer search from 2- dimentional chemical drawing.

    PubMed

    Ishikawa, Yoshinobu

    2013-01-01

    Building 3-dimensional (3D) molecules is the starting point in molecular modeling. Conformer search and identification of a global energy minimum structure are often performed computationally during spectral analysis of data from NMR, IR, and VCD or during rational drug design through ligand-based, structure-based, and QSAR approaches. I herein report a convenient script that allows for automated building of 3D structures and conformer searching from 2-dimensional (2D) drawing of chemical structures. With this Bash shell script, which runs on Mac OS X and the Linux platform, the tasks are consecutively and iteratively executed without a 3D molecule builder via the command line interface of the free (academic) software OpenBabel, Balloon, and MOPAC2012. A large number of 2D chemical drawing files can be processed simultaneously, and the script functions with stereoisomers. Semi-empirical quantum chemical calculation ensures reliable ranking of the generated conformers on the basis of energy. In addition to an energy-sorted list of file names of the conformers, their Gaussian input files are provided for ab initio and density functional theory calculations to predict rigorous electronic energies, structures, and properties. This script is freely available to all scientists. PMID:24391363

  8. A Script for Automated 3-Dimentional Structure Generation and Conformer Search from 2- Dimentional Chemical Drawing

    PubMed Central

    Ishikawa, Yoshinobu

    2013-01-01

    Building 3-dimensional (3D) molecules is the starting point in molecular modeling. Conformer search and identification of a global energy minimum structure are often performed computationally during spectral analysis of data from NMR, IR, and VCD or during rational drug design through ligand-based, structure-based, and QSAR approaches. I herein report a convenient script that allows for automated building of 3D structures and conformer searching from 2-dimensional (2D) drawing of chemical structures. With this Bash shell script, which runs on Mac OS X and the Linux platform, the tasks are consecutively and iteratively executed without a 3D molecule builder via the command line interface of the free (academic) software OpenBabel, Balloon, and MOPAC2012. A large number of 2D chemical drawing files can be processed simultaneously, and the script functions with stereoisomers. Semi-empirical quantum chemical calculation ensures reliable ranking of the generated conformers on the basis of energy. In addition to an energy-sorted list of file names of the conformers, their Gaussian input files are provided for ab initio and density functional theory calculations to predict rigorous electronic energies, structures, and properties. This script is freely available to all scientists. PMID:24391363

  9. Automated structure modeling of large protein assemblies using crosslinks as distance restraints.

    PubMed

    Ferber, Mathias; Kosinski, Jan; Ori, Alessandro; Rashid, Umar J; Moreno-Morcillo, María; Simon, Bernd; Bouvier, Guillaume; Batista, Paulo Ricardo; Müller, Christoph W; Beck, Martin; Nilges, Michael

    2016-06-01

    Crosslinking mass spectrometry is increasingly used for structural characterization of multisubunit protein complexes. Chemical crosslinking captures conformational heterogeneity, which typically results in conflicting crosslinks that cannot be satisfied in a single model, making detailed modeling a challenging task. Here we introduce an automated modeling method dedicated to large protein assemblies ('XL-MOD' software is available at http://aria.pasteur.fr/supplementary-data/x-links) that (i) uses a form of spatial restraints that realistically reflects the distribution of experimentally observed crosslinked distances; (ii) automatically deals with ambiguous and/or conflicting crosslinks and identifies alternative conformations within a Bayesian framework; and (iii) allows subunit structures to be flexible during conformational sampling. We demonstrate our method by testing it on known structures and available crosslinking data. We also crosslinked and modeled the 17-subunit yeast RNA polymerase III at atomic resolution; the resulting model agrees remarkably well with recently published cryoelectron microscopy structures and provides additional insights into the polymerase structure. PMID:27111507

  10. Automated torso organ segmentation from 3D CT images using structured perceptron and dual decomposition

    NASA Astrophysics Data System (ADS)

    Nimura, Yukitaka; Hayashi, Yuichiro; Kitasaka, Takayuki; Mori, Kensaku

    2015-03-01

    This paper presents a method for torso organ segmentation from abdominal CT images using structured perceptron and dual decomposition. A lot of methods have been proposed to enable automated extraction of organ regions from volumetric medical images. However, it is necessary to adjust empirical parameters of them to obtain precise organ regions. This paper proposes an organ segmentation method using structured output learning. Our method utilizes a graphical model and binary features which represent the relationship between voxel intensities and organ labels. Also we optimize the weights of the graphical model by structured perceptron and estimate the best organ label for a given image by dynamic programming and dual decomposition. The experimental result revealed that the proposed method can extract organ regions automatically using structured output learning. The error of organ label estimation was 4.4%. The DICE coefficients of left lung, right lung, heart, liver, spleen, pancreas, left kidney, right kidney, and gallbladder were 0.91, 0.95, 0.77, 0.81, 0.74, 0.08, 0.83, 0.84, and 0.03, respectively.

  11. A structural study of cyanotrichite from Dachang by conventional and automated electron diffraction

    NASA Astrophysics Data System (ADS)

    Ventruti, Gennaro; Mugnaioli, Enrico; Capitani, Giancarlo; Scordari, Fernando; Pinto, Daniela; Lausi, Andrea

    2015-09-01

    The crystal structure of cyanotrichite, having general formula Cu4Al2(SO4)(OH)12·2H2O, from the Dachang deposit (China) was studied by means of conventional transmission electron microscopy, automated electron diffraction tomography (ADT) and synchrotron X-ray powder diffraction (XRPD). ADT revealed the presence of two different cyanotrichite-like phases. The same phases were also recognized in the XRPD pattern, allowing the perfect indexing of all peaks leading, after refinement to the following cell parameters: (1) a = 12.417(2) Å, b = 2.907(1) Å, c = 10.157(1) Å and β = 98.12(1); (2) a = 12.660(2) Å, b = 2.897(1) Å, c = 10.162(1) Å and β = 92.42(1)°. Only for the former phase, labeled cyanotrichite-98, a partial structure, corresponding to the [Cu4Al2(OH){12/2+}] cluster, was obtained ab initio by direct methods in space group C2/ m on the basis of electron diffraction data. Geometric and charge-balance considerations allowed to reach the whole structure model for the cyanotrichite-98 phase. The sulfate group and water molecule result to be statistically disordered over two possible positions, but keeping the average structure consistent with the C-centering symmetry, in agreement with ADT results.

  12. Automated identification of RNA 3D modules with discriminative power in RNA structural alignments.

    PubMed

    Theis, Corinna; Höner Zu Siederdissen, Christian; Hofacker, Ivo L; Gorodkin, Jan

    2013-12-01

    Recent progress in predicting RNA structure is moving towards filling the 'gap' in 2D RNA structure prediction where, for example, predicted internal loops often form non-canonical base pairs. This is increasingly recognized with the steady increase of known RNA 3D modules. There is a general interest in matching structural modules known from one molecule to other molecules for which the 3D structure is not known yet. We have created a pipeline, metaRNAmodules, which completely automates extracting putative modules from the FR3D database and mapping of such modules to Rfam alignments to obtain comparative evidence. Subsequently, the modules, initially represented by a graph, are turned into models for the RMDetect program, which allows to test their discriminative power using real and randomized Rfam alignments. An initial extraction of 22 495 3D modules in all PDB files results in 977 internal loop and 17 hairpin modules with clear discriminatory power. Many of these modules describe only minor variants of each other. Indeed, mapping of the modules onto Rfam families results in 35 unique locations in 11 different families. The metaRNAmodules pipeline source for the internal loop modules is available at http://rth.dk/resources/mrm. PMID:24005040

  13. Modelling and interpreting biologically crusted dryland soil sub-surface structure using automated micropenetrometry

    NASA Astrophysics Data System (ADS)

    Hoon, Stephen R.; Felde, Vincent J. M. N. L.; Drahorad, Sylvie L.; Felix-Henningsen, Peter

    2015-04-01

    Soil penetrometers are used routinely to determine the shear strength of soils and deformable sediments both at the surface and throughout a depth profile in disciplines as diverse as soil science, agriculture, geoengineering and alpine avalanche-safety (e.g. Grunwald et al. 2001, Van Herwijnen et al. 2009). Generically, penetrometers comprise two principal components: An advancing probe, and a transducer; the latter to measure the pressure or force required to cause the probe to penetrate or advance through the soil or sediment. The force transducer employed to determine the pressure can range, for example, from a simple mechanical spring gauge to an automatically data-logged electronic transducer. Automated computer control of the penetrometer step size and probe advance rate enables precise measurements to be made down to a resolution of 10's of microns, (e.g. the automated electronic micropenetrometer (EMP) described by Drahorad 2012). Here we discuss the determination, modelling and interpretation of biologically crusted dryland soil sub-surface structures using automated micropenetrometry. We outline a model enabling the interpretation of depth dependent penetration resistance (PR) profiles and their spatial differentials using the model equations, σ {}(z) ={}σ c0{}+Σ 1n[σ n{}(z){}+anz + bnz2] and dσ /dz = Σ 1n[dσ n(z) /dz{} {}+{}Frn(z)] where σ c0 and σ n are the plastic deformation stresses for the surface and nth soil structure (e.g. soil crust, layer, horizon or void) respectively, and Frn(z)dz is the frictional work done per unit volume by sliding the penetrometer rod an incremental distance, dz, through the nth layer. Both σ n(z) and Frn(z) are related to soil structure. They determine the form of σ {}(z){} measured by the EMP transducer. The model enables pores (regions of zero deformation stress) to be distinguished from changes in layer structure or probe friction. We have applied this method to both artificial calibration soils in the

  14. Automated mutual exclusion rules discovery for structured observational codes in echocardiography reporting

    PubMed Central

    Forsberg, Thomas A.; Sevenster, Merlijn; Bieganski, Szymon; Bhagat, Puran; Kanasseril, Melvin; Jia, Yugang; Spencer, Kirk T.

    2015-01-01

    Structured reporting in medicine has been argued to support and enhance machine-assisted processing and communication of pertinent information. Retrospective studies showed that structured echocardiography reports, constructed through point-and-click selection of finding codes (FCs), contain pair-wise contradictory FCs (e.g., “No tricuspid regurgitation” and “Severe regurgitation”) downgrading report quality and reliability thereof. In a prospective study, contradictions were detected automatically using an extensive rule set that encodes mutual exclusion patterns between FCs. Rules creation is a labor and knowledge-intensive task that could benefit from automation. We propose a machine-learning approach to discover mutual exclusion rules in a corpus of 101,211 structured echocardiography reports through semantic and statistical analysis. Ground truth is derived from the extensive prospectively evaluated rule set. On the unseen test set, F-measure (0.439) and above-chance level AUC (0.885) show that our approach can potentially support the manual rules creation process. Our methods discovered previously unknown rules per expert review. PMID:26958191

  15. Automated metric characterization of urban structure using building decomposition from very high resolution imagery

    NASA Astrophysics Data System (ADS)

    Heinzel, Johannes; Kemper, Thomas

    2015-03-01

    Classification approaches for urban areas are mostly of qualitative and semantic nature. They produce interpreted classes similar to those from land cover and land use classifications. As a complement to those classes, quantitative measures directly derived from the image could lead to a metric characterization of the urban area. While these metrics lack of qualitative interpretation they are able to provide objective measure of the urban structures. Such quantitative measures are especially important in rapidly growing cities since, beside of the growth in area, they can provide structural information for specific areas and detect changes. Rustenburg, which serves as test area for the present study, is amongst the fastest growing cities in South Africa. It reveals a heterogeneous face of housing and building structures reflecting social and/or economic differences often linked to the spatial distribution of industrial and local mining sites. Up to date coverage with aerial photographs is provided by aerial surveys in regular intervals. Also recent satellite systems provide imagery with suitable resolution. Using such set of very high resolution images a fully automated algorithm has been developed which outputs metric classes by systematically combining important measures of building structure. The measurements are gained by decomposition of buildings directly from the imagery and by using methods from mathematical morphology. The decomposed building objects serve as basis for the computation of grid statistics. Finally a systematic combination of the single features leads to combined metrical classes. For the dominant urban structures verification results indicate an overall accuracy of at least 80% on the single feature level and 70% for the combined classes.

  16. The Upgrade Programme for the Structural Biology beamlines at the European Synchrotron Radiation Facility - High throughput sample evaluation and automation

    NASA Astrophysics Data System (ADS)

    Theveneau, P.; Baker, R.; Barrett, R.; Beteva, A.; Bowler, M. W.; Carpentier, P.; Caserotto, H.; de Sanctis, D.; Dobias, F.; Flot, D.; Guijarro, M.; Giraud, T.; Lentini, M.; Leonard, G. A.; Mattenet, M.; McCarthy, A. A.; McSweeney, S. M.; Morawe, C.; Nanao, M.; Nurizzo, D.; Ohlsson, S.; Pernot, P.; Popov, A. N.; Round, A.; Royant, A.; Schmid, W.; Snigirev, A.; Surr, J.; Mueller-Dieckmann, C.

    2013-03-01

    Automation and advances in technology are the key elements in addressing the steadily increasing complexity of Macromolecular Crystallography (MX) experiments. Much of this complexity is due to the inter-and intra-crystal heterogeneity in diffraction quality often observed for crystals of multi-component macromolecular assemblies or membrane proteins. Such heterogeneity makes high-throughput sample evaluation an important and necessary tool for increasing the chances of a successful structure determination. The introduction at the ESRF of automatic sample changers in 2005 dramatically increased the number of samples that were tested for diffraction quality. This "first generation" of automation, coupled with advances in software aimed at optimising data collection strategies in MX, resulted in a three-fold increase in the number of crystal structures elucidated per year using data collected at the ESRF. In addition, sample evaluation can be further complemented using small angle scattering experiments on the newly constructed bioSAXS facility on BM29 and the micro-spectroscopy facility (ID29S). The construction of a second generation of automated facilities on the MASSIF (Massively Automated Sample Screening Integrated Facility) beam lines will build on these advances and should provide a paradigm shift in how MX experiments are carried out which will benefit the entire Structural Biology community.

  17. E-novo: an automated workflow for efficient structure-based lead optimization.

    PubMed

    Pearce, Bradley C; Langley, David R; Kang, Jia; Huang, Hongwei; Kulkarni, Amit

    2009-07-01

    An automated E-Novo protocol designed as a structure-based lead optimization tool was prepared through Pipeline Pilot with existing CHARMm components in Discovery Studio. A scaffold core having 3D binding coordinates of interest is generated from a ligand-bound protein structural model. Ligands of interest are generated from the scaffold using an R-group fragmentation/enumeration tool within E-Novo, with their cores aligned. The ligand side chains are conformationally sampled and are subjected to core-constrained protein docking, using a modified CHARMm-based CDOCKER method to generate top poses along with CDOCKER energies. In the final stage of E-Novo, a physics-based binding energy scoring function ranks the top ligand CDOCKER poses using a more accurate Molecular Mechanics-Generalized Born with Surface Area method. Correlation of the calculated ligand binding energies with experimental binding affinities were used to validate protocol performance. Inhibitors of Src tyrosine kinase, CDK2 kinase, beta-secretase, factor Xa, HIV protease, and thrombin were used to test the protocol using published ligand crystal structure data within reasonably defined binding sites. In-house Respiratory Syncytial Virus inhibitor data were used as a more challenging test set using a hand-built binding model. Least squares fits for all data sets suggested reasonable validation of the protocol within the context of observed ligand binding poses. The E-Novo protocol provides a convenient all-in-one structure-based design process for rapid assessment and scoring of lead optimization libraries. PMID:19552372

  18. Automated laser-based barely visible impact damage detection in honeycomb sandwich composite structures

    SciTech Connect

    Girolamo, D. Yuan, F. G.; Girolamo, L.

    2015-03-31

    Nondestructive evaluation (NDE) for detection and quantification of damage in composite materials is fundamental in the assessment of the overall structural integrity of modern aerospace systems. Conventional NDE systems have been extensively used to detect the location and size of damages by propagating ultrasonic waves normal to the surface. However they usually require physical contact with the structure and are time consuming and labor intensive. An automated, contactless laser ultrasonic imaging system for barely visible impact damage (BVID) detection in advanced composite structures has been developed to overcome these limitations. Lamb waves are generated by a Q-switched Nd:YAG laser, raster scanned by a set of galvano-mirrors over the damaged area. The out-of-plane vibrations are measured through a laser Doppler Vibrometer (LDV) that is stationary at a point on the corner of the grid. The ultrasonic wave field of the scanned area is reconstructed in polar coordinates and analyzed for high resolution characterization of impact damage in the composite honeycomb panel. Two methodologies are used for ultrasonic wave-field analysis: scattered wave field analysis (SWA) and standing wave energy analysis (SWEA) in the frequency domain. The SWA is employed for processing the wave field and estimate spatially dependent wavenumber values, related to discontinuities in the structural domain. The SWEA algorithm extracts standing waves trapped within damaged areas and, by studying the spectrum of the standing wave field, returns high fidelity damage imaging. While the SWA can be used to locate the impact damage in the honeycomb panel, the SWEA produces damage images in good agreement with X-ray computed tomographic (X-ray CT) scans. The results obtained prove that the laser-based nondestructive system is an effective alternative to overcome limitations of conventional NDI technologies.

  19. Automated laser-based barely visible impact damage detection in honeycomb sandwich composite structures

    NASA Astrophysics Data System (ADS)

    Girolamo, D.; Girolamo, L.; Yuan, F. G.

    2015-03-01

    Nondestructive evaluation (NDE) for detection and quantification of damage in composite materials is fundamental in the assessment of the overall structural integrity of modern aerospace systems. Conventional NDE systems have been extensively used to detect the location and size of damages by propagating ultrasonic waves normal to the surface. However they usually require physical contact with the structure and are time consuming and labor intensive. An automated, contactless laser ultrasonic imaging system for barely visible impact damage (BVID) detection in advanced composite structures has been developed to overcome these limitations. Lamb waves are generated by a Q-switched Nd:YAG laser, raster scanned by a set of galvano-mirrors over the damaged area. The out-of-plane vibrations are measured through a laser Doppler Vibrometer (LDV) that is stationary at a point on the corner of the grid. The ultrasonic wave field of the scanned area is reconstructed in polar coordinates and analyzed for high resolution characterization of impact damage in the composite honeycomb panel. Two methodologies are used for ultrasonic wave-field analysis: scattered wave field analysis (SWA) and standing wave energy analysis (SWEA) in the frequency domain. The SWA is employed for processing the wave field and estimate spatially dependent wavenumber values, related to discontinuities in the structural domain. The SWEA algorithm extracts standing waves trapped within damaged areas and, by studying the spectrum of the standing wave field, returns high fidelity damage imaging. While the SWA can be used to locate the impact damage in the honeycomb panel, the SWEA produces damage images in good agreement with X-ray computed tomographic (X-ray CT) scans. The results obtained prove that the laser-based nondestructive system is an effective alternative to overcome limitations of conventional NDI technologies.

  20. Automated Assignment of MS/MS Cleavable Cross-Links in Protein 3D-Structure Analysis

    NASA Astrophysics Data System (ADS)

    Götze, Michael; Pettelkau, Jens; Fritzsche, Romy; Ihling, Christian H.; Schäfer, Mathias; Sinz, Andrea

    2015-01-01

    CID-MS/MS cleavable cross-linkers hold an enormous potential for an automated analysis of cross-linked products, which is essential for conducting structural proteomics studies. The created characteristic fragment ion patterns can easily be used for an automated assignment and discrimination of cross-linked products. To date, there are only a few software solutions available that make use of these properties, but none allows for an automated analysis of cleavable cross-linked products. The MeroX software fills this gap and presents a powerful tool for protein 3D-structure analysis in combination with MS/MS cleavable cross-linkers. We show that MeroX allows an automatic screening of characteristic fragment ions, considering static and variable peptide modifications, and effectively scores different types of cross-links. No manual input is required for a correct assignment of cross-links and false discovery rates are calculated. The self-explanatory graphical user interface of MeroX provides easy access for an automated cross-link search platform that is compatible with commonly used data file formats, enabling analysis of data originating from different instruments. The combination of an MS/MS cleavable cross-linker with a dedicated software tool for data analysis provides an automated workflow for 3D-structure analysis of proteins. MeroX is available at www.StavroX.com .

  1. Automated Structure-Activity Relationship Mining: Connecting Chemical Structure to Biological Profiles.

    PubMed

    Wawer, Mathias J; Jaramillo, David E; Dančík, Vlado; Fass, Daniel M; Haggarty, Stephen J; Shamji, Alykhan F; Wagner, Bridget K; Schreiber, Stuart L; Clemons, Paul A

    2014-06-01

    Understanding the structure-activity relationships (SARs) of small molecules is important for developing probes and novel therapeutic agents in chemical biology and drug discovery. Increasingly, multiplexed small-molecule profiling assays allow simultaneous measurement of many biological response parameters for the same compound (e.g., expression levels for many genes or binding constants against many proteins). Although such methods promise to capture SARs with high granularity, few computational methods are available to support SAR analyses of high-dimensional compound activity profiles. Many of these methods are not generally applicable or reduce the activity space to scalar summary statistics before establishing SARs. In this article, we present a versatile computational method that automatically extracts interpretable SAR rules from high-dimensional profiling data. The rules connect chemical structural features of compounds to patterns in their biological activity profiles. We applied our method to data from novel cell-based gene-expression and imaging assays collected on more than 30,000 small molecules. Based on the rules identified for this data set, we prioritized groups of compounds for further study, including a novel set of putative histone deacetylase inhibitors. PMID:24710340

  2. Application of an automated wireless structural monitoring system for long-span suspension bridges

    SciTech Connect

    Kurata, M.; Lynch, J. P.; Linden, G. W. van der; Hipley, P.; Sheng, L.-H.

    2011-06-23

    This paper describes an automated wireless structural monitoring system installed at the New Carquinez Bridge (NCB). The designed system utilizes a dense network of wireless sensors installed in the bridge but remotely controlled by a hierarchically designed cyber-environment. The early efforts have included performance verification of a dense network of wireless sensors installed on the bridge and the establishment of a cellular gateway to the system for remote access from the internet. Acceleration of the main bridge span was the primary focus of the initial field deployment of the wireless monitoring system. An additional focus of the study is on ensuring wireless sensors can survive for long periods without human intervention. Toward this end, the life-expectancy of the wireless sensors has been enhanced by embedding efficient power management schemes in the sensors while integrating solar panels for power harvesting. The dynamic characteristics of the NCB under daily traffic and wind loads were extracted from the vibration response of the bridge deck and towers. These results have been compared to a high-fidelity finite element model of the bridge.

  3. Automated cerebellar lobule segmentation with application to cerebellar structural analysis in cerebellar disease.

    PubMed

    Yang, Zhen; Ye, Chuyang; Bogovic, John A; Carass, Aaron; Jedynak, Bruno M; Ying, Sarah H; Prince, Jerry L

    2016-02-15

    The cerebellum plays an important role in both motor control and cognitive function. Cerebellar function is topographically organized and diseases that affect specific parts of the cerebellum are associated with specific patterns of symptoms. Accordingly, delineation and quantification of cerebellar sub-regions from magnetic resonance images are important in the study of cerebellar atrophy and associated functional losses. This paper describes an automated cerebellar lobule segmentation method based on a graph cut segmentation framework. Results from multi-atlas labeling and tissue classification contribute to the region terms in the graph cut energy function and boundary classification contributes to the boundary term in the energy function. A cerebellar parcellation is achieved by minimizing the energy function using the α-expansion technique. The proposed method was evaluated using a leave-one-out cross-validation on 15 subjects including both healthy controls and patients with cerebellar diseases. Based on reported Dice coefficients, the proposed method outperforms two state-of-the-art methods. The proposed method was then applied to 77 subjects to study the region-specific cerebellar structural differences in three spinocerebellar ataxia (SCA) genetic subtypes. Quantitative analysis of the lobule volumes shows distinct patterns of volume changes associated with different SCA subtypes consistent with known patterns of atrophy in these genetic subtypes. PMID:26408861

  4. Automated Foveola Localization in Retinal 3D-OCT Images Using Structural Support Vector Machine Prediction

    PubMed Central

    Liu, Yu-Ying; Ishikawa, Hiroshi; Chen, Mei; Wollstein, Gadi; Schuman, Joel S.; Rehg, James M.

    2013-01-01

    We develop an automated method to determine the foveola location in macular 3D-OCT images in either healthy or pathological conditions. Structural Support Vector Machine (S-SVM) is trained to directly predict the location of the foveola, such that the score at the ground truth position is higher than that at any other position by a margin scaling with the associated localization loss. This S-SVM formulation directly minimizes the empirical risk of localization error, and makes efficient use of all available training data. It deals with the localization problem in a more principled way compared to the conventional binary classifier learning that uses zero-one loss and random sampling of negative examples. A total of 170 scans were collected for the experiment. Our method localized 95.1% of testing scans within the anatomical area of the foveola. Our experimental results show that the proposed method can effectively identify the location of the foveola, facilitating diagnosis around this important landmark. PMID:23285565

  5. Application of AN Automated Wireless Structural Monitoring System for Long-Span Suspension Bridges

    NASA Astrophysics Data System (ADS)

    Kurata, M.; Lynch, J. P.; van der Linden, G. W.; Hipley, P.; Sheng, L.-H.

    2011-06-01

    This paper describes an automated wireless structural monitoring system installed at the New Carquinez Bridge (NCB). The designed system utilizes a dense network of wireless sensors installed in the bridge but remotely controlled by a hierarchically designed cyber-environment. The early efforts have included performance verification of a dense network of wireless sensors installed on the bridge and the establishment of a cellular gateway to the system for remote access from the internet. Acceleration of the main bridge span was the primary focus of the initial field deployment of the wireless monitoring system. An additional focus of the study is on ensuring wireless sensors can survive for long periods without human intervention. Toward this end, the life-expectancy of the wireless sensors has been enhanced by embedding efficient power management schemes in the sensors while integrating solar panels for power harvesting. The dynamic characteristics of the NCB under daily traffic and wind loads were extracted from the vibration response of the bridge deck and towers. These results have been compared to a high-fidelity finite element model of the bridge.

  6. Structural and Functional Relationships in Glaucoma Using Standard Automated Perimetry and the Humphrey Matrix

    PubMed Central

    Park, Seong Bae; Nam, Yoon Pyo; Sung, Kyung Rim

    2009-01-01

    Purpose To evaluate and compare correlations between structural and functional loss in glaucoma as assessed by optical coherence tomography (OCT), scanning laser polarimetry (GDx VCC, as this was the model used in this study), standard automated perimetry (SAP), and the Humphrey Matrix (Matrix). Methods Ninety glaucomatous eyes identified with SAP and 112 eyes diagnosed using Matrix were independently classified into six subgroups, either S1/M1 (MD>-6dB), S2/M2 (-12structural and functional defects when assessed using OCT and GDx VCC. These correlations were weaker in the Matrix subgroups, especially in the early stages of glaucoma. PMID:19794944

  7. Automated artery-venous classification of retinal blood vessels based on structural mapping method

    NASA Astrophysics Data System (ADS)

    Joshi, Vinayak S.; Garvin, Mona K.; Reinhardt, Joseph M.; Abramoff, Michael D.

    2012-03-01

    Retinal blood vessels show morphologic modifications in response to various retinopathies. However, the specific responses exhibited by arteries and veins may provide a precise diagnostic information, i.e., a diabetic retinopathy may be detected more accurately with the venous dilatation instead of average vessel dilatation. In order to analyze the vessel type specific morphologic modifications, the classification of a vessel network into arteries and veins is required. We previously described a method for identification and separation of retinal vessel trees; i.e. structural mapping. Therefore, we propose the artery-venous classification based on structural mapping and identification of color properties prominent to the vessel types. The mean and standard deviation of each of green channel intensity and hue channel intensity are analyzed in a region of interest around each centerline pixel of a vessel. Using the vector of color properties extracted from each centerline pixel, it is classified into one of the two clusters (artery and vein), obtained by the fuzzy-C-means clustering. According to the proportion of clustered centerline pixels in a particular vessel, and utilizing the artery-venous crossing property of retinal vessels, each vessel is assigned a label of an artery or a vein. The classification results are compared with the manually annotated ground truth (gold standard). We applied the proposed method to a dataset of 15 retinal color fundus images resulting in an accuracy of 88.28% correctly classified vessel pixels. The automated classification results match well with the gold standard suggesting its potential in artery-venous classification and the respective morphology analysis.

  8. Computational strategies for the automated design of RNA nanoscale structures from building blocks using NanoTiler☆

    PubMed Central

    Bindewald, Eckart; Grunewald, Calvin; Boyle, Brett; O’Connor, Mary; Shapiro, Bruce A.

    2013-01-01

    One approach to designing RNA nanoscale structures is to use known RNA structural motifs such as junctions, kissing loops or bulges and to construct a molecular model by connecting these building blocks with helical struts. We previously developed an algorithm for detecting internal loops, junctions and kissing loops in RNA structures. Here we present algorithms for automating or assisting many of the steps that are involved in creating RNA structures from building blocks: (1) assembling building blocks into nanostructures using either a combinatorial search or constraint satisfaction; (2) optimizing RNA 3D ring structures to improve ring closure; (3) sequence optimisation; (4) creating a unique non-degenerate RNA topology descriptor. This effectively creates a computational pipeline for generating molecular models of RNA nanostructures and more specifically RNA ring structures with optimized sequences from RNA building blocks. We show several examples of how the algorithms can be utilized to generate RNA tecto-shapes. PMID:18838281

  9. Automated Sample Exchange Robots for the Structural Biology Beam Lines at the Photon Factory

    SciTech Connect

    Hiraki, Masahiko; Watanabe, Shokei; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Gaponov, Yurii; Wakatsuki, Soichi

    2007-01-19

    We are now developing automated sample exchange robots for high-throughput protein crystallographic experiments for onsite use at synchrotron beam lines. It is part of the fully automated robotics systems being developed at the Photon Factory, for the purposes of protein crystallization, monitoring crystal growth, harvesting and freezing crystals, mounting the crystals inside a hutch and for data collection. We have already installed the sample exchange robots based on the SSRL automated mounting system at our insertion device beam lines BL-5A and AR-NW12A at the Photon Factory. In order to reduce the time required for sample exchange further, a prototype of a double-tonged system was developed. As a result of preliminary experiments with double-tonged robots, the sample exchange time was successfully reduced from 70 seconds to 10 seconds with the exception of the time required for pre-cooling and warming up the tongs.

  10. Automated error-tolerant macromolecular structure determination from multidimensional nuclear Overhauser enhancement spectra and chemical shift assignments

    PubMed Central

    Kuszewski, John J.; Thottungal, Robin Augustine; Schwieters, Charles D.; Clore, G. Marius

    2008-01-01

    We report substantial improvements to the previously introduced automated NOE assignment and structure determination protocol known as PASD. The improved protocol includes extensive analysis of input spectral data to create a low-resolution contact map of residues expected to be close in space. This map is used to obtain reasonable initial guesses of NOE assignment likelihoods which are refined during subsequent structure calculations. Information in the contact map about which residues are predicted to not be close in space is applied via conservative repulsive distance restraints which are used in early phases of the structure calculations. In comparison with the previous protocol, the new protocol requires significantly less computation time. We show results of running the new PASD protocol on six proteins and demonstrate that useful assignment and structural information is extracted on proteins of more than 220 residues. We show that useful assignment information can be obtained even in the case in which a unique structure cannot be determined. PMID:18668206

  11. PONDEROSA-C/S: client-server based software package for automated protein 3D structure determination.

    PubMed

    Lee, Woonghee; Stark, Jaime L; Markley, John L

    2014-11-01

    Peak-picking Of Noe Data Enabled by Restriction Of Shift Assignments-Client Server (PONDEROSA-C/S) builds on the original PONDEROSA software (Lee et al. in Bioinformatics 27:1727-1728. doi: 10.1093/bioinformatics/btr200, 2011) and includes improved features for structure calculation and refinement. PONDEROSA-C/S consists of three programs: Ponderosa Server, Ponderosa Client, and Ponderosa Analyzer. PONDEROSA-C/S takes as input the protein sequence, a list of assigned chemical shifts, and nuclear Overhauser data sets ((13)C- and/or (15)N-NOESY). The output is a set of assigned NOEs and 3D structural models for the protein. Ponderosa Analyzer supports the visualization, validation, and refinement of the results from Ponderosa Server. These tools enable semi-automated NMR-based structure determination of proteins in a rapid and robust fashion. We present examples showing the use of PONDEROSA-C/S in solving structures of four proteins: two that enable comparison with the original PONDEROSA package, and two from the Critical Assessment of automated Structure Determination by NMR (Rosato et al. in Nat Methods 6:625-626. doi: 10.1038/nmeth0909-625 , 2009) competition. The software package can be downloaded freely in binary format from http://pine.nmrfam.wisc.edu/download_packages.html. Registered users of the National Magnetic Resonance Facility at Madison can submit jobs to the PONDEROSA-C/S server at http://ponderosa.nmrfam.wisc.edu, where instructions, tutorials, and instructions can be found. Structures are normally returned within 1-2 days. PMID:25190042

  12. Semi-automated processing and routing within indoor structures for emergency response applications

    NASA Astrophysics Data System (ADS)

    Liu, Jianfei; Lyons, Kyle; Subramanian, Kalpathi; Ribarsky, William

    2010-04-01

    In this work, we propose new automation tools to process 2D building geometry data for effective communication and timely response to critical events in commercial buildings. Given the scale and complexity of commercial buildings, robust and visually rich tools are needed during an emergency. Our data processing pipeline consists of three major components, (1) adjacency graph construction, representing spatial relationships within a building (between hallways, offices, stairways, elevators), (2) identification of elements involved in evacuation routes (hallways, stairways), (3) 3D building network construction, by connecting the oor elements via stairways and elevators. We have used these tools to process a cluster of five academic buildings. Our automation tools (despite some needed manual processing) show a significant advantage over manual processing (a few minutes vs. 2-4 hours). Designed as a client-server model, our system supports analytical capabilities to determine dynamic routing within a building under constraints(parts of the building blocked during emergencies, for instance). Visualization capabilities are provided for easy interaction with the system, on both desktop (command post) stations as well as mobile hand-held devices, simulating a command post-responder scenario.

  13. Automated Processing of Imaging Data through Multi-tiered Classification of Biological Structures Illustrated Using Caenorhabditis elegans.

    PubMed

    Zhan, Mei; Crane, Matthew M; Entchev, Eugeni V; Caballero, Antonio; Fernandes de Abreu, Diana Andrea; Ch'ng, QueeLim; Lu, Hang

    2015-04-01

    Quantitative imaging has become a vital technique in biological discovery and clinical diagnostics; a plethora of tools have recently been developed to enable new and accelerated forms of biological investigation. Increasingly, the capacity for high-throughput experimentation provided by new imaging modalities, contrast techniques, microscopy tools, microfluidics and computer controlled systems shifts the experimental bottleneck from the level of physical manipulation and raw data collection to automated recognition and data processing. Yet, despite their broad importance, image analysis solutions to address these needs have been narrowly tailored. Here, we present a generalizable formulation for autonomous identification of specific biological structures that is applicable for many problems. The process flow architecture we present here utilizes standard image processing techniques and the multi-tiered application of classification models such as support vector machines (SVM). These low-level functions are readily available in a large array of image processing software packages and programming languages. Our framework is thus both easy to implement at the modular level and provides specific high-level architecture to guide the solution of more complicated image-processing problems. We demonstrate the utility of the classification routine by developing two specific classifiers as a toolset for automation and cell identification in the model organism Caenorhabditis elegans. To serve a common need for automated high-resolution imaging and behavior applications in the C. elegans research community, we contribute a ready-to-use classifier for the identification of the head of the animal under bright field imaging. Furthermore, we extend our framework to address the pervasive problem of cell-specific identification under fluorescent imaging, which is critical for biological investigation in multicellular organisms or tissues. Using these examples as a guide, we envision

  14. Automated Processing of Imaging Data through Multi-tiered Classification of Biological Structures Illustrated Using Caenorhabditis elegans

    PubMed Central

    Zhan, Mei; Crane, Matthew M.; Entchev, Eugeni V.; Caballero, Antonio; Fernandes de Abreu, Diana Andrea; Ch’ng, QueeLim; Lu, Hang

    2015-01-01

    Quantitative imaging has become a vital technique in biological discovery and clinical diagnostics; a plethora of tools have recently been developed to enable new and accelerated forms of biological investigation. Increasingly, the capacity for high-throughput experimentation provided by new imaging modalities, contrast techniques, microscopy tools, microfluidics and computer controlled systems shifts the experimental bottleneck from the level of physical manipulation and raw data collection to automated recognition and data processing. Yet, despite their broad importance, image analysis solutions to address these needs have been narrowly tailored. Here, we present a generalizable formulation for autonomous identification of specific biological structures that is applicable for many problems. The process flow architecture we present here utilizes standard image processing techniques and the multi-tiered application of classification models such as support vector machines (SVM). These low-level functions are readily available in a large array of image processing software packages and programming languages. Our framework is thus both easy to implement at the modular level and provides specific high-level architecture to guide the solution of more complicated image-processing problems. We demonstrate the utility of the classification routine by developing two specific classifiers as a toolset for automation and cell identification in the model organism Caenorhabditis elegans. To serve a common need for automated high-resolution imaging and behavior applications in the C. elegans research community, we contribute a ready-to-use classifier for the identification of the head of the animal under bright field imaging. Furthermore, we extend our framework to address the pervasive problem of cell-specific identification under fluorescent imaging, which is critical for biological investigation in multicellular organisms or tissues. Using these examples as a guide, we envision

  15. Automated retrieval of forest structure variables based on multi-scale texture analysis of VHR satellite imagery

    NASA Astrophysics Data System (ADS)

    Beguet, Benoit; Guyon, Dominique; Boukir, Samia; Chehata, Nesrine

    2014-10-01

    The main goal of this study is to design a method to describe the structure of forest stands from Very High Resolution satellite imagery, relying on some typical variables such as crown diameter, tree height, trunk diameter, tree density and tree spacing. The emphasis is placed on the automatization of the process of identification of the most relevant image features for the forest structure retrieval task, exploiting both spectral and spatial information. Our approach is based on linear regressions between the forest structure variables to be estimated and various spectral and Haralick's texture features. The main drawback of this well-known texture representation is the underlying parameters which are extremely difficult to set due to the spatial complexity of the forest structure. To tackle this major issue, an automated feature selection process is proposed which is based on statistical modeling, exploring a wide range of parameter values. It provides texture measures of diverse spatial parameters hence implicitly inducing a multi-scale texture analysis. A new feature selection technique, we called Random PRiF, is proposed. It relies on random sampling in feature space, carefully addresses the multicollinearity issue in multiple-linear regression while ensuring accurate prediction of forest variables. Our automated forest variable estimation scheme was tested on Quickbird and Pléiades panchromatic and multispectral images, acquired at different periods on the maritime pine stands of two sites in South-Western France. It outperforms two well-established variable subset selection techniques. It has been successfully applied to identify the best texture features in modeling the five considered forest structure variables. The RMSE of all predicted forest variables is improved by combining multispectral and panchromatic texture features, with various parameterizations, highlighting the potential of a multi-resolution approach for retrieving forest structure

  16. The role of social and ecological processes in structuring animal populations: a case study from automated tracking of wild birds.

    PubMed

    Farine, Damien R; Firth, Josh A; Aplin, Lucy M; Crates, Ross A; Culina, Antica; Garroway, Colin J; Hinde, Camilla A; Kidd, Lindall R; Milligan, Nicole D; Psorakis, Ioannis; Radersma, Reinder; Verhelst, Brecht; Voelkl, Bernhard; Sheldon, Ben C

    2015-04-01

    Both social and ecological factors influence population process and structure, with resultant consequences for phenotypic selection on individuals. Understanding the scale and relative contribution of these two factors is thus a central aim in evolutionary ecology. In this study, we develop a framework using null models to identify the social and spatial patterns that contribute to phenotypic structure in a wild population of songbirds. We used automated technologies to track 1053 individuals that formed 73 737 groups from which we inferred a social network. Our framework identified that both social and spatial drivers contributed to assortment in the network. In particular, groups had a more even sex ratio than expected and exhibited a consistent age structure that suggested local association preferences, such as preferential attachment or avoidance. By contrast, recent immigrants were spatially partitioned from locally born individuals, suggesting differential dispersal strategies by phenotype. Our results highlight how different scales of social decision-making, ranging from post-natal dispersal settlement to fission-fusion dynamics, can interact to drive phenotypic structure in animal populations. PMID:26064644

  17. The role of social and ecological processes in structuring animal populations: a case study from automated tracking of wild birds

    PubMed Central

    Farine, Damien R.; Firth, Josh A.; Aplin, Lucy M.; Crates, Ross A.; Culina, Antica; Garroway, Colin J.; Hinde, Camilla A.; Kidd, Lindall R.; Milligan, Nicole D.; Psorakis, Ioannis; Radersma, Reinder; Verhelst, Brecht; Voelkl, Bernhard; Sheldon, Ben C.

    2015-01-01

    Both social and ecological factors influence population process and structure, with resultant consequences for phenotypic selection on individuals. Understanding the scale and relative contribution of these two factors is thus a central aim in evolutionary ecology. In this study, we develop a framework using null models to identify the social and spatial patterns that contribute to phenotypic structure in a wild population of songbirds. We used automated technologies to track 1053 individuals that formed 73 737 groups from which we inferred a social network. Our framework identified that both social and spatial drivers contributed to assortment in the network. In particular, groups had a more even sex ratio than expected and exhibited a consistent age structure that suggested local association preferences, such as preferential attachment or avoidance. By contrast, recent immigrants were spatially partitioned from locally born individuals, suggesting differential dispersal strategies by phenotype. Our results highlight how different scales of social decision-making, ranging from post-natal dispersal settlement to fission–fusion dynamics, can interact to drive phenotypic structure in animal populations. PMID:26064644

  18. AIDA: ab initio domain assembly for automated multi-domain protein structure prediction and domain–domain interaction prediction

    PubMed Central

    Xu, Dong; Jaroszewski, Lukasz; Li, Zhanwen; Godzik, Adam

    2015-01-01

    Motivation: Most proteins consist of multiple domains, independent structural and evolutionary units that are often reshuffled in genomic rearrangements to form new protein architectures. Template-based modeling methods can often detect homologous templates for individual domains, but templates that could be used to model the entire query protein are often not available. Results: We have developed a fast docking algorithm ab initio domain assembly (AIDA) for assembling multi-domain protein structures, guided by the ab initio folding potential. This approach can be extended to discontinuous domains (i.e. domains with ‘inserted’ domains). When tested on experimentally solved structures of multi-domain proteins, the relative domain positions were accurately found among top 5000 models in 86% of cases. AIDA server can use domain assignments provided by the user or predict them from the provided sequence. The latter approach is particularly useful for automated protein structure prediction servers. The blind test consisting of 95 CASP10 targets shows that domain boundaries could be successfully determined for 97% of targets. Availability and implementation: The AIDA package as well as the benchmark sets used here are available for download at http://ffas.burnham.org/AIDA/. Contact: adam@sanfordburnham.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25701568

  19. Fully Automated and Robust Tracking of Transient Waves in Structured Anatomies Using Dynamic Programming.

    PubMed

    Akkus, Zeynettin; Bayat, Mahdi; Cheong, Mathew; Viksit, Kumar; Erickson, Bradley J; Alizad, Azra; Fatemi, Mostafa

    2016-10-01

    Tissue stiffness is often linked to underlying pathology and can be quantified by measuring the mechanical transient transverse wave speed (TWS) within the medium. Time-of-flight methods based on correlation of the transient signals or tracking of peaks have been used to quantify the TWS from displacement maps obtained with ultrasound pulse-echo techniques. However, it is challenging to apply these methods to in vivo data because of tissue inhomogeneity, noise and artifacts that produce outliers. In this study, we introduce a robust and fully automated method based on dynamic programming to estimate TWS in tissues with known geometries. The method is validated using ultrasound bladder vibrometry data from an in vivo study. We compared the results of our method with those of time-of-flight techniques. Our method performs better than time-of-flight techniques. In conclusion, we present a robust and accurate TWS detection method that overcomes the difficulties of time-of-flight methods. PMID:27425150

  20. Advances in inspection automation

    NASA Astrophysics Data System (ADS)

    Weber, Walter H.; Mair, H. Douglas; Jansen, Dion; Lombardi, Luciano

    2013-01-01

    This new session at QNDE reflects the growing interest in inspection automation. Our paper describes a newly developed platform that makes the complex NDE automation possible without the need for software programmers. Inspection tasks that are tedious, error-prone or impossible for humans to perform can now be automated using a form of drag and drop visual scripting. Our work attempts to rectify the problem that NDE is not keeping pace with the rest of factory automation. Outside of NDE, robots routinely and autonomously machine parts, assemble components, weld structures and report progress to corporate databases. By contrast, components arriving in the NDT department typically require manual part handling, calibrations and analysis. The automation examples in this paper cover the development of robotic thickness gauging and the use of adaptive contour following on the NRU reactor inspection at Chalk River.

  1. Method and system for automated on-chip material and structural certification of MEMS devices

    DOEpatents

    Sinclair, Michael B.; DeBoer, Maarten P.; Smith, Norman F.; Jensen, Brian D.; Miller, Samuel L.

    2003-05-20

    A new approach toward MEMS quality control and materials characterization is provided by a combined test structure measurement and mechanical response modeling approach. Simple test structures are cofabricated with the MEMS devices being produced. These test structures are designed to isolate certain types of physical response, so that measurement of their behavior under applied stress can be easily interpreted as quality control and material properties information.

  2. Automated Feature Extraction and Hydrocode Modeling of Impact Related Structures on Mars: Preliminary Report

    NASA Astrophysics Data System (ADS)

    Plesko, C. S.; Asphaug, E.; Brumby, S. P.; Gisler, G. R.

    2003-07-01

    A systematic, combined modeling and observation effort to correlate Martian impact structures craters and their regional aftermaths to the impactors, impact processes and target geologies responsible.

  3. Distributed cyberinfrastructure tools for automated data processing of structural monitoring data

    NASA Astrophysics Data System (ADS)

    Zhang, Yilan; Kurata, Masahiro; Lynch, Jerome P.; van der Linden, Gwendolyn; Sederat, Hassan; Prakash, Atul

    2012-04-01

    The emergence of cost-effective sensing technologies has now enabled the use of dense arrays of sensors to monitor the behavior and condition of large-scale bridges. The continuous operation of dense networks of sensors presents a number of new challenges including how to manage such massive amounts of data that can be created by the system. This paper reports on the progress of the creation of cyberinfrastructure tools which hierarchically control networks of wireless sensors deployed in a long-span bridge. The internet-enabled cyberinfrastructure is centrally managed by a powerful database which controls the flow of data in the entire monitoring system architecture. A client-server model built upon the database provides both data-provider and system end-users with secured access to various levels of information of a bridge. In the system, information on bridge behavior (e.g., acceleration, strain, displacement) and environmental condition (e.g., wind speed, wind direction, temperature, humidity) are uploaded to the database from sensor networks installed in the bridge. Then, data interrogation services interface with the database via client APIs to autonomously process data. The current research effort focuses on an assessment of the scalability and long-term robustness of the proposed cyberinfrastructure framework that has been implemented along with a permanent wireless monitoring system on the New Carquinez (Alfred Zampa Memorial) Suspension Bridge in Vallejo, CA. Many data interrogation tools are under development using sensor data and bridge metadata (e.g., geometric details, material properties, etc.) Sample data interrogation clients including those for the detection of faulty sensors, automated modal parameter extraction.

  4. Automated wind load characterization of wind turbine structures by embedded model updating

    NASA Astrophysics Data System (ADS)

    Swartz, R. Andrew; Zimmerman, Andrew T.; Lynch, Jerome P.

    2010-04-01

    The continued development of renewable energy resources is for the nation to limit its carbon footprint and to enjoy independence in energy production. Key to that effort are reliable generators of renewable energy sources that are economically competitive with legacy sources. In the area of wind energy, a major contributor to the cost of implementation is large uncertainty regarding the condition of wind turbines in the field due to lack of information about loading, dynamic response, and fatigue life of the structure expended. Under favorable circumstances, this uncertainty leads to overly conservative designs and maintenance schedules. Under unfavorable circumstances, it leads to inadequate maintenance schedules, damage to electrical systems, or even structural failure. Low-cost wireless sensors can provide more certainty for stakeholders by measuring the dynamic response of the structure to loading, estimating the fatigue state of the structure, and extracting loading information from the structural response without the need of an upwind instrumentation tower. This study presents a method for using wireless sensor networks to estimate the spectral properties of a wind turbine tower loading based on its measured response and some rudimentary knowledge of its structure. Structural parameters are estimated via model-updating in the frequency domain to produce an identification of the system. The updated structural model and the measured output spectra are then used to estimate the input spectra. Laboratory results are presented indicating accurate load characterization.

  5. A hybrid computational-experimental approach for automated crystal structure solution

    NASA Astrophysics Data System (ADS)

    Meredig, Bryce; Wolverton, C.

    2013-02-01

    Crystal structure solution from diffraction experiments is one of the most fundamental tasks in materials science, chemistry, physics and geology. Unfortunately, numerous factors render this process labour intensive and error prone. Experimental conditions, such as high pressure or structural metastability, often complicate characterization. Furthermore, many materials of great modern interest, such as batteries and hydrogen storage media, contain light elements such as Li and H that only weakly scatter X-rays. Finally, structural refinements generally require significant human input and intuition, as they rely on good initial guesses for the target structure. To address these many challenges, we demonstrate a new hybrid approach, first-principles-assisted structure solution (FPASS), which combines experimental diffraction data, statistical symmetry information and first-principles-based algorithmic optimization to automatically solve crystal structures. We demonstrate the broad utility of FPASS to clarify four important crystal structure debates: the hydrogen storage candidates MgNH and NH3BH3; Li2O2, relevant to Li-air batteries; and high-pressure silane, SiH4.

  6. Automating crystallographic structure solution and refinement of protein–ligand complexes

    PubMed Central

    Echols, Nathaniel; Moriarty, Nigel W.; Klei, Herbert E.; Afonine, Pavel V.; Bunkóczi, Gábor; Headd, Jeffrey J.; McCoy, Airlie J.; Oeffner, Robert D.; Read, Randy J.; Terwilliger, Thomas C.; Adams, Paul D.

    2014-01-01

    High-throughput drug-discovery and mechanistic studies often require the determination of multiple related crystal structures that only differ in the bound ligands, point mutations in the protein sequence and minor conformational changes. If performed manually, solution and refinement requires extensive repetition of the same tasks for each structure. To accelerate this process and minimize manual effort, a pipeline encompassing all stages of ligand building and refinement, starting from integrated and scaled diffraction intensities, has been implemented in Phenix. The resulting system is able to successfully solve and refine large collections of structures in parallel without extensive user intervention prior to the final stages of model completion and validation. PMID:24419387

  7. Automated Lipid A Structure Assignment from Hierarchical Tandem Mass Spectrometry Data

    NASA Astrophysics Data System (ADS)

    Ting, Ying S.; Shaffer, Scott A.; Jones, Jace W.; Ng, Wailap V.; Ernst, Robert K.; Goodlett, David R.

    2011-05-01

    Infusion-based electrospray ionization (ESI) coupled to multiple-stage tandem mass spectrometry (MS n ) is a standard methodology for investigating lipid A structural diversity (Shaffer et al. J. Am. Soc. Mass. Spectrom. 18(6), 1080-1092, 2007). Annotation of these MS n spectra, however, has remained a manual, expert-driven process. In order to keep up with the data acquisition rates of modern instruments, we devised a computational method to annotate lipid A MS n spectra rapidly and automatically, which we refer to as hierarchical tandem mass spectrometry (HiTMS) algorithm. As a first-pass tool, HiTMS aids expert interpretation of lipid A MS n data by providing the analyst with a set of candidate structures that may then be confirmed or rejected. HiTMS deciphers the signature ions (e.g., A-, Y-, and Z-type ions) and neutral losses of MS n spectra using a species-specific library based on general prior structural knowledge of the given lipid A species under investigation. Candidates are selected by calculating the correlation between theoretical and acquired MS n spectra. At a false discovery rate of less than 0.01, HiTMS correctly assigned 85% of the structures in a library of 133 manually annotated Francisella tularensis subspecies novicida lipid A structures. Additionally, HiTMS correctly assigned 85% of the structures in a smaller library of lipid A species from Yersinia pestis demonstrating that it may be used across species.

  8. Comparison of the bacterial community structure within the equine hindgut and faeces using Automated Ribosomal Intergenic Spacer Analysis (ARISA).

    PubMed

    Sadet-Bourgeteau, S; Philippeau, C; Dequiedt, S; Julliand, V

    2014-12-01

    The horse's hindgut bacterial ecosystem has often been studied using faecal samples. However few studies compared both bacterial ecosystems and the validity of using faecal samples may be questionable. Hence, the present study aimed to compare the structure of the equine bacterial community in the hindgut (caecum, right ventral colon) and faeces using a fingerprint technique known as Automated Ribosomal Intergenic Spacer Analysis (ARISA). Two DNA extraction methods were also assessed. Intestinal contents and faeces were sampled 3 h after the morning meal on four adult fistulated horses fed meadow hay and pelleted concentrate. Irrespective of the intestinal segment, Principal Component Analysis of ARISA profiles showed a strong individual effect (P<0.0001). However, across the study, faecal bacterial community structure significantly (P<0.001) differed from those of the caecum and colon, while there was no difference between the two hindgut communities. The use of a QIAamp(®) DNA Stool Mini kit increased the quality of DNA extracted irrespective of sample type. The differences observed between faecal and hindgut bacterial communities challenge the use of faeces as a representative for hindgut activity. Further investigations are necessary to compare bacterial activity between the hindgut and faeces in order to understand the validity of using faecal samples. PMID:25075719

  9. Automated Quantification of Arbitrary Arm-Segment Structure in Spiral Galaxies

    NASA Astrophysics Data System (ADS)

    Davis, Darren Robert

    This thesis describes a system that, given approximately-centered images of spiral galaxies, produces quantitative descriptions of spiral galaxy structure without the need for per-image human input. This structure information consists of a list of spiral arm segments, each associated with a fitted logarithmic spiral arc and a pixel region. This list-of-arcs representation allows description of arbitrary spiral galaxy structure: the arms do not need to be symmetric, may have forks or bends, and, more generally, may be arranged in any manner with a consistent spiral-pattern center (non-merging galaxies have a sufficiently well-defined center). Such flexibility is important in order to accommodate the myriad structure variations observed in spiral galaxies. From the arcs produced from our method it is possible to calculate measures of spiral galaxy structure such as winding direction, winding tightness, arm counts, asymmetry, or other values of interest (including user-defined measures). In addition to providing information about the spiral arm "skeleton" of each galaxy, our method can enable analyses of brightness within individual spiral arms, since we provide the pixel regions associated with each spiral arm segment. For winding direction, arm tightness, and arm count, comparable information is available (to various extents) from previous efforts; to the extent that such information is available, we find strong correspondence with our output. We also characterize the changes to (and invariances in) our output as a function of modifications to important algorithm parameters. By enabling generation of extensive data about spiral galaxy structure from large-scale sky surveys, our method will enable new discoveries and tests regarding the nature of galaxies and the universe, and will facilitate subsequent work to automatically fit detailed brightness models of spiral galaxies.

  10. A method for automated determination of the crystal structures from X-ray powder diffraction data

    SciTech Connect

    Hofmann, D. W. M. Kuleshova, L. N.

    2006-05-15

    An algorithm is proposed for determining the crystal structure of compounds. In the framework of this algorithm, X-ray powder diffraction patterns are compared using a new similarity index. Unlike the indices traditionally employed in X-ray powder diffraction analysis, the new similarity index can be applied even in the case of overlapping peaks and large differences in unit cell parameters. The capabilities of the proposed procedure are demonstrated by solving the crystal structures of a number of organic pigments (PY111, PR181, Me-PR170)

  11. Automated assignment of NMR chemical shifts based on a known structure and 4D spectra.

    PubMed

    Trautwein, Matthias; Fredriksson, Kai; Möller, Heiko M; Exner, Thomas E

    2016-08-01

    Apart from their central role during 3D structure determination of proteins the backbone chemical shift assignment is the basis for a number of applications, like chemical shift perturbation mapping and studies on the dynamics of proteins. This assignment is not a trivial task even if a 3D protein structure is known and needs almost as much effort as the assignment for structure prediction if performed manually. We present here a new algorithm based solely on 4D [(1)H,(15)N]-HSQC-NOESY-[(1)H,(15)N]-HSQC spectra which is able to assign a large percentage of chemical shifts (73-82 %) unambiguously, demonstrated with proteins up to a size of 250 residues. For the remaining residues, a small number of possible assignments is filtered out. This is done by comparing distances in the 3D structure to restraints obtained from the peak volumes in the 4D spectrum. Using dead-end elimination, assignments are removed in which at least one of the restraints is violated. Including additional information from chemical shift predictions, a complete unambiguous assignment was obtained for Ubiquitin and 95 % of the residues were correctly assigned in the 251 residue-long N-terminal domain of enzyme I. The program including source code is available at https://github.com/thomasexner/4Dassign . PMID:27484442

  12. Combining Structure and Sequence Information Allows Automated Prediction of Substrate Specificities within Enzyme Families

    PubMed Central

    Röttig, Marc; Rausch, Christian; Kohlbacher, Oliver

    2010-01-01

    An important aspect of the functional annotation of enzymes is not only the type of reaction catalysed by an enzyme, but also the substrate specificity, which can vary widely within the same family. In many cases, prediction of family membership and even substrate specificity is possible from enzyme sequence alone, using a nearest neighbour classification rule. However, the combination of structural information and sequence information can improve the interpretability and accuracy of predictive models. The method presented here, Active Site Classification (ASC), automatically extracts the residues lining the active site from one representative three-dimensional structure and the corresponding residues from sequences of other members of the family. From a set of representatives with known substrate specificity, a Support Vector Machine (SVM) can then learn a model of substrate specificity. Applied to a sequence of unknown specificity, the SVM can then predict the most likely substrate. The models can also be analysed to reveal the underlying structural reasons determining substrate specificities and thus yield valuable insights into mechanisms of enzyme specificity. We illustrate the high prediction accuracy achieved on two benchmark data sets and the structural insights gained from ASC by a detailed analysis of the family of decarboxylating dehydrogenases. The ASC web service is available at http://asc.informatik.uni-tuebingen.de/. PMID:20072606

  13. Using Structure-Based Organic Chemistry Online Tutorials with Automated Correction for Student Practice and Review

    ERIC Educational Resources Information Center

    O'Sullivan, Timothy P.; Hargaden, Gra´inne C.

    2014-01-01

    This article describes the development and implementation of an open-access organic chemistry question bank for online tutorials and assessments at University College Cork and Dublin Institute of Technology. SOCOT (structure-based organic chemistry online tutorials) may be used to supplement traditional small-group tutorials, thereby allowing…

  14. Low-cost impact detection and location for automated inspections of 3D metallic based structures.

    PubMed

    Morón, Carlos; Portilla, Marina P; Somolinos, José A; Morales, Rafael

    2015-01-01

    This paper describes a new low-cost means to detect and locate mechanical impacts (collisions) on a 3D metal-based structure. We employ the simple and reasonably hypothesis that the use of a homogeneous material will allow certain details of the impact to be automatically determined by measuring the time delays of acoustic wave propagation throughout the 3D structure. The location of strategic piezoelectric sensors on the structure and an electronic-computerized system has allowed us to determine the instant and position at which the impact is produced. The proposed automatic system allows us to fully integrate impact point detection and the task of inspecting the point or zone at which this impact occurs. What is more, the proposed method can be easily integrated into a robot-based inspection system capable of moving over 3D metallic structures, thus avoiding (or minimizing) the need for direct human intervention. Experimental results are provided to show the effectiveness of the proposed approach. PMID:26029951

  15. Automated antibody structure prediction using Accelrys tools: Results and best practices

    PubMed Central

    Fasnacht, Marc; Butenhof, Ken; Goupil-Lamy, Anne; Hernandez-Guzman, Francisco; Huang, Hongwei; Yan, Lisa

    2014-01-01

    We describe the methodology and results from our participation in the second Antibody Modeling Assessment experiment. During the experiment we predicted the structure of eleven unpublished antibody Fv fragments. Our prediction methods centered on template-based modeling; potential templates were selected from an antibody database based on their sequence similarity to the target in the framework regions. Depending on the quality of the templates, we constructed models of the antibody framework regions either using a single, chimeric or multiple template approach. The hypervariable loop regions in the initial models were rebuilt by grafting the corresponding regions from suitable templates onto the model. For the H3 loop region, we further refined models using ab initio methods. The final models were subjected to constrained energy minimization to resolve severe local structural problems. The analysis of the models submitted show that Accelrys tools allow for the construction of quite accurate models for the framework and the canonical CDR regions, with RMSDs to the X-ray structure on average below 1 Å for most of these regions. The results show that accurate prediction of the H3 hypervariable loops remains a challenge. Furthermore, model quality assessment of the submitted models show that the models are of quite high quality, with local geometry assessment scores similar to that of the target X-ray structures. Proteins 2014; 82:1583–1598. © 2014 The Authors. Proteins published by Wiley Periodicals, Inc. PMID:24833271

  16. Low-Cost Impact Detection and Location for Automated Inspections of 3D Metallic Based Structures

    PubMed Central

    Morón, Carlos; Portilla, Marina P.; Somolinos, José A.; Morales, Rafael

    2015-01-01

    This paper describes a new low-cost means to detect and locate mechanical impacts (collisions) on a 3D metal-based structure. We employ the simple and reasonably hypothesis that the use of a homogeneous material will allow certain details of the impact to be automatically determined by measuring the time delays of acoustic wave propagation throughout the 3D structure. The location of strategic piezoelectric sensors on the structure and an electronic-computerized system has allowed us to determine the instant and position at which the impact is produced. The proposed automatic system allows us to fully integrate impact point detection and the task of inspecting the point or zone at which this impact occurs. What is more, the proposed method can be easily integrated into a robot-based inspection system capable of moving over 3D metallic structures, thus avoiding (or minimizing) the need for direct human intervention. Experimental results are provided to show the effectiveness of the proposed approach. PMID:26029951

  17. Automating unambiguous NOE data usage in NVR for NMR protein structure-based assignments.

    PubMed

    Akhmedov, Murodzhon; Çatay, Bülent; Apaydın, Mehmet Serkan

    2015-12-01

    Nuclear Magnetic Resonance (NMR) Spectroscopy is an important technique that allows determining protein structure in solution. An important problem in protein structure determination using NMR spectroscopy is the mapping of peaks to corresponding amino acids, also known as the assignment problem. Structure-Based Assignment (SBA) is an approach to solve this problem using a template structure that is homologous to the target. Our previously developed approach Nuclear Vector Replacement-Binary Integer Programming (NVR-BIP) computed the optimal solution for small proteins, but was unable to solve the assignments of large proteins. NVR-Ant Colony Optimization (ACO) extended the applicability of the NVR approach for such proteins. One of the input data utilized in these approaches is the Nuclear Overhauser Effect (NOE) data. NOE is an interaction observed between two protons if the protons are located close in space. These protons could be amide protons, protons attached to the alpha-carbon atom in the backbone of the protein, or side chain protons. NVR only uses backbone protons. In this paper, we reformulate the NVR-BIP model to distinguish the type of proton in NOE data and use the corresponding proton coordinates in the extended formulation. In addition, the threshold value over interproton distances is set in a standard manner for all proteins by extracting the NOE upper bound distance information from the data. We also convert NOE intensities into distance thresholds. Our new approach thus handles the NOE data correctly and without manually determined parameters. We accordingly adapt NVR-ACO solution methodology to these changes. Computational results show that our approaches obtain optimal solutions for small proteins. For the large proteins our ant colony optimization-based approach obtains promising results. PMID:26260854

  18. Automated polyp measurement based on colon structure decomposition for CT colonography

    NASA Astrophysics Data System (ADS)

    Wang, Huafeng; Li, Lihong C.; Han, Hao; Peng, Hao; Song, Bowen; Wei, Xinzhou; Liang, Zhengrong

    2014-03-01

    Accurate assessment of colorectal polyp size is of great significance for early diagnosis and management of colorectal cancers. Due to the complexity of colon structure, polyps with diverse geometric characteristics grow from different landform surfaces. In this paper, we present a new colon decomposition approach for polyp measurement. We first apply an efficient maximum a posteriori expectation-maximization (MAP-EM) partial volume segmentation algorithm to achieve an effective electronic cleansing on colon. The global colon structure is then decomposed into different kinds of morphological shapes, e.g. haustral folds or haustral wall. Meanwhile, the polyp location is identified by an automatic computer aided detection algorithm. By integrating the colon structure decomposition with the computer aided detection system, a patch volume of colon polyps is extracted. Thus, polyp size assessment can be achieved by finding abnormal protrusion on a relative uniform morphological surface from the decomposed colon landform. We evaluated our method via physical phantom and clinical datasets. Experiment results demonstrate the feasibility of our method in consistently quantifying the size of polyp volume and, therefore, facilitating characterizing for clinical management.

  19. Automated procedure for design of wing structures to satisfy strength and flutter requirements

    NASA Technical Reports Server (NTRS)

    Haftka, R. T.

    1973-01-01

    A pilot computer program was developed for the design of minimum mass wing structures under flutter, strength, and minimum gage constraints. The wing structure is idealized by finite elements, and second-order piston theory aerodynamics is used in the flutter calculation. Mathematical programing methods are used for the optimization. Computation times during the design process are reduced by three techniques. First, iterative analysis methods used to reduce significantly reanalysis times. Second, the number of design variables is kept small by not using a one-to-one correspondence between finite elements and design variables. Third, a technique for using approximate second derivatives with Newton's method for the optimization is incorporated. The program output is compared witH previous published results. It is found that some flutter characteristics, such as the flutter speed, can display discontinous dependence on the design variables (which are the thicknesses of the structural elements). It is concluded that it is undesirable to use such quantities in the formulation of the flutter constraint.

  20. Automated Voxel Model from Point Clouds for Structural Analysis of Cultural Heritage

    NASA Astrophysics Data System (ADS)

    Bitelli, G.; Castellazzi, G.; D'Altri, A. M.; De Miranda, S.; Lambertini, A.; Selvaggi, I.

    2016-06-01

    In the context of cultural heritage, an accurate and comprehensive digital survey of a historical building is today essential in order to measure its geometry in detail for documentation or restoration purposes, for supporting special studies regarding materials and constructive characteristics, and finally for structural analysis. Some proven geomatic techniques, such as photogrammetry and terrestrial laser scanning, are increasingly used to survey buildings with different complexity and dimensions; one typical product is in form of point clouds. We developed a semi-automatic procedure to convert point clouds, acquired from laserscan or digital photogrammetry, to a filled volume model of the whole structure. The filled volume model, in a voxel format, can be useful for further analysis and also for the generation of a Finite Element Model (FEM) of the surveyed building. In this paper a new approach is presented with the aim to decrease operator intervention in the workflow and obtain a better description of the structure. In order to achieve this result a voxel model with variable resolution is produced. Different parameters are compared and different steps of the procedure are tested and validated in the case study of the North tower of the San Felice sul Panaro Fortress, a monumental historical building located in San Felice sul Panaro (Modena, Italy) that was hit by an earthquake in 2012.

  1. Automated method for the identification and analysis of vascular tree structures in retinal vessel network

    NASA Astrophysics Data System (ADS)

    Joshi, Vinayak S.; Garvin, Mona K.; Reinhardt, Joseph M.; Abramoff, Michael D.

    2011-03-01

    Structural analysis of retinal vessel network has so far served in the diagnosis of retinopathies and systemic diseases. The retinopathies are known to affect the morphologic properties of retinal vessels such as course, shape, caliber, and tortuosity. Whether the arteries and the veins respond to these changes together or in tandem has always been a topic of discussion. However the diseases such as diabetic retinopathy and retinopathy of prematurity have been diagnosed with the morphologic changes specific either to arteries or to veins. Thus a method describing the separation of retinal vessel trees imaged in a two dimensional color fundus image may assist in artery-vein classification and quantitative assessment of morphologic changes particular to arteries or veins. We propose a method based on mathematical morphology and graph search to identify and label the retinal vessel trees, which provides a structural mapping of vessel network in terms of each individual primary vessel, its branches and spatial positions of branching and cross-over points. The method was evaluated on a dataset of 15 fundus images resulting into an accuracy of 92.87 % correctly assigned vessel pixels when compared with the manual labeling of separated vessel trees. Accordingly, the structural mapping method performs well and we are currently investigating its potential in evaluating the characteristic properties specific to arteries or veins.

  2. Automated preliminary design of simplified wing structures to satisfy strength and flutter requirements

    NASA Technical Reports Server (NTRS)

    Stroud, W. J.; Dexter, C. B.; Stein, M.

    1972-01-01

    A simple structural model of an aircraft wing is used to show the effects of strength (stress) and flutter requirements on the design of minimum-weight aircraft-wing structures. The wing is idealized as an isotropic sandwich plate with a variable cover thickness distribution and a variable depth between covers. Plate theory is used for the structural analysis, and piston theory is used for the unsteady aerodynamics in the flutter analysis. Mathematical programming techniques are used to find the minimum-weight cover thickness distribution which satisfies flutter, strength, and minimum-gage constraints. The method of solution, some sample results, and the computer program used to obtain these results are presented. The results indicate that the cover thickness distribution obtained when designing for the strength requirement alone may be quite different from the cover thickness distribution obtained when designing for either the flutter requirement alone or for both the strength and flutter requirements concurrently. This conclusion emphasizes the need for designing for both flutter and strength from the outset.

  3. CHEM-PATH-TRACKER: An automated tool to analyze chemical motifs in molecular structures.

    PubMed

    Ribeiro, João V; Cerqueira, N M F S A; Fernandes, Pedro A; Ramos, Maria J

    2014-07-01

    In this article, we propose a method for locating functionally relevant chemical motifs in protein structures. The chemical motifs can be a small group of residues or structure protein fragments with highly conserved properties that have important biological functions. However, the detection of chemical motifs is rather difficult because they often consist of a set of amino acid residues separated by long, variable regions, and they only come together to form a functional group when the protein is folded into its three-dimensional structure. Furthermore, the assemblage of these residues is often dependent on non-covalent interactions among the constituent amino acids that are difficult to detect or visualize. To simplify the analysis of these chemical motifs and give access to a generalized use for all users, we developed chem-path-tracker. This software is a VMD plug-in that allows the user to highlight and reveal potential chemical motifs requiring only a few selections. The analysis is based on atoms/residues pair distances applying a modified version of Dijkstra's algorithm, and it makes possible to monitor the distances of a large pathway, even during a molecular dynamics simulation. This tool turned out to be very useful, fast, and user-friendly in the performed tests. The chem-path-tracker package is distributed as an independent platform and can be found at http://www.fc.up.pt/PortoBioComp/database/doku.php?id=chem-path-tracker. PMID:24775806

  4. PDB2PQR: expanding and upgrading automated preparation of biomolecular structures for molecular simulations

    PubMed Central

    Dolinsky, Todd J.; Czodrowski, Paul; Li, Hui; Nielsen, Jens E.; Jensen, Jan H.; Klebe, Gerhard; Baker, Nathan A.

    2007-01-01

    Real-world observable physical and chemical characteristics are increasingly being calculated from the 3D structures of biomolecules. Methods for calculating pKa values, binding constants of ligands, and changes in protein stability are readily available, but often the limiting step in computational biology is the conversion of PDB structures into formats ready for use with biomolecular simulation software. The continued sophistication and integration of biomolecular simulation methods for systems- and genome-wide studies requires a fast, robust, physically realistic and standardized protocol for preparing macromolecular structures for biophysical algorithms. As described previously, the PDB2PQR web server addresses this need for electrostatic field calculations (Dolinsky et al., Nucleic Acids Research, 32, W665–W667, 2004). Here we report the significantly expanded PDB2PQR that includes the following features: robust standalone command line support, improved pKa estimation via the PROPKA framework, ligand parameterization via PEOE_PB charge methodology, expanded set of force fields and easily incorporated user-defined parameters via XML input files, and improvement of atom addition and optimization code. These features are available through a new web interface (http://pdb2pqr.sourceforge.net/), which offers users a wide range of options for PDB file conversion, modification and parameterization. PMID:17488841

  5. Upper-mantle shear-wave structure under East and Southeast Asia from Automated Multimode Inversion of waveforms

    NASA Astrophysics Data System (ADS)

    Legendre, C. P.; Zhao, L.; Chen, Q.-F.

    2015-10-01

    We present a new Sv-velocity model of the upper mantle under East and Southeast Asia constrained by the inversion of seismic waveforms recorded by broad-band stations. Seismograms from earthquakes occurred between 1977 and 2012 are collected from about 4786 permanent and temporary stations in the region whenever and wherever available. Automated Multimode Inversion of surface and multiple-S waveforms is applied to extract structural information from the seismograms, in the form of linear equations with uncorrelated uncertainties. The equations are then solved for the seismic velocity perturbations in the crust and upper mantle with respect to a three-dimensional (3-D) reference model and a realistic crust. Major features of the lithosphere-asthenosphere system in East and Southeast Asia are identified in the resulting model. At lithospheric depth, low velocities can be seen beneath Tibet, whereas high velocities are found beneath cratonic regions, such as the Siberian, North China, Yangtze,) Tarim, and Dharwarand cratons. A number of microplates are mapped and the interaction with neighbouring plates is discussed. Slabs from the Pacific and Indian Oceans can be seen in the upper mantle. Passive marginal basins and subduction zones are also properly resolved.

  6. Automated transient thermography for the inspection of CFRP structures: experimental results and developed procedures

    NASA Astrophysics Data System (ADS)

    Theodorakeas, P.; Avdelidis, N. P.; Hrissagis, K.; Ibarra-Castanedo, C.; Koui, M.; Maldague, X.

    2011-05-01

    In thermography surveys, the inspector uses the camera to acquire images from the examined part. Common problems are the lack of repeatability when trying to repeat the scanning process, the need to carry the equipment during scanning, and long setting-up time. The aim of this paper is to present transient thermography results on CFRP plates for assessing different types of fabricated defects (impact damage, inclusions for delaminations, etc), as well as and to discuss and present a prototype robotic scanner to apply non destructive testing (thermographic scanning) on materials and structures. Currently, the scanning process is not automatic. The equipment to be developed, will be able to perform thermal NDT scanning on structures, create the appropriate scanning conditions (material thermal excitation), and ensure precision and tracking of scanning process. A thermographic camera that will be used for the image acquisition of the non destructive inspection, will be installed on a x, y, z, linear manipulator's end effector and would be surrounded by excitation sources (optical lamps), required for the application of transient thermography. In this work various CFRP samples of different shape, thickness and geometry were investigated using two different thermographic systems in order to compare and evaluate their effectiveness concerning the internal defect detectability under different testing conditions.

  7. Informatics in radiology: automated structured reporting of imaging findings using the AIM standard and XML.

    PubMed

    Zimmerman, Stefan L; Kim, Woojin; Boonn, William W

    2011-01-01

    Quantitative and descriptive imaging data are a vital component of the radiology report and are frequently of paramount importance to the ordering physician. Unfortunately, current methods of recording these data in the report are both inefficient and error prone. In addition, the free-text, unstructured format of a radiology report makes aggregate analysis of data from multiple reports difficult or even impossible without manual intervention. A structured reporting work flow has been developed that allows quantitative data created at an advanced imaging workstation to be seamlessly integrated into the radiology report with minimal radiologist intervention. As an intermediary step between the workstation and the reporting software, quantitative and descriptive data are converted into an extensible markup language (XML) file in a standardized format specified by the Annotation and Image Markup (AIM) project of the National Institutes of Health Cancer Biomedical Informatics Grid. The AIM standard was created to allow image annotation data to be stored in a uniform machine-readable format. These XML files containing imaging data can also be stored on a local database for data mining and analysis. This structured work flow solution has the potential to improve radiologist efficiency, reduce errors, and facilitate storage of quantitative and descriptive imaging data for research. PMID:21357413

  8. Automated structure extraction and XML conversion of life science database flat files.

    PubMed

    Philippi, Stephan; Köhler, Jacob

    2006-10-01

    In the light of the increasing number of biological databases, their integration is a fundamental prerequisite for answering complex biological questions. Database integration, therefore, is an important area of research in bioinformatics. Since most of the publicly available life science databases are still exclusively exchanged by means of proprietary flat files, database integration requires parsers for very different flat file formats. Unfortunately, the development and maintenance of database specific flat file parsers is a nontrivial and time-consuming task, which takes considerable effort in large-scale integration scenarios. This paper introduces heuristically based concepts for automatic structure extraction from life science database flat files. On the basis of these concepts the FlatEx prototype is developed for the automatic conversion of flat files into XML representations. PMID:17044405

  9. Development of a Genetic Algorithm to Automate Clustering of a Dependency Structure Matrix

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; Korte, John J.; Bilardo, Vincent J.

    2006-01-01

    Much technology assessment and organization design data exists in Microsoft Excel spreadsheets. Tools are needed to put this data into a form that can be used by design managers to make design decisions. One need is to cluster data that is highly coupled. Tools such as the Dependency Structure Matrix (DSM) and a Genetic Algorithm (GA) can be of great benefit. However, no tool currently combines the DSM and a GA to solve the clustering problem. This paper describes a new software tool that interfaces a GA written as an Excel macro with a DSM in spreadsheet format. The results of several test cases are included to demonstrate how well this new tool works.

  10. Endoscopic system for automated high dynamic range inspection of moving periodic structures

    NASA Astrophysics Data System (ADS)

    Hahlweg, Cornelius; Rothe, Hendrik

    2015-09-01

    In the current paper an advanced endoscopic system for high resolution and high dynamic range inspection of periodic structures in rotating machines is presented. We address the system architecture, short time illumination, special optical problems, such as excluding the specular reflex, image processing, forward velocity prediction and metrological image processing. There are several special requirements to be met, such as the thermal stability above 100°C, robustness of the image field, illumination in view direction and the separation of metallic surface diffuse scatter. To find a compromise between image resolution and frame rate, an external sensor system was applied for synchronization with the moving target. The system originally was intended for inspection of thermal engines, but turned out to be of a more general use. Beside the theoretical part and dimensioning issues, practical examples and measurement results are included.

  11. Application of Hadamard spectroscopy to automated structure verification in high-throughput NMR.

    PubMed

    Ruan, Ke; Yang, Shengtian; Van Sant, Karey A; Likos, John J

    2009-08-01

    Combined verification using 1-D proton and HSQC has been proved to be quite successful; the acquisition time of HSQC spectra, however, can be limiting in its high-throughput applications. The replacement with Hadamard HSQC can significantly enhance the throughput. We hereby propose a protocol to optimize the grouping of the predicted carbon chemical shifts from the proposed structure and the associated Hadamard frequencies and bandwidths. The resulting Hadamard HSQC spectra compare favorably with their Fourier-transformed counterparts, and have demonstrated to perform equivalently in terms of combined verification, but with several fold enhancement in throughput, as illustrated for 21 commercial available molecules and 16 prototypical drug compounds. Further improvement of the verification accuracy can be achieved by the cross validation from Hadamard TOCSY, which can be acquired without much sacrifice in throughput. PMID:19496061

  12. Automated segmentation of intramacular layers in Fourier domain optical coherence tomography structural images from normal subjects

    PubMed Central

    Zhang, Xusheng; Yousefi, Siavash; An, Lin

    2012-01-01

    Abstract. Segmentation of optical coherence tomography (OCT) cross-sectional structural images is important for assisting ophthalmologists in clinical decision making in terms of both diagnosis and treatment. We present an automatic approach for segmenting intramacular layers in Fourier domain optical coherence tomography (FD-OCT) images using a searching strategy based on locally weighted gradient extrema, coupled with an error-removing technique based on statistical error estimation. A two-step denoising preprocess in different directions is also employed to suppress random speckle noise while preserving the layer boundary as intact as possible. The algorithms are tested on the FD-OCT volume images obtained from four normal subjects, which successfully identify the boundaries of seven physiological layers, consistent with the results based on manual determination of macular OCT images. PMID:22559689

  13. Automated grid generation from models of complex geologic structure and stratigraphy

    SciTech Connect

    Gable, C.; Trease, H.; Cherry, T.

    1996-04-01

    The construction of computational grids which accurately reflect complex geologic structure and stratigraphy for flow and transport models poses a formidable task. With an understanding of stratigraphy, material properties and boundary and initial conditions, the task of incorporating this data into a numerical model can be difficult and time consuming. Most GIS tools for representing complex geologic volumes and surfaces are not designed for producing optimal grids for flow and transport computation. We have developed a tool, GEOMESH, for generating finite element grids that maintain the geometric integrity of input volumes, surfaces, and geologic data and produce an optimal (Delaunay) tetrahedral grid that can be used for flow and transport computations. GEOMESH also satisfies the constraint that the geometric coupling coefficients of the grid are positive for all elements. GEOMESH generates grids for two dimensional cross sections, three dimensional regional models, represents faults and fractures, and has the capability of including finer grids representing tunnels and well bores into grids. GEOMESH also permits adaptive grid refinement in three dimensions. The tools to glue, merge and insert grids together demonstrate how complex grids can be built from simpler pieces. The resulting grid can be utilized by unstructured finite element or integrated finite difference computational physics codes.

  14. Automated Structure- and Sequence-Based Design of Proteins for High Bacterial Expression and Stability.

    PubMed

    Goldenzweig, Adi; Goldsmith, Moshe; Hill, Shannon E; Gertman, Or; Laurino, Paola; Ashani, Yacov; Dym, Orly; Unger, Tamar; Albeck, Shira; Prilusky, Jaime; Lieberman, Raquel L; Aharoni, Amir; Silman, Israel; Sussman, Joel L; Tawfik, Dan S; Fleishman, Sarel J

    2016-07-21

    Upon heterologous overexpression, many proteins misfold or aggregate, thus resulting in low functional yields. Human acetylcholinesterase (hAChE), an enzyme mediating synaptic transmission, is a typical case of a human protein that necessitates mammalian systems to obtain functional expression. We developed a computational strategy and designed an AChE variant bearing 51 mutations that improved core packing, surface polarity, and backbone rigidity. This variant expressed at ∼2,000-fold higher levels in E. coli compared to wild-type hAChE and exhibited 20°C higher thermostability with no change in enzymatic properties or in the active-site configuration as determined by crystallography. To demonstrate broad utility, we similarly designed four other human and bacterial proteins. Testing at most three designs per protein, we obtained enhanced stability and/or higher yields of soluble and active protein in E. coli. Our algorithm requires only a 3D structure and several dozen sequences of naturally occurring homologs, and is available at http://pross.weizmann.ac.il. PMID:27425410

  15. Automation or De-automation

    NASA Astrophysics Data System (ADS)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  16. Process automation

    SciTech Connect

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs.

  17. Automation pilot

    NASA Technical Reports Server (NTRS)

    1983-01-01

    An important concept of the Action Information Management System (AIMS) approach is to evaluate office automation technology in the context of hands on use by technical program managers in the conduct of human acceptance difficulties which may accompany the transition to a significantly changing work environment. The improved productivity and communications which result from application of office automation technology are already well established for general office environments, but benefits unique to NASA are anticipated and these will be explored in detail.

  18. Automated Urinalysis

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Information from NASA Tech Briefs assisted DiaSys Corporation in the development of the R/S 2000 which automates urinalysis, eliminating most manual procedures. An automatic aspirator is inserted into a standard specimen tube, the "Sample" button is pressed, and within three seconds a consistent amount of urine sediment is transferred to a microscope. The instrument speeds up, standardizes, automates and makes urine analysis safer. Additional products based on the same technology are anticipated.

  19. Internal Transcribed Spacer 2 (nu ITS2 rRNA) Sequence-Structure Phylogenetics: Towards an Automated Reconstruction of the Green Algal Tree of Life

    PubMed Central

    Buchheim, Mark A.; Keller, Alexander; Koetschan, Christian; Förster, Frank; Merget, Benjamin; Wolf, Matthias

    2011-01-01

    Background Chloroplast-encoded genes (matK and rbcL) have been formally proposed for use in DNA barcoding efforts targeting embryophytes. Extending such a protocol to chlorophytan green algae, though, is fraught with problems including non homology (matK) and heterogeneity that prevents the creation of a universal PCR toolkit (rbcL). Some have advocated the use of the nuclear-encoded, internal transcribed spacer two (ITS2) as an alternative to the traditional chloroplast markers. However, the ITS2 is broadly perceived to be insufficiently conserved or to be confounded by introgression or biparental inheritance patterns, precluding its broad use in phylogenetic reconstruction or as a DNA barcode. A growing body of evidence has shown that simultaneous analysis of nucleotide data with secondary structure information can overcome at least some of the limitations of ITS2. The goal of this investigation was to assess the feasibility of an automated, sequence-structure approach for analysis of IT2 data from a large sampling of phylum Chlorophyta. Methodology/Principal Findings Sequences and secondary structures from 591 chlorophycean, 741 trebouxiophycean and 938 ulvophycean algae, all obtained from the ITS2 Database, were aligned using a sequence structure-specific scoring matrix. Phylogenetic relationships were reconstructed by Profile Neighbor-Joining coupled with a sequence structure-specific, general time reversible substitution model. Results from analyses of the ITS2 data were robust at multiple nodes and showed considerable congruence with results from published phylogenetic analyses. Conclusions/Significance Our observations on the power of automated, sequence-structure analyses of ITS2 to reconstruct phylum-level phylogenies of the green algae validate this approach to assessing diversity for large sets of chlorophytan taxa. Moreover, our results indicate that objections to the use of ITS2 for DNA barcoding should be weighed against the utility of an automated

  20. Improving the correlation of structural FEA models by the application of automated high density robotized laser Doppler vibrometry

    NASA Astrophysics Data System (ADS)

    Chowanietz, Maximilian; Bhangaonkar, Avinash; Semken, Michael; Cockrill, Martin

    2016-06-01

    Sound has had an intricate relation with the wellbeing of humans since time immemorial. It has the ability to enhance the quality of life immensely when present as music; at the same time, it can degrade its quality when manifested as noise. Hence, understanding its sources and the processes by which it is produced gains acute significance. Although various theories exist with respect to evolution of bells, it is indisputable that they carry millennia of cultural significance, and at least a few centuries of perfection with respect to design, casting and tuning. Despite the science behind its design, the nuances pertaining to founding and tuning have largely been empirical, and conveyed from one generation to the next. Post-production assessment for bells remains largely person-centric and traditional. However, progressive bell manufacturers have started adopting methods such as finite element analysis (FEA) for informing and optimising their future model designs. To establish confidence in the FEA process it is necessary to correlate the virtual model against a physical example. This is achieved by performing an experimental modal analysis (EMA) and comparing the results with those from FEA. Typically to collect the data for an EMA, the vibratory response of the structure is measured with the application of accelerometers. This technique has limitations; principally these are the observer effect and limited geometric resolution. In this paper, 3-dimensional laser Doppler vibrometry (LDV) has been used to measure the vibratory response with no observer effect due to the non-contact nature of the technique; resulting in higher accuracy measurements as the input to the correlation process. The laser heads were mounted on an industrial robot that enables large objects to be measured and extensive data sets to be captured quickly through an automated process. This approach gives previously unobtainable geometric resolution resulting in a higher confidence EMA. This is

  1. Human brain atlas for automated region of interest selection in quantitative susceptibility mapping: application to determine iron content in deep gray matter structures

    PubMed Central

    Lim, Issel Anne L.; Faria, Andreia V.; Li, Xu; Hsu, Johnny T.C.; Airan, Raag D.; Mori, Susumu; van Zijl, Peter C. M.

    2013-01-01

    The purpose of this paper is to extend the single-subject Eve atlas from Johns Hopkins University, which currently contains diffusion tensor and T1-weighted anatomical maps, by including contrast based on quantitative susceptibility mapping. The new atlas combines a “deep gray matter parcellation map” (DGMPM) derived from a single-subject quantitative susceptibility map with the previously established “white matter parcellation map” (WMPM) from the same subject’s T1-weighted and diffusion tensor imaging data into an MNI coordinate map named the “Everything Parcellation Map in Eve Space,” also known as the “EvePM.” It allows automated segmentation of gray matter and white matter structures. Quantitative susceptibility maps from five healthy male volunteers (30 to 33 years of age) were coregistered to the Eve Atlas with AIR and Large Deformation Diffeomorphic Metric Mapping (LDDMM), and the transformation matrices were applied to the EvePM to produce automated parcellation in subject space. Parcellation accuracy was measured with a kappa analysis for the left and right structures of six deep gray matter regions. For multi-orientation QSM images, the Kappa statistic was 0.85 between automated and manual segmentation, with the inter-rater reproducibility Kappa being 0.89 for the human raters, suggesting “almost perfect” agreement between all segmentation methods. Segmentation seemed slightly more difficult for human raters on single-orientation QSM images, with the Kappa statistic being 0.88 between automated and manual segmentation, and 0.85 and 0.86 between human raters. Overall, this atlas provides a time-efficient tool for automated coregistration and segmentation of quantitative susceptibility data to analyze many regions of interest. These data were used to establish a baseline for normal magnetic susceptibility measurements for over 60 brain structures of 30- to 33-year-old males. Correlating the average susceptibility with age-based iron

  2. Semi-automated structural characterisation of high velocity oxy fuel thermally sprayed WC-Co based coatings

    NASA Astrophysics Data System (ADS)

    Fay, M. W.; Han, Y.; McCartney, G.; Korpiola, K.; Brown, P. D.

    2008-08-01

    The application of an automated procedure for the rapid assessment of selected area electron diffraction patterns is described. Comparison with complementary EDX spectra has enabled the thermal decomposition reactions within high velocity oxy-fuel thermally sprayed WC-Co coatings to be investigated.

  3. Automation of analytical isotachophoresis

    NASA Technical Reports Server (NTRS)

    Thormann, Wolfgang

    1985-01-01

    The basic features of automation of analytical isotachophoresis (ITP) are reviewed. Experimental setups consisting of narrow bore tubes which are self-stabilized against thermal convection are considered. Sample detection in free solution is discussed, listing the detector systems presently used or expected to be of potential use in the near future. The combination of a universal detector measuring the evolution of ITP zone structures with detector systems specific to desired components is proposed as a concept of an automated chemical analyzer based on ITP. Possible miniaturization of such an instrument by means of microlithographic techniques is discussed.

  4. Habitat automation

    NASA Technical Reports Server (NTRS)

    Swab, Rodney E.

    1992-01-01

    A habitat, on either the surface of the Moon or Mars, will be designed and built with the proven technologies of that day. These technologies will be mature and readily available to the habitat designer. We believe an acceleration of the normal pace of automation would allow a habitat to be safer and more easily maintained than would be the case otherwise. This document examines the operation of a habitat and describes elements of that operation which may benefit from an increased use of automation. Research topics within the automation realm are then defined and discussed with respect to the role they can have in the design of the habitat. Problems associated with the integration of advanced technologies into real-world projects at NASA are also addressed.

  5. Automated High Throughput Drug Target Crystallography

    SciTech Connect

    Rupp, B

    2005-02-18

    The molecular structures of drug target proteins and receptors form the basis for 'rational' or structure guided drug design. The majority of target structures are experimentally determined by protein X-ray crystallography, which as evolved into a highly automated, high throughput drug discovery and screening tool. Process automation has accelerated tasks from parallel protein expression, fully automated crystallization, and rapid data collection to highly efficient structure determination methods. A thoroughly designed automation technology platform supported by a powerful informatics infrastructure forms the basis for optimal workflow implementation and the data mining and analysis tools to generate new leads from experimental protein drug target structures.

  6. Automating Finance

    ERIC Educational Resources Information Center

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  7. Automated dispenser

    SciTech Connect

    Hollen, R.M.; Stalnaker, N.D.

    1989-04-06

    An automated dispenser having a conventional pipette attached to an actuating cylinder through a flexible cable for delivering precise quantities of a liquid through commands from remotely located computer software. The travel of the flexible cable is controlled by adjustable stops and a locking shaft. The pipette can be positioned manually or by the hands of a robot. 1 fig.

  8. Significant reduction in errors associated with nonbonded contacts in protein crystal structures: automated all-atom refinement with PrimeX.

    PubMed

    Bell, Jeffrey A; Ho, Kenneth L; Farid, Ramy

    2012-08-01

    All-atom models are essential for many applications in molecular modeling and computational chemistry. Nonbonded atomic contacts much closer than the sum of the van der Waals radii of the two atoms (clashes) are commonly observed in such models derived from protein crystal structures. A set of 94 recently deposited protein structures in the resolution range 1.5-2.8 Å were analyzed for clashes by the addition of all H atoms to the models followed by optimization and energy minimization of the positions of just these H atoms. The results were compared with the same set of structures after automated all-atom refinement with PrimeX and with nonbonded contacts in protein crystal structures at a resolution equal to or better than 0.9 Å. The additional PrimeX refinement produced structures with reasonable summary geometric statistics and similar R(free) values to the original structures. The frequency of clashes at less than 0.8 times the sum of van der Waals radii was reduced over fourfold compared with that found in the original structures, to a level approaching that found in the ultrahigh-resolution structures. Moreover, severe clashes at less than or equal to 0.7 times the sum of atomic radii were reduced 15-fold. All-atom refinement with PrimeX produced improved crystal structure models with respect to nonbonded contacts and yielded changes in structural details that dramatically impacted on the interpretation of some protein-ligand interactions. PMID:22868759

  9. Works carried out by ZAO NPK Del'fin-Informatika on developing distributed and hybrid structures of technical means for automated control systems of process equipment at thermal power stations

    NASA Astrophysics Data System (ADS)

    Shapiro, V. I.; Chausov, Yu. N.; Borisova, E. V.; Pshenichnikova, O. A.; Tolmachev, A. L.

    2011-10-01

    The field for applying distributed structures of technical means is identified on the basis of experience gained with development of information-computation systems and fully functional automated process control systems. Functions of automated process control systems are pointed out for which centralized processing of data is preferable or necessary in order to support their speed of response and reliability. Experience gained from development of hybrid systems with centralized and distributed processing of information is presented and advisability of constructing them is shown.

  10. Scaling Out and Evaluation of OBSecAn, an Automated Section Annotator for Semi-Structured Clinical Documents, on a Large VA Clinical Corpus

    PubMed Central

    Tran, Le-Thuy T.; Divita, Guy; Redd, Andrew; Carter, Marjorie E.; Samore, Matthew; Gundlapalli, Adi V.

    2015-01-01

    “Identifying and labeling” (annotating) sections improves the effectiveness of extracting information stored in the free text of clinical documents. OBSecAn, an automated ontology-based section annotator, was developed to identify and label sections of semi-structured clinical documents from the Department of Veterans Affairs (VA). In the first step, the algorithm reads and parses the document to obtain and store information regarding sections into a structure that supports the hierarchy of sections. The second stage detects and makes correction to errors in the parsed structure. The third stage produces the section annotation output using the final parsed tree. In this study, we present the OBSecAn method and its scale to a million document corpus and evaluate its performance in identifying family history sections. We identify high yield sections for this use case from note titles such as primary care and demonstrate a median rate of 99% in correctly identifying a family history section. PMID:26958260

  11. SU-C-9A-02: Structured Noise Index as An Automated Quality Control for Nuclear Medicine: A Two Year Experience

    SciTech Connect

    Nelson, J; Christianson, O; Samei, E

    2014-06-01

    Purpose: Flood-field uniformity evaluation is an essential element in the assessment of nuclear medicine (NM) gamma cameras. It serves as the central element of the quality control (QC) program, acquired and analyzed on a daily basis prior to clinical imaging. Uniformity images are traditionally analyzed using pixel value-based metrics which often fail to capture subtle structure and patterns caused by changes in gamma camera performance requiring additional visual inspection which is subjective and time demanding. The goal of this project was to develop and implement a robust QC metrology for NM that is effective in identifying non-uniformity issues, reporting issues in a timely manner for efficient correction prior to clinical involvement, all incorporated into an automated effortless workflow, and to characterize the program over a two year period. Methods: A new quantitative uniformity analysis metric was developed based on 2D noise power spectrum metrology and confirmed based on expert observer visual analysis. The metric, termed Structured Noise Index (SNI) was then integrated into an automated program to analyze, archive, and report on daily NM QC uniformity images. The effectiveness of the program was evaluated over a period of 2 years. Results: The SNI metric successfully identified visually apparent non-uniformities overlooked by the pixel valuebased analysis methods. Implementation of the program has resulted in nonuniformity identification in about 12% of daily flood images. In addition, due to the vigilance of staff response, the percentage of days exceeding trigger value shows a decline over time. Conclusion: The SNI provides a robust quantification of the NM performance of gamma camera uniformity. It operates seamlessly across a fleet of multiple camera models. The automated process provides effective workflow within the NM spectra between physicist, technologist, and clinical engineer. The reliability of this process has made it the preferred

  12. Evaluating the impact of scoring parameters on the structure of intra-specific genetic variation using RawGeno, an R package for automating AFLP scoring

    PubMed Central

    Arrigo, Nils; Tuszynski, Jarek W; Ehrich, Dorothee; Gerdes, Tommy; Alvarez, Nadir

    2009-01-01

    Background Since the transfer and application of modern sequencing technologies to the analysis of amplified fragment-length polymorphisms (AFLP), evolutionary biologists have included an increasing number of samples and markers in their studies. Although justified in this context, the use of automated scoring procedures may result in technical biases that weaken the power and reliability of further analyses. Results Using a new scoring algorithm, RawGeno, we show that scoring errors – in particular "bin oversplitting" (i.e. when variant sizes of the same AFLP marker are not considered as homologous) and "technical homoplasy" (i.e. when two AFLP markers that differ slightly in size are mistakenly considered as being homologous) – induce a loss of discriminatory power, decrease the robustness of results and, in extreme cases, introduce erroneous information in genetic structure analyses. In the present study, we evaluate several descriptive statistics that can be used to optimize the scoring of the AFLP analysis, and we describe a new statistic, the information content per bin (Ibin) that represents a valuable estimator during the optimization process. This statistic can be computed at any stage of the AFLP analysis without requiring the inclusion of replicated samples. Finally, we show that downstream analyses are not equally sensitive to scoring errors. Indeed, although a reasonable amount of flexibility is allowed during the optimization of the scoring procedure without causing considerable changes in the detection of genetic structure patterns, notable discrepancies are observed when estimating genetic diversities from differently scored datasets. Conclusion Our algorithm appears to perform as well as a commercial program in automating AFLP scoring, at least in the context of population genetics or phylogeographic studies. To our knowledge, RawGeno is the only freely available public-domain software for fully automated AFLP scoring, from electropherogram

  13. Significant reduction in errors associated with nonbonded contacts in protein crystal structures: automated all-atom refinement with PrimeX

    SciTech Connect

    Bell, Jeffrey A.; Ho, Kenneth L.; Farid, Ramy

    2012-08-01

    All-atom models derived from moderate-resolution protein crystal structures contain a high frequency of close nonbonded contacts, independent of the major refinement program used for structure determination. All-atom refinement with PrimeX corrects many of these problematic interactions, producing models that are better suited for use in computational chemistry and related applications. All-atom models are essential for many applications in molecular modeling and computational chemistry. Nonbonded atomic contacts much closer than the sum of the van der Waals radii of the two atoms (clashes) are commonly observed in such models derived from protein crystal structures. A set of 94 recently deposited protein structures in the resolution range 1.5–2.8 Å were analyzed for clashes by the addition of all H atoms to the models followed by optimization and energy minimization of the positions of just these H atoms. The results were compared with the same set of structures after automated all-atom refinement with PrimeX and with nonbonded contacts in protein crystal structures at a resolution equal to or better than 0.9 Å. The additional PrimeX refinement produced structures with reasonable summary geometric statistics and similar R{sub free} values to the original structures. The frequency of clashes at less than 0.8 times the sum of van der Waals radii was reduced over fourfold compared with that found in the original structures, to a level approaching that found in the ultrahigh-resolution structures. Moreover, severe clashes at less than or equal to 0.7 times the sum of atomic radii were reduced 15-fold. All-atom refinement with PrimeX produced improved crystal structure models with respect to nonbonded contacts and yielded changes in structural details that dramatically impacted on the interpretation of some protein–ligand interactions.

  14. Numerical analysis of stiffened shells of revolution. Volume 3: Users' manual for STARS-2B, 2V, shell theory automated for rotational structures, 2 (buckling, vibrations), digital computer programs

    NASA Technical Reports Server (NTRS)

    Svalbonas, V.

    1973-01-01

    The User's manual for the shell theory automated for rotational structures (STARS) 2B and 2V (buckling, vibrations) is presented. Several features of the program are: (1) arbitrary branching of the shell meridians, (2) arbitrary boundary conditions, (3) minimum input requirements to describe a complex, practical shell of revolution structure, and (4) accurate analysis capability using a minimum number of degrees of freedom.

  15. Developing a Graphical User Interface to Automate the Estimation and Prediction of Risk Values for Flood Protective Structures using Artificial Neural Network

    NASA Astrophysics Data System (ADS)

    Hasan, M.; Helal, A.; Gabr, M.

    2014-12-01

    In this project, we focus on providing a computer-automated platform for a better assessment of the potential failures and retrofit measures of flood-protecting earth structures, e.g., dams and levees. Such structures play an important role during extreme flooding events as well as during normal operating conditions. Furthermore, they are part of other civil infrastructures such as water storage and hydropower generation. Hence, there is a clear need for accurate evaluation of stability and functionality levels during their service lifetime so that the rehabilitation and maintenance costs are effectively guided. Among condition assessment approaches based on the factor of safety, the limit states (LS) approach utilizes numerical modeling to quantify the probability of potential failures. The parameters for LS numerical modeling include i) geometry and side slopes of the embankment, ii) loading conditions in terms of rate of rising and duration of high water levels in the reservoir, and iii) cycles of rising and falling water levels simulating the effect of consecutive storms throughout the service life of the structure. Sample data regarding the correlations of these parameters are available through previous research studies. We have unified these criteria and extended the risk assessment in term of loss of life through the implementation of a graphical user interface to automate input parameters that divides data into training and testing sets, and then feeds them into Artificial Neural Network (ANN) tool through MATLAB programming. The ANN modeling allows us to predict risk values of flood protective structures based on user feedback quickly and easily. In future, we expect to fine-tune the software by adding extensive data on variations of parameters.

  16. Automated RTOP Management System

    NASA Technical Reports Server (NTRS)

    Hayes, P.

    1984-01-01

    The structure of NASA's Office of Aeronautics and Space Technology electronic information system network from 1983 to 1985 is illustrated. The RTOP automated system takes advantage of existing hardware, software, and expertise, and provides: (1) computerized cover sheet and resources forms; (2) electronic signature and transmission; (3) a data-based information system; (4) graphics; (5) intercenter communications; (6) management information; and (7) text editing. The system is coordinated with Headquarters efforts in codes R,E, and T.

  17. Automated lithocell

    NASA Astrophysics Data System (ADS)

    Englisch, Andreas; Deuter, Armin

    1990-06-01

    Integration and automation have gained more and more ground in modern IC-manufacturing. It is difficult to make a direct calculation of the profit these investments yield. On the other hand, the demands to man, machine and technology have increased enormously of late; it is not difficult to see that only by means of integration and automation can these demands be coped with. Here are some salient points: U the complexity and costs incurred by the equipment and processes have got significantly higher . owing to the reduction of all dimensions, the tolerances within which the various process steps have to be carried out have got smaller and smaller and the adherence to these tolerances more and more difficult U the cycle time has become more and more important both for the development and control of new processes and, to a great extent, for a rapid and reliable supply to the customer. In order that the products be competitive under these conditions, all sort of costs have to be reduced and the yield has to be maximized. Therefore, the computer-aided control of the equipment and the process combined with an automatic data collection and a real-time SPC (statistical process control) has become absolutely necessary for successful IC-manufacturing. Human errors must be eliminated from the execution of the various process steps by automation. The work time set free in this way makes it possible for the human creativity to be employed on a larger scale in stabilizing the processes. Besides, a computer-aided equipment control can ensure the optimal utilization of the equipment round the clock.

  18. Automated error-tolerant macromolecular structure determination from multidimensional nuclear Overhauser enhancement spectra and chemical shift assignments: improved robustness and performance of the PASD algorithm.

    PubMed

    Kuszewski, John J; Thottungal, Robin Augustine; Clore, G Marius; Schwieters, Charles D

    2008-08-01

    We report substantial improvements to the previously introduced automated NOE assignment and structure determination protocol known as PASD (Kuszewski et al. (2004) J Am Chem Soc 26:6258-6273). The improved protocol includes extensive analysis of input spectral data to create a low-resolution contact map of residues expected to be close in space. This map is used to obtain reasonable initial guesses of NOE assignment likelihoods which are refined during subsequent structure calculations. Information in the contact map about which residues are predicted to not be close in space is applied via conservative repulsive distance restraints which are used in early phases of the structure calculations. In comparison with the previous protocol, the new protocol requires significantly less computation time. We show results of running the new PASD protocol on six proteins and demonstrate that useful assignment and structural information is extracted on proteins of more than 220 residues. We show that useful assignment information can be obtained even in the case in which a unique structure cannot be determined. PMID:18668206

  19. Reduced dimensionality (3,2)D NMR experiments and their automated analysis: implications to high-throughput structural studies on proteins.

    PubMed

    Reddy, Jithender G; Kumar, Dinesh; Hosur, Ramakrishna V

    2015-02-01

    Protein NMR spectroscopy has expanded dramatically over the last decade into a powerful tool for the study of their structure, dynamics, and interactions. The primary requirement for all such investigations is sequence-specific resonance assignment. The demand now is to obtain this information as rapidly as possible and in all types of protein systems, stable/unstable, soluble/insoluble, small/big, structured/unstructured, and so on. In this context, we introduce here two reduced dimensionality experiments – (3,2)D-hNCOcanH and (3,2)D-hNcoCAnH – which enhance the previously described 2D NMR-based assignment methods quite significantly. Both the experiments can be recorded in just about 2-3 h each and hence would be of immense value for high-throughput structural proteomics and drug discovery research. The applicability of the method has been demonstrated using alpha-helical bovine apo calbindin-D9k P43M mutant (75 aa) protein. Automated assignment of this data using AUTOBA has been presented, which enhances the utility of these experiments. The backbone resonance assignments so derived are utilized to estimate secondary structures and the backbone fold using Web-based algorithms. Taken together, we believe that the method and the protocol proposed here can be used for routine high-throughput structural studies of proteins. PMID:25178811

  20. Automated campaign system

    NASA Astrophysics Data System (ADS)

    Vondran, Gary; Chao, Hui; Lin, Xiaofan; Beyer, Dirk; Joshi, Parag; Atkins, Brian; Obrador, Pere

    2006-02-01

    To run a targeted campaign involves coordination and management across numerous organizations and complex process flows. Everything from market analytics on customer databases, acquiring content and images, composing the materials, meeting the sponsoring enterprise brand standards, driving through production and fulfillment, and evaluating results; all processes are currently performed by experienced highly trained staff. Presented is a developed solution that not only brings together technologies that automate each process, but also automates the entire flow so that a novice user could easily run a successful campaign from their desktop. This paper presents the technologies, structure, and process flows used to bring this system together. Highlighted will be how the complexity of running a targeted campaign is hidden from the user through technologies, all while providing the benefits of a professionally managed campaign.

  1. Automated discovery of structural features of the optic nerve head on the basis of image and genetic data

    NASA Astrophysics Data System (ADS)

    Christopher, Mark; Tang, Li; Fingert, John H.; Scheetz, Todd E.; Abramoff, Michael D.

    2014-03-01

    Evaluation of optic nerve head (ONH) structure is a commonly used clinical technique for both diagnosis and monitoring of glaucoma. Glaucoma is associated with characteristic changes in the structure of the ONH. We present a method for computationally identifying ONH structural features using both imaging and genetic data from a large cohort of participants at risk for primary open angle glaucoma (POAG). Using 1054 participants from the Ocular Hypertension Treatment Study, ONH structure was measured by application of a stereo correspondence algorithm to stereo fundus images. In addition, the genotypes of several known POAG genetic risk factors were considered for each participant. ONH structural features were discovered using both a principal component analysis approach to identify the major modes of variance within structural measurements and a linear discriminant analysis approach to capture the relationship between genetic risk factors and ONH structure. The identified ONH structural features were evaluated based on the strength of their associations with genotype and development of POAG by the end of the OHTS study. ONH structural features with strong associations with genotype were identified for each of the genetic loci considered. Several identified ONH structural features were significantly associated (p < 0.05) with the development of POAG after Bonferroni correction. Further, incorporation of genetic risk status was found to substantially increase performance of early POAG prediction. These results suggest incorporating both imaging and genetic data into ONH structural modeling significantly improves the ability to explain POAG-related changes to ONH structure.

  2. Structure tensor based automated detection of macular edema and central serous retinopathy using optical coherence tomography images.

    PubMed

    Hassan, Bilal; Raja, Gulistan; Hassan, Taimur; Usman Akram, M

    2016-04-01

    Macular edema (ME) and central serous retinopathy (CSR) are two macular diseases that affect the central vision of a person if they are left untreated. Optical coherence tomography (OCT) imaging is the latest eye examination technique that shows a cross-sectional region of the retinal layers and that can be used to detect many retinal disorders in an early stage. Many researchers have done clinical studies on ME and CSR and reported significant findings in macular OCT scans. However, this paper proposes an automated method for the classification of ME and CSR from OCT images using a support vector machine (SVM) classifier. Five distinct features (three based on the thickness profiles of the sub-retinal layers and two based on cyst fluids within the sub-retinal layers) are extracted from 30 labeled images (10 ME, 10 CSR, and 10 healthy), and SVM is trained on these. We applied our proposed algorithm on 90 time-domain OCT (TD-OCT) images (30 ME, 30 CSR, 30 healthy) of 73 patients. Our algorithm correctly classified 88 out of 90 subjects with accuracy, sensitivity, and specificity of 97.77%, 100%, and 93.33%, respectively. PMID:27140751

  3. Identification of new leishmanicidal peptide lead structures by automated real-time monitoring of changes in intracellular ATP.

    PubMed Central

    Luque-Ortega, J Román; Saugar, José M; Chiva, Cristina; Andreu, David; Rivas, Luis

    2003-01-01

    Leishmanicidal drugs interacting stoichiometrically with parasite plasma membrane lipids, thus promoting permeability, have raised significant expectations for Leishmania chemotherapy due to their nil or very low induction of resistance. Inherent in this process is a decrease in intracellular ATP, either wasted by ionic pumps to restore membrane potential or directly leaked through larger membrane lesions caused by the drug. We have adapted a luminescence method for fast automated real-time monitoring of this process, using Leishmania donovani promastigotes transfected with a cytoplasmic luciferase form, previously tested for anti-mitochondrial drugs. The system was first assayed against a set of well-known membrane-active drugs [amphotericin B, nystatin, cecropin A-melittin peptide CA(1-8)M(1-18)], plus two ionophoric polyethers (narasin and salinomycin) not previously tested on Leishmania, then used to screen seven new cecropin A-melittin hybrid peptides. All membrane-active compounds showed a good correlation between inhibition of luminescence and leishmanicidal activity. Induction of membrane permeability was demonstrated by dissipation of membrane potential, SYTOX trade mark Green influx and membrane damage assessed by electron microscopy, except for the polyethers, where ATP decrease was due to inhibition of its mitochondrial synthesis. Five of the test peptides showed an ED50 around 1 microM on promastigotes. These peptides, with equal or better activity than 26-residue-long CA(1-8)M(1-18), are the shortest leishmanicidal peptides described so far, and validate our luminescence assay as a fast and cheap screening tool for membrane-active compounds. PMID:12864731

  4. Identification of new leishmanicidal peptide lead structures by automated real-time monitoring of changes in intracellular ATP.

    PubMed

    Luque-Ortega, J Román; Saugar, José M; Chiva, Cristina; Andreu, David; Rivas, Luis

    2003-10-01

    Leishmanicidal drugs interacting stoichiometrically with parasite plasma membrane lipids, thus promoting permeability, have raised significant expectations for Leishmania chemotherapy due to their nil or very low induction of resistance. Inherent in this process is a decrease in intracellular ATP, either wasted by ionic pumps to restore membrane potential or directly leaked through larger membrane lesions caused by the drug. We have adapted a luminescence method for fast automated real-time monitoring of this process, using Leishmania donovani promastigotes transfected with a cytoplasmic luciferase form, previously tested for anti-mitochondrial drugs. The system was first assayed against a set of well-known membrane-active drugs [amphotericin B, nystatin, cecropin A-melittin peptide CA(1-8)M(1-18)], plus two ionophoric polyethers (narasin and salinomycin) not previously tested on Leishmania, then used to screen seven new cecropin A-melittin hybrid peptides. All membrane-active compounds showed a good correlation between inhibition of luminescence and leishmanicidal activity. Induction of membrane permeability was demonstrated by dissipation of membrane potential, SYTOX trade mark Green influx and membrane damage assessed by electron microscopy, except for the polyethers, where ATP decrease was due to inhibition of its mitochondrial synthesis. Five of the test peptides showed an ED50 around 1 microM on promastigotes. These peptides, with equal or better activity than 26-residue-long CA(1-8)M(1-18), are the shortest leishmanicidal peptides described so far, and validate our luminescence assay as a fast and cheap screening tool for membrane-active compounds. PMID:12864731

  5. Both Automation and Paper.

    ERIC Educational Resources Information Center

    Purcell, Royal

    1988-01-01

    Discusses the concept of a paperless society and the current situation in library automation. Various applications of automation and telecommunications are addressed, and future library automation is considered. Automation at the Monroe County Public Library in Bloomington, Indiana, is described as an example. (MES)

  6. Structure-Function Modeling of Optical Coherence Tomography and Standard Automated Perimetry in the Retina of Patients with Autosomal Dominant Retinitis Pigmentosa

    PubMed Central

    Smith, Travis B.; Parker, Maria; Steinkamp, Peter N.; Weleber, Richard G.; Smith, Ning; Wilson, David J.

    2016-01-01

    Purpose To assess relationships between structural and functional biomarkers, including new topographic measures of visual field sensitivity, in patients with autosomal dominant retinitis pigmentosa. Methods Spectral domain optical coherence tomography line scans and hill of vision (HOV) sensitivity surfaces from full-field standard automated perimetry were semi-automatically aligned for 60 eyes of 35 patients. Structural biomarkers were extracted from outer retina b-scans along horizontal and vertical midlines. Functional biomarkers were extracted from local sensitivity profiles along the b-scans and from the full visual field. These included topographic measures of functional transition such as the contour of most rapid sensitivity decline around the HOV, herein called HOV slope for convenience. Biomarker relationships were assessed pairwise by coefficients of determination (R2) from mixed-effects analysis with automatic model selection. Results Structure-function relationships were accurately modeled (conditional R2>0.8 in most cases). The best-fit relationship models and correlation patterns for horizontally oriented biomarkers were different than vertically oriented ones. The structural biomarker with the largest number of significant functional correlates was the ellipsoid zone (EZ) width, followed by the total photoreceptor layer thickness. The strongest correlation observed was between EZ width and HOV slope distance (marginal R2 = 0.85, p<10−10). The mean sensitivity defect at the EZ edge was 7.6 dB. Among all functional biomarkers, the HOV slope mean value, HOV slope mean distance, and maximum sensitivity along the b-scan had the largest number of significant structural correlates. Conclusions Topographic slope metrics show promise as functional biomarkers relevant to the transition zone. EZ width is strongly associated with the location of most rapid HOV decline. PMID:26845445

  7. Automated 3D architecture reconstruction from photogrammetric structure-and-motion: A case study of the One Pilla pagoda, Hanoi, Vienam

    NASA Astrophysics Data System (ADS)

    To, T.; Nguyen, D.; Tran, G.

    2015-04-01

    Heritage system of Vietnam has decline because of poor-conventional condition. For sustainable development, it is required a firmly control, space planning organization, and reasonable investment. Moreover, in the field of Cultural Heritage, the use of automated photogrammetric systems, based on Structure from Motion techniques (SfM), is widely used. With the potential of high-resolution, low-cost, large field of view, easiness, rapidity and completeness, the derivation of 3D metric information from Structure-and- Motion images is receiving great attention. In addition, heritage objects in form of 3D physical models are recorded not only for documentation issues, but also for historical interpretation, restoration, cultural and educational purposes. The study suggests the archaeological documentation of the "One Pilla" pagoda placed in Hanoi capital, Vietnam. The data acquired through digital camera Cannon EOS 550D, CMOS APS-C sensor 22.3 x 14.9 mm. Camera calibration and orientation were carried out by VisualSFM, CMPMVS (Multi-View Reconstruction) and SURE (Photogrammetric Surface Reconstruction from Imagery) software. The final result represents a scaled 3D model of the One Pilla Pagoda and displayed different views in MeshLab software.

  8. Completely automated, highly error-tolerant macromolecular structure determination from multidimensional nuclear overhauser enhancement spectra and chemical shift assignments.

    PubMed

    Kuszewski, John; Schwieters, Charles D; Garrett, Daniel S; Byrd, R Andrew; Tjandra, Nico; Clore, G Marius

    2004-05-26

    The major rate-limiting step in high-throughput NMR protein structure determination involves the calculation of a reliable initial fold, the elimination of incorrect nuclear Overhauser enhancement (NOE) assignments, and the resolution of NOE assignment ambiguities. We present a robust approach to automatically calculate structures with a backbone coordinate accuracy of 1.0-1.5 A from datasets in which as much as 80% of the long-range NOE information (i.e., between residues separated by more than five positions in the sequence) is incorrect. The current algorithm differs from previously published methods in that it has been expressly designed to ensure that the results from successive cycles are not biased by the global fold of structures generated in preceding cycles. Consequently, the method is highly error tolerant and is not easily funnelled down an incorrect path in either three-dimensional structure or NOE assignment space. The algorithm incorporates three main features: a linear energy function representation of the NOE restraints to allow maximization of the number of simultaneously satisfied restraints during the course of simulated annealing; a method for handling the presence of multiple possible assignments for each NOE cross-peak which avoids local minima by treating each possible assignment as if it were an independent restraint; and a probabilistic method to permit both inactivation and reactivation of all NOE restraints on the fly during the course of simulated annealing. NOE restraints are never removed permanently, thereby significantly reducing the likelihood of becoming trapped in a false minimum of NOE assignment space. The effectiveness of the algorithm is demonstrated using completely automatically peak-picked experimental NOE data from two proteins: interleukin-4 (136 residues) and cyanovirin-N (101 residues). The limits of the method are explored using simulated data on the 56-residue B1 domain of Streptococcal protein G. PMID:15149223

  9. The Development of a Tool for Semi-Automated Generation of Structured and Unstructured Grids about Isolated Rotorcraft Blades

    NASA Technical Reports Server (NTRS)

    Shanmugasundaram, Ramakrishnan; Garriz, Javier A.; Samareh, Jamshid A.

    1997-01-01

    The grid generation used to model rotorcraft configurations for Computational Fluid Dynamics (CFD) analysis is highly complicated and time consuming. The highly complex geometry and irregular shapes encountered in entire rotorcraft configurations are typically modeled using overset grids. Another promising approach is to utilize unstructured grid methods. With either approach the majority of time is spent manually setting up the topology. For less complicated geometries such as isolated rotor blades, less time is obviously required. This paper discusses the capabilities of a tool called Rotor blade Optimized Topology Organizer and Renderer(ROTOR) being developed to quickly generate block structured grids and unstructured tetrahedral grids about isolated blades. The key algorithm uses individual airfoil sections to construct a Non-Uniform Rational B-Spline(NURBS) surface representation of the rotor blade. This continuous surface definition can be queried to define the block topology used in constructing a structured mesh around the rotor blade. Alternatively, the surface definition can be used to define the surface patches and grid cell spacing requirements for generating unstructured surface and volume grids. Presently, the primary output for ROTOR is block structured grids using 0-H and H-H topologies suitable for full-potential solvers. This paper will discuss the present capabilities of the tool and highlight future work.

  10. Using a semi-automated filtering process to improve large footprint lidar sub-canopy elevation models and forest structure metrics

    NASA Astrophysics Data System (ADS)

    Fricker, G. A.; Saatchi, S.; Meyer, V.; Gillespie, T.; Sheng, Y.

    2011-12-01

    Quantification of sub-canopy topography and forest structure is important for developing a better understanding of how forest ecosystems function. This study focuses on a three-step method to adapt discrete return lidar (DRL) filtering techniques to Laser Vegetation Imaging Sensor (LVIS) large-footprint lidar (LFL) waveforms to improve the accuracy of both sub-canopy digital elevation models (DEMs), as well as forest structure measurements. The results of the experiment demonstrate that LFL ground surfaces can be effectively filtered using methods adapted from DRL point filtering methods, and the resulting data will produce more accurate digital elevation models, as well as improved estimates of forest structure. The first step quantifies the slope present at the center of each LFL pulse, and the average error expected at each particular degree of slope is modeled. Areas of high terrain slope show consistently more error in LFL ground detection, and empirical relationships between terrain angle and expected LVIS ground detection error are established. These relationships are then used to create an algorithm for LFL ground elevation correction. The second step uses an iterative, expanding window filter to identify outlier points which are not part of the ground surface, as well as manual editing to identify laser pulses which are not at ground level. The semi-automated methods improved the LVIS DEM accuracy significantly by identifying significant outliers in the LVIS point cloud. The final step develops an approach which utilizes both the filtered LFL DEMs, and the modeled error introduced by terrain slope to improve both sub-canopy elevation models, and above ground LFL waveform metrics. DRL and LVIS data from Barro Colorado Island, Panama, and La Selva, Costa Rica were used to develop and test the algorithm. Acknowledgements: Special thanks to Dr. Jim Dilling for providing the DRL lidar data for Barro Colorado Island.

  11. Automated method for determination of dissolved organic carbon-water distribution constants of structurally diverse pollutants using pre-equilibrium solid-phase microextraction.

    PubMed

    Ripszam, Matyas; Haglund, Peter

    2015-02-01

    Dissolved organic carbon (DOC) plays a key role in determining the environmental fate of semivolatile organic environmental contaminants. The goal of the present study was to develop a method using commercially available hardware to rapidly characterize the sorption properties of DOC in water samples. The resulting method uses negligible-depletion direct immersion solid-phase microextraction (SPME) and gas chromatography-mass spectrometry. Its performance was evaluated using Nordic reference fulvic acid and 40 priority environmental contaminants that cover a wide range of physicochemical properties. Two SPME fibers had to be used to cope with the span of properties, 1 coated with polydimethylsiloxane and 1 coated with polystyrene divinylbenzene polydimethylsiloxane, for nonpolar and semipolar contaminants, respectively. The measured DOC-water distribution constants showed reasonably good reproducibility (standard deviation ≤ 0.32) and good correlation (R(2)  = 0.80) with log octanol-water partition coefficients for nonpolar persistent organic pollutants. The sample pretreatment is limited to filtration, and the method is easy to adjust to different DOC concentrations. These experiments also utilized the latest SPME automation that largely decreases total cycle time (to 20 min or shorter) and increases sample throughput, which is advantageous in cases when many samples of DOC must be characterized or when the determinations must be performed quickly, for example, to avoid precipitation, aggregation, and other changes of DOC structure and properties. The data generated by this method are valuable as a basis for transport and fate modeling studies. PMID:25393710

  12. A linear programming approach to reconstructing subcellular structures from confocal images for automated generation of representative 3D cellular models

    PubMed Central

    Wood, Scott T.; Dean, Brian C.; Dean, Delphine

    2013-01-01

    This paper presents a novel computer vision algorithm to analyze 3D stacks of confocal images of fluorescently stained single cells. The goal of the algorithm is to create representative in silico model structures that can be imported into finite element analysis software for mechanical characterization. Segmentation of cell and nucleus boundaries is accomplished via standard thresholding methods. Using novel linear programming methods, a representative actin stress fiber network is generated by computing a linear superposition of fibers having minimum discrepancy compared with an experimental 3D confocal image. Qualitative validation is performed through analysis of seven 3D confocal image stacks of adherent vascular smooth muscle cells (VSMCs) grown in 2D culture. The presented method is able to automatically generate 3D geometries of the cell's boundary, nucleus, and representative F-actin network based on standard cell microscopy data. These geometries can be used for direct importation and implementation in structural finite element models for analysis of the mechanics of a single cell to potentially speed discoveries in the fields of regenerative medicine, mechanobiology, and drug discovery. PMID:23395283

  13. Use of conditional rule structure to automate clinical decision support: a comparison of artificial intelligence and deterministic programming techniques.

    PubMed

    Friedman, R H; Frank, A D

    1983-08-01

    A rule-based computer system was developed to perform clinical decision-making support within a medical information system, oncology practice, and clinical research. This rule-based system, which has been programmed using deterministic rules, possesses features of generalizability, modularity of structure, convenience in rule acquisition, explanability, and utility for patient care and teaching, features which have been identified as advantages of artificial intelligence (AI) rule-based systems. Formal rules are primarily represented as conditional statements; common conditions and actions are stored in system dictionaries so that they can be recalled at any time to form new decision rules. Important similarities and differences exist in the structure of this system and clinical computer systems utilizing artificial intelligence (AI) production rule techniques. The non-AI rule-based system possesses advantages in cost and ease of implementation. The degree to which significant medical decision problems can be solved by this technique remains uncertain as does whether the more complex AI methodologies will be required. PMID:6352165

  14. A new methodology for non-contact accurate crack width measurement through photogrammetry for automated structural safety evaluation

    NASA Astrophysics Data System (ADS)

    Jahanshahi, Mohammad R.; Masri, Sami F.

    2013-03-01

    In mechanical, aerospace and civil structures, cracks are important defects that can cause catastrophes if neglected. Visual inspection is currently the predominant method for crack assessment. This approach is tedious, labor-intensive, subjective and highly qualitative. An inexpensive alternative to current monitoring methods is to use a robotic system that could perform autonomous crack detection and quantification. To reach this goal, several image-based crack detection approaches have been developed; however, the crack thickness quantification, which is an essential element for a reliable structural condition assessment, has not been sufficiently investigated. In this paper, a new contact-less crack quantification methodology, based on computer vision and image processing concepts, is introduced and evaluated against a crack quantification approach which was previously developed by the authors. The proposed approach in this study utilizes depth perception to quantify crack thickness and, as opposed to most previous studies, needs no scale attachment to the region under inspection, which makes this approach ideal for incorporation with autonomous or semi-autonomous mobile inspection systems. Validation tests are performed to evaluate the performance of the proposed approach, and the results show that the new proposed approach outperforms the previously developed one.

  15. Automated protein NMR resonance assignments.

    PubMed

    Wan, Xiang; Xu, Dong; Slupsky, Carolyn M; Lin, Guohui

    2003-01-01

    NMR resonance peak assignment is one of the key steps in solving an NMR protein structure. The assignment process links resonance peaks to individual residues of the target protein sequence, providing the prerequisite for establishing intra- and inter-residue spatial relationships between atoms. The assignment process is tedious and time-consuming, which could take many weeks. Though there exist a number of computer programs to assist the assignment process, many NMR labs are still doing the assignments manually to ensure quality. This paper presents (1) a new scoring system for mapping spin systems to residues, (2) an automated adjacency information extraction procedure from NMR spectra, and (3) a very fast assignment algorithm based on our previous proposed greedy filtering method and a maximum matching algorithm to automate the assignment process. The computational tests on 70 instances of (pseudo) experimental NMR data of 14 proteins demonstrate that the new score scheme has much better discerning power with the aid of adjacency information between spin systems simulated across various NMR spectra. Typically, with automated extraction of adjacency information, our method achieves nearly complete assignments for most of the proteins. The experiment shows very promising perspective that the fast automated assignment algorithm together with the new score scheme and automated adjacency extraction may be ready for practical use. PMID:16452794

  16. Space power subsystem automation technology

    NASA Technical Reports Server (NTRS)

    Graves, J. R. (Compiler)

    1982-01-01

    The technology issues involved in power subsystem automation and the reasonable objectives to be sought in such a program were discussed. The complexities, uncertainties, and alternatives of power subsystem automation, along with the advantages from both an economic and a technological perspective were considered. Whereas most spacecraft power subsystems now use certain automated functions, the idea of complete autonomy for long periods of time is almost inconceivable. Thus, it seems prudent that the technology program for power subsystem automation be based upon a growth scenario which should provide a structured framework of deliberate steps to enable the evolution of space power subsystems from the current practice of limited autonomy to a greater use of automation with each step being justified on a cost/benefit basis. Each accomplishment should move toward the objectives of decreased requirement for ground control, increased system reliability through onboard management, and ultimately lower energy cost through longer life systems that require fewer resources to operate and maintain. This approach seems well-suited to the evolution of more sophisticated algorithms and eventually perhaps even the use of some sort of artificial intelligence. Multi-hundred kilowatt systems of the future will probably require an advanced level of autonomy if they are to be affordable and manageable.

  17. Automated External Defibrillator

    MedlinePlus

    ... from the NHLBI on Twitter. What Is an Automated External Defibrillator? An automated external defibrillator (AED) is a portable device that ... Institutes of Health Department of Health and Human Services USA.gov

  18. Towards a structural classification of phosphate binding sites in protein-nucleotide complexes: an automated all-against-all structural comparison using geometric matching.

    PubMed

    Brakoulias, Andreas; Jackson, Richard M

    2004-08-01

    A method is described for the rapid comparison of protein binding sites using geometric matching to detect similar three-dimensional structure. The geometric matching detects common atomic features through identification of the maximum common sub-graph or clique. These features are not necessarily evident from sequence or from global structural similarity giving additional insight into molecular recognition not evident from current sequence or structural classification schemes. Here we use the method to produce an all-against-all comparison of phosphate binding sites in a number of different nucleotide phosphate-binding proteins. The similarity search is combined with clustering of similar sites to allow a preliminary structural classification. Clustering by site similarity produces a classification of binding sites for the 476 representative local environments producing ten main clusters representing half of the representative environments. The similarities make sense in terms of both structural and functional classification schemes. The ten main clusters represent a very limited number of unique structural binding motifs for phosphate. These are the structural P-loop, di-nucleotide binding motif [FAD/NAD(P)-binding and Rossman-like fold] and FAD-binding motif. Similar classification schemes for nucleotide binding proteins have also been arrived at independently by others using different methods. PMID:15211509

  19. Application of DEN refinement and automated model building to a difficult case of molecular-replacement phasing: the structure of a putative succinyl-diaminopimelate desuccinylase from Corynebacterium glutamicum

    PubMed Central

    Brunger, Axel T.; Das, Debanu; Deacon, Ashley M.; Grant, Joanna; Terwilliger, Thomas C.; Read, Randy J.; Adams, Paul D.; Levitt, Michael; Schröder, Gunnar F.

    2012-01-01

    Phasing by molecular replacement remains difficult for targets that are far from the search model or in situations where the crystal diffracts only weakly or to low resolution. Here, the process of determining and refining the structure of Cgl1109, a putative succinyl-diaminopimelate desuccinylase from Corynebacterium glutamicum, at ∼3 Å resolution is described using a combination of homology modeling with MODELLER, molecular-replacement phasing with Phaser, deformable elastic network (DEN) refinement and automated model building using AutoBuild in a semi-automated fashion, followed by final refinement cycles with phenix.refine and Coot. This difficult molecular-replacement case illustrates the power of including DEN restraints derived from a starting model to guide the movements of the model during refinement. The resulting improved model phases provide better starting points for automated model building and produce more significant difference peaks in anomalous difference Fourier maps to locate anomalous scatterers than does standard refinement. This example also illustrates a current limitation of automated procedures that require manual adjustment of local sequence misalignments between the homology model and the target sequence. PMID:22505259

  20. Workflow automation architecture standard

    SciTech Connect

    Moshofsky, R.P.; Rohen, W.T.

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  1. Automation of industrial bioprocesses.

    PubMed

    Beyeler, W; DaPra, E; Schneider, K

    2000-01-01

    The dramatic development of new electronic devices within the last 25 years has had a substantial influence on the control and automation of industrial bioprocesses. Within this short period of time the method of controlling industrial bioprocesses has changed completely. In this paper, the authors will use a practical approach focusing on the industrial applications of automation systems. From the early attempts to use computers for the automation of biotechnological processes up to the modern process automation systems some milestones are highlighted. Special attention is given to the influence of Standards and Guidelines on the development of automation systems. PMID:11092132

  2. Automation in Clinical Microbiology

    PubMed Central

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  3. Distributed Experiment Automation System

    NASA Astrophysics Data System (ADS)

    Lebedev, Gennadi

    2003-03-01

    Module based distributed system for controlling and automation scientific experiments were developed. System divides in five main layers: 1. Data processing and presentation modules, 2. Controllers - support primary command evaluation, data analysis and synchronization between Device Drivers. 3. Data Server. Provide real time data storage and management. 4. Device Drivers, support communication, preliminary signals acquisitions and control of peripheral devices. 5. Utility - batch processing, login, errors of execution handling, experimental data persistent storage and management, modules and devices monitoring, alarm state, remote components messaging and notification processing. System used networking (DCOM protocol) for communication between distributed modules. Configuration, modules parameters, data and commands links defined in scripting file (XML format). This modular structure allows great flexibility and extensibility as modules can be added and configured as required without any extensive programming.

  4. Automated Standard Hazard Tool

    NASA Technical Reports Server (NTRS)

    Stebler, Shane

    2014-01-01

    The current system used to generate standard hazard reports is considered cumbersome and iterative. This study defines a structure for this system's process in a clear, algorithmic way so that standard hazard reports and basic hazard analysis may be completed using a centralized, web-based computer application. To accomplish this task, a test server is used to host a prototype of the tool during development. The prototype is configured to easily integrate into NASA's current server systems with minimal alteration. Additionally, the tool is easily updated and provides NASA with a system that may grow to accommodate future requirements and possibly, different applications. Results of this project's success are outlined in positive, subjective reviews complete by payload providers and NASA Safety and Mission Assurance personnel. Ideally, this prototype will increase interest in the concept of standard hazard automation and lead to the full-scale production of a user-ready application.

  5. Automating the multiprocessing environment

    NASA Technical Reports Server (NTRS)

    Arpasi, Dale J.

    1989-01-01

    An approach to automate the programming and operation of tree-structured networks of multiprocessor systems is discussed. A conceptual, knowledge-based operating environment is presented, and requirements for two major technology elements are identified as follows: (1) An intelligent information translator is proposed for implementating information transfer between dissimilar hardware and software, thereby enabling independent and modular development of future systems and promoting a language-independence of codes and information; (2) A resident system activity manager, which recognizes the systems capabilities and monitors the status of all systems within the environment, is proposed for integrating dissimilar systems into effective parallel processing resources to optimally meet user needs. Finally, key computational capabilities which must be provided before the environment can be realized are identified.

  6. Automated DNA Sequencing System

    SciTech Connect

    Armstrong, G.A.; Ekkebus, C.P.; Hauser, L.J.; Kress, R.L.; Mural, R.J.

    1999-04-25

    Oak Ridge National Laboratory (ORNL) is developing a core DNA sequencing facility to support biological research endeavors at ORNL and to conduct basic sequencing automation research. This facility is novel because its development is based on existing standard biology laboratory equipment; thus, the development process is of interest to the many small laboratories trying to use automation to control costs and increase throughput. Before automation, biology Laboratory personnel purified DNA, completed cycle sequencing, and prepared 96-well sample plates with commercially available hardware designed specifically for each step in the process. Following purification and thermal cycling, an automated sequencing machine was used for the sequencing. A technician handled all movement of the 96-well sample plates between machines. To automate the process, ORNL is adding a CRS Robotics A- 465 arm, ABI 377 sequencing machine, automated centrifuge, automated refrigerator, and possibly an automated SpeedVac. The entire system will be integrated with one central controller that will direct each machine and the robot. The goal of this system is to completely automate the sequencing procedure from bacterial cell samples through ready-to-be-sequenced DNA and ultimately to completed sequence. The system will be flexible and will accommodate different chemistries than existing automated sequencing lines. The system will be expanded in the future to include colony picking and/or actual sequencing. This discrete event, DNA sequencing system will demonstrate that smaller sequencing labs can achieve cost-effective the laboratory grow.

  7. Laboratory Automation and Middleware.

    PubMed

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory. PMID:26065792

  8. Management Planning for Workplace Automation.

    ERIC Educational Resources Information Center

    McDole, Thomas L.

    Several factors must be considered when implementing office automation. Included among these are whether or not to automate at all, the effects of automation on employees, requirements imposed by automation on the physical environment, effects of automation on the total organization, and effects on clientele. The reasons behind the success or…

  9. Automating symbolic analysis with CLIPS

    NASA Technical Reports Server (NTRS)

    Morris, Keith E.

    1990-01-01

    Symbolic Analysis is a methodology first applied as an aid in selecting and generating test cases for 'white box' type testing of computer software programs. The feasibility of automating this analysis process has recently been demonstrated through the development of a CLIPS-based prototype tool. Symbolic analysis is based on separating the logic flow diagram of a computer program into its basic elements, and then systematically examining those elements and their relationships to provide a detailed static analysis of the process that those diagrams represent. The basic logic flow diagram elements are flow structure (connections), predicates (decisions), and computations (actions). The symbolic analysis approach supplies a disciplined step-by-step process to identify all executable program paths and produce a truth table that defines the input and output domains for each path identified. The resulting truth table is the tool that allows software test cases to be generated in a comprehensive manner to achieve total program path, input domain, and output domain coverage. Since the manual application of symbolic analysis is extremely labor intensive and is itself error prone, automation of the process is highly desirable. Earlier attempts at automation, utilizing conventional software approaches, had only limited success. This paper briefly describes the automation problems, the symbolic analysis expert's problem solving heuristics, and the implementation of those heuristics as a CLIPS based prototype, and the manual augmentation required. A simple application example is also provided for illustration purposes. The paper concludes with a discussion of implementation experiences, automation limitations, usage experiences, and future development suggestions.

  10. An automation simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.; Mutammara, Atheel

    1988-01-01

    The work being done in porting ROBOSIM (a graphical simulation system developed jointly by NASA-MSFC and Vanderbilt University) to the HP350SRX graphics workstation is described. New additional ROBOSIM features, like collision detection and new kinematics simulation methods are also discussed. Based on the experiences of the work on ROBOSIM, a new graphics structural modeling environment is suggested which is intended to be a part of a new knowledge-based multiple aspect modeling testbed. The knowledge-based modeling methodologies and tools already available are described. Three case studies in the area of Space Station automation are also reported. First a geometrical structural model of the station is presented. This model was developed using the ROBOSIM package. Next the possible application areas of an integrated modeling environment in the testing of different Space Station operations are discussed. One of these possible application areas is the modeling of the Environmental Control and Life Support System (ECLSS), which is one of the most complex subsystems of the station. Using the multiple aspect modeling methodology, a fault propagation model of this system is being built and is described.

  11. Automation, Manpower, and Education.

    ERIC Educational Resources Information Center

    Rosenberg, Jerry M.

    Each group in our population will be affected by automation and other forms of technological advancement. This book seeks to identify the needs of these various groups, and to present ways in which educators can best meet them. The author corrects certain prevalent misconceptions concerning manpower utilization and automation. Based on the…

  12. Planning for Office Automation.

    ERIC Educational Resources Information Center

    Sherron, Gene T.

    1982-01-01

    The steps taken toward office automation by the University of Maryland are described. Office automation is defined and some types of word processing systems are described. Policies developed in the writing of a campus plan are listed, followed by a section on procedures adopted to implement the plan. (Author/MLW)

  13. The Automated Office.

    ERIC Educational Resources Information Center

    Naclerio, Nick

    1979-01-01

    Clerical personnel may be able to climb career ladders as a result of office automation and expanded job opportunities in the word processing area. Suggests opportunities in an automated office system and lists books and periodicals on word processing for counselors and teachers. (MF)

  14. Work and Programmable Automation.

    ERIC Educational Resources Information Center

    DeVore, Paul W.

    A new industrial era based on electronics and the microprocessor has arrived, an era that is being called intelligent automation. Intelligent automation, in the form of robots, replaces workers, and the new products, using microelectronic devices, require significantly less labor to produce than the goods they replace. The microprocessor thus…

  15. Automated drilling draws interest

    SciTech Connect

    Not Available

    1985-05-01

    Interest in subsea technology includes recent purchase of both a British yard and Subsea Technology, a Houston-based BOP manufacturer. In France, key personnel from the former Comex Industries have been acquired and a base reinstalled in Marseille. ACB is also investing heavily, with the Norwegians, in automated drilling programs. These automated drilling programs are discussed.

  16. Library Automation Style Guide.

    ERIC Educational Resources Information Center

    Gaylord Bros., Liverpool, NY.

    This library automation style guide lists specific terms and names often used in the library automation industry. The terms and/or acronyms are listed alphabetically and each is followed by a brief definition. The guide refers to the "Chicago Manual of Style" for general rules, and a notes section is included for the convenience of individual…

  17. Automation and Cataloging.

    ERIC Educational Resources Information Center

    Furuta, Kenneth; And Others

    1990-01-01

    These three articles address issues in library cataloging that are affected by automation: (1) the impact of automation and bibliographic utilities on professional catalogers; (2) the effect of the LASS microcomputer software on the cost of authority work in cataloging at the University of Arizona; and (3) online subject heading and classification…

  18. Automation in Immunohematology

    PubMed Central

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-01-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process. PMID:22988378

  19. Automation, parallelism, and robotics for proteomics.

    PubMed

    Alterovitz, Gil; Liu, Jonathan; Chow, Jijun; Ramoni, Marco F

    2006-07-01

    The speed of the human genome project (Lander, E. S., Linton, L. M., Birren, B., Nusbaum, C. et al., Nature 2001, 409, 860-921) was made possible, in part, by developments in automation of sequencing technologies. Before these technologies, sequencing was a laborious, expensive, and personnel-intensive task. Similarly, automation and robotics are changing the field of proteomics today. Proteomics is defined as the effort to understand and characterize proteins in the categories of structure, function and interaction (Englbrecht, C. C., Facius, A., Comb. Chem. High Throughput Screen. 2005, 8, 705-715). As such, this field nicely lends itself to automation technologies since these methods often require large economies of scale in order to achieve cost and time-saving benefits. This article describes some of the technologies and methods being applied in proteomics in order to facilitate automation within the field as well as in linking proteomics-based information with other related research areas. PMID:16786489

  20. Making the transition to automation

    SciTech Connect

    Christenson, D.J. )

    1992-10-01

    By 1995, the Bureau of Reclamation's hydropower plant near Hungry Horse, Montana, will be remotely operated from Grand Coulee dam (about 300 miles away) in Washington State. Automation at Hungry Horse will eliminate the need for four full-time power plant operators. Between now and then, a transition plan that offers employees choices for retraining, transferring, or taking early retirement will smooth the transition in reducing from five operators to one. The transition plan also includes the use of temporary employees to offset risks of reducing staff too soon. When completed in 1953, the Hungry Horse structure was the world's fourth largest and fourth highest concrete dam. The arch-gravity structure has a crest length of 2,115 feet; it is 3,565 feet above sea level. The four turbine-generator units in the powerhouse total 284 MW, and supply approximately 1 billion kilowatt-hours of electricity annually to the federal power grid managed by the Bonneville Power Administration. In 1988, Reclamation began to automate operations at many of its hydro plants, and to establish centralized control points. The control center concept will increase efficiency. It also will coordinate water movements and power supply throughout the West. In the Pacific Northwest, the Grand Coulee and Black Canyon plants are automated control centers. Several Reclamation-owned facilities in the Columbia River Basin, including Hungry Horse, will be connected to these centers via microwave and telephone lines. When automation is complete, constant monitoring by computer will replace hourly manual readings and equipment checks. Computers also are expected to increase water use efficiency by 1 to 2 percent by ensuring operation for maximum turbine efficiency. Unit efficiency curves for various heads will be programmed into the system.

  1. Systematic review automation technologies

    PubMed Central

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  2. To automate or not to automate : this is the question.

    SciTech Connect

    Cymborowski, M.; Klimecka, M.; Chruszcz, M.; Zimmerman, M.; Shumilin, I.; Borek, D.; Lazarski, K.; Joachimiak, A.; Otwinowski, Z.; Anderson, W.; Minor, W.; Biosciences Division; Univ. of Virginia; Univ. of Texas; Northwestern Univ.; Univ. of Chicago

    2010-06-06

    New protocols and instrumentation significantly boost the outcome of structural biology, which has resulted in significant growth in the number of deposited Protein Data Bank structures. However, even an enormous increase of the productivity of a single step of the structure determination process may not significantly shorten the time between clone and deposition or publication. For example, in a medium size laboratory equipped with the LabDB and HKL-3000 systems, we show that automation of some (and integration of all) steps of the X-ray structure determination pathway is critical for laboratory productivity. Moreover, we show that the lag period after which the impact of a technology change is observed is longer than expected.

  3. Automated analysis in generic groups

    NASA Astrophysics Data System (ADS)

    Fagerholm, Edvard

    This thesis studies automated methods for analyzing hardness assumptions in generic group models, following ideas of symbolic cryptography. We define a broad class of generic and symbolic group models for different settings---symmetric or asymmetric (leveled) k-linear groups --- and prove ''computational soundness'' theorems for the symbolic models. Based on this result, we formulate a master theorem that relates the hardness of an assumption to solving problems in polynomial algebra. We systematically analyze these problems identifying different classes of assumptions and obtain decidability and undecidability results. Then, we develop automated procedures for verifying the conditions of our master theorems, and thus the validity of hardness assumptions in generic group models. The concrete outcome is an automated tool, the Generic Group Analyzer, which takes as input the statement of an assumption, and outputs either a proof of its generic hardness or shows an algebraic attack against the assumption. Structure-preserving signatures are signature schemes defined over bilinear groups in which messages, public keys and signatures are group elements, and the verification algorithm consists of evaluating ''pairing-product equations''. Recent work on structure-preserving signatures studies optimality of these schemes in terms of the number of group elements needed in the verification key and the signature, and the number of pairing-product equations in the verification algorithm. While the size of keys and signatures is crucial for many applications, another aspect of performance is the time it takes to verify a signature. The most expensive operation during verification is the computation of pairings. However, the concrete number of pairings is not captured by the number of pairing-product equations considered in earlier work. We consider the question of what is the minimal number of pairing computations needed to verify structure-preserving signatures. We build an

  4. Automated Fabrication Technologies for High Performance Polymer Composites

    NASA Technical Reports Server (NTRS)

    Shuart , M. J.; Johnston, N. J.; Dexter, H. B.; Marchello, J. M.; Grenoble, R. W.

    1998-01-01

    New fabrication technologies are being exploited for building high graphite-fiber-reinforced composite structure. Stitched fiber preforms and resin film infusion have been successfully demonstrated for large, composite wing structures. Other automatic processes being developed include automated placement of tacky, drapable epoxy towpreg, automated heated head placement of consolidated ribbon/tape, and vacuum-assisted resin transfer molding. These methods have the potential to yield low cost high performance structures by fabricating composite structures to net shape out-of-autoclave.

  5. Automated fiber placement: Evolution and current demonstrations

    NASA Technical Reports Server (NTRS)

    Grant, Carroll G.; Benson, Vernon M.

    1993-01-01

    The automated fiber placement process has been in development at Hercules since 1980. Fiber placement is being developed specifically for aircraft and other high performance structural applications. Several major milestones have been achieved during process development. These milestones are discussed in this paper. The automated fiber placement process is currently being demonstrated on the NASA ACT program. All demonstration projects to date have focused on fiber placement of transport aircraft fuselage structures. Hercules has worked closely with Boeing and Douglas on these demonstration projects. This paper gives a description of demonstration projects and results achieved.

  6. Monitoring of the physical status of Mars-500 subjects as a model of structuring an automated system in support of the training process in an exploration mission

    NASA Astrophysics Data System (ADS)

    Fomina, Elena; Savinkina, Alexandra; Kozlovskaya, Inesa; Lysova, Nataliya; Angeli, Tomas; Chernova, Maria; Uskov, Konstantin; Kukoba, Tatyana; Sonkin, Valentin; Ba, Norbert

    Physical training sessions aboard the ISS are performed under the permanent continuous control from Earth. Every week the instructors give their recommendations on how to proceed with the training considering the results of analysis of the daily records of training cosmonauts and data of the monthly fitness testing. It is obvious that in very long exploration missions this system of monitoring will be inapplicable. For this reason we venture to develop an automated system to control the physical training process using the current ISS locomotion test parameters as the leading criteria. Simulation of an extended exploration mission in experiment MARS-500 enabled the trial application of the automated system for assessing shifts in cosmonauts’ physical status in response to exercises of varying category and dismissal periods. Methods. Six subjects spent 520 days in the analog of an interplanetary vehicle at IBMP (Moscow). A variety of training regimens and facilities were used to maintain a high level of physical performance of the subjects. The resistance exercises involved expanders, strength training device (MDS) and vibrotraining device (Galileo). The cycling exercises were performed on the bicycle ergometer (VB-3) and a treadmill with the motor in or out of motion. To study the effect of prolonged periods of dismissal from training on physical performance, the training flow was interrupted for a month once in the middle and then at the end of isolation. In addition to the in-flight locomotion test integrated into the automated training control system, the physical status of subjects was attested by analysis of the records of the monthly incremental testing on the bicycle ergometer and MDS. Results. It was demonstrated that the recommended training regimens maintained high physical performance levels despite the limited motor activities in isolation. According to the locomotion testing, the subjects increased velocity significantly and reduced the physiological

  7. Planning for Office Automation.

    ERIC Educational Resources Information Center

    Mick, Colin K.

    1983-01-01

    Outlines a practical approach to planning for office automation termed the "Focused Process Approach" (the "what" phase, "how" phase, "doing" phase) which is a synthesis of the problem-solving and participatory planning approaches. Thirteen references are provided. (EJS)

  8. Space station automation II

    SciTech Connect

    Chiou, W.C.

    1986-01-01

    This book contains the proceedings of a conference on space station automation. Topics include the following: distributed artificial intelligence for space station energy management systems and computer architecture for tolerobots in earth orbit.

  9. Shielded cells transfer automation

    SciTech Connect

    Fisher, J J

    1984-01-01

    Nuclear waste from shielded cells is removed, packaged, and transferred manually in many nuclear facilities. Radiation exposure is absorbed by operators during these operations and limited only through procedural controls. Technological advances in automation using robotics have allowed a production waste removal operation to be automated to reduce radiation exposure. The robotic system bags waste containers out of glove box and transfers them to a shielded container. Operators control the system outside the system work area via television cameras. 9 figures.

  10. Automated Lattice Perturbation Theory

    SciTech Connect

    Monahan, Christopher

    2014-11-01

    I review recent developments in automated lattice perturbation theory. Starting with an overview of lattice perturbation theory, I focus on the three automation packages currently "on the market": HiPPy/HPsrc, Pastor and PhySyCAl. I highlight some recent applications of these methods, particularly in B physics. In the final section I briefly discuss the related, but distinct, approach of numerical stochastic perturbation theory.

  11. Automated data analysis.

    NASA Astrophysics Data System (ADS)

    Teuber, D.

    Automated data analysis assists the astronomer in the decision making processes applied for extracting astronomical information from data. Automated data analysis is the step between image processing and model interpretation. Tools developed in AI are applied (classification, expert system). Programming languages and computers are chosen to fulfil the increasing requirements. Expert systems have begun in astronomy. Data banks permit the astronomical community to share the large body of resulting information.

  12. Automated Pilot Advisory System

    NASA Technical Reports Server (NTRS)

    Parks, J. L., Jr.; Haidt, J. G.

    1981-01-01

    An Automated Pilot Advisory System (APAS) was developed and operationally tested to demonstrate the concept that low cost automated systems can provide air traffic and aviation weather advisory information at high density uncontrolled airports. The system was designed to enhance the see and be seen rule of flight, and pilots who used the system preferred it over the self announcement system presently used at uncontrolled airports.

  13. Automated Status Notification System

    NASA Technical Reports Server (NTRS)

    2005-01-01

    NASA Lewis Research Center's Automated Status Notification System (ASNS) was born out of need. To prevent "hacker attacks," Lewis' telephone system needed to monitor communications activities 24 hr a day, 7 days a week. With decreasing staff resources, this continuous monitoring had to be automated. By utilizing existing communications hardware, a UNIX workstation, and NAWK (a pattern scanning and processing language), we implemented a continuous monitoring system.

  14. Automated imagery orthorectification pilot

    NASA Astrophysics Data System (ADS)

    Slonecker, E. Terrence; Johnson, Brad; McMahon, Joe

    2009-10-01

    Automated orthorectification of raw image products is now possible based on the comprehensive metadata collected by Global Positioning Systems and Inertial Measurement Unit technology aboard aircraft and satellite digital imaging systems, and based on emerging pattern-matching and automated image-to-image and control point selection capabilities in many advanced image processing systems. Automated orthorectification of standard aerial photography is also possible if a camera calibration report and sufficient metadata is available. Orthorectification of historical imagery, for which only limited metadata was available, was also attempted and found to require some user input, creating a semi-automated process that still has significant potential to reduce processing time and expense for the conversion of archival historical imagery into geospatially enabled, digital formats, facilitating preservation and utilization of a vast archive of historical imagery. Over 90 percent of the frames of historical aerial photos used in this experiment were successfully orthorectified to the accuracy of the USGS 100K base map series utilized for the geospatial reference of the archive. The accuracy standard for the 100K series maps is approximately 167 feet (51 meters). The main problems associated with orthorectification failure were cloud cover, shadow and historical landscape change which confused automated image-to-image matching processes. Further research is recommended to optimize automated orthorectification methods and enable broad operational use, especially as related to historical imagery archives.

  15. Automated Groundwater Screening

    SciTech Connect

    Taylor, Glenn A.; Collard, Leonard, B.

    2005-10-31

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application.

  16. Automation of Meudon Synoptic Maps

    NASA Astrophysics Data System (ADS)

    Aboudarham, J.; Scholl, I.; Fuller, N.; Fouesneau, M.; Galametz, M.; Gonon, F.; Maire, A.; Leroy, Y.

    2007-05-01

    Thanks to the automatic solar features detection developed in the frame of the European EGSO (European Grid of Solar Observations) project, an important part of the automation of Meudon Synoptic Maps is achieved. Nevertheless, the tracking of these solar structures over time has still to be done to synthesize their evolution during a Carrington rotation. A new approach to track filaments, based on image segmentation and intersection of regions of interest, gives successful results, This is a major step to move towards a fully automatic building of Meudon Synoptic Maps of Solar Activity.

  17. An Automated Control System for Machinery Parts Machining

    NASA Astrophysics Data System (ADS)

    Petreshin, D. I.; Handozhko, A. V.; Fedonin, O. N.

    2016-04-01

    The article deals with the problem of creating an automated system for controlling surface layer quality characteristics of machinery parts during machining. An automated system structure, its operation algorithm, mathematical support and work results are provided. The paper proves the necessity of using a self-learning mode in technological systems for providing set values of surface layer quality characteristics.

  18. Desperately Seeking Authority Control: Automated Systems Are Not Providing It.

    ERIC Educational Resources Information Center

    Johnston, Sarah Hager

    1990-01-01

    Reports on a survey which assessed automated authority control capabilities of 18 vendors' automated library systems, software, or services. Graphs rank vendors according to overall score, authority record source, format/storage of authority records, database dynamics, matching/linking authority and bibliographic records, syndetic structure,…

  19. Approaches to automated protein crystal harvesting

    PubMed Central

    Deller, Marc C.; Rupp, Bernhard

    2014-01-01

    The harvesting of protein crystals is almost always a necessary step in the determination of a protein structure using X-ray crystallographic techniques. However, protein crystals are usually fragile and susceptible to damage during the harvesting process. For this reason, protein crystal harvesting is the single step that remains entirely dependent on skilled human intervention. Automation has been implemented in the majority of other stages of the structure-determination pipeline, including cloning, expression, purification, crystallization and data collection. The gap in automation between crystallization and data collection results in a bottleneck in throughput and presents unfortunate opportunities for crystal damage. Several automated protein crystal harvesting systems have been developed, including systems utilizing microcapillaries, microtools, microgrippers, acoustic droplet ejection and optical traps. However, these systems have yet to be commonly deployed in the majority of crystallography laboratories owing to a variety of technical and cost-related issues. Automation of protein crystal harvesting remains essential for harnessing the full benefits of fourth-generation synchrotrons, free-electron lasers and microfocus beamlines. Furthermore, automation of protein crystal harvesting offers several benefits when compared with traditional manual approaches, including the ability to harvest microcrystals, improved flash-cooling procedures and increased throughput. PMID:24637746

  20. Approaches to automated protein crystal harvesting.

    PubMed

    Deller, Marc C; Rupp, Bernhard

    2014-02-01

    The harvesting of protein crystals is almost always a necessary step in the determination of a protein structure using X-ray crystallographic techniques. However, protein crystals are usually fragile and susceptible to damage during the harvesting process. For this reason, protein crystal harvesting is the single step that remains entirely dependent on skilled human intervention. Automation has been implemented in the majority of other stages of the structure-determination pipeline, including cloning, expression, purification, crystallization and data collection. The gap in automation between crystallization and data collection results in a bottleneck in throughput and presents unfortunate opportunities for crystal damage. Several automated protein crystal harvesting systems have been developed, including systems utilizing microcapillaries, microtools, microgrippers, acoustic droplet ejection and optical traps. However, these systems have yet to be commonly deployed in the majority of crystallography laboratories owing to a variety of technical and cost-related issues. Automation of protein crystal harvesting remains essential for harnessing the full benefits of fourth-generation synchrotrons, free-electron lasers and microfocus beamlines. Furthermore, automation of protein crystal harvesting offers several benefits when compared with traditional manual approaches, including the ability to harvest microcrystals, improved flash-cooling procedures and increased throughput. PMID:24637746

  1. Impact of Automation on Technical Services.

    ERIC Educational Resources Information Center

    Rooks, Dana C.; Thompson, Linda L.

    1988-01-01

    Discusses the impact of automation on library technical services, and the need for library managers to be aware of the issues involved and to plan for future developments. The discussion focuses on the areas of job related concerns of technical staff, organizational structures, recruitment and training, and ergonomic considerations. (CLB)

  2. Automated inspection of bread and loaves

    NASA Astrophysics Data System (ADS)

    Batchelor, Bruce G.

    1993-08-01

    The prospects for building practical automated inspection machines, capable of detecting the following faults in ordinary, everyday loaves are reviewed: (1) foreign bodies, using X-rays, (2) texture changes, using glancing illumination, mathematical morphology and Neural Net learning techniques, and (3) shape deformations, using structured lighting and simple geometry.

  3. Improving Acceptance of Automated Counseling Procedures.

    ERIC Educational Resources Information Center

    Johnson, James H.; And Others

    This paper discusses factors that may influence the acceptance of automated counseling procedures by the military. A consensual model of the change process is presented which structures organizational readiness, the change strategy, and acceptance as integrated variables to be considered in a successful installation. A basic introduction to the…

  4. Automated telescope scheduling

    NASA Technical Reports Server (NTRS)

    Johnston, Mark D.

    1988-01-01

    With the ever increasing level of automation of astronomical telescopes the benefits and feasibility of automated planning and scheduling are becoming more apparent. Improved efficiency and increased overall telescope utilization are the most obvious goals. Automated scheduling at some level has been done for several satellite observatories, but the requirements on these systems were much less stringent than on modern ground or satellite observatories. The scheduling problem is particularly acute for Hubble Space Telescope: virtually all observations must be planned in excruciating detail weeks to months in advance. Space Telescope Science Institute has recently made significant progress on the scheduling problem by exploiting state-of-the-art artificial intelligence software technology. What is especially interesting is that this effort has already yielded software that is well suited to scheduling groundbased telescopes, including the problem of optimizing the coordinated scheduling of more than one telescope.

  5. Automated telescope scheduling

    NASA Astrophysics Data System (ADS)

    Johnston, Mark D.

    1988-08-01

    With the ever increasing level of automation of astronomical telescopes the benefits and feasibility of automated planning and scheduling are becoming more apparent. Improved efficiency and increased overall telescope utilization are the most obvious goals. Automated scheduling at some level has been done for several satellite observatories, but the requirements on these systems were much less stringent than on modern ground or satellite observatories. The scheduling problem is particularly acute for Hubble Space Telescope: virtually all observations must be planned in excruciating detail weeks to months in advance. Space Telescope Science Institute has recently made significant progress on the scheduling problem by exploiting state-of-the-art artificial intelligence software technology. What is especially interesting is that this effort has already yielded software that is well suited to scheduling groundbased telescopes, including the problem of optimizing the coordinated scheduling of more than one telescope.

  6. Automated Camera Calibration

    NASA Technical Reports Server (NTRS)

    Chen, Siqi; Cheng, Yang; Willson, Reg

    2006-01-01

    Automated Camera Calibration (ACAL) is a computer program that automates the generation of calibration data for camera models used in machine vision systems. Machine vision camera models describe the mapping between points in three-dimensional (3D) space in front of the camera and the corresponding points in two-dimensional (2D) space in the camera s image. Calibrating a camera model requires a set of calibration data containing known 3D-to-2D point correspondences for the given camera system. Generating calibration data typically involves taking images of a calibration target where the 3D locations of the target s fiducial marks are known, and then measuring the 2D locations of the fiducial marks in the images. ACAL automates the analysis of calibration target images and greatly speeds the overall calibration process.

  7. Power subsystem automation study

    NASA Technical Reports Server (NTRS)

    Imamura, M. S.; Moser, R. L.; Veatch, M.

    1983-01-01

    Generic power-system elements and their potential faults are identified. Automation functions and their resulting benefits are defined and automation functions between power subsystem, central spacecraft computer, and ground flight-support personnel are partitioned. All automation activities were categorized as data handling, monitoring, routine control, fault handling, planning and operations, or anomaly handling. Incorporation of all these classes of tasks, except for anomaly handling, in power subsystem hardware and software was concluded to be mandatory to meet the design and operational requirements of the space station. The key drivers are long mission lifetime, modular growth, high-performance flexibility, a need to accommodate different electrical user-load equipment, onorbit assembly/maintenance/servicing, and potentially large number of power subsystem components. A significant effort in algorithm development and validation is essential in meeting the 1987 technology readiness date for the space station.

  8. Automated Factor Slice Sampling.

    PubMed

    Tibbits, Matthew M; Groendyke, Chris; Haran, Murali; Liechty, John C

    2014-01-01

    Markov chain Monte Carlo (MCMC) algorithms offer a very general approach for sampling from arbitrary distributions. However, designing and tuning MCMC algorithms for each new distribution, can be challenging and time consuming. It is particularly difficult to create an efficient sampler when there is strong dependence among the variables in a multivariate distribution. We describe a two-pronged approach for constructing efficient, automated MCMC algorithms: (1) we propose the "factor slice sampler", a generalization of the univariate slice sampler where we treat the selection of a coordinate basis (factors) as an additional tuning parameter, and (2) we develop an approach for automatically selecting tuning parameters in order to construct an efficient factor slice sampler. In addition to automating the factor slice sampler, our tuning approach also applies to the standard univariate slice samplers. We demonstrate the efficiency and general applicability of our automated MCMC algorithm with a number of illustrative examples. PMID:24955002

  9. Automated Factor Slice Sampling

    PubMed Central

    Tibbits, Matthew M.; Groendyke, Chris; Haran, Murali; Liechty, John C.

    2013-01-01

    Markov chain Monte Carlo (MCMC) algorithms offer a very general approach for sampling from arbitrary distributions. However, designing and tuning MCMC algorithms for each new distribution, can be challenging and time consuming. It is particularly difficult to create an efficient sampler when there is strong dependence among the variables in a multivariate distribution. We describe a two-pronged approach for constructing efficient, automated MCMC algorithms: (1) we propose the “factor slice sampler”, a generalization of the univariate slice sampler where we treat the selection of a coordinate basis (factors) as an additional tuning parameter, and (2) we develop an approach for automatically selecting tuning parameters in order to construct an efficient factor slice sampler. In addition to automating the factor slice sampler, our tuning approach also applies to the standard univariate slice samplers. We demonstrate the efficiency and general applicability of our automated MCMC algorithm with a number of illustrative examples. PMID:24955002

  10. Automated software development workstation

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Engineering software development was automated using an expert system (rule-based) approach. The use of this technology offers benefits not available from current software development and maintenance methodologies. A workstation was built with a library or program data base with methods for browsing the designs stored; a system for graphical specification of designs including a capability for hierarchical refinement and definition in a graphical design system; and an automated code generation capability in FORTRAN. The workstation was then used in a demonstration with examples from an attitude control subsystem design for the space station. Documentation and recommendations are presented.