Science.gov

Sample records for pcgc automated structure

  1. Revised users manual, Pulverized Coal Gasification or Combustion: 2-dimensional (87-PCGC-2): Final report, Volume 2. [87-PCGC-2

    SciTech Connect

    Smith, P.J.; Smoot, L.D.; Brewster, B.S.

    1987-12-01

    A two-dimensional, steady-state model for describing a variety of reactive and non-reactive flows, including pulverized coal combustion and gasification, is presented. Recent code revisions and additions are described. The model, referred to as 87-PCGC-2, is applicable to cylindrical axi-symmetric systems. Turbulence is accounted for in both the fluid mechanics equations and the combustion scheme. Radiation from gases, walls, and particles is taken into account using either a flux method or discrete ordinates method. The particle phase is modeled in a Lagrangian framework, such that mean paths of particle groups are followed. Several multi-step coal devolatilization schemes are included along with a heterogeneous reaction scheme that allows for both diffusion and chemical reaction. Major gas-phase reactions are modeled assuming local instantaneous equilibrium, and thus the reaction rates are limited by the turbulent rate mixing. A NO/sub x/ finite rate chemistry submodel is included which integrates chemical kinetics and the statistics of the turbulence. The gas phase is described by elliptic partial differential equations that are solved by an iterative line-by-line technique. Under-relaxation is used to achieve numerical stability. The generalized nature of the model allows for calculation of isothermal fluid mechanicsgaseous combustion, droplet combustion, particulate combustion and various mixtures of the above, including combustion of coal-water and coal-oil slurries. Both combustion and gasification environments are permissible. User information and theory are presented, along with sample problems. 106 refs.

  2. Automated design of aerospace structures

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Mccomb, H. G.

    1974-01-01

    The current state-of-the-art in structural analysis of aerospace vehicles is characterized, automated design technology is discussed, and an indication is given of the future direction of research in analysis and automated design. Representative computer programs for analysis typical of those in routine use in vehicle design activities are described, and results are shown for some selected analysis problems. Recent and planned advances in analysis capability are indicated. Techniques used to automate the more routine aspects of structural design are discussed, and some recently developed automated design computer programs are described. Finally, discussion is presented of early accomplishments in interdisciplinary automated design systems, and some indication of the future thrust of research in this field is given.

  3. Automated Characterization Of Vibrations Of A Structure

    NASA Technical Reports Server (NTRS)

    Bayard, David S.; Yam, Yeung; Mettler, Edward; Hadaegh, Fred Y.; Milman, Mark H.; Scheid, Robert E.

    1992-01-01

    Automated method of characterizing dynamical properties of large flexible structure yields estimates of modal parameters used by robust control system to stabilize structure and minimize undesired motions. Based on extraction of desired modal and control-design data from responses of structure to known vibrational excitations. Applicable to terrestrial structures where vibrations are important - aircraft, buildings, bridges, cranes, and drill strings.

  4. Automated MAD and MIR structure solution.

    PubMed

    Terwilliger, T C; Berendzen, J

    1999-04-01

    Obtaining an electron-density map from X-ray diffraction data can be difficult and time-consuming even after the data have been collected, largely because MIR and MAD structure determinations currently require many subjective evaluations of the qualities of trial heavy-atom partial structures before a correct heavy-atom solution is obtained. A set of criteria for evaluating the quality of heavy-atom partial solutions in macromolecular crystallography have been developed. These have allowed the conversion of the crystal structure-solution process into an optimization problem and have allowed its automation. The SOLVE software has been used to solve MAD data sets with as many as 52 selenium sites in the asymmetric unit. The automated structure-solution process developed is a major step towards the fully automated structure-determination, model-building and refinement procedure which is needed for genomic scale structure determinations. PMID:10089316

  5. Automated MAD and MIR structure solution

    SciTech Connect

    Terwilliger, Thomas C.; Berendzen, Joel

    1999-04-01

    A fully automated procedure for solving MIR and MAD structures has been developed using a scoring scheme to convert the structure-solution process into an optimization problem. Obtaining an electron-density map from X-ray diffraction data can be difficult and time-consuming even after the data have been collected, largely because MIR and MAD structure determinations currently require many subjective evaluations of the qualities of trial heavy-atom partial structures before a correct heavy-atom solution is obtained. A set of criteria for evaluating the quality of heavy-atom partial solutions in macromolecular crystallography have been developed. These have allowed the conversion of the crystal structure-solution process into an optimization problem and have allowed its automation. The SOLVE software has been used to solve MAD data sets with as many as 52 selenium sites in the asymmetric unit. The automated structure-solution process developed is a major step towards the fully automated structure-determination, model-building and refinement procedure which is needed for genomic scale structure determinations.

  6. Automated structure solution with the PHENIX suite

    SciTech Connect

    Terwilliger, Thomas C; Zwart, Peter H; Afonine, Pavel V; Grosse - Kunstleve, Ralf W

    2008-01-01

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution, and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution, and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template- and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix. refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  7. Automated Structure Solution with the PHENIX Suite

    SciTech Connect

    Zwart, Peter H.; Zwart, Peter H.; Afonine, Pavel; Grosse-Kunstleve, Ralf W.; Hung, Li-Wei; Ioerger, Tom R.; McCoy, A.J.; McKee, Eric; Moriarty, Nigel; Read, Randy J.; Sacchettini, James C.; Sauter, Nicholas K.; Storoni, L.C.; Terwilliger, Tomas C.; Adams, Paul D.

    2008-06-09

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix.refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  8. Automating Structural Analysis of Spacecraft Vehicles

    NASA Technical Reports Server (NTRS)

    Hrinda, Glenn A.

    2004-01-01

    A major effort within NASA's vehicle analysis discipline has been to automate structural analysis and sizing optimization during conceptual design studies of advanced spacecraft. Traditional spacecraft structural sizing has involved detailed finite element analysis (FEA) requiring large degree-of-freedom (DOF) finite element models (FEM). Creation and analysis of these models can be time consuming and limit model size during conceptual designs. The goal is to find an optimal design that meets the mission requirements but produces the lightest structure. A structural sizing tool called HyperSizer has been successfully used in the conceptual design phase of a reusable launch vehicle and planetary exploration spacecraft. The program couples with FEA to enable system level performance assessments and weight predictions including design optimization of material selections and sizing of spacecraft members. The software's analysis capabilities are based on established aerospace structural methods for strength, stability and stiffness that produce adequately sized members and reliable structural weight estimates. The software also helps to identify potential structural deficiencies early in the conceptual design so changes can be made without wasted time. HyperSizer's automated analysis and sizing optimization increases productivity and brings standardization to a systems study. These benefits will be illustrated in examining two different types of conceptual spacecraft designed using the software. A hypersonic air breathing, single stage to orbit (SSTO), reusable launch vehicle (RLV) will be highlighted as well as an aeroshell for a planetary exploration vehicle used for aerocapture at Mars. By showing the two different types of vehicles, the software's flexibility will be demonstrated with an emphasis on reducing aeroshell structural weight. Member sizes, concepts and material selections will be discussed as well as analysis methods used in optimizing the structure

  9. The automation of natural product structure elucidation.

    PubMed

    Steinbeck, C

    2001-05-01

    The last two or three years have seen exciting developments in the field of computer-assisted structure elucidation (CASE) with a number of programs becoming commercially or freely available. This was the conditio sine qua non for CASE to be widely applied in the daily work of bench chemists and spectroscopists. A number of promising applications have been published in the area of structure generators, deterministic and stochastic CASE tools and property predictions, including the automatic distinction between natural products and artificial compounds, as well as the determination of 3-D structure from a connection table based on IR spectroscopy. Advancements in coupling techniques between chromatographic and spectroscopic methods demonstrate progress towards a fully automated structure elucidation or identification process starting at the earliest steps of obtaining crude extracts.

  10. Predicting toxicity through a computer automated structure evaluation program

    SciTech Connect

    Klopman, G.

    1985-09-01

    The computer automated structure evaluation program (CASE) has been extended to perform automatic quantitative structure-activity relationships (QSAR). Applications include the carcinogenicity of polycyclic aromatic hydrocarbons and of N-nitrosamines. Agreement with experiment is satisfactory.

  11. Automated detection of glaucoma using structural and non structural features.

    PubMed

    Salam, Anum A; Khalil, Tehmina; Akram, M Usman; Jameel, Amina; Basit, Imran

    2016-01-01

    Glaucoma is a chronic disease often called "silent thief of sight" as it has no symptoms and if not detected at an early stage it may cause permanent blindness. Glaucoma progression precedes some structural changes in the retina which aid ophthalmologists to detect glaucoma at an early stage and stop its progression. Fundoscopy is among one of the biomedical imaging techniques to analyze the internal structure of retina. Our proposed technique provides a novel algorithm to detect glaucoma from digital fundus image using a hybrid feature set. This paper proposes a novel combination of structural (cup to disc ratio) and non-structural (texture and intensity) features to improve the accuracy of automated diagnosis of glaucoma. The proposed method introduces a suspect class in automated diagnosis in case of any conflict in decision from structural and non-structural features. The evaluation of proposed algorithm is performed using a local database containing fundus images from 100 patients. This system is designed to refer glaucoma cases from rural areas to specialists and the motivation behind introducing suspect class is to ensure high sensitivity of proposed system. The average sensitivity and specificity of proposed system are 100 and 87 % respectively. PMID:27652092

  12. Pricing Structures for Automated Library Consortia.

    ERIC Educational Resources Information Center

    Machovec, George S.

    1993-01-01

    Discusses the development of successful pricing algorithms for cooperative library automation projects. Highlights include desirable characteristics of pricing measures, including simplicity and the ability to allow for system growth; problems with transaction-based systems; and a review of the pricing strategies of seven library consortia.…

  13. Automating the determination of 3D protein structure

    SciTech Connect

    Rayl, K.D.

    1993-12-31

    The creation of an automated method for determining 3D protein structure would be invaluable to the field of biology and presents an interesting challenge to computer science. Unfortunately, given the current level of protein knowledge, a completely automated solution method is not yet feasible, therefore, our group has decided to integrate existing databases and theories to create a software system that assists X-ray crystallographers in specifying a particular protein structure. By breaking the problem of determining overall protein structure into small subproblems, we hope to come closer to solving a novel structure by solving each component. By generating necessary information for structure determination, this method provides the first step toward designing a program to determine protein conformation automatically.

  14. Automated detection and location of structural degradation

    SciTech Connect

    Damiano, B.; Blakeman, E.D.; Phillips, L.D.

    1997-03-01

    The investigation of a diagnostic method for detecting and locating the source of structural degradation in mechanical systems is described in this paper. The diagnostic method uses a mathematical model of the mechanical system to define relationships between system parameters, such as spring rates and damping rates, and measurable spectral features, such as natural frequencies and mode shapes. These model-defined relationships are incorporated into a neural network, which is used to relate measured spectral features to system parameters. The diagnosis of the system`s condition is performed by presenting the neural network with measured spectral features and comparing the system parameters estimated by the neural network to previously estimated values. Changes in the estimated system parameters indicate the location and severity of degradation in the mechanical system. The investigation applied the method by using computer-simulated data and data collected form a bench-top mechanical system. The effects of neural network training set size and composition on the accuracy of the model parameter estimates were investigated by using computer simulated data. The results show that diagnostic method can be applied to successfully locate and estimate the magnitude of structural changes in a mechanical system. The average error in the estimated spring rate values of the bench-top mechanical system was less than 10%. This degree of accuracy is sufficient to permit the use of this method for detecting and locating structural degradation in mechanical systems.

  15. Automated Localization of Multiple Pelvic Bone Structures on MRI.

    PubMed

    Onal, Sinan; Lai-Yuen, Susana; Bao, Paul; Weitzenfeld, Alfredo; Hart, Stuart

    2016-01-01

    In this paper, we present a fully automated localization method for multiple pelvic bone structures on magnetic resonance images (MRI). Pelvic bone structures are at present identified manually on MRI to locate reference points for measurement and evaluation of pelvic organ prolapse (POP). Given that this is a time-consuming and subjective procedure, there is a need to localize pelvic bone structures automatically. However, bone structures are not easily differentiable from soft tissue on MRI as their pixel intensities tend to be very similar. In this paper, we present a model that combines support vector machines and nonlinear regression capturing global and local information to automatically identify the bounding boxes of bone structures on MRI. The model identifies the location of the pelvic bone structures by establishing the association between their relative locations and using local information such as texture features. Results show that the proposed method is able to locate the bone structures of interest accurately (dice similarity index >0.75) in 87-91% of the images. This research aims to enable accurate, consistent, and fully automated localization of bone structures on MRI to facilitate and improve the diagnosis of health conditions such as female POP.

  16. Automated RNA 3D Structure Prediction with RNAComposer.

    PubMed

    Biesiada, Marcin; Purzycka, Katarzyna J; Szachniuk, Marta; Blazewicz, Jacek; Adamiak, Ryszard W

    2016-01-01

    RNAs adopt specific structures to perform their activities and these are critical to virtually all RNA-mediated processes. Because of difficulties in experimentally assessing structures of large RNAs using NMR, X-ray crystallography, or cryo-microscopy, there is currently great demand for new high-resolution 3D structure prediction methods. Recently we reported on RNAComposer, a knowledge-based method for the fully automated RNA 3D structure prediction from a user-defined secondary structure. RNAComposer method is especially suited for structural biology users. Since our initial report in 2012, both servers, freely available at http://rnacomposer.ibch.poznan.pl and http://rnacomposer.cs.put.poznan.pl have been often visited. Therefore this chapter provides guidance for using RNAComposer and discusses points that should be considered when predicting 3D RNA structure. An application example presents current scope and limitations of RNAComposer. PMID:27665601

  17. Automated assembly of a tetrahedral truss structure using machine vision

    NASA Technical Reports Server (NTRS)

    Doggett, William R.

    1992-01-01

    The Automated Structures Assembly Laboratory is a unique facility at NASA Langley Research Center used to investigate the robotic assembly of truss structures. Two special-purpose end-effectors have been used to assemble 102 truss members and 12 panels into an 8-meter diameter structure. One end-effector is dedicated to truss member insertion, while a second end-effector is used to install panels. Until recently, the robot motions required to construct the structure were developed iteratively using the facility hardware. Recent work at Langley has resulted in a compact machine vision system capable of providing position information relative to targets on the structure. Use of the vision system to guide the robot from an approach point 10 to 18 inches from the structure, offsetting model inaccuracies, permits robot motion based on calculated points as a first step toward use of preplanned paths from an automated path planner. This paper presents recent work at Langley highlighting the application of the machine vision system during truss member insertion.

  18. Automated Low-Cost Photogrammetry for Flexible Structure Monitoring

    NASA Astrophysics Data System (ADS)

    Wang, C. H.; Mills, J. P.; Miller, P. E.

    2012-07-01

    Structural monitoring requires instruments which can provide high precision and accuracy, reliable measurements at good temporal resolution and rapid processing speeds. Long-term campaigns and flexible structures are regarded as two of the most challenging subjects in monitoring engineering structures. Long-term monitoring in civil engineering is generally considered to be labourintensive and financially expensive and it can take significant effort to arrange the necessary human resources, transportation and equipment maintenance. When dealing with flexible structure monitoring, it is of paramount importance that any monitoring equipment used is able to carry out rapid sampling. Low cost, automated, photogrammetric techniques therefore have the potential to become routinely viable for monitoring non-rigid structures. This research aims to provide a photogrammetric solution for long-term flexible structural monitoring purposes. The automated approach was achieved using low-cost imaging devices (mobile phones) to replace traditional image acquisition stations and substantially reduce the equipment costs. A self-programmed software package was developed to deal with the hardware-software integration and system operation. In order to evaluate the performance of this low-cost monitoring system, a shaking table experiment was undertaken. Different network configurations and target sizes were used to determine the best configuration. A large quantity of image data was captured by four DSLR cameras and four mobile phone cameras respectively. These image data were processed using photogrammetric techniques to calculate the final results for the system evaluation.

  19. Automated analysis of fundamental features of brain structures.

    PubMed

    Lancaster, Jack L; McKay, D Reese; Cykowski, Matthew D; Martinez, Michael J; Tan, Xi; Valaparla, Sunil; Zhang, Yi; Fox, Peter T

    2011-12-01

    Automated image analysis of the brain should include measures of fundamental structural features such as size and shape. We used principal axes (P-A) measurements to measure overall size and shape of brain structures segmented from MR brain images. The rationale was that quantitative volumetric studies of brain structures would benefit from shape standardization as had been shown for whole brain studies. P-A analysis software was extended to include controls for variability in position and orientation to support individual structure spatial normalization (ISSN). The rationale was that ISSN would provide a bias-free means to remove elementary sources of a structure's spatial variability in preparation for more detailed analyses. We studied nine brain structures (whole brain, cerebral hemispheres, cerebellum, brainstem, caudate, putamen, hippocampus, inferior frontal gyrus, and precuneus) from the 40-brain LPBA40 atlas. This paper provides the first report of anatomical positions and principal axes orientations within a standard reference frame, in addition to "shape/size related" principal axes measures, for the nine brain structures from the LPBA40 atlas. Analysis showed that overall size (mean volume) for internal brain structures was preserved using shape standardization while variance was reduced by more than 50%. Shape standardization provides increased statistical power for between-group volumetric studies of brain structures compared to volumetric studies that control only for whole brain size. To test ISSN's ability to control for spatial variability of brain structures we evaluated the overlap of 40 regions of interest (ROIs) in a standard reference frame for the nine different brain structures before and after processing. Standardizations of orientation or shape were ineffective when not combined with position standardization. The greatest reduction in spatial variability was seen for combined standardizations of position, orientation and shape. These

  20. A telerobotic system for automated assembly of large space structures

    NASA Technical Reports Server (NTRS)

    Rhodes, Marvin D.; Will, Ralph W.; Wise, Marion A.

    1990-01-01

    Future space missions such as polar platforms and antennas are anticipated to require large truss structures as their primary support system. During the past several years considerable research has been conducted to develop hardware and construction techniques suitable for astronaut assembly of truss structures in space. A research program has recently been initiated to develop the technology and to demonstrate the potential for automated in-space assembly of large erectable structures. The initial effort will be focused on automated assembly of a tetrahedral truss composed of 2-meter members. The facility is designed as a ground based system to permit evaluation of assembly concepts and was not designed for space qualification. The system is intended to be used as a tool from which more sophisticated procedures and operations can be developed. The facility description includes a truss structure, motionbases and a robot arm equipped with an end effector. Other considerations and requirements of the structural assembly describe computer control systems to monitor and control the operations of the assembly facility.

  1. A telerobotic system for automated assembly of large space structures

    NASA Technical Reports Server (NTRS)

    Rhodes, Marvin D.; Will, Ralph W.; Wise, Marion A.

    1989-01-01

    Future space missions such as polar platforms and antennas are anticipated to require large truss structures as their primary support system. During the past several years considerable research has been conducted to develop hardware and construction techniques suitable for astronaut assembly of truss structures in space. A research program has recently been initiated to develop the technology and to demonstrate the potential for automated in-space assembly of large erectable structures. The initial effort will be focussed on automated assembly of a tetrahedral truss composed of 2-meter members. The facility is designed as a ground based system to permit evaluation of assembly concepts and was not designed for space qualification. The system is intended to be used as a tool from which more sophisticated procedures and operations can be developed. The facility description includes a truss structure, motionbases and a robot arm equipped with an end effector. Other considerations and requirements of the structural assembly describe computer control systems to monitor and control the operations of the assembly facility.

  2. Verification Test of Automated Robotic Assembly of Space Truss Structures

    NASA Technical Reports Server (NTRS)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.

    1995-01-01

    A multidisciplinary program has been conducted at the Langley Research Center to develop operational procedures for supervised autonomous assembly of truss structures suitable for large-aperture antennas. The hardware and operations required to assemble a 102-member tetrahedral truss and attach 12 hexagonal panels were developed and evaluated. A brute-force automation approach was used to develop baseline assembly hardware and software techniques. However, as the system matured and operations were proven, upgrades were incorporated and assessed against the baseline test results. These upgrades included the use of distributed microprocessors to control dedicated end-effector operations, machine vision guidance for strut installation, and the use of an expert system-based executive-control program. This paper summarizes the developmental phases of the program, the results of several assembly tests, and a series of proposed enhancements. No problems that would preclude automated in-space assembly or truss structures have been encountered. The test system was developed at a breadboard level and continued development at an enhanced level is warranted.

  3. pmx: Automated protein structure and topology generation for alchemical perturbations.

    PubMed

    Gapsys, Vytautas; Michielssens, Servaas; Seeliger, Daniel; de Groot, Bert L

    2015-02-15

    Computational protein design requires methods to accurately estimate free energy changes in protein stability or binding upon an amino acid mutation. From the different approaches available, molecular dynamics-based alchemical free energy calculations are unique in their accuracy and solid theoretical basis. The challenge in using these methods lies in the need to generate hybrid structures and topologies representing two physical states of a system. A custom made hybrid topology may prove useful for a particular mutation of interest, however, a high throughput mutation analysis calls for a more general approach. In this work, we present an automated procedure to generate hybrid structures and topologies for the amino acid mutations in all commonly used force fields. The described software is compatible with the Gromacs simulation package. The mutation libraries are readily supported for five force fields, namely Amber99SB, Amber99SB*-ILDN, OPLS-AA/L, Charmm22*, and Charmm36. PMID:25487359

  4. Automated Glioblastoma Segmentation Based on a Multiparametric Structured Unsupervised Classification

    PubMed Central

    Juan-Albarracín, Javier; Fuster-Garcia, Elies; Manjón, José V.; Robles, Montserrat; Aparici, F.; Martí-Bonmatí, L.; García-Gómez, Juan M.

    2015-01-01

    Automatic brain tumour segmentation has become a key component for the future of brain tumour treatment. Currently, most of brain tumour segmentation approaches arise from the supervised learning standpoint, which requires a labelled training dataset from which to infer the models of the classes. The performance of these models is directly determined by the size and quality of the training corpus, whose retrieval becomes a tedious and time-consuming task. On the other hand, unsupervised approaches avoid these limitations but often do not reach comparable results than the supervised methods. In this sense, we propose an automated unsupervised method for brain tumour segmentation based on anatomical Magnetic Resonance (MR) images. Four unsupervised classification algorithms, grouped by their structured or non-structured condition, were evaluated within our pipeline. Considering the non-structured algorithms, we evaluated K-means, Fuzzy K-means and Gaussian Mixture Model (GMM), whereas as structured classification algorithms we evaluated Gaussian Hidden Markov Random Field (GHMRF). An automated postprocess based on a statistical approach supported by tissue probability maps is proposed to automatically identify the tumour classes after the segmentations. We evaluated our brain tumour segmentation method with the public BRAin Tumor Segmentation (BRATS) 2013 Test and Leaderboard datasets. Our approach based on the GMM model improves the results obtained by most of the supervised methods evaluated with the Leaderboard set and reaches the second position in the ranking. Our variant based on the GHMRF achieves the first position in the Test ranking of the unsupervised approaches and the seventh position in the general Test ranking, which confirms the method as a viable alternative for brain tumour segmentation. PMID:25978453

  5. Automated glioblastoma segmentation based on a multiparametric structured unsupervised classification.

    PubMed

    Juan-Albarracín, Javier; Fuster-Garcia, Elies; Manjón, José V; Robles, Montserrat; Aparici, F; Martí-Bonmatí, L; García-Gómez, Juan M

    2015-01-01

    Automatic brain tumour segmentation has become a key component for the future of brain tumour treatment. Currently, most of brain tumour segmentation approaches arise from the supervised learning standpoint, which requires a labelled training dataset from which to infer the models of the classes. The performance of these models is directly determined by the size and quality of the training corpus, whose retrieval becomes a tedious and time-consuming task. On the other hand, unsupervised approaches avoid these limitations but often do not reach comparable results than the supervised methods. In this sense, we propose an automated unsupervised method for brain tumour segmentation based on anatomical Magnetic Resonance (MR) images. Four unsupervised classification algorithms, grouped by their structured or non-structured condition, were evaluated within our pipeline. Considering the non-structured algorithms, we evaluated K-means, Fuzzy K-means and Gaussian Mixture Model (GMM), whereas as structured classification algorithms we evaluated Gaussian Hidden Markov Random Field (GHMRF). An automated postprocess based on a statistical approach supported by tissue probability maps is proposed to automatically identify the tumour classes after the segmentations. We evaluated our brain tumour segmentation method with the public BRAin Tumor Segmentation (BRATS) 2013 Test and Leaderboard datasets. Our approach based on the GMM model improves the results obtained by most of the supervised methods evaluated with the Leaderboard set and reaches the second position in the ranking. Our variant based on the GHMRF achieves the first position in the Test ranking of the unsupervised approaches and the seventh position in the general Test ranking, which confirms the method as a viable alternative for brain tumour segmentation. PMID:25978453

  6. Automated optimum design of wing structures. Deterministic and probabilistic approaches

    NASA Technical Reports Server (NTRS)

    Rao, S. S.

    1982-01-01

    The automated optimum design of airplane wing structures subjected to multiple behavior constraints is described. The structural mass of the wing is considered the objective function. The maximum stress, wing tip deflection, root angle of attack, and flutter velocity during the pull up maneuver (static load), the natural frequencies of the wing structure, and the stresses induced in the wing structure due to landing and gust loads are suitably constrained. Both deterministic and probabilistic approaches are used for finding the stresses induced in the airplane wing structure due to landing and gust loads. A wing design is represented by a uniform beam with a cross section in the form of a hollow symmetric double wedge. The airfoil thickness and chord length are the design variables, and a graphical procedure is used to find the optimum solutions. A supersonic wing design is represented by finite elements. The thicknesses of the skin and the web and the cross sectional areas of the flanges are the design variables, and nonlinear programming techniques are used to find the optimum solution.

  7. Semi-Automated Discovery of Application Session Structure

    SciTech Connect

    Kannan, J.; Jung, J.; Paxson, V.; Koksal, C.

    2006-09-07

    While the problem of analyzing network traffic at the granularity of individual connections has seen considerable previous work and tool development, understanding traffic at a higher level---the structure of user-initiated sessions comprised of groups of related connections---remains much less explored. Some types of session structure, such as the coupling between an FTP control connection and the data connections it spawns, have prespecified forms, though the specifications do not guarantee how the forms appear in practice. Other types of sessions, such as a user reading email with a browser, only manifest empirically. Still other sessions might exist without us even knowing of their presence, such as a botnet zombie receiving instructions from its master and proceeding in turn to carry them out. We present algorithms rooted in the statistics of Poisson processes that can mine a large corpus of network connection logs to extract the apparent structure of application sessions embedded in the connections. Our methods are semi-automated in that we aim to present an analyst with high-quality information (expressed as regular expressions) reflecting different possible abstractions of an application's session structure. We develop and test our methods using traces from a large Internet site, finding diversity in the number of applications that manifest, their different session structures, and the presence of abnormal behavior. Our work has applications to traffic characterization and monitoring, source models for synthesizing network traffic, and anomaly detection.

  8. An automated method for the analysis of trabecular bone structure.

    PubMed

    Aaron, J E; Johnson, D R; Kanis, J A; Oakley, B A; O'Higgins, P; Paxton, S K

    1992-02-01

    Trabecular structure as well as bone mass is important in studies of bone disease and fracture. An automated method for the direct analysis of two-dimensional trabecular micro-anatomy and its application to human iliac crest bone biopsies is described. Compared with established methods which require expensive equipment and complex software, costs have been reduced and availability increased by using an image analyzer driven by a microcomputer. Routine histological sections are accepted and an editing function enables the removal of artifacts. An elastic window allows field expansion for large specimens. The program enables the rapid assessment of the bone volume and trabecular surface from the intact image, followed by image skeletonization and the deduction of the trabecular length, number, character, and spacing together with the number of trabecular junctions and discontinuities; the trabecular width is calculated indirectly. Images may be stored to disk or printed as permanent records for diagnostic or research purposes.

  9. An automated approach to network features of protein structure ensembles.

    PubMed

    Bhattacharyya, Moitrayee; Bhat, Chanda R; Vishveshwara, Saraswathi

    2013-10-01

    Network theory applied to protein structures provides insights into numerous problems of biological relevance. The explosion in structural data available from PDB and simulations establishes a need to introduce a standalone-efficient program that assembles network concepts/parameters under one hood in an automated manner. Herein, we discuss the development/application of an exhaustive, user-friendly, standalone program package named PSN-Ensemble, which can handle structural ensembles generated through molecular dynamics (MD) simulation/NMR studies or from multiple X-ray structures. The novelty in network construction lies in the explicit consideration of side-chain interactions among amino acids. The program evaluates network parameters dealing with topological organization and long-range allosteric communication. The introduction of a flexible weighing scheme in terms of residue pairwise cross-correlation/interaction energy in PSN-Ensemble brings in dynamical/chemical knowledge into the network representation. Also, the results are mapped on a graphical display of the structure, allowing an easy access of network analysis to a general biological community. The potential of PSN-Ensemble toward examining structural ensemble is exemplified using MD trajectories of an ubiquitin-conjugating enzyme (UbcH5b). Furthermore, insights derived from network parameters evaluated using PSN-Ensemble for single-static structures of active/inactive states of β2-adrenergic receptor and the ternary tRNA complexes of tyrosyl tRNA synthetases (from organisms across kingdoms) are discussed. PSN-Ensemble is freely available from http://vishgraph.mbu.iisc.ernet.in/PSN-Ensemble/psn_index.html.

  10. Automated segmentation of tissue structures in optical coherence tomography data.

    PubMed

    Gasca, Fernando; Ramrath, Lukas; Huettmann, Gereon; Schweikard, Achim

    2009-01-01

    Segmentation of optical coherence tomography (OCT) images provides useful information, especially in medical imaging applications. Because OCT images are subject to speckle noise, the identification of structures is complicated. Addressing this issue, two methods for the automated segmentation of arbitrary structures in OCT images are proposed. The methods perform a seeded region growing, applying a model-based analysis of OCT A-scans for the seed's acquisition. The segmentation therefore avoids any user-intervention dependency. The first region-growing algorithm uses an adaptive neighborhood homogeneity criterion based on a model of an OCT intensity course in tissue and a model of speckle noise corruption. It can be applied to an unfiltered OCT image. The second performs region growing on a filtered OCT image applying the local median as a measure for homogeneity in the region. Performance is compared through the quantitative evaluation of artificial data, showing the capabilities of both in terms of structures detected and leakage. The proposed methods were tested on real OCT data in different scenarios and showed promising results for their application in OCT imaging.

  11. Development of a machine vision system for automated structural assembly

    NASA Technical Reports Server (NTRS)

    Sydow, P. Daniel; Cooper, Eric G.

    1992-01-01

    Research is being conducted at the LaRC to develop a telerobotic assembly system designed to construct large space truss structures. This research program was initiated within the past several years, and a ground-based test-bed was developed to evaluate and expand the state of the art. Test-bed operations currently use predetermined ('taught') points for truss structural assembly. Total dependence on the use of taught points for joint receptacle capture and strut installation is neither robust nor reliable enough for space operations. Therefore, a machine vision sensor guidance system is being developed to locate and guide the robot to a passive target mounted on the truss joint receptacle. The vision system hardware includes a miniature video camera, passive targets mounted on the joint receptacles, target illumination hardware, and an image processing system. Discrimination of the target from background clutter is accomplished through standard digital processing techniques. Once the target is identified, a pose estimation algorithm is invoked to determine the location, in three-dimensional space, of the target relative to the robots end-effector. Preliminary test results of the vision system in the Automated Structural Assembly Laboratory with a range of lighting and background conditions indicate that it is fully capable of successfully identifying joint receptacle targets throughout the required operational range. Controlled optical bench test results indicate that the system can also provide the pose estimation accuracy to define the target position.

  12. Automated identification of elemental ions in macromolecular crystal structures

    SciTech Connect

    Echols, Nathaniel Morshed, Nader; Afonine, Pavel V.; McCoy, Airlie J.; Read, Randy J.; Terwilliger, Thomas C.; Adams, Paul D.

    2014-04-01

    The solvent-picking procedure in phenix.refine has been extended and combined with Phaser anomalous substructure completion and analysis of coordination geometry to identify and place elemental ions. Many macromolecular model-building and refinement programs can automatically place solvent atoms in electron density at moderate-to-high resolution. This process frequently builds water molecules in place of elemental ions, the identification of which must be performed manually. The solvent-picking algorithms in phenix.refine have been extended to build common ions based on an analysis of the chemical environment as well as physical properties such as occupancy, B factor and anomalous scattering. The method is most effective for heavier elements such as calcium and zinc, for which a majority of sites can be placed with few false positives in a diverse test set of structures. At atomic resolution, it is observed that it can also be possible to identify tightly bound sodium and magnesium ions. A number of challenges that contribute to the difficulty of completely automating the process of structure completion are discussed.

  13. Automated web service composition supporting conditional branch structures

    NASA Astrophysics Data System (ADS)

    Wang, Pengwei; Ding, Zhijun; Jiang, Changjun; Zhou, Mengchu

    2014-01-01

    The creation of value-added services by automatic composition of existing ones is gaining a significant momentum as the potential silver bullet in service-oriented architecture. However, service composition faces two aspects of difficulties. First, users' needs present such characteristics as diversity, uncertainty and personalisation; second, the existing services run in a real-world environment that is highly complex and dynamically changing. These difficulties may cause the emergence of nondeterministic choices in the process of service composition, which has gone beyond what the existing automated service composition techniques can handle. According to most of the existing methods, the process model of composite service includes sequence constructs only. This article presents a method to introduce conditional branch structures into the process model of composite service when needed, in order to satisfy users' diverse and personalised needs and adapt to the dynamic changes of real-world environment. UML activity diagrams are used to represent dependencies in composite service. Two types of user preferences are considered in this article, which have been ignored by the previous work and a simple programming language style expression is adopted to describe them. Two different algorithms are presented to deal with different situations. A real-life case is provided to illustrate the proposed concepts and methods.

  14. Software design for automated assembly of truss structures

    NASA Technical Reports Server (NTRS)

    Herstrom, Catherine L.; Grantham, Carolyn; Allen, Cheryl L.; Doggett, William R.; Will, Ralph W.

    1992-01-01

    Concern over the limited intravehicular activity time has increased the interest in performing in-space assembly and construction operations with automated robotic systems. A technique being considered at LaRC is a supervised-autonomy approach, which can be monitored by an Earth-based supervisor that intervenes only when the automated system encounters a problem. A test-bed to support evaluation of the hardware and software requirements for supervised-autonomy assembly methods was developed. This report describes the design of the software system necessary to support the assembly process. The software is hierarchical and supports both automated assembly operations and supervisor error-recovery procedures, including the capability to pause and reverse any operation. The software design serves as a model for the development of software for more sophisticated automated systems and as a test-bed for evaluation of new concepts and hardware components.

  15. Automated alignment of serial thoracic scans using bone structure descriptors

    NASA Astrophysics Data System (ADS)

    Gavrielides, Marios A.; Petrick, Nicholas; Myers, Kyle J.

    2007-03-01

    In this manuscript we present an automated algorithm for the alignment of thoracic scans using descriptors of bone structures. Bone structures were utilized because they are expected to be less susceptible to sources of errors such as patient positioning and breath hold. The algorithm employed the positioning of ribs relative to the spinal cord along with a description of the scapula. The spinal cord centroid was detected by extracting local maxima of the distance transform followed by point tracing along consecutive slices. Ribs were segmented using adaptive thresholding followed by the watershed algorithm to detach ribs from the vertebra, and by imposing requirements of rib proximity to the lung border. The angles formed between the spinal cord centroid and segmented rib centroids were used to describe rib positioning. Additionally, the length of the scapula was extracted in each slice. A cost function incorporating the difference of features from rib positioning and scapula length between two slices was derived and used to match slices. The method was evaluated on a set of 12 pairs of full and partial CT scans acquired on the same day. Evaluation was based on whether the slices showing a nodule at its maximum diameter in each scan were matched. Full-to-partial and partial-to-full alignment were performed. Results showed that the proposed metric matched nodule slices within an average distance of 1.08 and 1.17 slices from the target for full-to-partial and partial-to-full alignment respectively. These preliminary results are encouraging for using this method as a first step in an overall process of temporally analyzing CT lung nodules.

  16. Automated construction of lightweight, simple, field-erected structures

    NASA Technical Reports Server (NTRS)

    Leonard, R. S.

    1980-01-01

    The feasibility of automation of construction processes which could result in mobile construction robots is examined. The construction of a large photovoltaic power plant with a peak power output of 100 MW is demonstrated. The reasons to automate the construction process, a conventional construction scenario as the reference for evaluation, and a list of potential cost benefits using robots are presented. The technical feasibility of using robots to construct SPS ground stations is addressed.

  17. Automated Eukaryotic Gene Structure Annotation Using EVidenceModeler and the Program to Assemble Spliced Alignments

    SciTech Connect

    Haas, B J; Salzberg, S L; Zhu, W; Pertea, M; Allen, J E; Orvis, J; White, O; Buell, C R; Wortman, J R

    2007-12-10

    EVidenceModeler (EVM) is presented as an automated eukaryotic gene structure annotation tool that reports eukaryotic gene structures as a weighted consensus of all available evidence. EVM, when combined with the Program to Assemble Spliced Alignments (PASA), yields a comprehensive, configurable annotation system that predicts protein-coding genes and alternatively spliced isoforms. Our experiments on both rice and human genome sequences demonstrate that EVM produces automated gene structure annotation approaching the quality of manual curation.

  18. Automated frequency domain system identification of a large space structure

    NASA Technical Reports Server (NTRS)

    Yam, Y.; Bayard, D. S.; Hadaegh, F. Y.; Mettler, E.; Milman, M. H.

    1989-01-01

    This paper presents the development and experimental results of an automated on-orbit system identification method for large flexible spacecraft that yields estimated quantities to support on-line design and tuning of robust high performance control systems. The procedure consists of applying an input to the plant, obtaining an output, and then conducting nonparametric identification to yield the spectral estimate of the system transfer function. A parametric model is determined by curve fitting the spectral estimate to a rational transfer function. The identification method has been demonstrated experimentally on the Large Spacecraft Control Laboratory in JPL.

  19. Finite element based electrostatic-structural coupled analysis with automated mesh morphing

    SciTech Connect

    OWEN,STEVEN J.; ZHULIN,V.I.; OSTERGAARD,D.F.

    2000-02-29

    A co-simulation tool based on finite element principles has been developed to solve coupled electrostatic-structural problems. An automated mesh morphing algorithm has been employed to update the field mesh after structural deformation. The co-simulation tool has been successfully applied to the hysteric behavior of a MEMS switch.

  20. Guiding automated NMR structure determination using a global optimization metric, the NMR DP score.

    PubMed

    Huang, Yuanpeng Janet; Mao, Binchen; Xu, Fei; Montelione, Gaetano T

    2015-08-01

    ASDP is an automated NMR NOE assignment program. It uses a distinct bottom-up topology-constrained network anchoring approach for NOE interpretation, with 2D, 3D and/or 4D NOESY peak lists and resonance assignments as input, and generates unambiguous NOE constraints for iterative structure calculations. ASDP is designed to function interactively with various structure determination programs that use distance restraints to generate molecular models. In the CASD-NMR project, ASDP was tested and further developed using blinded NMR data, including resonance assignments, either raw or manually-curated (refined) NOESY peak list data, and in some cases (15)N-(1)H residual dipolar coupling data. In these blinded tests, in which the reference structure was not available until after structures were generated, the fully-automated ASDP program performed very well on all targets using both the raw and refined NOESY peak list data. Improvements of ASDP relative to its predecessor program for automated NOESY peak assignments, AutoStructure, were driven by challenges provided by these CASD-NMR data. These algorithmic improvements include (1) using a global metric of structural accuracy, the discriminating power score, for guiding model selection during the iterative NOE interpretation process, and (2) identifying incorrect NOESY cross peak assignments caused by errors in the NMR resonance assignment list. These improvements provide a more robust automated NOESY analysis program, ASDP, with the unique capability of being utilized with alternative structure generation and refinement programs including CYANA, CNS, and/or Rosetta. PMID:26081575

  1. Texture analysis for automated classification of geologic structures

    USGS Publications Warehouse

    Shankar, V.; Rodriguez, J.J.; Gettings, M.E.

    2006-01-01

    Texture present in aeromagnetic anomaly images offers an abundance of useful geological information for discriminating between rock types, but current analysis of such images still relies on tedious, human interpretation. This study is believed to be the first effort to quantitatively assess the performance of texture-based digital image analysis for this geophysical exploration application. We computed several texture measures and determined the best subset using automated feature selection techniques. Pattern classification experiments measured the ability of various texture measures to automatically predict rock types. The classification accuracy was significantly better than a priori probability and prior weights-of-evidence results. The accuracy rates and choice of texture measures that minimize the error rate are reported. ?? 2006 IEEE.

  2. Aircraft wing structural design optimization based on automated finite element modelling and ground structure approach

    NASA Astrophysics Data System (ADS)

    Yang, Weizhu; Yue, Zhufeng; Li, Lei; Wang, Peiyan

    2016-01-01

    An optimization procedure combining an automated finite element modelling (AFEM) technique with a ground structure approach (GSA) is proposed for structural layout and sizing design of aircraft wings. The AFEM technique, based on CATIA VBA scripting and PCL programming, is used to generate models automatically considering the arrangement of inner systems. GSA is used for local structural topology optimization. The design procedure is applied to a high-aspect-ratio wing. The arrangement of the integral fuel tank, landing gear and control surfaces is considered. For the landing gear region, a non-conventional initial structural layout is adopted. The positions of components, the number of ribs and local topology in the wing box and landing gear region are optimized to obtain a minimum structural weight. Constraints include tank volume, strength, buckling and aeroelastic parameters. The results show that the combined approach leads to a greater weight saving, i.e. 26.5%, compared with three additional optimizations based on individual design approaches.

  3. Development and verification testing of automation and robotics for assembly of space structures

    NASA Technical Reports Server (NTRS)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.

    1993-01-01

    A program was initiated within the past several years to develop operational procedures for automated assembly of truss structures suitable for large-aperture antennas. The assembly operations require the use of a robotic manipulator and are based on the principle of supervised autonomy to minimize crew resources. A hardware testbed was established to support development and evaluation testing. A brute-force automation approach was used to develop the baseline assembly hardware and software techniques. As the system matured and an operation was proven, upgrades were incorprated and assessed against the baseline test results. This paper summarizes the developmental phases of the program, the results of several assembly tests, the current status, and a series of proposed developments for additional hardware and software control capability. No problems that would preclude automated in-space assembly of truss structures have been encountered. The current system was developed at a breadboard level and continued development at an enhanced level is warranted.

  4. About the automated pattern creation of 3D jacquard double needle bed warp knitted structures

    NASA Astrophysics Data System (ADS)

    Renkens, W.; Kyosev, Y.

    2016-07-01

    Three dimensional structures can be produced on jacquard warp knitting machines with double needle bed. This work presents theoretical considerations about the modelling and simulation of these structures. After that a method is described, how to obtain production parameters from the simulation data. The analysis demonstrates, that the automated pattern creation of 3D structures is not always possible and not all mathematical solutions of the problem can be knittable.

  5. Automated discovery of active motifs in multiple RNA secondary structures

    SciTech Connect

    Wang, J.T.L.; Chang, Chia-Yo; Shapiro, B.A.

    1996-12-31

    In this paper we present a method for discovering approximately common motifs (also known as active motifs) in multiple RNA secondary structures. The secondary structures can be represented as ordered trees (i.e., the order among siblings matters). Motifs in these trees are connected subgraphs that can differ in both substitutions and deletions/insertions. The proposed method consists of two steps: (1) find candidate motifs in a small sample of the secondary structures; (2) search all of the secondary structures to determine how frequently these motifs occur (within the allowed approximation) in the secondary structures. To reduce the running time, we develop two optimization heuristics based on sampling and pattern matching techniques. Experimental results obtained by running these algorithms on both generated data and RNA secondary structures show the good performance of the algorithms. To demonstrate the utility of our algorithms, we discuss their applications to conducting the phylogenetic study of RNA sequences obtained from GenBank.

  6. Automated Finite Element Modeling of Wing Structures for Shape Optimization

    NASA Technical Reports Server (NTRS)

    Harvey, Michael Stephen

    1993-01-01

    The displacement formulation of the finite element method is the most general and most widely used technique for structural analysis of airplane configurations. Modem structural synthesis techniques based on the finite element method have reached a certain maturity in recent years, and large airplane structures can now be optimized with respect to sizing type design variables for many load cases subject to a rich variety of constraints including stress, buckling, frequency, stiffness and aeroelastic constraints (Refs. 1-3). These structural synthesis capabilities use gradient based nonlinear programming techniques to search for improved designs. For these techniques to be practical a major improvement was required in computational cost of finite element analyses (needed repeatedly in the optimization process). Thus, associated with the progress in structural optimization, a new perspective of structural analysis has emerged, namely, structural analysis specialized for design optimization application, or.what is known as "design oriented structural analysis" (Ref. 4). This discipline includes approximation concepts and methods for obtaining behavior sensitivity information (Ref. 1), all needed to make the optimization of large structural systems (modeled by thousands of degrees of freedom and thousands of design variables) practical and cost effective.

  7. Development of a machine vision guidance system for automated assembly of space structures

    NASA Technical Reports Server (NTRS)

    Cooper, Eric G.; Sydow, P. Daniel

    1992-01-01

    The topics are presented in viewgraph form and include: automated structural assembly robot vision; machine vision requirements; vision targets and hardware; reflective efficiency; target identification; pose estimation algorithms; triangle constraints; truss node with joint receptacle targets; end-effector mounted camera and light assembly; vision system results from optical bench tests; and future work.

  8. Automated on-orbit frequency domain identification for large space structures

    NASA Technical Reports Server (NTRS)

    Bayard, D. S.; Hadaegh, F. Y.; Yam, Y.; Scheid, R. E.; Mettler, E.; Milman, M. H.

    1991-01-01

    Recent experiences in the field of flexible structure control in space have indicated a need for on-orbit system identification to support robust control redesign to avoid in-flight instabilities and maintain high spacecraft performance. This paper highlights an automated frequency domain system identification methodology recently developed to fulfill this need. The methodology is focused to support (1) the estimation of system quantities useful for robust control analysis and design; (2) experiment design tailored to performing system identification in a typically constrained on-orbit environment; and (3) the automation of operations to reduce 'human in the loop' requirements.

  9. Error tolerant NMR backbone resonance assignment and automated structure generation.

    PubMed

    Alipanahi, Babak; Gao, Xin; Karakoc, Emre; Li, Shuai Cheng; Balbach, Frank; Feng, Guangyu; Donaldson, Logan; Li, Ming

    2011-02-01

    Error tolerant backbone resonance assignment is the cornerstone of the NMR structure determination process. Although a variety of assignment approaches have been developed, none works sufficiently well on noisy fully automatically picked peaks to enable the subsequent automatic structure determination steps. We have designed an integer linear programming (ILP) based assignment system (IPASS) that has enabled fully automatic protein structure determination for four test proteins. IPASS employs probabilistic spin system typing based on chemical shifts and secondary structure predictions. Furthermore, IPASS extracts connectivity information from the inter-residue information and the (automatically picked) (15)N-edited NOESY peaks which are then used to fix reliable fragments. When applied to automatically picked peaks for real proteins, IPASS achieves an average precision and recall of 82% and 63%, respectively. In contrast, the next best method, MARS, achieves an average precision and recall of 77% and 36%, respectively. The assignments generated by IPASS are then fed into our protein structure calculation system, FALCON-NMR, to determine the 3D structures without human intervention. The final models have backbone RMSDs of 1.25Å, 0.88Å, 1.49Å, and 0.67Å to the reference native structures for proteins TM1112, CASKIN, VRAR, and HACS1, respectively. The web server is publicly available at http://monod.uwaterloo.ca/nmr/ipass.

  10. MemProtMD: Automated Insertion of Membrane Protein Structures into Explicit Lipid Membranes

    PubMed Central

    Stansfeld, Phillip J.; Goose, Joseph E.; Caffrey, Martin; Carpenter, Elisabeth P.; Parker, Joanne L.; Newstead, Simon; Sansom, Mark S.P.

    2015-01-01

    Summary There has been exponential growth in the number of membrane protein structures determined. Nevertheless, these structures are usually resolved in the absence of their lipid environment. Coarse-grained molecular dynamics (CGMD) simulations enable insertion of membrane proteins into explicit models of lipid bilayers. We have automated the CGMD methodology, enabling membrane protein structures to be identified upon their release into the PDB and embedded into a membrane. The simulations are analyzed for protein-lipid interactions, identifying lipid binding sites, and revealing local bilayer deformations plus molecular access pathways within the membrane. The coarse-grained models of membrane protein/bilayer complexes are transformed to atomistic resolution for further analysis and simulation. Using this automated simulation pipeline, we have analyzed a number of recently determined membrane protein structures to predict their locations within a membrane, their lipid/protein interactions, and the functional implications of an enhanced understanding of the local membrane environment of each protein. PMID:26073602

  11. From bacterial to human dihydrouridine synthase: automated structure determination

    SciTech Connect

    Whelan, Fiona Jenkins, Huw T.; Griffiths, Samuel C.; Byrne, Robert T.; Dodson, Eleanor J.; Antson, Alfred A.

    2015-06-30

    The crystal structure of a human dihydrouridine synthase, an enzyme associated with lung cancer, with 18% sequence identity to a T. maritima enzyme, has been determined at 1.9 Å resolution by molecular replacement after extensive molecular remodelling of the template. The reduction of uridine to dihydrouridine at specific positions in tRNA is catalysed by dihydrouridine synthase (Dus) enzymes. Increased expression of human dihydrouridine synthase 2 (hDus2) has been linked to pulmonary carcinogenesis, while its knockdown decreased cancer cell line viability, suggesting that it may serve as a valuable target for therapeutic intervention. Here, the X-ray crystal structure of a construct of hDus2 encompassing the catalytic and tRNA-recognition domains (residues 1–340) determined at 1.9 Å resolution is presented. It is shown that the structure can be determined automatically by phenix.mr-rosetta starting from a bacterial Dus enzyme with only 18% sequence identity and a significantly divergent structure. The overall fold of the human Dus2 is similar to that of bacterial enzymes, but has a larger recognition domain and a unique three-stranded antiparallel β-sheet insertion into the catalytic domain that packs next to the recognition domain, contributing to domain–domain interactions. The structure may inform the development of novel therapeutic approaches in the fight against lung cancer.

  12. Reduced complexity structural modeling for automated airframe synthesis

    NASA Technical Reports Server (NTRS)

    Hajela, Prabhat

    1987-01-01

    A procedure is developed for the optimum sizing of wing structures based on representing the built-up finite element assembly of the structure by equivalent beam models. The reduced-order beam models are computationally less demanding in an optimum design environment which dictates repetitive analysis of several trial designs. The design procedure is implemented in a computer program requiring geometry and loading information to create the wing finite element model and its equivalent beam model, and providing a rapid estimate of the optimum weight obtained from a fully stressed design approach applied to the beam. The synthesis procedure is demonstrated for representative conventional-cantilever and joined wing configurations.

  13. Application of a hierarchical structure stochastic learning automation

    NASA Technical Reports Server (NTRS)

    Neville, R. G.; Chrystall, M. S.; Mars, P.

    1979-01-01

    A hierarchical structure automaton was developed using a two state stochastic learning automato (SLA) in a time shared model. Application of the hierarchical SLA to systems with multidimensional, multimodal performance criteria is described. Results of experiments performed with the hierarchical SLA using a performance index with a superimposed noise component of ? or - delta distributed uniformly over the surface are discussed.

  14. Automating the parallel processing of fluid and structural dynamics calculations

    NASA Technical Reports Server (NTRS)

    Arpasi, Dale J.; Cole, Gary L.

    1987-01-01

    The NASA Lewis Research Center is actively involved in the development of expert system technology to assist users in applying parallel processing to computational fluid and structural dynamic analysis. The goal of this effort is to eliminate the necessity for the physical scientist to become a computer scientist in order to effectively use the computer as a research tool. Programming and operating software utilities have previously been developed to solve systems of ordinary nonlinear differential equations on parallel scalar processors. Current efforts are aimed at extending these capabilties to systems of partial differential equations, that describe the complex behavior of fluids and structures within aerospace propulsion systems. This paper presents some important considerations in the redesign, in particular, the need for algorithms and software utilities that can automatically identify data flow patterns in the application program and partition and allocate calculations to the parallel processors. A library-oriented multiprocessing concept for integrating the hardware and software functions is described.

  15. Automating the parallel processing of fluid and structural dynamics calculations

    NASA Technical Reports Server (NTRS)

    Arpasi, Dale J.; Cole, Gary L.

    1987-01-01

    The NASA Lewis Research Center is actively involved in the development of expert system technology to assist users in applying parallel processing to computational fluid and structural dynamic analysis. The goal of this effort is to eliminate the necessity for the physical scientist to become a computer scientist in order to effectively use the computer as a research tool. Programming and operating software utilities have previously been developed to solve systems of ordinary nonlinear differential equations on parallel scalar processors. Current efforts are aimed at extending these capabilities to systems of partial differential equations, that describe the complex behavior of fluids and structures within aerospace propulsion systems. This paper presents some important considerations in the redesign, in particular, the need for algorithms and software utilities that can automatically identify data flow patterns in the application program and partition and allocate calculations to the parallel processors. A library-oriented multiprocessing concept for integrating the hardware and software functions is described.

  16. Automated protein model building combined with iterative structure refinement.

    PubMed

    Perrakis, A; Morris, R; Lamzin, V S

    1999-05-01

    In protein crystallography, much time and effort are often required to trace an initial model from an interpretable electron density map and to refine it until it best agrees with the crystallographic data. Here, we present a method to build and refine a protein model automatically and without user intervention, starting from diffraction data extending to resolution higher than 2.3 A and reasonable estimates of crystallographic phases. The method is based on an iterative procedure that describes the electron density map as a set of unconnected atoms and then searches for protein-like patterns. Automatic pattern recognition (model building) combined with refinement, allows a structural model to be obtained reliably within a few CPU hours. We demonstrate the power of the method with examples of a few recently solved structures.

  17. A Highly Flexible, Automated System Providing Reliable Sample Preparation in Element- and Structure-Specific Measurements.

    PubMed

    Vorberg, Ellen; Fleischer, Heidi; Junginger, Steffen; Liu, Hui; Stoll, Norbert; Thurow, Kerstin

    2016-10-01

    Life science areas require specific sample pretreatment to increase the concentration of the analytes and/or to convert the analytes into an appropriate form for the detection and separation systems. Various workstations are commercially available, allowing for automated biological sample pretreatment. Nevertheless, due to the required temperature, pressure, and volume conditions in typical element and structure-specific measurements, automated platforms are not suitable for analytical processes. Thus, the purpose of the presented investigation was the design, realization, and evaluation of an automated system ensuring high-precision sample preparation for a variety of analytical measurements. The developed system has to enable system adaption and high performance flexibility. Furthermore, the system has to be capable of dealing with the wide range of required vessels simultaneously, allowing for less cost and time-consuming process steps. However, the system's functionality has been confirmed in various validation sequences. Using element-specific measurements, the automated system was up to 25% more precise compared to the manual procedure and as precise as the manual procedure using structure-specific measurements.

  18. Three-dimensional visualization for evaluating automated, geomorphic pattern-recognition analyses of crustal structures

    NASA Astrophysics Data System (ADS)

    Foley, M. G.

    1991-02-01

    We are developing and applying a suite of automated remote geologic analysis (RGA) methods at Pacific Northwest Laboratory (PNL) for extracting structural and tectonic patterns from digital models of topography and other spatially registered geophysical data. In analyzing a map area, the geologist employs a variety of spatial representations (e.g., topographic maps; oblique, vertical and vertical stereographic aerial photographs; satellite-sensor images) in addition to actual field observations to provide a basis for recognizing features (patterns) diagnostic or suggestive of various geologic and geomorphic features. We intend that our automated analyses of digital models of elevation use the same photogeologic pattern-recognition methods as the geologist's; otherwise there is no direct basis for manually evaluating results of the automated analysis. Any system for automating geologic analysis should extend the geologist's pattern-recognition abilities and quantify them, rather than replace them. This requirement means that results of automated structural pattern-recognition analyses must be evaluated by geologists using the same method that would be employed in manual field checking: visual examination of the three-dimensional relationships among rocks, erosional patterns, and identifiable structures. Interactive computer-graphics in quantitative (i.e., spatially registered), simulated three-dimensional perspective and stereo are thus critical to the integration and interpretation of topography, imagery, point data, RGA-identified fracture/fault planes, stratigraphy, contoured geophysical data, nonplanar surfaces, boreholes, and three-dimensional zones (e.g., crush zones at fracture intersections). This graphical interaction presents the megabytes of digital geologic and geophysical data to the geologist in the same spatial format that field observations would take, permitting direct evaluation of RGA methods and results.

  19. Three-dimensional visualization for evaluating automated, geomorphic pattern-recognition analyses of crustal structures

    SciTech Connect

    Foley, M.G.

    1991-02-01

    We are developing and applying a suite of automated remote geologic analysis (RGA) methods at Pacific Northwest Laboratory (PNL) for extracting structural and tectonic patterns from digital models of topography and other spatially registered geophysical data. In analyzing a map area, the geologist employs a variety of spatial representations (e.g., topographic maps; oblique, vertical and vertical stereographic aerial photographs; satellite-sensor images) in addition to actual field observations to provide a basis for recognizing features (patterns) diagnostic or suggestive of various geologic and geomorphic features. We intend that our automated analyses of digital models of elevation use the same photogeologic pattern-recognition methods as the geologist's; otherwise there is no direct basis for manually evaluating results of the automated analysis. Any system for automating geologic analysis should extend the geologist's pattern-recognition abilities and quantify them, rather than replace them. This requirement means that results of automated structural pattern-recognition analyses must be evaluated by geologists using the same method that would be employed in manual field checking: visual examination of the three-dimensional relationships among rocks, erosional patterns, and identifiable structures. Interactive computer-graphics in quantitative (i.e., spatially registered), simulated three-dimensional perspective and stereo are thus critical to the integration and interpretation of topography, imagery, point data, RGA-identified fracture/fault planes, stratigraphy, contoured geophysical data, nonplanar surfaces, boreholes, and three-dimensional zones (e.g., crush zones at fracture intersections). This graphical interaction presents the megabytes of digital geologic and geophysical data to the geologist in the same spatial format that field observations would take, permitting direct evaluation of RGA methods and results. 5 refs., 2 figs.

  20. An automated procedure for covariation-based detection of RNA structure

    SciTech Connect

    Winker, S.; Overbeek, R.; Woese, C.R.; Olsen, G.J.; Pfluger, N.

    1989-12-01

    This paper summarizes our investigations into the computational detection of secondary and tertiary structure of ribosomal RNA. We have developed a new automated procedure that not only identifies potential bondings of secondary and tertiary structure, but also provides the covariation evidence that supports the proposed bondings, and any counter-evidence that can be detected in the known sequences. A small number of previously unknown bondings have been detected in individual RNA molecules (16S rRNA and 7S RNA) through the use of our automated procedure. Currently, we are systematically studying mitochondrial rRNA. Our goal is to detect tertiary structure within 16S rRNA and quaternary structure between 16S and 23S rRNA. Our ultimate hope is that automated covariation analysis will contribute significantly to a refined picture of ribosome structure. Our colleagues in biology have begun experiments to test certain hypotheses suggested by an examination of our program's output. These experiments involve sequencing key portions of the 23S ribosomal RNA for species in which the known 16S ribosomal RNA exhibits variation (from the dominant pattern) at the site of a proposed bonding. The hope is that the 23S ribosomal RNA of these species will exhibit corresponding complementary variation or generalized covariation. 24 refs.

  1. Automated determination of fibrillar structures by simultaneous model building and fiber diffraction refinement.

    PubMed

    Potrzebowski, Wojciech; André, Ingemar

    2015-07-01

    For highly oriented fibrillar molecules, three-dimensional structures can often be determined from X-ray fiber diffraction data. However, because of limited information content, structure determination and validation can be challenging. We demonstrate that automated structure determination of protein fibers can be achieved by guiding the building of macromolecular models with fiber diffraction data. We illustrate the power of our approach by determining the structures of six bacteriophage viruses de novo using fiber diffraction data alone and together with solid-state NMR data. Furthermore, we demonstrate the feasibility of molecular replacement from monomeric and fibrillar templates by solving the structure of a plant virus using homology modeling and protein-protein docking. The generated models explain the experimental data to the same degree as deposited reference structures but with improved structural quality. We also developed a cross-validation method for model selection. The results highlight the power of fiber diffraction data as structural constraints.

  2. Automated determination of fibrillar structures by simultaneous model building and fiber diffraction refinement.

    PubMed

    Potrzebowski, Wojciech; André, Ingemar

    2015-07-01

    For highly oriented fibrillar molecules, three-dimensional structures can often be determined from X-ray fiber diffraction data. However, because of limited information content, structure determination and validation can be challenging. We demonstrate that automated structure determination of protein fibers can be achieved by guiding the building of macromolecular models with fiber diffraction data. We illustrate the power of our approach by determining the structures of six bacteriophage viruses de novo using fiber diffraction data alone and together with solid-state NMR data. Furthermore, we demonstrate the feasibility of molecular replacement from monomeric and fibrillar templates by solving the structure of a plant virus using homology modeling and protein-protein docking. The generated models explain the experimental data to the same degree as deposited reference structures but with improved structural quality. We also developed a cross-validation method for model selection. The results highlight the power of fiber diffraction data as structural constraints. PMID:25961412

  3. Knowledge structure representation and automated updates in intelligent information management systems

    NASA Technical Reports Server (NTRS)

    Corey, Stephen; Carnahan, Richard S., Jr.

    1990-01-01

    A continuing effort to apply rapid prototyping and Artificial Intelligence techniques to problems associated with projected Space Station-era information management systems is examined. In particular, timely updating of the various databases and knowledge structures within the proposed intelligent information management system (IIMS) is critical to support decision making processes. Because of the significantly large amounts of data entering the IIMS on a daily basis, information updates will need to be automatically performed with some systems requiring that data be incorporated and made available to users within a few hours. Meeting these demands depends first, on the design and implementation of information structures that are easily modified and expanded, and second, on the incorporation of intelligent automated update techniques that will allow meaningful information relationships to be established. Potential techniques are studied for developing such an automated update capability and IIMS update requirements are examined in light of results obtained from the IIMS prototyping effort.

  4. The AUDANA algorithm for automated protein 3D structure determination from NMR NOE data.

    PubMed

    Lee, Woonghee; Petit, Chad M; Cornilescu, Gabriel; Stark, Jaime L; Markley, John L

    2016-06-01

    We introduce AUDANA (Automated Database-Assisted NOE Assignment), an algorithm for determining three-dimensional structures of proteins from NMR data that automates the assignment of 3D-NOE spectra, generates distance constraints, and conducts iterative high temperature molecular dynamics and simulated annealing. The protein sequence, chemical shift assignments, and NOE spectra are the only required inputs. Distance constraints generated automatically from ambiguously assigned NOE peaks are validated during the structure calculation against information from an enlarged version of the freely available PACSY database that incorporates information on protein structures deposited in the Protein Data Bank (PDB). This approach yields robust sets of distance constraints and 3D structures. We evaluated the performance of AUDANA with input data for 14 proteins ranging in size from 6 to 25 kDa that had 27-98 % sequence identity to proteins in the database. In all cases, the automatically calculated 3D structures passed stringent validation tests. Structures were determined with and without database support. In 9/14 cases, database support improved the agreement with manually determined structures in the PDB and in 11/14 cases, database support lowered the r.m.s.d. of the family of 20 structural models.

  5. Automated refinement of macromolecular structures at low resolution using prior information

    PubMed Central

    Kovalevskiy, Oleg; Nicholls, Robert A.; Murshudov, Garib N.

    2016-01-01

    Since the ratio of the number of observations to adjustable parameters is small at low resolution, it is necessary to use complementary information for the analysis of such data. ProSMART is a program that can generate restraints for macromolecules using homologous structures, as well as generic restraints for the stabilization of secondary structures. These restraints are used by REFMAC5 to stabilize the refinement of an atomic model. However, the optimal refinement protocol varies from case to case, and it is not always obvious how to select appropriate homologous structure(s), or other sources of prior information, for restraint generation. After running extensive tests on a large data set of low-resolution models, the best-performing refinement protocols and strategies for the selection of homologous structures have been identified. These strategies and protocols have been implemented in the Low-Resolution Structure Refinement (LORESTR) pipeline. The pipeline performs auto-detection of twinning and selects the optimal scaling method and solvent parameters. LORESTR can either use user-supplied homologous structures, or run an automated BLAST search and download homologues from the PDB. The pipeline executes multiple model-refinement instances using different parameters in order to find the best protocol. Tests show that the automated pipeline improves R factors, geometry and Ramachandran statistics for 94% of the low-resolution cases from the PDB included in the test set. PMID:27710936

  6. Automated protein motif generation in the structure-based protein function prediction tool ProMOL.

    PubMed

    Osipovitch, Mikhail; Lambrecht, Mitchell; Baker, Cameron; Madha, Shariq; Mills, Jeffrey L; Craig, Paul A; Bernstein, Herbert J

    2015-12-01

    ProMOL, a plugin for the PyMOL molecular graphics system, is a structure-based protein function prediction tool. ProMOL includes a set of routines for building motif templates that are used for screening query structures for enzyme active sites. Previously, each motif template was generated manually and required supervision in the optimization of parameters for sensitivity and selectivity. We developed an algorithm and workflow for the automation of motif building and testing routines in ProMOL. The algorithm uses a set of empirically derived parameters for optimization and requires little user intervention. The automated motif generation algorithm was first tested in a performance comparison with a set of manually generated motifs based on identical active sites from the same 112 PDB entries. The two sets of motifs were equally effective in identifying alignments with homologs and in rejecting alignments with unrelated structures. A second set of 296 active site motifs were generated automatically, based on Catalytic Site Atlas entries with literature citations, as an expansion of the library of existing manually generated motif templates. The new motif templates exhibited comparable performance to the existing ones in terms of hit rates against native structures, homologs with the same EC and Pfam designations, and randomly selected unrelated structures with a different EC designation at the first EC digit, as well as in terms of RMSD values obtained from local structural alignments of motifs and query structures. This research is supported by NIH grant GM078077. PMID:26573864

  7. Automated fine structure image analysis method for discrimination of diabetic retinopathy stage using conjunctival microvasculature images

    PubMed Central

    Khansari, Maziyar M; O’Neill, William; Penn, Richard; Chau, Felix; Blair, Norman P; Shahidi, Mahnaz

    2016-01-01

    The conjunctiva is a densely vascularized mucus membrane covering the sclera of the eye with a unique advantage of accessibility for direct visualization and non-invasive imaging. The purpose of this study is to apply an automated quantitative method for discrimination of different stages of diabetic retinopathy (DR) using conjunctival microvasculature images. Fine structural analysis of conjunctival microvasculature images was performed by ordinary least square regression and Fisher linear discriminant analysis. Conjunctival images between groups of non-diabetic and diabetic subjects at different stages of DR were discriminated. The automated method’s discriminate rates were higher than those determined by human observers. The method allowed sensitive and rapid discrimination by assessment of conjunctival microvasculature images and can be potentially useful for DR screening and monitoring. PMID:27446692

  8. Automated detection of structural alerts (chemical fragments) in (eco)toxicology

    PubMed Central

    Lepailleur, Alban; Poezevara, Guillaume; Bureau, Ronan

    2013-01-01

    This mini-review describes the evolution of different algorithms dedicated to the automated discovery of chemical fragments associated to (eco)toxicological endpoints. These structural alerts correspond to one of the most interesting approach of in silico toxicology due to their direct link with specific toxicological mechanisms. A number of expert systems are already available but, since the first work in this field which considered a binomial distribution of chemical fragments between two datasets, new data miners were developed and applied with success in chemoinformatics. The frequency of a chemical fragment in a dataset is often at the core of the process for the definition of its toxicological relevance. However, recent progresses in data mining provide new insights into the automated discovery of new rules. Particularly, this review highlights the notion of Emerging Patterns that can capture contrasts between classes of data. PMID:24688706

  9. Automated segmentation of pulmonary structures in thoracic computed tomography scans: a review

    NASA Astrophysics Data System (ADS)

    van Rikxoort, Eva M.; van Ginneken, Bram

    2013-09-01

    Computed tomography (CT) is the modality of choice for imaging the lungs in vivo. Sub-millimeter isotropic images of the lungs can be obtained within seconds, allowing the detection of small lesions and detailed analysis of disease processes. The high resolution of thoracic CT and the high prevalence of lung diseases require a high degree of automation in the analysis pipeline. The automated segmentation of pulmonary structures in thoracic CT has been an important research topic for over a decade now. This systematic review provides an overview of current literature. We discuss segmentation methods for the lungs, the pulmonary vasculature, the airways, including airway tree construction and airway wall segmentation, the fissures, the lobes and the pulmonary segments. For each topic, the current state of the art is summarized, and topics for future research are identified.

  10. A two-level structure for advanced space power system automation

    NASA Technical Reports Server (NTRS)

    Loparo, Kenneth A.; Chankong, Vira

    1990-01-01

    The tasks to be carried out during the three-year project period are: (1) performing extensive simulation using existing mathematical models to build a specific knowledge base of the operating characteristics of space power systems; (2) carrying out the necessary basic research on hierarchical control structures, real-time quantitative algorithms, and decision-theoretic procedures; (3) developing a two-level automation scheme for fault detection and diagnosis, maintenance and restoration scheduling, and load management; and (4) testing and demonstration. The outlines of the proposed system structure that served as a master plan for this project, work accomplished, concluding remarks, and ideas for future work are also addressed.

  11. Integrating automated structured analysis and design with Ada programming support environments

    NASA Technical Reports Server (NTRS)

    Hecht, Alan; Simmons, Andy

    1986-01-01

    Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.

  12. Automated High Throughput Protein Crystallization Screening at Nanoliter Scale and Protein Structural Study on Lactate Dehydrogenase

    SciTech Connect

    Li, Fenglei

    2006-08-09

    The purposes of our research were: (1) To develop an economical, easy to use, automated, high throughput system for large scale protein crystallization screening. (2) To develop a new protein crystallization method with high screening efficiency, low protein consumption and complete compatibility with high throughput screening system. (3) To determine the structure of lactate dehydrogenase complexed with NADH by x-ray protein crystallography to study its inherent structural properties. Firstly, we demonstrated large scale protein crystallization screening can be performed in a high throughput manner with low cost, easy operation. The overall system integrates liquid dispensing, crystallization and detection and serves as a whole solution to protein crystallization screening. The system can dispense protein and multiple different precipitants in nanoliter scale and in parallel. A new detection scheme, native fluorescence, has been developed in this system to form a two-detector system with a visible light detector for detecting protein crystallization screening results. This detection scheme has capability of eliminating common false positives by distinguishing protein crystals from inorganic crystals in a high throughput and non-destructive manner. The entire system from liquid dispensing, crystallization to crystal detection is essentially parallel, high throughput and compatible with automation. The system was successfully demonstrated by lysozyme crystallization screening. Secondly, we developed a new crystallization method with high screening efficiency, low protein consumption and compatibility with automation and high throughput. In this crystallization method, a gas permeable membrane is employed to achieve the gentle evaporation required by protein crystallization. Protein consumption is significantly reduced to nanoliter scale for each condition and thus permits exploring more conditions in a phase diagram for given amount of protein. In addition

  13. Automating crystallographic structure solution and refinement of protein–ligand complexes

    SciTech Connect

    Echols, Nathaniel Moriarty, Nigel W. Klei, Herbert E.; Afonine, Pavel V.; Bunkóczi, Gábor; McCoy, Airlie J.; Oeffner, Robert D.; Read, Randy J.; Adams, Paul D.

    2014-01-01

    A software system for automated protein–ligand crystallography has been implemented in the Phenix suite. This significantly reduces the manual effort required in high-throughput crystallographic studies. High-throughput drug-discovery and mechanistic studies often require the determination of multiple related crystal structures that only differ in the bound ligands, point mutations in the protein sequence and minor conformational changes. If performed manually, solution and refinement requires extensive repetition of the same tasks for each structure. To accelerate this process and minimize manual effort, a pipeline encompassing all stages of ligand building and refinement, starting from integrated and scaled diffraction intensities, has been implemented in Phenix. The resulting system is able to successfully solve and refine large collections of structures in parallel without extensive user intervention prior to the final stages of model completion and validation.

  14. New tissue priors for improved automated classification of subcortical brain structures on MRI.

    PubMed

    Lorio, S; Fresard, S; Adaszewski, S; Kherif, F; Chowdhury, R; Frackowiak, R S; Ashburner, J; Helms, G; Weiskopf, N; Lutti, A; Draganski, B

    2016-04-15

    Despite the constant improvement of algorithms for automated brain tissue classification, the accurate delineation of subcortical structures using magnetic resonance images (MRI) data remains challenging. The main difficulties arise from the low gray-white matter contrast of iron rich areas in T1-weighted (T1w) MRI data and from the lack of adequate priors for basal ganglia and thalamus. The most recent attempts to obtain such priors were based on cohorts with limited size that included subjects in a narrow age range, failing to account for age-related gray-white matter contrast changes. Aiming to improve the anatomical plausibility of automated brain tissue classification from T1w data, we have created new tissue probability maps for subcortical gray matter regions. Supported by atlas-derived spatial information, raters manually labeled subcortical structures in a cohort of healthy subjects using magnetization transfer saturation and R2* MRI maps, which feature optimal gray-white matter contrast in these areas. After assessment of inter-rater variability, the new tissue priors were tested on T1w data within the framework of voxel-based morphometry. The automated detection of gray matter in subcortical areas with our new probability maps was more anatomically plausible compared to the one derived with currently available priors. We provide evidence that the improved delineation compensates age-related bias in the segmentation of iron rich subcortical regions. The new tissue priors, allowing robust detection of basal ganglia and thalamus, have the potential to enhance the sensitivity of voxel-based morphometry in both healthy and diseased brains. PMID:26854557

  15. New tissue priors for improved automated classification of subcortical brain structures on MRI☆

    PubMed Central

    Lorio, S.; Fresard, S.; Adaszewski, S.; Kherif, F.; Chowdhury, R.; Frackowiak, R.S.; Ashburner, J.; Helms, G.; Weiskopf, N.; Lutti, A.; Draganski, B.

    2016-01-01

    Despite the constant improvement of algorithms for automated brain tissue classification, the accurate delineation of subcortical structures using magnetic resonance images (MRI) data remains challenging. The main difficulties arise from the low gray-white matter contrast of iron rich areas in T1-weighted (T1w) MRI data and from the lack of adequate priors for basal ganglia and thalamus. The most recent attempts to obtain such priors were based on cohorts with limited size that included subjects in a narrow age range, failing to account for age-related gray-white matter contrast changes. Aiming to improve the anatomical plausibility of automated brain tissue classification from T1w data, we have created new tissue probability maps for subcortical gray matter regions. Supported by atlas-derived spatial information, raters manually labeled subcortical structures in a cohort of healthy subjects using magnetization transfer saturation and R2* MRI maps, which feature optimal gray-white matter contrast in these areas. After assessment of inter-rater variability, the new tissue priors were tested on T1w data within the framework of voxel-based morphometry. The automated detection of gray matter in subcortical areas with our new probability maps was more anatomically plausible compared to the one derived with currently available priors. We provide evidence that the improved delineation compensates age-related bias in the segmentation of iron rich subcortical regions. The new tissue priors, allowing robust detection of basal ganglia and thalamus, have the potential to enhance the sensitivity of voxel-based morphometry in both healthy and diseased brains. PMID:26854557

  16. Automated clustering of ensembles of alternative models in protein structure databases.

    PubMed

    Domingues, Francisco S; Rahnenführer, Jörg; Lengauer, Thomas

    2004-06-01

    Experimentally determined protein structures have been classified in different public databases according to their structural and evolutionary relationships. Frequently, alternative structural models, determined using X-ray crystallography or NMR spectroscopy, are available for a protein. These models can present significant structural dissimilarity. Currently there is no classification available for these alternative structures. In order to classify them, we developed STRuster, an automated method for clustering ensembles of structural models according to their backbone structure. The method is based on the calculation of carbon alpha (Calpha) distance matrices. Two filters are applied in the calculation of the dissimilarity measure in order to identify both large and small (but significant) backbone conformational changes. The resulting dissimilarity value is used for hierarchical clustering and partitioning around medoids (PAM). Hierarchical clustering reflects the hierarchy of similarities between all pairs of models, while PAM groups the models into the 'optimal' number of clusters. The method has been applied to cluster the structures in each SCOP species level and can be easily applied to any other sets of conformers. The results are available at: http://bioinf.mpi-sb.mpg.de/projects/struster/. PMID:15319469

  17. Automated measurement of CT noise in patient images with a novel structure coherence feature

    NASA Astrophysics Data System (ADS)

    Chun, Minsoo; Choi, Young Hun; Hyo Kim, Jong

    2015-12-01

    While the assessment of CT noise constitutes an important task for the optimization of scan protocols in clinical routine, the majority of noise measurements in practice still rely on manual operation, hence limiting their efficiency and reliability. This study presents an algorithm for the automated measurement of CT noise in patient images with a novel structure coherence feature. The proposed algorithm consists of a four-step procedure including subcutaneous fat tissue selection, the calculation of structure coherence feature, the determination of homogeneous ROIs, and the estimation of the average noise level. In an evaluation with 94 CT scans (16 517 images) of pediatric and adult patients along with the participation of two radiologists, ROIs were placed on a homogeneous fat region at 99.46% accuracy, and the agreement of the automated noise measurements with the radiologists’ reference noise measurements (PCC  =  0.86) was substantially higher than the within and between-rater agreements of noise measurements (PCCwithin  =  0.75, PCCbetween  =  0.70). In addition, the absolute noise level measurements matched closely the theoretical noise levels generated by a reduced-dose simulation technique. Our proposed algorithm has the potential to be used for examining the appropriateness of radiation dose and the image quality of CT protocols for research purposes as well as clinical routine.

  18. Automated assembly of large space structures using an expert system executive

    NASA Technical Reports Server (NTRS)

    Allen, Cheryl L.

    1993-01-01

    NASA LaRC has developed a unique testbed for investigating the practical problems associated with the assembly of large space structures using robotic manipulators. The testbed is an interdisciplinary effort which considers the full spectrum of assembly problems from the design of mechanisms to the development of software. This paper will describe the automated structures assembly testbed and its operation, detail the expert system executive and its development, and discuss the planned system evolution. Emphasis will be placed on the expert system development of the program executive. The executive program must be capable of directing and reliably performing complex assembly tasks with the flexibility to recover from realistic system errors. By employing an expert system, information pertaining to the operation of the system was encapsulated concisely within a knowledge base. This lead to a substantial reduction in code, increased flexibility, eased software upgrades, and realized a savings in software maintenance costs.

  19. An expert system executive for automated assembly of large space truss structures

    NASA Technical Reports Server (NTRS)

    Allen, Cheryl L.

    1993-01-01

    Langley Research Center developed a unique test bed for investigating the practical problems associated with the assembly of large space truss structures using robotic manipulators. The test bed is the result of an interdisciplinary effort that encompasses the full spectrum of assembly problems - from the design of mechanisms to the development of software. The automated structures assembly test bed and its operation are described, the expert system executive and its development are detailed, and the planned system evolution is discussed. Emphasis is on the expert system implementation of the program executive. The executive program must direct and reliably perform complex assembly tasks with the flexibility to recover from realistic system errors. The employment of an expert system permits information that pertains to the operation of the system to be encapsulated concisely within a knowledge base. This consolidation substantially reduced code, increased flexibility, eased software upgrades, and realized a savings in software maintenance costs.

  20. Integrating edge detection and fuzzy connectedness for automated segmentation of anatomical branching structures.

    PubMed

    Skoura, Angeliki; Nuzhnaya, Tatyana; Megalooikonomou, Vasileios

    2014-01-01

    Image segmentation algorithms are critical components of medical image analysis systems. This paper presents a novel and fully automated methodology for segmenting anatomical branching structures in medical images. It is a hybrid approach which integrates the Canny edge detection to obtain a preliminary boundary of the structure and the fuzzy connectedness algorithm to handle efficiently the discontinuities of the returned edge map. To ensure efficient localisation of weak branches, the fuzzy connectedness framework is applied in a sliding window mode and using a voting scheme the optimal connection point is estimated. Finally, the image regions are labelled as tissue or background using a locally adaptive thresholding technique. The proposed methodology is applied and evaluated in segmenting ductal trees visualised in clinical X-ray galactograms and vasculature visualised in angiograms. The experimental results demonstrate the effectiveness of the proposed approach achieving high scores of detection rate and accuracy among state-of-the-art segmentation techniques.

  1. 12 CFR Appendix D to Part 360 - Sweep/Automated Credit Account File Structure

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... exists, then a null value for the Sweep/Automated Credit Account Identifiers should be provided, but the... be a null valueThe Sweep/Automated Credit Account Identifier may be composed of more than...

  2. Automating gene library synthesis by structure-based combinatorial protein engineering: examples from plant sesquiterpene synthases.

    PubMed

    Dokarry, Melissa; Laurendon, Caroline; O'Maille, Paul E

    2012-01-01

    Structure-based combinatorial protein engineering (SCOPE) is a homology-independent recombination method to create multiple crossover gene libraries by assembling defined combinations of structural elements ranging from single mutations to domains of protein structure. SCOPE was originally inspired by DNA shuffling, which mimics recombination during meiosis, where mutations from parental genes are "shuffled" to create novel combinations in the resulting progeny. DNA shuffling utilizes sequence identity between parental genes to mediate template-switching events (the annealing and extension of one parental gene fragment on another) in PCR reassembly reactions to generate crossovers and hence recombination between parental genes. In light of the conservation of protein structure and degeneracy of sequence, SCOPE was developed to enable the "shuffling" of distantly related genes with no requirement for sequence identity. The central principle involves the use of oligonucleotides to encode for crossover regions to choreograph template-switching events during PCR assembly of gene fragments to create chimeric genes. This approach was initially developed to create libraries of hybrid DNA polymerases from distantly related parents, and later developed to create a combinatorial mutant library of sesquiterpene synthases to explore the catalytic landscapes underlying the functional divergence of related enzymes. This chapter presents a simplified protocol of SCOPE that can be integrated with different mutagenesis techniques and is suitable for automation by liquid-handling robots. Two examples are presented to illustrate the application of SCOPE to create gene libraries using plant sesquiterpene synthases as the model system. In the first example, we outline how to create an active-site library as a series of complex mixtures of diverse mutants. In the second example, we outline how to create a focused library as an array of individual clones to distil minimal combinations of

  3. Automated torso organ segmentation from 3D CT images using structured perceptron and dual decomposition

    NASA Astrophysics Data System (ADS)

    Nimura, Yukitaka; Hayashi, Yuichiro; Kitasaka, Takayuki; Mori, Kensaku

    2015-03-01

    This paper presents a method for torso organ segmentation from abdominal CT images using structured perceptron and dual decomposition. A lot of methods have been proposed to enable automated extraction of organ regions from volumetric medical images. However, it is necessary to adjust empirical parameters of them to obtain precise organ regions. This paper proposes an organ segmentation method using structured output learning. Our method utilizes a graphical model and binary features which represent the relationship between voxel intensities and organ labels. Also we optimize the weights of the graphical model by structured perceptron and estimate the best organ label for a given image by dynamic programming and dual decomposition. The experimental result revealed that the proposed method can extract organ regions automatically using structured output learning. The error of organ label estimation was 4.4%. The DICE coefficients of left lung, right lung, heart, liver, spleen, pancreas, left kidney, right kidney, and gallbladder were 0.91, 0.95, 0.77, 0.81, 0.74, 0.08, 0.83, 0.84, and 0.03, respectively.

  4. A structural study of cyanotrichite from Dachang by conventional and automated electron diffraction

    NASA Astrophysics Data System (ADS)

    Ventruti, Gennaro; Mugnaioli, Enrico; Capitani, Giancarlo; Scordari, Fernando; Pinto, Daniela; Lausi, Andrea

    2015-09-01

    The crystal structure of cyanotrichite, having general formula Cu4Al2(SO4)(OH)12·2H2O, from the Dachang deposit (China) was studied by means of conventional transmission electron microscopy, automated electron diffraction tomography (ADT) and synchrotron X-ray powder diffraction (XRPD). ADT revealed the presence of two different cyanotrichite-like phases. The same phases were also recognized in the XRPD pattern, allowing the perfect indexing of all peaks leading, after refinement to the following cell parameters: (1) a = 12.417(2) Å, b = 2.907(1) Å, c = 10.157(1) Å and β = 98.12(1); (2) a = 12.660(2) Å, b = 2.897(1) Å, c = 10.162(1) Å and β = 92.42(1)°. Only for the former phase, labeled cyanotrichite-98, a partial structure, corresponding to the [Cu4Al2(OH){12/2+}] cluster, was obtained ab initio by direct methods in space group C2/ m on the basis of electron diffraction data. Geometric and charge-balance considerations allowed to reach the whole structure model for the cyanotrichite-98 phase. The sulfate group and water molecule result to be statistically disordered over two possible positions, but keeping the average structure consistent with the C-centering symmetry, in agreement with ADT results.

  5. Modelling and interpreting biologically crusted dryland soil sub-surface structure using automated micropenetrometry

    NASA Astrophysics Data System (ADS)

    Hoon, Stephen R.; Felde, Vincent J. M. N. L.; Drahorad, Sylvie L.; Felix-Henningsen, Peter

    2015-04-01

    Soil penetrometers are used routinely to determine the shear strength of soils and deformable sediments both at the surface and throughout a depth profile in disciplines as diverse as soil science, agriculture, geoengineering and alpine avalanche-safety (e.g. Grunwald et al. 2001, Van Herwijnen et al. 2009). Generically, penetrometers comprise two principal components: An advancing probe, and a transducer; the latter to measure the pressure or force required to cause the probe to penetrate or advance through the soil or sediment. The force transducer employed to determine the pressure can range, for example, from a simple mechanical spring gauge to an automatically data-logged electronic transducer. Automated computer control of the penetrometer step size and probe advance rate enables precise measurements to be made down to a resolution of 10's of microns, (e.g. the automated electronic micropenetrometer (EMP) described by Drahorad 2012). Here we discuss the determination, modelling and interpretation of biologically crusted dryland soil sub-surface structures using automated micropenetrometry. We outline a model enabling the interpretation of depth dependent penetration resistance (PR) profiles and their spatial differentials using the model equations, σ {}(z) ={}σ c0{}+Σ 1n[σ n{}(z){}+anz + bnz2] and dσ /dz = Σ 1n[dσ n(z) /dz{} {}+{}Frn(z)] where σ c0 and σ n are the plastic deformation stresses for the surface and nth soil structure (e.g. soil crust, layer, horizon or void) respectively, and Frn(z)dz is the frictional work done per unit volume by sliding the penetrometer rod an incremental distance, dz, through the nth layer. Both σ n(z) and Frn(z) are related to soil structure. They determine the form of σ {}(z){} measured by the EMP transducer. The model enables pores (regions of zero deformation stress) to be distinguished from changes in layer structure or probe friction. We have applied this method to both artificial calibration soils in the

  6. Vibration based structural health monitoring of an arch bridge: From automated OMA to damage detection

    NASA Astrophysics Data System (ADS)

    Magalhães, F.; Cunha, A.; Caetano, E.

    2012-04-01

    In order to evaluate the usefulness of approaches based on modal parameters tracking for structural health monitoring of bridges, in September of 2007, a dynamic monitoring system was installed in a concrete arch bridge at the city of Porto, in Portugal. The implementation of algorithms to perform the continuous on-line identification of modal parameters based on structural responses to ambient excitation (automated Operational Modal Analysis) has permitted to create a very complete database with the time evolution of the bridge modal characteristics during more than 2 years. This paper describes the strategy that was followed to minimize the effects of environmental and operational factors on the bridge natural frequencies, enabling, in a subsequent stage, the identification of structural anomalies. Alternative static and dynamic regression models are tested and complemented by a Principal Components Analysis. Afterwards, the identification of damages is tried with control charts. At the end, it is demonstrated that the adopted processing methodology permits the detection of realistic damage scenarios, associated with frequency shifts around 0.2%, which were simulated with a numerical model.

  7. Structure-Function relationships using the Cirrus Spectral Domain Optical Coherence Tomograph and Standard Automated Perimetry

    PubMed Central

    Leite, Mauro T.; Zangwill, Linda M.; Weinreb, Robert N.; Rao, Harsha L.; Alencar, Luciana M.; Medeiros, Felipe A.

    2012-01-01

    Purpose To evaluate the relationship between glaucomatous structural damage assessed by the Cirrus Spectral Domain OCT (SDOCT) and functional loss as measured by standard automated perimetry (SAP). Methods Four hundred twenty two eyes (78 healthy, 210 suspects, 134 glaucomatous) of 250 patients were recruited from the longitudinal Diagnostic Innovations in Glaucoma Study (DIGS) and from the African Descent and Glaucoma Evaluation Study (ADAGES). All eyes underwent testing with the Cirrus SDOCT and SAP within a 6-month period. The relationship between parapapillary retinal nerve fiber layer thickness (RNFL) sectors and corresponding topographic SAP locations was evaluated using locally weighted scatterplot smoothing (LOWESS) and regression analysis. SAP sensitivity values were evaluated using both linear as well as logarithmic scales. We also tested the fit of a model (Hood) for structure-function relationship in glaucoma. Results Structure was significantly related to function for all but the nasal thickness sector. The relationship was strongest for superotemporal RNFL thickness and inferonasal sensitivity (R2 = 0.314, P<0.001). The Hood model fitted the data relatively well with 88% of the eyes inside the 95% confidence interval predicted by the model. Conclusion RNFL thinning measured by the Cirrus SDOCT was associated with correspondent visual field loss in glaucoma. PMID:21952500

  8. Automated metric characterization of urban structure using building decomposition from very high resolution imagery

    NASA Astrophysics Data System (ADS)

    Heinzel, Johannes; Kemper, Thomas

    2015-03-01

    Classification approaches for urban areas are mostly of qualitative and semantic nature. They produce interpreted classes similar to those from land cover and land use classifications. As a complement to those classes, quantitative measures directly derived from the image could lead to a metric characterization of the urban area. While these metrics lack of qualitative interpretation they are able to provide objective measure of the urban structures. Such quantitative measures are especially important in rapidly growing cities since, beside of the growth in area, they can provide structural information for specific areas and detect changes. Rustenburg, which serves as test area for the present study, is amongst the fastest growing cities in South Africa. It reveals a heterogeneous face of housing and building structures reflecting social and/or economic differences often linked to the spatial distribution of industrial and local mining sites. Up to date coverage with aerial photographs is provided by aerial surveys in regular intervals. Also recent satellite systems provide imagery with suitable resolution. Using such set of very high resolution images a fully automated algorithm has been developed which outputs metric classes by systematically combining important measures of building structure. The measurements are gained by decomposition of buildings directly from the imagery and by using methods from mathematical morphology. The decomposed building objects serve as basis for the computation of grid statistics. Finally a systematic combination of the single features leads to combined metrical classes. For the dominant urban structures verification results indicate an overall accuracy of at least 80% on the single feature level and 70% for the combined classes.

  9. The Upgrade Programme for the Structural Biology beamlines at the European Synchrotron Radiation Facility - High throughput sample evaluation and automation

    NASA Astrophysics Data System (ADS)

    Theveneau, P.; Baker, R.; Barrett, R.; Beteva, A.; Bowler, M. W.; Carpentier, P.; Caserotto, H.; de Sanctis, D.; Dobias, F.; Flot, D.; Guijarro, M.; Giraud, T.; Lentini, M.; Leonard, G. A.; Mattenet, M.; McCarthy, A. A.; McSweeney, S. M.; Morawe, C.; Nanao, M.; Nurizzo, D.; Ohlsson, S.; Pernot, P.; Popov, A. N.; Round, A.; Royant, A.; Schmid, W.; Snigirev, A.; Surr, J.; Mueller-Dieckmann, C.

    2013-03-01

    Automation and advances in technology are the key elements in addressing the steadily increasing complexity of Macromolecular Crystallography (MX) experiments. Much of this complexity is due to the inter-and intra-crystal heterogeneity in diffraction quality often observed for crystals of multi-component macromolecular assemblies or membrane proteins. Such heterogeneity makes high-throughput sample evaluation an important and necessary tool for increasing the chances of a successful structure determination. The introduction at the ESRF of automatic sample changers in 2005 dramatically increased the number of samples that were tested for diffraction quality. This "first generation" of automation, coupled with advances in software aimed at optimising data collection strategies in MX, resulted in a three-fold increase in the number of crystal structures elucidated per year using data collected at the ESRF. In addition, sample evaluation can be further complemented using small angle scattering experiments on the newly constructed bioSAXS facility on BM29 and the micro-spectroscopy facility (ID29S). The construction of a second generation of automated facilities on the MASSIF (Massively Automated Sample Screening Integrated Facility) beam lines will build on these advances and should provide a paradigm shift in how MX experiments are carried out which will benefit the entire Structural Biology community.

  10. ABodyBuilder: Automated antibody structure prediction with data–driven accuracy estimation

    PubMed Central

    Leem, Jinwoo; Dunbar, James; Georges, Guy; Shi, Jiye; Deane, Charlotte M.

    2016-01-01

    ABSTRACT Computational modeling of antibody structures plays a critical role in therapeutic antibody design. Several antibody modeling pipelines exist, but no freely available methods currently model nanobodies, provide estimates of expected model accuracy, or highlight potential issues with the antibody's experimental development. Here, we describe our automated antibody modeling pipeline, ABodyBuilder, designed to overcome these issues. The algorithm itself follows the standard 4 steps of template selection, orientation prediction, complementarity-determining region (CDR) loop modeling, and side chain prediction. ABodyBuilder then annotates the ‘confidence’ of the model as a probability that a component of the antibody (e.g., CDRL3 loop) will be modeled within a root–mean square deviation threshold. It also flags structural motifs on the model that are known to cause issues during in vitro development. ABodyBuilder was tested on 4 separate datasets, including the 11 antibodies from the Antibody Modeling Assessment–II competition. ABodyBuilder builds models that are of similar quality to other methodologies, with sub–Angstrom predictions for the ‘canonical’ CDR loops. Its ability to model nanobodies, and rapidly generate models (∼30 seconds per model) widens its potential usage. ABodyBuilder can also help users in decision–making for the development of novel antibodies because it provides model confidence and potential sequence liabilities. ABodyBuilder is freely available at http://opig.stats.ox.ac.uk/webapps/abodybuilder. PMID:27392298

  11. Automated laser-based barely visible impact damage detection in honeycomb sandwich composite structures

    NASA Astrophysics Data System (ADS)

    Girolamo, D.; Girolamo, L.; Yuan, F. G.

    2015-03-01

    Nondestructive evaluation (NDE) for detection and quantification of damage in composite materials is fundamental in the assessment of the overall structural integrity of modern aerospace systems. Conventional NDE systems have been extensively used to detect the location and size of damages by propagating ultrasonic waves normal to the surface. However they usually require physical contact with the structure and are time consuming and labor intensive. An automated, contactless laser ultrasonic imaging system for barely visible impact damage (BVID) detection in advanced composite structures has been developed to overcome these limitations. Lamb waves are generated by a Q-switched Nd:YAG laser, raster scanned by a set of galvano-mirrors over the damaged area. The out-of-plane vibrations are measured through a laser Doppler Vibrometer (LDV) that is stationary at a point on the corner of the grid. The ultrasonic wave field of the scanned area is reconstructed in polar coordinates and analyzed for high resolution characterization of impact damage in the composite honeycomb panel. Two methodologies are used for ultrasonic wave-field analysis: scattered wave field analysis (SWA) and standing wave energy analysis (SWEA) in the frequency domain. The SWA is employed for processing the wave field and estimate spatially dependent wavenumber values, related to discontinuities in the structural domain. The SWEA algorithm extracts standing waves trapped within damaged areas and, by studying the spectrum of the standing wave field, returns high fidelity damage imaging. While the SWA can be used to locate the impact damage in the honeycomb panel, the SWEA produces damage images in good agreement with X-ray computed tomographic (X-ray CT) scans. The results obtained prove that the laser-based nondestructive system is an effective alternative to overcome limitations of conventional NDI technologies.

  12. Automated laser-based barely visible impact damage detection in honeycomb sandwich composite structures

    SciTech Connect

    Girolamo, D. Yuan, F. G.; Girolamo, L.

    2015-03-31

    Nondestructive evaluation (NDE) for detection and quantification of damage in composite materials is fundamental in the assessment of the overall structural integrity of modern aerospace systems. Conventional NDE systems have been extensively used to detect the location and size of damages by propagating ultrasonic waves normal to the surface. However they usually require physical contact with the structure and are time consuming and labor intensive. An automated, contactless laser ultrasonic imaging system for barely visible impact damage (BVID) detection in advanced composite structures has been developed to overcome these limitations. Lamb waves are generated by a Q-switched Nd:YAG laser, raster scanned by a set of galvano-mirrors over the damaged area. The out-of-plane vibrations are measured through a laser Doppler Vibrometer (LDV) that is stationary at a point on the corner of the grid. The ultrasonic wave field of the scanned area is reconstructed in polar coordinates and analyzed for high resolution characterization of impact damage in the composite honeycomb panel. Two methodologies are used for ultrasonic wave-field analysis: scattered wave field analysis (SWA) and standing wave energy analysis (SWEA) in the frequency domain. The SWA is employed for processing the wave field and estimate spatially dependent wavenumber values, related to discontinuities in the structural domain. The SWEA algorithm extracts standing waves trapped within damaged areas and, by studying the spectrum of the standing wave field, returns high fidelity damage imaging. While the SWA can be used to locate the impact damage in the honeycomb panel, the SWEA produces damage images in good agreement with X-ray computed tomographic (X-ray CT) scans. The results obtained prove that the laser-based nondestructive system is an effective alternative to overcome limitations of conventional NDI technologies.

  13. Automated Assignment of MS/MS Cleavable Cross-Links in Protein 3D-Structure Analysis

    NASA Astrophysics Data System (ADS)

    Götze, Michael; Pettelkau, Jens; Fritzsche, Romy; Ihling, Christian H.; Schäfer, Mathias; Sinz, Andrea

    2015-01-01

    CID-MS/MS cleavable cross-linkers hold an enormous potential for an automated analysis of cross-linked products, which is essential for conducting structural proteomics studies. The created characteristic fragment ion patterns can easily be used for an automated assignment and discrimination of cross-linked products. To date, there are only a few software solutions available that make use of these properties, but none allows for an automated analysis of cleavable cross-linked products. The MeroX software fills this gap and presents a powerful tool for protein 3D-structure analysis in combination with MS/MS cleavable cross-linkers. We show that MeroX allows an automatic screening of characteristic fragment ions, considering static and variable peptide modifications, and effectively scores different types of cross-links. No manual input is required for a correct assignment of cross-links and false discovery rates are calculated. The self-explanatory graphical user interface of MeroX provides easy access for an automated cross-link search platform that is compatible with commonly used data file formats, enabling analysis of data originating from different instruments. The combination of an MS/MS cleavable cross-linker with a dedicated software tool for data analysis provides an automated workflow for 3D-structure analysis of proteins. MeroX is available at www.StavroX.com .

  14. Automated Structure-Activity Relationship Mining: Connecting Chemical Structure to Biological Profiles.

    PubMed

    Wawer, Mathias J; Jaramillo, David E; Dančík, Vlado; Fass, Daniel M; Haggarty, Stephen J; Shamji, Alykhan F; Wagner, Bridget K; Schreiber, Stuart L; Clemons, Paul A

    2014-06-01

    Understanding the structure-activity relationships (SARs) of small molecules is important for developing probes and novel therapeutic agents in chemical biology and drug discovery. Increasingly, multiplexed small-molecule profiling assays allow simultaneous measurement of many biological response parameters for the same compound (e.g., expression levels for many genes or binding constants against many proteins). Although such methods promise to capture SARs with high granularity, few computational methods are available to support SAR analyses of high-dimensional compound activity profiles. Many of these methods are not generally applicable or reduce the activity space to scalar summary statistics before establishing SARs. In this article, we present a versatile computational method that automatically extracts interpretable SAR rules from high-dimensional profiling data. The rules connect chemical structural features of compounds to patterns in their biological activity profiles. We applied our method to data from novel cell-based gene-expression and imaging assays collected on more than 30,000 small molecules. Based on the rules identified for this data set, we prioritized groups of compounds for further study, including a novel set of putative histone deacetylase inhibitors.

  15. Application of an automated wireless structural monitoring system for long-span suspension bridges

    SciTech Connect

    Kurata, M.; Lynch, J. P.; Linden, G. W. van der; Hipley, P.; Sheng, L.-H.

    2011-06-23

    This paper describes an automated wireless structural monitoring system installed at the New Carquinez Bridge (NCB). The designed system utilizes a dense network of wireless sensors installed in the bridge but remotely controlled by a hierarchically designed cyber-environment. The early efforts have included performance verification of a dense network of wireless sensors installed on the bridge and the establishment of a cellular gateway to the system for remote access from the internet. Acceleration of the main bridge span was the primary focus of the initial field deployment of the wireless monitoring system. An additional focus of the study is on ensuring wireless sensors can survive for long periods without human intervention. Toward this end, the life-expectancy of the wireless sensors has been enhanced by embedding efficient power management schemes in the sensors while integrating solar panels for power harvesting. The dynamic characteristics of the NCB under daily traffic and wind loads were extracted from the vibration response of the bridge deck and towers. These results have been compared to a high-fidelity finite element model of the bridge.

  16. Combined computational metabolite prediction and automated structure-based analysis of mass spectrometric data.

    PubMed

    Stranz, David D; Miao, Shichang; Campbell, Scott; Maydwell, George; Ekins, Sean

    2008-01-01

    ABSTRACT As high-throughput technologies have developed in the pharmaceutical industry, the demand for identification of possible metabolites using predominantly liquid chromatographic/mass spectrometry-mass spectrometry/mass spectrometry (LC/MS-MS/MS) for a large number of molecules in drug discovery has also increased. In parallel, computational technologies have also been developed to generate predictions for metabolites alongside methods to predict MS spectra and score the quality of the match with experimental spectra. The goal of the current study was to generate metabolite predictions from molecular structure with a software product, MetaDrug. In vitro microsomal incubations were used to ultimately produce MS data that could be used to verify the predictions with Apex, which is a new software tool that can predict the molecular ion spectrum and a fragmentation spectrum, automating the detailed examination of both MS and MS/MS spectra. For the test molecule imipramine used to illustrate the combined in vitro/in silico process proposed, MetaDrug predicts 16 metabolites. Following rat microsomal incubations with imipramine and analysis of the MS(n) data using the Apex software, strong evidence was found for imipramine and five metabolites and weaker evidence for five additional metabolites. This study suggests a new approach to streamline MS data analysis using a combination of predictive computational approaches with software capable of comparing the predicted metabolite output with empirical data when looking at drug metabolites.

  17. Multiple computer-automated structure evaluation study of aquatic toxicity. 2. Fathead minnow

    SciTech Connect

    Klopman, G.; Saiakhov, R.; Rosenkranz, H.S.

    2000-02-01

    An acute toxicity model was constructed on the basis of experimental data for 685 chemicals tested for toxicity for fathead minnow. The multiple computer-automated structure evaluation (M-CASE) program was used for the construction of the model. Based on a comparison between the authors results and published results, the authors found that the methodology is able to describe acute toxicity for the fathead minnow with high accuracy. The model incorporates the concept of a baseline activity as one of the parameters for the correlation as well as other parameters such as the presence of biophores, hardness-softness parameters, and other characteristics determined from quantum mechanical calculations. By using its artificial intelligence algorithm, M-CASE chooses automatically the most suitable set of parameters for evaluating the minnow toxicity of any organic molecule. The authors found that M-CASE can correctly predict acute toxicity for minnow for 80% of organic compounds with an average error of only 0.4 log units of lethal dose. The main toxicophores, corresponding to polar narcosis and to other types of reactive chemicals, were identified.

  18. Automated cerebellar lobule segmentation with application to cerebellar structural analysis in cerebellar disease.

    PubMed

    Yang, Zhen; Ye, Chuyang; Bogovic, John A; Carass, Aaron; Jedynak, Bruno M; Ying, Sarah H; Prince, Jerry L

    2016-02-15

    The cerebellum plays an important role in both motor control and cognitive function. Cerebellar function is topographically organized and diseases that affect specific parts of the cerebellum are associated with specific patterns of symptoms. Accordingly, delineation and quantification of cerebellar sub-regions from magnetic resonance images are important in the study of cerebellar atrophy and associated functional losses. This paper describes an automated cerebellar lobule segmentation method based on a graph cut segmentation framework. Results from multi-atlas labeling and tissue classification contribute to the region terms in the graph cut energy function and boundary classification contributes to the boundary term in the energy function. A cerebellar parcellation is achieved by minimizing the energy function using the α-expansion technique. The proposed method was evaluated using a leave-one-out cross-validation on 15 subjects including both healthy controls and patients with cerebellar diseases. Based on reported Dice coefficients, the proposed method outperforms two state-of-the-art methods. The proposed method was then applied to 77 subjects to study the region-specific cerebellar structural differences in three spinocerebellar ataxia (SCA) genetic subtypes. Quantitative analysis of the lobule volumes shows distinct patterns of volume changes associated with different SCA subtypes consistent with known patterns of atrophy in these genetic subtypes. PMID:26408861

  19. Automated artery-venous classification of retinal blood vessels based on structural mapping method

    NASA Astrophysics Data System (ADS)

    Joshi, Vinayak S.; Garvin, Mona K.; Reinhardt, Joseph M.; Abramoff, Michael D.

    2012-03-01

    Retinal blood vessels show morphologic modifications in response to various retinopathies. However, the specific responses exhibited by arteries and veins may provide a precise diagnostic information, i.e., a diabetic retinopathy may be detected more accurately with the venous dilatation instead of average vessel dilatation. In order to analyze the vessel type specific morphologic modifications, the classification of a vessel network into arteries and veins is required. We previously described a method for identification and separation of retinal vessel trees; i.e. structural mapping. Therefore, we propose the artery-venous classification based on structural mapping and identification of color properties prominent to the vessel types. The mean and standard deviation of each of green channel intensity and hue channel intensity are analyzed in a region of interest around each centerline pixel of a vessel. Using the vector of color properties extracted from each centerline pixel, it is classified into one of the two clusters (artery and vein), obtained by the fuzzy-C-means clustering. According to the proportion of clustered centerline pixels in a particular vessel, and utilizing the artery-venous crossing property of retinal vessels, each vessel is assigned a label of an artery or a vein. The classification results are compared with the manually annotated ground truth (gold standard). We applied the proposed method to a dataset of 15 retinal color fundus images resulting in an accuracy of 88.28% correctly classified vessel pixels. The automated classification results match well with the gold standard suggesting its potential in artery-venous classification and the respective morphology analysis.

  20. Automated Sample Exchange Robots for the Structural Biology Beam Lines at the Photon Factory

    SciTech Connect

    Hiraki, Masahiko; Watanabe, Shokei; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Gaponov, Yurii; Wakatsuki, Soichi

    2007-01-19

    We are now developing automated sample exchange robots for high-throughput protein crystallographic experiments for onsite use at synchrotron beam lines. It is part of the fully automated robotics systems being developed at the Photon Factory, for the purposes of protein crystallization, monitoring crystal growth, harvesting and freezing crystals, mounting the crystals inside a hutch and for data collection. We have already installed the sample exchange robots based on the SSRL automated mounting system at our insertion device beam lines BL-5A and AR-NW12A at the Photon Factory. In order to reduce the time required for sample exchange further, a prototype of a double-tonged system was developed. As a result of preliminary experiments with double-tonged robots, the sample exchange time was successfully reduced from 70 seconds to 10 seconds with the exception of the time required for pre-cooling and warming up the tongs.

  1. The second round of Critical Assessment of Automated Structure Determination of Proteins by NMR: CASD-NMR-2013.

    PubMed

    Rosato, Antonio; Vranken, Wim; Fogh, Rasmus H; Ragan, Timothy J; Tejero, Roberto; Pederson, Kari; Lee, Hsiau-Wei; Prestegard, James H; Yee, Adelinda; Wu, Bin; Lemak, Alexander; Houliston, Scott; Arrowsmith, Cheryl H; Kennedy, Michael; Acton, Thomas B; Xiao, Rong; Liu, Gaohua; Montelione, Gaetano T; Vuister, Geerten W

    2015-08-01

    The second round of the community-wide initiative Critical Assessment of automated Structure Determination of Proteins by NMR (CASD-NMR-2013) comprised ten blind target datasets, consisting of unprocessed spectral data, assigned chemical shift lists and unassigned NOESY peak and RDC lists, that were made available in both curated (i.e. manually refined) or un-curated (i.e. automatically generated) form. Ten structure calculation programs, using fully automated protocols only, generated a total of 164 three-dimensional structures (entries) for the ten targets, sometimes using both curated and un-curated lists to generate multiple entries for a single target. The accuracy of the entries could be established by comparing them to the corresponding manually solved structure of each target, which was not available at the time the data were provided. Across the entire data set, 71 % of all entries submitted achieved an accuracy relative to the reference NMR structure better than 1.5 Å. Methods based on NOESY peak lists achieved even better results with up to 100% of the entries within the 1.5 Å threshold for some programs. However, some methods did not converge for some targets using un-curated NOESY peak lists. Over 90% of the entries achieved an accuracy better than the more relaxed threshold of 2.5 Å that was used in the previous CASD-NMR-2010 round. Comparisons between entries generated with un-curated versus curated peaks show only marginal improvements for the latter in those cases where both calculations converged. PMID:26071966

  2. Automated Processing of Imaging Data through Multi-tiered Classification of Biological Structures Illustrated Using Caenorhabditis elegans.

    PubMed

    Zhan, Mei; Crane, Matthew M; Entchev, Eugeni V; Caballero, Antonio; Fernandes de Abreu, Diana Andrea; Ch'ng, QueeLim; Lu, Hang

    2015-04-01

    Quantitative imaging has become a vital technique in biological discovery and clinical diagnostics; a plethora of tools have recently been developed to enable new and accelerated forms of biological investigation. Increasingly, the capacity for high-throughput experimentation provided by new imaging modalities, contrast techniques, microscopy tools, microfluidics and computer controlled systems shifts the experimental bottleneck from the level of physical manipulation and raw data collection to automated recognition and data processing. Yet, despite their broad importance, image analysis solutions to address these needs have been narrowly tailored. Here, we present a generalizable formulation for autonomous identification of specific biological structures that is applicable for many problems. The process flow architecture we present here utilizes standard image processing techniques and the multi-tiered application of classification models such as support vector machines (SVM). These low-level functions are readily available in a large array of image processing software packages and programming languages. Our framework is thus both easy to implement at the modular level and provides specific high-level architecture to guide the solution of more complicated image-processing problems. We demonstrate the utility of the classification routine by developing two specific classifiers as a toolset for automation and cell identification in the model organism Caenorhabditis elegans. To serve a common need for automated high-resolution imaging and behavior applications in the C. elegans research community, we contribute a ready-to-use classifier for the identification of the head of the animal under bright field imaging. Furthermore, we extend our framework to address the pervasive problem of cell-specific identification under fluorescent imaging, which is critical for biological investigation in multicellular organisms or tissues. Using these examples as a guide, we envision

  3. Automated Processing of Imaging Data through Multi-tiered Classification of Biological Structures Illustrated Using Caenorhabditis elegans

    PubMed Central

    Zhan, Mei; Crane, Matthew M.; Entchev, Eugeni V.; Caballero, Antonio; Fernandes de Abreu, Diana Andrea; Ch’ng, QueeLim; Lu, Hang

    2015-01-01

    Quantitative imaging has become a vital technique in biological discovery and clinical diagnostics; a plethora of tools have recently been developed to enable new and accelerated forms of biological investigation. Increasingly, the capacity for high-throughput experimentation provided by new imaging modalities, contrast techniques, microscopy tools, microfluidics and computer controlled systems shifts the experimental bottleneck from the level of physical manipulation and raw data collection to automated recognition and data processing. Yet, despite their broad importance, image analysis solutions to address these needs have been narrowly tailored. Here, we present a generalizable formulation for autonomous identification of specific biological structures that is applicable for many problems. The process flow architecture we present here utilizes standard image processing techniques and the multi-tiered application of classification models such as support vector machines (SVM). These low-level functions are readily available in a large array of image processing software packages and programming languages. Our framework is thus both easy to implement at the modular level and provides specific high-level architecture to guide the solution of more complicated image-processing problems. We demonstrate the utility of the classification routine by developing two specific classifiers as a toolset for automation and cell identification in the model organism Caenorhabditis elegans. To serve a common need for automated high-resolution imaging and behavior applications in the C. elegans research community, we contribute a ready-to-use classifier for the identification of the head of the animal under bright field imaging. Furthermore, we extend our framework to address the pervasive problem of cell-specific identification under fluorescent imaging, which is critical for biological investigation in multicellular organisms or tissues. Using these examples as a guide, we envision

  4. Automated structural design with aeroelastic constraints - A review and assessment of the state of the art

    NASA Technical Reports Server (NTRS)

    Stroud, W. J.

    1974-01-01

    A review and assessment of the state of the art in automated aeroelastic design is presented. Most of the aeroelastic design studies appearing in the literature deal with flutter, and, therefore, this paper also concentrates on flutter. The flutter design problem is divided into three cases: as isolated flutter mode, neighboring flutter modes, and a hump mode which can rise and cause a sudden, discontinuous change in the flutter velocity. Synthesis procedures are presented in terms of techniques that are appropriate for problems of various levels of difficulty. Current trends, which should result in more efficient, powerful and versatile design codes, are discussed. Approximate analysis procedures and the need for simultaneous consideration of multiple design requirements are emphasized.

  5. Automated retrieval of forest structure variables based on multi-scale texture analysis of VHR satellite imagery

    NASA Astrophysics Data System (ADS)

    Beguet, Benoit; Guyon, Dominique; Boukir, Samia; Chehata, Nesrine

    2014-10-01

    The main goal of this study is to design a method to describe the structure of forest stands from Very High Resolution satellite imagery, relying on some typical variables such as crown diameter, tree height, trunk diameter, tree density and tree spacing. The emphasis is placed on the automatization of the process of identification of the most relevant image features for the forest structure retrieval task, exploiting both spectral and spatial information. Our approach is based on linear regressions between the forest structure variables to be estimated and various spectral and Haralick's texture features. The main drawback of this well-known texture representation is the underlying parameters which are extremely difficult to set due to the spatial complexity of the forest structure. To tackle this major issue, an automated feature selection process is proposed which is based on statistical modeling, exploring a wide range of parameter values. It provides texture measures of diverse spatial parameters hence implicitly inducing a multi-scale texture analysis. A new feature selection technique, we called Random PRiF, is proposed. It relies on random sampling in feature space, carefully addresses the multicollinearity issue in multiple-linear regression while ensuring accurate prediction of forest variables. Our automated forest variable estimation scheme was tested on Quickbird and Pléiades panchromatic and multispectral images, acquired at different periods on the maritime pine stands of two sites in South-Western France. It outperforms two well-established variable subset selection techniques. It has been successfully applied to identify the best texture features in modeling the five considered forest structure variables. The RMSE of all predicted forest variables is improved by combining multispectral and panchromatic texture features, with various parameterizations, highlighting the potential of a multi-resolution approach for retrieving forest structure

  6. The role of social and ecological processes in structuring animal populations: a case study from automated tracking of wild birds.

    PubMed

    Farine, Damien R; Firth, Josh A; Aplin, Lucy M; Crates, Ross A; Culina, Antica; Garroway, Colin J; Hinde, Camilla A; Kidd, Lindall R; Milligan, Nicole D; Psorakis, Ioannis; Radersma, Reinder; Verhelst, Brecht; Voelkl, Bernhard; Sheldon, Ben C

    2015-04-01

    Both social and ecological factors influence population process and structure, with resultant consequences for phenotypic selection on individuals. Understanding the scale and relative contribution of these two factors is thus a central aim in evolutionary ecology. In this study, we develop a framework using null models to identify the social and spatial patterns that contribute to phenotypic structure in a wild population of songbirds. We used automated technologies to track 1053 individuals that formed 73 737 groups from which we inferred a social network. Our framework identified that both social and spatial drivers contributed to assortment in the network. In particular, groups had a more even sex ratio than expected and exhibited a consistent age structure that suggested local association preferences, such as preferential attachment or avoidance. By contrast, recent immigrants were spatially partitioned from locally born individuals, suggesting differential dispersal strategies by phenotype. Our results highlight how different scales of social decision-making, ranging from post-natal dispersal settlement to fission-fusion dynamics, can interact to drive phenotypic structure in animal populations.

  7. The role of social and ecological processes in structuring animal populations: a case study from automated tracking of wild birds

    PubMed Central

    Farine, Damien R.; Firth, Josh A.; Aplin, Lucy M.; Crates, Ross A.; Culina, Antica; Garroway, Colin J.; Hinde, Camilla A.; Kidd, Lindall R.; Milligan, Nicole D.; Psorakis, Ioannis; Radersma, Reinder; Verhelst, Brecht; Voelkl, Bernhard; Sheldon, Ben C.

    2015-01-01

    Both social and ecological factors influence population process and structure, with resultant consequences for phenotypic selection on individuals. Understanding the scale and relative contribution of these two factors is thus a central aim in evolutionary ecology. In this study, we develop a framework using null models to identify the social and spatial patterns that contribute to phenotypic structure in a wild population of songbirds. We used automated technologies to track 1053 individuals that formed 73 737 groups from which we inferred a social network. Our framework identified that both social and spatial drivers contributed to assortment in the network. In particular, groups had a more even sex ratio than expected and exhibited a consistent age structure that suggested local association preferences, such as preferential attachment or avoidance. By contrast, recent immigrants were spatially partitioned from locally born individuals, suggesting differential dispersal strategies by phenotype. Our results highlight how different scales of social decision-making, ranging from post-natal dispersal settlement to fission–fusion dynamics, can interact to drive phenotypic structure in animal populations. PMID:26064644

  8. Cockpit automation

    NASA Technical Reports Server (NTRS)

    Wiener, Earl L.

    1988-01-01

    The aims and methods of aircraft cockpit automation are reviewed from a human-factors perspective. Consideration is given to the mixed pilot reception of increased automation, government concern with the safety and reliability of highly automated aircraft, the formal definition of automation, and the ground-proximity warning system and accidents involving controlled flight into terrain. The factors motivating automation include technology availability; safety; economy, reliability, and maintenance; workload reduction and two-pilot certification; more accurate maneuvering and navigation; display flexibility; economy of cockpit space; and military requirements.

  9. Advances in inspection automation

    NASA Astrophysics Data System (ADS)

    Weber, Walter H.; Mair, H. Douglas; Jansen, Dion; Lombardi, Luciano

    2013-01-01

    This new session at QNDE reflects the growing interest in inspection automation. Our paper describes a newly developed platform that makes the complex NDE automation possible without the need for software programmers. Inspection tasks that are tedious, error-prone or impossible for humans to perform can now be automated using a form of drag and drop visual scripting. Our work attempts to rectify the problem that NDE is not keeping pace with the rest of factory automation. Outside of NDE, robots routinely and autonomously machine parts, assemble components, weld structures and report progress to corporate databases. By contrast, components arriving in the NDT department typically require manual part handling, calibrations and analysis. The automation examples in this paper cover the development of robotic thickness gauging and the use of adaptive contour following on the NRU reactor inspection at Chalk River.

  10. Fully Automated and Robust Tracking of Transient Waves in Structured Anatomies Using Dynamic Programming.

    PubMed

    Akkus, Zeynettin; Bayat, Mahdi; Cheong, Mathew; Viksit, Kumar; Erickson, Bradley J; Alizad, Azra; Fatemi, Mostafa

    2016-10-01

    Tissue stiffness is often linked to underlying pathology and can be quantified by measuring the mechanical transient transverse wave speed (TWS) within the medium. Time-of-flight methods based on correlation of the transient signals or tracking of peaks have been used to quantify the TWS from displacement maps obtained with ultrasound pulse-echo techniques. However, it is challenging to apply these methods to in vivo data because of tissue inhomogeneity, noise and artifacts that produce outliers. In this study, we introduce a robust and fully automated method based on dynamic programming to estimate TWS in tissues with known geometries. The method is validated using ultrasound bladder vibrometry data from an in vivo study. We compared the results of our method with those of time-of-flight techniques. Our method performs better than time-of-flight techniques. In conclusion, we present a robust and accurate TWS detection method that overcomes the difficulties of time-of-flight methods.

  11. Fully Automated and Robust Tracking of Transient Waves in Structured Anatomies Using Dynamic Programming.

    PubMed

    Akkus, Zeynettin; Bayat, Mahdi; Cheong, Mathew; Viksit, Kumar; Erickson, Bradley J; Alizad, Azra; Fatemi, Mostafa

    2016-10-01

    Tissue stiffness is often linked to underlying pathology and can be quantified by measuring the mechanical transient transverse wave speed (TWS) within the medium. Time-of-flight methods based on correlation of the transient signals or tracking of peaks have been used to quantify the TWS from displacement maps obtained with ultrasound pulse-echo techniques. However, it is challenging to apply these methods to in vivo data because of tissue inhomogeneity, noise and artifacts that produce outliers. In this study, we introduce a robust and fully automated method based on dynamic programming to estimate TWS in tissues with known geometries. The method is validated using ultrasound bladder vibrometry data from an in vivo study. We compared the results of our method with those of time-of-flight techniques. Our method performs better than time-of-flight techniques. In conclusion, we present a robust and accurate TWS detection method that overcomes the difficulties of time-of-flight methods. PMID:27425150

  12. Deposit3D: a tool for automating structure depositions to the Protein Data Bank

    SciTech Connect

    Badger, J. Hendle, J.; Burley, S. K.; Kissinger, C. R.

    2005-09-01

    This paper describes a Python script that may be used to gather all required structure-annotation information into an mmCIF file for upload through the RCSB PDB ADIT structure-deposition interface. Almost all successful protein structure-determination projects in the public sector culminate in a structure deposition to the Protein Data Bank (PDB). In order to expedite the deposition proces, Deposit3D has been developed. This command-line script calculates or gathers all the required structure-deposition information and outputs this data into a mmCIF file for subsequent upload through the RCSB PDB ADIT interface. Deposit3D might be particularly useful for structural genomics pipeline projects because it allows workers involved with various stages of a structure-determination project to pool their different categories of annotation information before starting a deposition session.

  13. Method and system for automated on-chip material and structural certification of MEMS devices

    DOEpatents

    Sinclair, Michael B.; DeBoer, Maarten P.; Smith, Norman F.; Jensen, Brian D.; Miller, Samuel L.

    2003-05-20

    A new approach toward MEMS quality control and materials characterization is provided by a combined test structure measurement and mechanical response modeling approach. Simple test structures are cofabricated with the MEMS devices being produced. These test structures are designed to isolate certain types of physical response, so that measurement of their behavior under applied stress can be easily interpreted as quality control and material properties information.

  14. Automated detection and labeling of high-density EEG electrodes from structural MR images

    NASA Astrophysics Data System (ADS)

    Marino, Marco; Liu, Quanying; Brem, Silvia; Wenderoth, Nicole; Mantini, Dante

    2016-10-01

    Objective. Accurate knowledge about the positions of electrodes in electroencephalography (EEG) is very important for precise source localizations. Direct detection of electrodes from magnetic resonance (MR) images is particularly interesting, as it is possible to avoid errors of co-registration between electrode and head coordinate systems. In this study, we propose an automated MR-based method for electrode detection and labeling, particularly tailored to high-density montages. Approach. Anatomical MR images were processed to create an electrode-enhanced image in individual space. Image processing included intensity non-uniformity correction, background noise and goggles artifact removal. Next, we defined a search volume around the head where electrode positions were detected. Electrodes were identified as local maxima in the search volume and registered to the Montreal Neurological Institute standard space using an affine transformation. This allowed the matching of the detected points with the specific EEG montage template, as well as their labeling. Matching and labeling were performed by the coherent point drift method. Our method was assessed on 8 MR images collected in subjects wearing a 256-channel EEG net, using the displacement with respect to manually selected electrodes as performance metric. Main results. Average displacement achieved by our method was significantly lower compared to alternative techniques, such as the photogrammetry technique. The maximum displacement was for more than 99% of the electrodes lower than 1 cm, which is typically considered an acceptable upper limit for errors in electrode positioning. Our method showed robustness and reliability, even in suboptimal conditions, such as in the case of net rotation, imprecisely gathered wires, electrode detachment from the head, and MR image ghosting. Significance. We showed that our method provides objective, repeatable and precise estimates of EEG electrode coordinates. We hope our work

  15. ArchDB: automated protein loop classification as a tool for structural genomics.

    PubMed

    Espadaler, Jordi; Fernandez-Fuentes, Narcis; Hermoso, Antonio; Querol, Enrique; Aviles, Francesc X; Sternberg, Michael J E; Oliva, Baldomero

    2004-01-01

    The annotation of protein function has become a crucial problem with the advent of sequence and structural genomics initiatives. A large body of evidence suggests that protein structural information is frequently encoded in local sequences, and that folds are mainly made up of a number of simple local units of super-secondary structural motifs, consisting of a few secondary structures and their connecting loops. Moreover, protein loops play an important role in protein function. Here we present ArchDB, a classification database of structural motifs, consisting of one loop plus its bracing secondary structures. ArchDB currently contains 12,665 super-secondary elements classified into 1496 motif subclasses. The database provides an easy way to retrieve functional information from protein structures sharing a common motif, to search motifs found in a given SCOP family, superfamily or fold, or to search by keywords on proteins with classified loops. The ArchDB database of loops is located at http://sbi.imim.es/archdb. PMID:14681390

  16. Semi-automated measurement of anatomical structures using statistical and morphological priors

    NASA Astrophysics Data System (ADS)

    Ashton, Edward A.; Du, Tong

    2004-05-01

    Rapid, accurate and reproducible delineation and measurement of arbitrary anatomical structures in medical images is a widely held goal, with important applications in both clinical diagnostics and, perhaps more significantly, pharmaceutical trial evaluation. This process requires the ability first to localize a structure within the body, and then to find a best approximation of the structure"s boundaries within a given scan. Structures that are tortuous and small in cross section, such as the hippocampus in the brain or the abdominal aorta, present a particular challenge. Their apparent shape and position can change significantly from slice to slice, and accurate prior shape models for such structures are often difficult to form. In this work, we have developed a system that makes use of both a user-defined shape model and a statistical maximum likelihood classifier to identify and measure structures of this sort in MRI and CT images. Experiments show that this system can reduce analysis time by 75% or more with respect to manual tracing with no loss of precision or accuracy.

  17. A semi-automated system for the characterization of NLC accelerating structures

    SciTech Connect

    Hanna, S.M.; Bowden, G.B.; Hoag, H.A.; Loewen, R.; Vlieks, A.E.; Wang, J.W.

    1995-06-01

    *A system for characterizing the phase shift per cell of a long X-band accelerator structure is described. The fields within the structure are perturbed by a small cylindrical metal bead pulled along the axis. A computer controls the bead position and processes the data from a network analyzer connected to the accelerator section. Measurements made on prototype accelerator sections are described, and they are shown to be in good agreement with theory.

  18. Automated Lipid A Structure Assignment from Hierarchical Tandem Mass Spectrometry Data

    NASA Astrophysics Data System (ADS)

    Ting, Ying S.; Shaffer, Scott A.; Jones, Jace W.; Ng, Wailap V.; Ernst, Robert K.; Goodlett, David R.

    2011-05-01

    Infusion-based electrospray ionization (ESI) coupled to multiple-stage tandem mass spectrometry (MS n ) is a standard methodology for investigating lipid A structural diversity (Shaffer et al. J. Am. Soc. Mass. Spectrom. 18(6), 1080-1092, 2007). Annotation of these MS n spectra, however, has remained a manual, expert-driven process. In order to keep up with the data acquisition rates of modern instruments, we devised a computational method to annotate lipid A MS n spectra rapidly and automatically, which we refer to as hierarchical tandem mass spectrometry (HiTMS) algorithm. As a first-pass tool, HiTMS aids expert interpretation of lipid A MS n data by providing the analyst with a set of candidate structures that may then be confirmed or rejected. HiTMS deciphers the signature ions (e.g., A-, Y-, and Z-type ions) and neutral losses of MS n spectra using a species-specific library based on general prior structural knowledge of the given lipid A species under investigation. Candidates are selected by calculating the correlation between theoretical and acquired MS n spectra. At a false discovery rate of less than 0.01, HiTMS correctly assigned 85% of the structures in a library of 133 manually annotated Francisella tularensis subspecies novicida lipid A structures. Additionally, HiTMS correctly assigned 85% of the structures in a smaller library of lipid A species from Yersinia pestis demonstrating that it may be used across species.

  19. Comparison of the bacterial community structure within the equine hindgut and faeces using Automated Ribosomal Intergenic Spacer Analysis (ARISA).

    PubMed

    Sadet-Bourgeteau, S; Philippeau, C; Dequiedt, S; Julliand, V

    2014-12-01

    The horse's hindgut bacterial ecosystem has often been studied using faecal samples. However few studies compared both bacterial ecosystems and the validity of using faecal samples may be questionable. Hence, the present study aimed to compare the structure of the equine bacterial community in the hindgut (caecum, right ventral colon) and faeces using a fingerprint technique known as Automated Ribosomal Intergenic Spacer Analysis (ARISA). Two DNA extraction methods were also assessed. Intestinal contents and faeces were sampled 3 h after the morning meal on four adult fistulated horses fed meadow hay and pelleted concentrate. Irrespective of the intestinal segment, Principal Component Analysis of ARISA profiles showed a strong individual effect (P<0.0001). However, across the study, faecal bacterial community structure significantly (P<0.001) differed from those of the caecum and colon, while there was no difference between the two hindgut communities. The use of a QIAamp(®) DNA Stool Mini kit increased the quality of DNA extracted irrespective of sample type. The differences observed between faecal and hindgut bacterial communities challenge the use of faeces as a representative for hindgut activity. Further investigations are necessary to compare bacterial activity between the hindgut and faeces in order to understand the validity of using faecal samples. PMID:25075719

  20. Using Structure-Based Organic Chemistry Online Tutorials with Automated Correction for Student Practice and Review

    ERIC Educational Resources Information Center

    O'Sullivan, Timothy P.; Hargaden, Gra´inne C.

    2014-01-01

    This article describes the development and implementation of an open-access organic chemistry question bank for online tutorials and assessments at University College Cork and Dublin Institute of Technology. SOCOT (structure-based organic chemistry online tutorials) may be used to supplement traditional small-group tutorials, thereby allowing…

  1. Automated global structure extraction for effective local building block processing in XCS.

    PubMed

    Butz, Martin V; Pelikan, Martin; Llorà, Xavier; Goldberg, David E

    2006-01-01

    Learning Classifier Systems (LCSs), such as the accuracy-based XCS, evolve distributed problem solutions represented by a population of rules. During evolution, features are specialized, propagated, and recombined to provide increasingly accurate subsolutions. Recently, it was shown that, as in conventional genetic algorithms (GAs), some problems require efficient processing of subsets of features to find problem solutions efficiently. In such problems, standard variation operators of genetic and evolutionary algorithms used in LCSs suffer from potential disruption of groups of interacting features, resulting in poor performance. This paper introduces efficient crossover operators to XCS by incorporating techniques derived from competent GAs: the extended compact GA (ECGA) and the Bayesian optimization algorithm (BOA). Instead of simple crossover operators such as uniform crossover or one-point crossover, ECGA or BOA-derived mechanisms are used to build a probabilistic model of the global population and to generate offspring classifiers locally using the model. Several offspring generation variations are introduced and evaluated. The results show that it is possible to achieve performance similar to runs with an informed crossover operator that is specifically designed to yield ideal problem-dependent exploration, exploiting provided problem structure information. Thus, we create the first competent LCSs, XCS/ECGA and XCS/BOA, that detect dependency structures online and propagate corresponding lower-level dependency structures effectively without any information about these structures given in advance.

  2. Automated assignment of NMR chemical shifts based on a known structure and 4D spectra.

    PubMed

    Trautwein, Matthias; Fredriksson, Kai; Möller, Heiko M; Exner, Thomas E

    2016-08-01

    Apart from their central role during 3D structure determination of proteins the backbone chemical shift assignment is the basis for a number of applications, like chemical shift perturbation mapping and studies on the dynamics of proteins. This assignment is not a trivial task even if a 3D protein structure is known and needs almost as much effort as the assignment for structure prediction if performed manually. We present here a new algorithm based solely on 4D [(1)H,(15)N]-HSQC-NOESY-[(1)H,(15)N]-HSQC spectra which is able to assign a large percentage of chemical shifts (73-82 %) unambiguously, demonstrated with proteins up to a size of 250 residues. For the remaining residues, a small number of possible assignments is filtered out. This is done by comparing distances in the 3D structure to restraints obtained from the peak volumes in the 4D spectrum. Using dead-end elimination, assignments are removed in which at least one of the restraints is violated. Including additional information from chemical shift predictions, a complete unambiguous assignment was obtained for Ubiquitin and 95 % of the residues were correctly assigned in the 251 residue-long N-terminal domain of enzyme I. The program including source code is available at https://github.com/thomasexner/4Dassign . PMID:27484442

  3. Low-Cost Impact Detection and Location for Automated Inspections of 3D Metallic Based Structures

    PubMed Central

    Morón, Carlos; Portilla, Marina P.; Somolinos, José A.; Morales, Rafael

    2015-01-01

    This paper describes a new low-cost means to detect and locate mechanical impacts (collisions) on a 3D metal-based structure. We employ the simple and reasonably hypothesis that the use of a homogeneous material will allow certain details of the impact to be automatically determined by measuring the time delays of acoustic wave propagation throughout the 3D structure. The location of strategic piezoelectric sensors on the structure and an electronic-computerized system has allowed us to determine the instant and position at which the impact is produced. The proposed automatic system allows us to fully integrate impact point detection and the task of inspecting the point or zone at which this impact occurs. What is more, the proposed method can be easily integrated into a robot-based inspection system capable of moving over 3D metallic structures, thus avoiding (or minimizing) the need for direct human intervention. Experimental results are provided to show the effectiveness of the proposed approach. PMID:26029951

  4. Automated antibody structure prediction using Accelrys tools: Results and best practices

    PubMed Central

    Fasnacht, Marc; Butenhof, Ken; Goupil-Lamy, Anne; Hernandez-Guzman, Francisco; Huang, Hongwei; Yan, Lisa

    2014-01-01

    We describe the methodology and results from our participation in the second Antibody Modeling Assessment experiment. During the experiment we predicted the structure of eleven unpublished antibody Fv fragments. Our prediction methods centered on template-based modeling; potential templates were selected from an antibody database based on their sequence similarity to the target in the framework regions. Depending on the quality of the templates, we constructed models of the antibody framework regions either using a single, chimeric or multiple template approach. The hypervariable loop regions in the initial models were rebuilt by grafting the corresponding regions from suitable templates onto the model. For the H3 loop region, we further refined models using ab initio methods. The final models were subjected to constrained energy minimization to resolve severe local structural problems. The analysis of the models submitted show that Accelrys tools allow for the construction of quite accurate models for the framework and the canonical CDR regions, with RMSDs to the X-ray structure on average below 1 Å for most of these regions. The results show that accurate prediction of the H3 hypervariable loops remains a challenge. Furthermore, model quality assessment of the submitted models show that the models are of quite high quality, with local geometry assessment scores similar to that of the target X-ray structures. Proteins 2014; 82:1583–1598. © 2014 The Authors. Proteins published by Wiley Periodicals, Inc. PMID:24833271

  5. Semi-automated structural analysis of high resolution magnetic and gamma-ray spectrometry airborne surveys

    NASA Astrophysics Data System (ADS)

    Debeglia, N.; Martelet, G.; Perrin, J.; Truffert, C.; Ledru, P.; Tourlière, B.

    2005-08-01

    A user-controlled procedure was implemented for the structural analysis of geophysical maps. Local edge segments are first extracted using a suitable edge detector function, then linked into straight discontinuities and, finally, organised in complex boundary lines best delineating geophysical features. Final boundary lines may be attributed by a geologist to lithological contacts and/or structural geological features. Tests of some edge detectors, (i) horizontal gradient magnitude (HGM), (ii) various orders of the analytic signal ( An), reduced to the pole or not, (iii) enhanced horizontal derivative (EHD), (iv) composite analytic signal (CAS), were performed on synthetic magnetic data (with and without noise). As a result of these comparisons, the horizontal gradient appears to remain the best operator for the analysis of magnetic data. Computation of gradients in the frequency domain, including filtering and upward continuation of noisy data, is well-suited to the extraction of magnetic gradients associated to deep sources, while space-domain smoothing and differentiation techniques is generally preferable in the case of shallow magnetic sources, or for gamma-ray spectrometry analysis. Algorithms for edge extraction, segment linking, and line following can be controlled by choosing adequate edge detector and processing parameters which allows adaptation to a desired scale of interpretation. Tests on synthetic and real case data demonstrate the adaptability of the procedure and its ability to produce basic layer for multi-data analysis. The method was applied to the interpretation of high-resolution airborne magnetic and gamma-ray spectrometry data collected in northern Namibia. It allowed the delineation of dyke networks concealed by superficial weathering and demonstrated the presence of lithological variations in alluvial flows. The output from the structural analysis procedure are compatible with standard GIS softwares and enable the geologist to (i) compare

  6. Automating unambiguous NOE data usage in NVR for NMR protein structure-based assignments.

    PubMed

    Akhmedov, Murodzhon; Çatay, Bülent; Apaydın, Mehmet Serkan

    2015-12-01

    Nuclear Magnetic Resonance (NMR) Spectroscopy is an important technique that allows determining protein structure in solution. An important problem in protein structure determination using NMR spectroscopy is the mapping of peaks to corresponding amino acids, also known as the assignment problem. Structure-Based Assignment (SBA) is an approach to solve this problem using a template structure that is homologous to the target. Our previously developed approach Nuclear Vector Replacement-Binary Integer Programming (NVR-BIP) computed the optimal solution for small proteins, but was unable to solve the assignments of large proteins. NVR-Ant Colony Optimization (ACO) extended the applicability of the NVR approach for such proteins. One of the input data utilized in these approaches is the Nuclear Overhauser Effect (NOE) data. NOE is an interaction observed between two protons if the protons are located close in space. These protons could be amide protons, protons attached to the alpha-carbon atom in the backbone of the protein, or side chain protons. NVR only uses backbone protons. In this paper, we reformulate the NVR-BIP model to distinguish the type of proton in NOE data and use the corresponding proton coordinates in the extended formulation. In addition, the threshold value over interproton distances is set in a standard manner for all proteins by extracting the NOE upper bound distance information from the data. We also convert NOE intensities into distance thresholds. Our new approach thus handles the NOE data correctly and without manually determined parameters. We accordingly adapt NVR-ACO solution methodology to these changes. Computational results show that our approaches obtain optimal solutions for small proteins. For the large proteins our ant colony optimization-based approach obtains promising results.

  7. Automating unambiguous NOE data usage in NVR for NMR protein structure-based assignments.

    PubMed

    Akhmedov, Murodzhon; Çatay, Bülent; Apaydın, Mehmet Serkan

    2015-12-01

    Nuclear Magnetic Resonance (NMR) Spectroscopy is an important technique that allows determining protein structure in solution. An important problem in protein structure determination using NMR spectroscopy is the mapping of peaks to corresponding amino acids, also known as the assignment problem. Structure-Based Assignment (SBA) is an approach to solve this problem using a template structure that is homologous to the target. Our previously developed approach Nuclear Vector Replacement-Binary Integer Programming (NVR-BIP) computed the optimal solution for small proteins, but was unable to solve the assignments of large proteins. NVR-Ant Colony Optimization (ACO) extended the applicability of the NVR approach for such proteins. One of the input data utilized in these approaches is the Nuclear Overhauser Effect (NOE) data. NOE is an interaction observed between two protons if the protons are located close in space. These protons could be amide protons, protons attached to the alpha-carbon atom in the backbone of the protein, or side chain protons. NVR only uses backbone protons. In this paper, we reformulate the NVR-BIP model to distinguish the type of proton in NOE data and use the corresponding proton coordinates in the extended formulation. In addition, the threshold value over interproton distances is set in a standard manner for all proteins by extracting the NOE upper bound distance information from the data. We also convert NOE intensities into distance thresholds. Our new approach thus handles the NOE data correctly and without manually determined parameters. We accordingly adapt NVR-ACO solution methodology to these changes. Computational results show that our approaches obtain optimal solutions for small proteins. For the large proteins our ant colony optimization-based approach obtains promising results. PMID:26260854

  8. A Machine Learning Approach to Automated Structural Network Analysis: Application to Neonatal Encephalopathy

    PubMed Central

    Ziv, Etay; Tymofiyeva, Olga; Ferriero, Donna M.; Barkovich, A. James; Hess, Chris P.; Xu, Duan

    2013-01-01

    Neonatal encephalopathy represents a heterogeneous group of conditions associated with life-long developmental disabilities and neurological deficits. Clinical measures and current anatomic brain imaging remain inadequate predictors of outcome in children with neonatal encephalopathy. Some studies have suggested that brain development and, therefore, brain connectivity may be altered in the subgroup of patients who subsequently go on to develop clinically significant neurological abnormalities. Large-scale structural brain connectivity networks constructed using diffusion tractography have been posited to reflect organizational differences in white matter architecture at the mesoscale, and thus offer a unique tool for characterizing brain development in patients with neonatal encephalopathy. In this manuscript we use diffusion tractography to construct structural networks for a cohort of patients with neonatal encephalopathy. We systematically map these networks to a high-dimensional space and then apply standard machine learning algorithms to predict neurological outcome in the cohort. Using nested cross-validation we demonstrate high prediction accuracy that is both statistically significant and robust over a broad range of thresholds. Our algorithm offers a novel tool to evaluate neonates at risk for developing neurological deficit. The described approach can be applied to any brain pathology that affects structural connectivity. PMID:24282501

  9. Automated procedure for design of wing structures to satisfy strength and flutter requirements

    NASA Technical Reports Server (NTRS)

    Haftka, R. T.

    1973-01-01

    A pilot computer program was developed for the design of minimum mass wing structures under flutter, strength, and minimum gage constraints. The wing structure is idealized by finite elements, and second-order piston theory aerodynamics is used in the flutter calculation. Mathematical programing methods are used for the optimization. Computation times during the design process are reduced by three techniques. First, iterative analysis methods used to reduce significantly reanalysis times. Second, the number of design variables is kept small by not using a one-to-one correspondence between finite elements and design variables. Third, a technique for using approximate second derivatives with Newton's method for the optimization is incorporated. The program output is compared witH previous published results. It is found that some flutter characteristics, such as the flutter speed, can display discontinous dependence on the design variables (which are the thicknesses of the structural elements). It is concluded that it is undesirable to use such quantities in the formulation of the flutter constraint.

  10. CHEM-PATH-TRACKER: An automated tool to analyze chemical motifs in molecular structures.

    PubMed

    Ribeiro, João V; Cerqueira, N M F S A; Fernandes, Pedro A; Ramos, Maria J

    2014-07-01

    In this article, we propose a method for locating functionally relevant chemical motifs in protein structures. The chemical motifs can be a small group of residues or structure protein fragments with highly conserved properties that have important biological functions. However, the detection of chemical motifs is rather difficult because they often consist of a set of amino acid residues separated by long, variable regions, and they only come together to form a functional group when the protein is folded into its three-dimensional structure. Furthermore, the assemblage of these residues is often dependent on non-covalent interactions among the constituent amino acids that are difficult to detect or visualize. To simplify the analysis of these chemical motifs and give access to a generalized use for all users, we developed chem-path-tracker. This software is a VMD plug-in that allows the user to highlight and reveal potential chemical motifs requiring only a few selections. The analysis is based on atoms/residues pair distances applying a modified version of Dijkstra's algorithm, and it makes possible to monitor the distances of a large pathway, even during a molecular dynamics simulation. This tool turned out to be very useful, fast, and user-friendly in the performed tests. The chem-path-tracker package is distributed as an independent platform and can be found at http://www.fc.up.pt/PortoBioComp/database/doku.php?id=chem-path-tracker. PMID:24775806

  11. Automated preliminary design of simplified wing structures to satisfy strength and flutter requirements

    NASA Technical Reports Server (NTRS)

    Stroud, W. J.; Dexter, C. B.; Stein, M.

    1972-01-01

    A simple structural model of an aircraft wing is used to show the effects of strength (stress) and flutter requirements on the design of minimum-weight aircraft-wing structures. The wing is idealized as an isotropic sandwich plate with a variable cover thickness distribution and a variable depth between covers. Plate theory is used for the structural analysis, and piston theory is used for the unsteady aerodynamics in the flutter analysis. Mathematical programming techniques are used to find the minimum-weight cover thickness distribution which satisfies flutter, strength, and minimum-gage constraints. The method of solution, some sample results, and the computer program used to obtain these results are presented. The results indicate that the cover thickness distribution obtained when designing for the strength requirement alone may be quite different from the cover thickness distribution obtained when designing for either the flutter requirement alone or for both the strength and flutter requirements concurrently. This conclusion emphasizes the need for designing for both flutter and strength from the outset.

  12. Automated Voxel Model from Point Clouds for Structural Analysis of Cultural Heritage

    NASA Astrophysics Data System (ADS)

    Bitelli, G.; Castellazzi, G.; D'Altri, A. M.; De Miranda, S.; Lambertini, A.; Selvaggi, I.

    2016-06-01

    In the context of cultural heritage, an accurate and comprehensive digital survey of a historical building is today essential in order to measure its geometry in detail for documentation or restoration purposes, for supporting special studies regarding materials and constructive characteristics, and finally for structural analysis. Some proven geomatic techniques, such as photogrammetry and terrestrial laser scanning, are increasingly used to survey buildings with different complexity and dimensions; one typical product is in form of point clouds. We developed a semi-automatic procedure to convert point clouds, acquired from laserscan or digital photogrammetry, to a filled volume model of the whole structure. The filled volume model, in a voxel format, can be useful for further analysis and also for the generation of a Finite Element Model (FEM) of the surveyed building. In this paper a new approach is presented with the aim to decrease operator intervention in the workflow and obtain a better description of the structure. In order to achieve this result a voxel model with variable resolution is produced. Different parameters are compared and different steps of the procedure are tested and validated in the case study of the North tower of the San Felice sul Panaro Fortress, a monumental historical building located in San Felice sul Panaro (Modena, Italy) that was hit by an earthquake in 2012.

  13. Automated method for the identification and analysis of vascular tree structures in retinal vessel network

    NASA Astrophysics Data System (ADS)

    Joshi, Vinayak S.; Garvin, Mona K.; Reinhardt, Joseph M.; Abramoff, Michael D.

    2011-03-01

    Structural analysis of retinal vessel network has so far served in the diagnosis of retinopathies and systemic diseases. The retinopathies are known to affect the morphologic properties of retinal vessels such as course, shape, caliber, and tortuosity. Whether the arteries and the veins respond to these changes together or in tandem has always been a topic of discussion. However the diseases such as diabetic retinopathy and retinopathy of prematurity have been diagnosed with the morphologic changes specific either to arteries or to veins. Thus a method describing the separation of retinal vessel trees imaged in a two dimensional color fundus image may assist in artery-vein classification and quantitative assessment of morphologic changes particular to arteries or veins. We propose a method based on mathematical morphology and graph search to identify and label the retinal vessel trees, which provides a structural mapping of vessel network in terms of each individual primary vessel, its branches and spatial positions of branching and cross-over points. The method was evaluated on a dataset of 15 fundus images resulting into an accuracy of 92.87 % correctly assigned vessel pixels when compared with the manual labeling of separated vessel trees. Accordingly, the structural mapping method performs well and we are currently investigating its potential in evaluating the characteristic properties specific to arteries or veins.

  14. Automated polyp measurement based on colon structure decomposition for CT colonography

    NASA Astrophysics Data System (ADS)

    Wang, Huafeng; Li, Lihong C.; Han, Hao; Peng, Hao; Song, Bowen; Wei, Xinzhou; Liang, Zhengrong

    2014-03-01

    Accurate assessment of colorectal polyp size is of great significance for early diagnosis and management of colorectal cancers. Due to the complexity of colon structure, polyps with diverse geometric characteristics grow from different landform surfaces. In this paper, we present a new colon decomposition approach for polyp measurement. We first apply an efficient maximum a posteriori expectation-maximization (MAP-EM) partial volume segmentation algorithm to achieve an effective electronic cleansing on colon. The global colon structure is then decomposed into different kinds of morphological shapes, e.g. haustral folds or haustral wall. Meanwhile, the polyp location is identified by an automatic computer aided detection algorithm. By integrating the colon structure decomposition with the computer aided detection system, a patch volume of colon polyps is extracted. Thus, polyp size assessment can be achieved by finding abnormal protrusion on a relative uniform morphological surface from the decomposed colon landform. We evaluated our method via physical phantom and clinical datasets. Experiment results demonstrate the feasibility of our method in consistently quantifying the size of polyp volume and, therefore, facilitating characterizing for clinical management.

  15. Automated tracing of open-field coronal structures for an optimized large-scale magnetic field reconstruction

    NASA Astrophysics Data System (ADS)

    Uritsky, V. M.; Davila, J. M.; Jones, S. I.

    2014-12-01

    Solar Probe Plus and Solar Orbiter will provide detailed measurements in the inner heliosphere magnetically connected with the topologically complex and eruptive solar corona. Interpretation of these measurements will require accurate reconstruction of the large-scale coronal magnetic field. In a related presentation by S. Jones et al., we argue that such reconstruction can be performed using photospheric extrapolation methods constrained by white-light coronagraph images. Here, we present the image-processing component of this project dealing with an automated segmentation of fan-like coronal loop structures. In contrast to the existing segmentation codes designed for detecting small-scale closed loops in the vicinity of active regions, we focus on the large-scale geometry of the open-field coronal features observed at significant radial distances from the solar surface. The coronagraph images used for the loop segmentation are transformed into a polar coordinate system and undergo radial detrending and initial noise reduction. The preprocessed images are subject to an adaptive second order differentiation combining radial and azimuthal directions. An adjustable thresholding technique is applied to identify candidate coronagraph features associated with the large-scale coronal field. A blob detection algorithm is used to extract valid features and discard noisy data pixels. The obtained features are interpolated using higher-order polynomials which are used to derive empirical directional constraints for magnetic field extrapolation procedures based on photospheric magnetograms.

  16. Upper-mantle shear-wave structure under East and Southeast Asia from Automated Multimode Inversion of waveforms

    NASA Astrophysics Data System (ADS)

    Legendre, C. P.; Zhao, L.; Chen, Q.-F.

    2015-10-01

    We present a new Sv-velocity model of the upper mantle under East and Southeast Asia constrained by the inversion of seismic waveforms recorded by broad-band stations. Seismograms from earthquakes occurred between 1977 and 2012 are collected from about 4786 permanent and temporary stations in the region whenever and wherever available. Automated Multimode Inversion of surface and multiple-S waveforms is applied to extract structural information from the seismograms, in the form of linear equations with uncorrelated uncertainties. The equations are then solved for the seismic velocity perturbations in the crust and upper mantle with respect to a three-dimensional (3-D) reference model and a realistic crust. Major features of the lithosphere-asthenosphere system in East and Southeast Asia are identified in the resulting model. At lithospheric depth, low velocities can be seen beneath Tibet, whereas high velocities are found beneath cratonic regions, such as the Siberian, North China, Yangtze,) Tarim, and Dharwarand cratons. A number of microplates are mapped and the interaction with neighbouring plates is discussed. Slabs from the Pacific and Indian Oceans can be seen in the upper mantle. Passive marginal basins and subduction zones are also properly resolved.

  17. Informatics in radiology: automated structured reporting of imaging findings using the AIM standard and XML.

    PubMed

    Zimmerman, Stefan L; Kim, Woojin; Boonn, William W

    2011-01-01

    Quantitative and descriptive imaging data are a vital component of the radiology report and are frequently of paramount importance to the ordering physician. Unfortunately, current methods of recording these data in the report are both inefficient and error prone. In addition, the free-text, unstructured format of a radiology report makes aggregate analysis of data from multiple reports difficult or even impossible without manual intervention. A structured reporting work flow has been developed that allows quantitative data created at an advanced imaging workstation to be seamlessly integrated into the radiology report with minimal radiologist intervention. As an intermediary step between the workstation and the reporting software, quantitative and descriptive data are converted into an extensible markup language (XML) file in a standardized format specified by the Annotation and Image Markup (AIM) project of the National Institutes of Health Cancer Biomedical Informatics Grid. The AIM standard was created to allow image annotation data to be stored in a uniform machine-readable format. These XML files containing imaging data can also be stored on a local database for data mining and analysis. This structured work flow solution has the potential to improve radiologist efficiency, reduce errors, and facilitate storage of quantitative and descriptive imaging data for research. PMID:21357413

  18. Informatics in radiology: automated structured reporting of imaging findings using the AIM standard and XML.

    PubMed

    Zimmerman, Stefan L; Kim, Woojin; Boonn, William W

    2011-01-01

    Quantitative and descriptive imaging data are a vital component of the radiology report and are frequently of paramount importance to the ordering physician. Unfortunately, current methods of recording these data in the report are both inefficient and error prone. In addition, the free-text, unstructured format of a radiology report makes aggregate analysis of data from multiple reports difficult or even impossible without manual intervention. A structured reporting work flow has been developed that allows quantitative data created at an advanced imaging workstation to be seamlessly integrated into the radiology report with minimal radiologist intervention. As an intermediary step between the workstation and the reporting software, quantitative and descriptive data are converted into an extensible markup language (XML) file in a standardized format specified by the Annotation and Image Markup (AIM) project of the National Institutes of Health Cancer Biomedical Informatics Grid. The AIM standard was created to allow image annotation data to be stored in a uniform machine-readable format. These XML files containing imaging data can also be stored on a local database for data mining and analysis. This structured work flow solution has the potential to improve radiologist efficiency, reduce errors, and facilitate storage of quantitative and descriptive imaging data for research.

  19. Development of a Genetic Algorithm to Automate Clustering of a Dependency Structure Matrix

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; Korte, John J.; Bilardo, Vincent J.

    2006-01-01

    Much technology assessment and organization design data exists in Microsoft Excel spreadsheets. Tools are needed to put this data into a form that can be used by design managers to make design decisions. One need is to cluster data that is highly coupled. Tools such as the Dependency Structure Matrix (DSM) and a Genetic Algorithm (GA) can be of great benefit. However, no tool currently combines the DSM and a GA to solve the clustering problem. This paper describes a new software tool that interfaces a GA written as an Excel macro with a DSM in spreadsheet format. The results of several test cases are included to demonstrate how well this new tool works.

  20. Endoscopic system for automated high dynamic range inspection of moving periodic structures

    NASA Astrophysics Data System (ADS)

    Hahlweg, Cornelius; Rothe, Hendrik

    2015-09-01

    In the current paper an advanced endoscopic system for high resolution and high dynamic range inspection of periodic structures in rotating machines is presented. We address the system architecture, short time illumination, special optical problems, such as excluding the specular reflex, image processing, forward velocity prediction and metrological image processing. There are several special requirements to be met, such as the thermal stability above 100°C, robustness of the image field, illumination in view direction and the separation of metallic surface diffuse scatter. To find a compromise between image resolution and frame rate, an external sensor system was applied for synchronization with the moving target. The system originally was intended for inspection of thermal engines, but turned out to be of a more general use. Beside the theoretical part and dimensioning issues, practical examples and measurement results are included.

  1. Automated segmentation of intramacular layers in Fourier domain optical coherence tomography structural images from normal subjects

    PubMed Central

    Zhang, Xusheng; Yousefi, Siavash; An, Lin

    2012-01-01

    Abstract. Segmentation of optical coherence tomography (OCT) cross-sectional structural images is important for assisting ophthalmologists in clinical decision making in terms of both diagnosis and treatment. We present an automatic approach for segmenting intramacular layers in Fourier domain optical coherence tomography (FD-OCT) images using a searching strategy based on locally weighted gradient extrema, coupled with an error-removing technique based on statistical error estimation. A two-step denoising preprocess in different directions is also employed to suppress random speckle noise while preserving the layer boundary as intact as possible. The algorithms are tested on the FD-OCT volume images obtained from four normal subjects, which successfully identify the boundaries of seven physiological layers, consistent with the results based on manual determination of macular OCT images. PMID:22559689

  2. Direct evidence of intra- and interhemispheric corticomotor network degeneration in amyotrophic lateral sclerosis: an automated MRI structural connectivity study.

    PubMed

    Rose, Stephen; Pannek, Kerstin; Bell, Christopher; Baumann, Fusun; Hutchinson, Nicole; Coulthard, Alan; McCombe, Pamela; Henderson, Robert

    2012-02-01

    Although the pathogenesis of amyotrophic lateral sclerosis (ALS) is uncertain, there is mounting neuroimaging evidence to suggest a mechanism involving the degeneration of multiple white matter (WM) motor and extramotor neural networks. This insight has been achieved, in part, by using MRI Diffusion Tensor Imaging (DTI) and the voxelwise analysis of anisotropy indices, along with DTI tractography to determine which specific motor pathways are involved with ALS pathology. Automated MRI structural connectivity analyses, which probe WM connections linking various functionally discrete cortical regions, have the potential to provide novel information about degenerative processes within multiple white matter (WM) pathways. Our hypothesis is that measures of altered intra- and interhemispheric structural connectivity of the primary motor and somatosensory cortex will provide an improved assessment of corticomotor involvement in ALS. To test this hypothesis, we acquired High Angular Resolution Diffusion Imaging (HARDI) scans along with high resolution structural images (sMRI) on 15 patients with clinical evidence of upper and lower motor neuron involvement, and 20 matched control participants. Whole brain probabilistic tractography was applied to define specific WM pathways connecting discrete corticomotor targets generated from anatomical parcellation of sMRI of the brain. The integrity of these connections was interrogated by comparing the mean fractional anisotropy (FA) derived for each WM pathway. To assist in the interpretation of results, we measured the reproducibility of the FA summary measures over time (6months) in control participants. We also incorporated into our analysis pipeline the evaluation and replacement of outlier voxels due to head motion and physiological noise. When assessing corticomotor connectivity, we found a significant reduction in mean FA within a number of intra- and interhemispheric motor pathways in ALS patients. The abnormal

  3. Automation or De-automation

    NASA Astrophysics Data System (ADS)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  4. Process automation

    SciTech Connect

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs.

  5. Automated grid generation from models of complex geologic structure and stratigraphy

    SciTech Connect

    Gable, C.; Trease, H.; Cherry, T.

    1996-04-01

    The construction of computational grids which accurately reflect complex geologic structure and stratigraphy for flow and transport models poses a formidable task. With an understanding of stratigraphy, material properties and boundary and initial conditions, the task of incorporating this data into a numerical model can be difficult and time consuming. Most GIS tools for representing complex geologic volumes and surfaces are not designed for producing optimal grids for flow and transport computation. We have developed a tool, GEOMESH, for generating finite element grids that maintain the geometric integrity of input volumes, surfaces, and geologic data and produce an optimal (Delaunay) tetrahedral grid that can be used for flow and transport computations. GEOMESH also satisfies the constraint that the geometric coupling coefficients of the grid are positive for all elements. GEOMESH generates grids for two dimensional cross sections, three dimensional regional models, represents faults and fractures, and has the capability of including finer grids representing tunnels and well bores into grids. GEOMESH also permits adaptive grid refinement in three dimensions. The tools to glue, merge and insert grids together demonstrate how complex grids can be built from simpler pieces. The resulting grid can be utilized by unstructured finite element or integrated finite difference computational physics codes.

  6. Automated Structure- and Sequence-Based Design of Proteins for High Bacterial Expression and Stability.

    PubMed

    Goldenzweig, Adi; Goldsmith, Moshe; Hill, Shannon E; Gertman, Or; Laurino, Paola; Ashani, Yacov; Dym, Orly; Unger, Tamar; Albeck, Shira; Prilusky, Jaime; Lieberman, Raquel L; Aharoni, Amir; Silman, Israel; Sussman, Joel L; Tawfik, Dan S; Fleishman, Sarel J

    2016-07-21

    Upon heterologous overexpression, many proteins misfold or aggregate, thus resulting in low functional yields. Human acetylcholinesterase (hAChE), an enzyme mediating synaptic transmission, is a typical case of a human protein that necessitates mammalian systems to obtain functional expression. We developed a computational strategy and designed an AChE variant bearing 51 mutations that improved core packing, surface polarity, and backbone rigidity. This variant expressed at ∼2,000-fold higher levels in E. coli compared to wild-type hAChE and exhibited 20°C higher thermostability with no change in enzymatic properties or in the active-site configuration as determined by crystallography. To demonstrate broad utility, we similarly designed four other human and bacterial proteins. Testing at most three designs per protein, we obtained enhanced stability and/or higher yields of soluble and active protein in E. coli. Our algorithm requires only a 3D structure and several dozen sequences of naturally occurring homologs, and is available at http://pross.weizmann.ac.il. PMID:27425410

  7. Structures' validation profiles in Transmission of Imaging and Data (TRIAD) for automated National Clinical Trials Network (NCTN) clinical trial digital data quality assurance.

    PubMed

    Giaddui, Tawfik; Yu, Jialu; Manfredi, Denise; Linnemann, Nancy; Hunter, Joanne; O'Meara, Elizabeth; Galvin, James; Bialecki, Brian; Xiao, Ying

    2016-01-01

    Transmission of Imaging and Data (TRIAD) is a standard-based system built by the American College of Radiology to provide the seamless exchange of images and data for accreditation of clinical trials and registries. Scripts of structures' names validation profiles created in TRIAD are used in the automated submission process. It is essential for users to understand the logistics of these scripts for successful submission of radiation therapy cases with less iteration.

  8. Internal Transcribed Spacer 2 (nu ITS2 rRNA) Sequence-Structure Phylogenetics: Towards an Automated Reconstruction of the Green Algal Tree of Life

    PubMed Central

    Buchheim, Mark A.; Keller, Alexander; Koetschan, Christian; Förster, Frank; Merget, Benjamin; Wolf, Matthias

    2011-01-01

    Background Chloroplast-encoded genes (matK and rbcL) have been formally proposed for use in DNA barcoding efforts targeting embryophytes. Extending such a protocol to chlorophytan green algae, though, is fraught with problems including non homology (matK) and heterogeneity that prevents the creation of a universal PCR toolkit (rbcL). Some have advocated the use of the nuclear-encoded, internal transcribed spacer two (ITS2) as an alternative to the traditional chloroplast markers. However, the ITS2 is broadly perceived to be insufficiently conserved or to be confounded by introgression or biparental inheritance patterns, precluding its broad use in phylogenetic reconstruction or as a DNA barcode. A growing body of evidence has shown that simultaneous analysis of nucleotide data with secondary structure information can overcome at least some of the limitations of ITS2. The goal of this investigation was to assess the feasibility of an automated, sequence-structure approach for analysis of IT2 data from a large sampling of phylum Chlorophyta. Methodology/Principal Findings Sequences and secondary structures from 591 chlorophycean, 741 trebouxiophycean and 938 ulvophycean algae, all obtained from the ITS2 Database, were aligned using a sequence structure-specific scoring matrix. Phylogenetic relationships were reconstructed by Profile Neighbor-Joining coupled with a sequence structure-specific, general time reversible substitution model. Results from analyses of the ITS2 data were robust at multiple nodes and showed considerable congruence with results from published phylogenetic analyses. Conclusions/Significance Our observations on the power of automated, sequence-structure analyses of ITS2 to reconstruct phylum-level phylogenies of the green algae validate this approach to assessing diversity for large sets of chlorophytan taxa. Moreover, our results indicate that objections to the use of ITS2 for DNA barcoding should be weighed against the utility of an automated

  9. Improving the correlation of structural FEA models by the application of automated high density robotized laser Doppler vibrometry

    NASA Astrophysics Data System (ADS)

    Chowanietz, Maximilian; Bhangaonkar, Avinash; Semken, Michael; Cockrill, Martin

    2016-06-01

    Sound has had an intricate relation with the wellbeing of humans since time immemorial. It has the ability to enhance the quality of life immensely when present as music; at the same time, it can degrade its quality when manifested as noise. Hence, understanding its sources and the processes by which it is produced gains acute significance. Although various theories exist with respect to evolution of bells, it is indisputable that they carry millennia of cultural significance, and at least a few centuries of perfection with respect to design, casting and tuning. Despite the science behind its design, the nuances pertaining to founding and tuning have largely been empirical, and conveyed from one generation to the next. Post-production assessment for bells remains largely person-centric and traditional. However, progressive bell manufacturers have started adopting methods such as finite element analysis (FEA) for informing and optimising their future model designs. To establish confidence in the FEA process it is necessary to correlate the virtual model against a physical example. This is achieved by performing an experimental modal analysis (EMA) and comparing the results with those from FEA. Typically to collect the data for an EMA, the vibratory response of the structure is measured with the application of accelerometers. This technique has limitations; principally these are the observer effect and limited geometric resolution. In this paper, 3-dimensional laser Doppler vibrometry (LDV) has been used to measure the vibratory response with no observer effect due to the non-contact nature of the technique; resulting in higher accuracy measurements as the input to the correlation process. The laser heads were mounted on an industrial robot that enables large objects to be measured and extensive data sets to be captured quickly through an automated process. This approach gives previously unobtainable geometric resolution resulting in a higher confidence EMA. This is

  10. Human brain atlas for automated region of interest selection in quantitative susceptibility mapping: application to determine iron content in deep gray matter structures.

    PubMed

    Lim, Issel Anne L; Faria, Andreia V; Li, Xu; Hsu, Johnny T C; Airan, Raag D; Mori, Susumu; van Zijl, Peter C M

    2013-11-15

    The purpose of this paper is to extend the single-subject Eve atlas from Johns Hopkins University, which currently contains diffusion tensor and T1-weighted anatomical maps, by including contrast based on quantitative susceptibility mapping. The new atlas combines a "deep gray matter parcellation map" (DGMPM) derived from a single-subject quantitative susceptibility map with the previously established "white matter parcellation map" (WMPM) from the same subject's T1-weighted and diffusion tensor imaging data into an MNI coordinate map named the "Everything Parcellation Map in Eve Space," also known as the "EvePM." It allows automated segmentation of gray matter and white matter structures. Quantitative susceptibility maps from five healthy male volunteers (30 to 33 years of age) were coregistered to the Eve Atlas with AIR and Large Deformation Diffeomorphic Metric Mapping (LDDMM), and the transformation matrices were applied to the EvePM to produce automated parcellation in subject space. Parcellation accuracy was measured with a kappa analysis for the left and right structures of six deep gray matter regions. For multi-orientation QSM images, the Kappa statistic was 0.85 between automated and manual segmentation, with the inter-rater reproducibility Kappa being 0.89 for the human raters, suggesting "almost perfect" agreement between all segmentation methods. Segmentation seemed slightly more difficult for human raters on single-orientation QSM images, with the Kappa statistic being 0.88 between automated and manual segmentation, and 0.85 and 0.86 between human raters. Overall, this atlas provides a time-efficient tool for automated coregistration and segmentation of quantitative susceptibility data to analyze many regions of interest. These data were used to establish a baseline for normal magnetic susceptibility measurements for over 60 brain structures of 30- to 33-year-old males. Correlating the average susceptibility with age-based iron concentrations in gray

  11. Human brain atlas for automated region of interest selection in quantitative susceptibility mapping: application to determine iron content in deep gray matter structures.

    PubMed

    Lim, Issel Anne L; Faria, Andreia V; Li, Xu; Hsu, Johnny T C; Airan, Raag D; Mori, Susumu; van Zijl, Peter C M

    2013-11-15

    The purpose of this paper is to extend the single-subject Eve atlas from Johns Hopkins University, which currently contains diffusion tensor and T1-weighted anatomical maps, by including contrast based on quantitative susceptibility mapping. The new atlas combines a "deep gray matter parcellation map" (DGMPM) derived from a single-subject quantitative susceptibility map with the previously established "white matter parcellation map" (WMPM) from the same subject's T1-weighted and diffusion tensor imaging data into an MNI coordinate map named the "Everything Parcellation Map in Eve Space," also known as the "EvePM." It allows automated segmentation of gray matter and white matter structures. Quantitative susceptibility maps from five healthy male volunteers (30 to 33 years of age) were coregistered to the Eve Atlas with AIR and Large Deformation Diffeomorphic Metric Mapping (LDDMM), and the transformation matrices were applied to the EvePM to produce automated parcellation in subject space. Parcellation accuracy was measured with a kappa analysis for the left and right structures of six deep gray matter regions. For multi-orientation QSM images, the Kappa statistic was 0.85 between automated and manual segmentation, with the inter-rater reproducibility Kappa being 0.89 for the human raters, suggesting "almost perfect" agreement between all segmentation methods. Segmentation seemed slightly more difficult for human raters on single-orientation QSM images, with the Kappa statistic being 0.88 between automated and manual segmentation, and 0.85 and 0.86 between human raters. Overall, this atlas provides a time-efficient tool for automated coregistration and segmentation of quantitative susceptibility data to analyze many regions of interest. These data were used to establish a baseline for normal magnetic susceptibility measurements for over 60 brain structures of 30- to 33-year-old males. Correlating the average susceptibility with age-based iron concentrations in gray

  12. Habitat automation

    NASA Technical Reports Server (NTRS)

    Swab, Rodney E.

    1992-01-01

    A habitat, on either the surface of the Moon or Mars, will be designed and built with the proven technologies of that day. These technologies will be mature and readily available to the habitat designer. We believe an acceleration of the normal pace of automation would allow a habitat to be safer and more easily maintained than would be the case otherwise. This document examines the operation of a habitat and describes elements of that operation which may benefit from an increased use of automation. Research topics within the automation realm are then defined and discussed with respect to the role they can have in the design of the habitat. Problems associated with the integration of advanced technologies into real-world projects at NASA are also addressed.

  13. Automation of analytical isotachophoresis

    NASA Technical Reports Server (NTRS)

    Thormann, Wolfgang

    1985-01-01

    The basic features of automation of analytical isotachophoresis (ITP) are reviewed. Experimental setups consisting of narrow bore tubes which are self-stabilized against thermal convection are considered. Sample detection in free solution is discussed, listing the detector systems presently used or expected to be of potential use in the near future. The combination of a universal detector measuring the evolution of ITP zone structures with detector systems specific to desired components is proposed as a concept of an automated chemical analyzer based on ITP. Possible miniaturization of such an instrument by means of microlithographic techniques is discussed.

  14. Phytoplankton community structure in the North Sea: coupling between remote sensing and automated in situ analysis at the single cell level

    NASA Astrophysics Data System (ADS)

    Thyssen, M.; Alvain, S.; Lefèbvre, A.; Dessailly, D.; Rijkeboer, M.; Guiselin, N.; Creach, V.; Artigas, L.-F.

    2014-11-01

    Phytoplankton observation in the ocean can be a challenge in oceanography. Accurate estimations of their biomass and dynamics will help to understand ocean ecosystems and refine global climate models. This requires relevant datasets of phytoplankton at a functional level and on a daily and sub meso scale. In order to achieve this, an automated, high frequency, dedicated scanning flow cytometer (SFC, Cytobuoy, NL), has been developed to cover the entire size range of phytoplankton cells whilst simultaneously taking pictures of the largest of them. This cytometer was directly connected to the water inlet of a~pocket Ferry Box during a cruise in the North Sea, 8-12 May 2011 (DYMAPHY project, INTERREG IV A "2 Seas"), in order to identify the phytoplankton community structure of near surface waters (6 m) with a high resolution spacial basis (2.2 ± 1.8 km). Ten groups of cells, distinguished on the basis of their optical pulse shapes, were described (abundance, size estimate, red fluorescence per unit volume). Abundances varied depending on the hydrological status of the traversed waters, reflecting different stages of the North Sea blooming period. Comparisons between several techniques analyzing chlorophyll a and the scanning flow cytometer, using the integrated red fluorescence emitted by each counted cell, showed significant correlations. The community structure observed from the automated flow cytometry was compared with the PHYSAT reflectance anomalies over a daily scale. The number of matchups observed between the SFC automated high frequency in situ sampling and the remote sensing was found to be two to three times better than when using traditional water sampling strategies. Significant differences in the phytoplankton community structure within the two days for which matchups were available, suggest that it is possible to label PHYSAT anomalies not only with dominant groups, but at the level of the community structure.

  15. Automated High Throughput Drug Target Crystallography

    SciTech Connect

    Rupp, B

    2005-02-18

    The molecular structures of drug target proteins and receptors form the basis for 'rational' or structure guided drug design. The majority of target structures are experimentally determined by protein X-ray crystallography, which as evolved into a highly automated, high throughput drug discovery and screening tool. Process automation has accelerated tasks from parallel protein expression, fully automated crystallization, and rapid data collection to highly efficient structure determination methods. A thoroughly designed automation technology platform supported by a powerful informatics infrastructure forms the basis for optimal workflow implementation and the data mining and analysis tools to generate new leads from experimental protein drug target structures.

  16. Automated dispenser

    SciTech Connect

    Hollen, R.M.; Stalnaker, N.D.

    1989-04-06

    An automated dispenser having a conventional pipette attached to an actuating cylinder through a flexible cable for delivering precise quantities of a liquid through commands from remotely located computer software. The travel of the flexible cable is controlled by adjustable stops and a locking shaft. The pipette can be positioned manually or by the hands of a robot. 1 fig.

  17. Automating Finance

    ERIC Educational Resources Information Center

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  18. Scaling Out and Evaluation of OBSecAn, an Automated Section Annotator for Semi-Structured Clinical Documents, on a Large VA Clinical Corpus

    PubMed Central

    Tran, Le-Thuy T.; Divita, Guy; Redd, Andrew; Carter, Marjorie E.; Samore, Matthew; Gundlapalli, Adi V.

    2015-01-01

    “Identifying and labeling” (annotating) sections improves the effectiveness of extracting information stored in the free text of clinical documents. OBSecAn, an automated ontology-based section annotator, was developed to identify and label sections of semi-structured clinical documents from the Department of Veterans Affairs (VA). In the first step, the algorithm reads and parses the document to obtain and store information regarding sections into a structure that supports the hierarchy of sections. The second stage detects and makes correction to errors in the parsed structure. The third stage produces the section annotation output using the final parsed tree. In this study, we present the OBSecAn method and its scale to a million document corpus and evaluate its performance in identifying family history sections. We identify high yield sections for this use case from note titles such as primary care and demonstrate a median rate of 99% in correctly identifying a family history section. PMID:26958260

  19. SU-C-9A-02: Structured Noise Index as An Automated Quality Control for Nuclear Medicine: A Two Year Experience

    SciTech Connect

    Nelson, J; Christianson, O; Samei, E

    2014-06-01

    Purpose: Flood-field uniformity evaluation is an essential element in the assessment of nuclear medicine (NM) gamma cameras. It serves as the central element of the quality control (QC) program, acquired and analyzed on a daily basis prior to clinical imaging. Uniformity images are traditionally analyzed using pixel value-based metrics which often fail to capture subtle structure and patterns caused by changes in gamma camera performance requiring additional visual inspection which is subjective and time demanding. The goal of this project was to develop and implement a robust QC metrology for NM that is effective in identifying non-uniformity issues, reporting issues in a timely manner for efficient correction prior to clinical involvement, all incorporated into an automated effortless workflow, and to characterize the program over a two year period. Methods: A new quantitative uniformity analysis metric was developed based on 2D noise power spectrum metrology and confirmed based on expert observer visual analysis. The metric, termed Structured Noise Index (SNI) was then integrated into an automated program to analyze, archive, and report on daily NM QC uniformity images. The effectiveness of the program was evaluated over a period of 2 years. Results: The SNI metric successfully identified visually apparent non-uniformities overlooked by the pixel valuebased analysis methods. Implementation of the program has resulted in nonuniformity identification in about 12% of daily flood images. In addition, due to the vigilance of staff response, the percentage of days exceeding trigger value shows a decline over time. Conclusion: The SNI provides a robust quantification of the NM performance of gamma camera uniformity. It operates seamlessly across a fleet of multiple camera models. The automated process provides effective workflow within the NM spectra between physicist, technologist, and clinical engineer. The reliability of this process has made it the preferred

  20. Primary structure of rat cardiac beta-adrenergic and muscarinic cholinergic receptors obtained by automated DNA sequence analysis: further evidence for a multigene family.

    PubMed Central

    Gocayne, J; Robinson, D A; FitzGerald, M G; Chung, F Z; Kerlavage, A R; Lentes, K U; Lai, J; Wang, C D; Fraser, C M; Venter, J C

    1987-01-01

    Two cDNA clones, lambda RHM-MF and lambda RHB-DAR, encoding the muscarinic cholinergic receptor and the beta-adrenergic receptor, respectively, have been isolated from a rat heart cDNA library. The cDNA clones were characterized by restriction mapping and automated DNA sequence analysis utilizing fluorescent dye primers. The rat heart muscarinic receptor consists of 466 amino acids and has a calculated molecular weight of 51,543. The rat heart beta-adrenergic receptor consists of 418 amino acids and has a calculated molecular weight of 46,890. The two cardiac receptors have substantial amino acid homology (27.2% identity, 50.6% with favored substitutions). The rat cardiac beta receptor has 88.0% homology (92.5% with favored substitutions) with the human brain beta receptor and the rat cardiac muscarinic receptor has 94.6% homology (97.6% with favored substitutions) with the porcine cardiac muscarinic receptor. The muscarinic cholinergic and beta-adrenergic receptors appear to be as conserved as hemoglobin and cytochrome c but less conserved than histones and are clearly members of a multigene family. These data support our hypothesis, based upon biochemical and immunological evidence, that suggests considerable structural homology and evolutionary conservation between adrenergic and muscarinic cholinergic receptors. To our knowledge, this is the first report utilizing automated DNA sequence analysis to determine the structure of a gene. Images PMID:2825184

  1. Numerical analysis of stiffened shells of revolution. Volume 3: Users' manual for STARS-2B, 2V, shell theory automated for rotational structures, 2 (buckling, vibrations), digital computer programs

    NASA Technical Reports Server (NTRS)

    Svalbonas, V.

    1973-01-01

    The User's manual for the shell theory automated for rotational structures (STARS) 2B and 2V (buckling, vibrations) is presented. Several features of the program are: (1) arbitrary branching of the shell meridians, (2) arbitrary boundary conditions, (3) minimum input requirements to describe a complex, practical shell of revolution structure, and (4) accurate analysis capability using a minimum number of degrees of freedom.

  2. Developing a Graphical User Interface to Automate the Estimation and Prediction of Risk Values for Flood Protective Structures using Artificial Neural Network

    NASA Astrophysics Data System (ADS)

    Hasan, M.; Helal, A.; Gabr, M.

    2014-12-01

    In this project, we focus on providing a computer-automated platform for a better assessment of the potential failures and retrofit measures of flood-protecting earth structures, e.g., dams and levees. Such structures play an important role during extreme flooding events as well as during normal operating conditions. Furthermore, they are part of other civil infrastructures such as water storage and hydropower generation. Hence, there is a clear need for accurate evaluation of stability and functionality levels during their service lifetime so that the rehabilitation and maintenance costs are effectively guided. Among condition assessment approaches based on the factor of safety, the limit states (LS) approach utilizes numerical modeling to quantify the probability of potential failures. The parameters for LS numerical modeling include i) geometry and side slopes of the embankment, ii) loading conditions in terms of rate of rising and duration of high water levels in the reservoir, and iii) cycles of rising and falling water levels simulating the effect of consecutive storms throughout the service life of the structure. Sample data regarding the correlations of these parameters are available through previous research studies. We have unified these criteria and extended the risk assessment in term of loss of life through the implementation of a graphical user interface to automate input parameters that divides data into training and testing sets, and then feeds them into Artificial Neural Network (ANN) tool through MATLAB programming. The ANN modeling allows us to predict risk values of flood protective structures based on user feedback quickly and easily. In future, we expect to fine-tune the software by adding extensive data on variations of parameters.

  3. Automated lithocell

    NASA Astrophysics Data System (ADS)

    Englisch, Andreas; Deuter, Armin

    1990-06-01

    Integration and automation have gained more and more ground in modern IC-manufacturing. It is difficult to make a direct calculation of the profit these investments yield. On the other hand, the demands to man, machine and technology have increased enormously of late; it is not difficult to see that only by means of integration and automation can these demands be coped with. Here are some salient points: U the complexity and costs incurred by the equipment and processes have got significantly higher . owing to the reduction of all dimensions, the tolerances within which the various process steps have to be carried out have got smaller and smaller and the adherence to these tolerances more and more difficult U the cycle time has become more and more important both for the development and control of new processes and, to a great extent, for a rapid and reliable supply to the customer. In order that the products be competitive under these conditions, all sort of costs have to be reduced and the yield has to be maximized. Therefore, the computer-aided control of the equipment and the process combined with an automatic data collection and a real-time SPC (statistical process control) has become absolutely necessary for successful IC-manufacturing. Human errors must be eliminated from the execution of the various process steps by automation. The work time set free in this way makes it possible for the human creativity to be employed on a larger scale in stabilizing the processes. Besides, a computer-aided equipment control can ensure the optimal utilization of the equipment round the clock.

  4. H++ 3.0: automating pK prediction and the preparation of biomolecular structures for atomistic molecular modeling and simulations.

    PubMed

    Anandakrishnan, Ramu; Aguilar, Boris; Onufriev, Alexey V

    2012-07-01

    The accuracy of atomistic biomolecular modeling and simulation studies depend on the accuracy of the input structures. Preparing these structures for an atomistic modeling task, such as molecular dynamics (MD) simulation, can involve the use of a variety of different tools for: correcting errors, adding missing atoms, filling valences with hydrogens, predicting pK values for titratable amino acids, assigning predefined partial charges and radii to all atoms, and generating force field parameter/topology files for MD. Identifying, installing and effectively using the appropriate tools for each of these tasks can be difficult for novice and time-consuming for experienced users. H++ (http://biophysics.cs.vt.edu/) is a free open-source web server that automates the above key steps in the preparation of biomolecular structures for molecular modeling and simulations. H++ also performs extensive error and consistency checking, providing error/warning messages together with the suggested corrections. In addition to numerous minor improvements, the latest version of H++ includes several new capabilities and options: fix erroneous (flipped) side chain conformations for HIS, GLN and ASN, include a ligand in the input structure, process nucleic acid structures and generate a solvent box with specified number of common ions for explicit solvent MD.

  5. The automation of science.

    PubMed

    King, Ross D; Rowland, Jem; Oliver, Stephen G; Young, Michael; Aubrey, Wayne; Byrne, Emma; Liakata, Maria; Markham, Magdalena; Pir, Pinar; Soldatova, Larisa N; Sparkes, Andrew; Whelan, Kenneth E; Clare, Amanda

    2009-04-01

    The basis of science is the hypothetico-deductive method and the recording of experiments in sufficient detail to enable reproducibility. We report the development of Robot Scientist "Adam," which advances the automation of both. Adam has autonomously generated functional genomics hypotheses about the yeast Saccharomyces cerevisiae and experimentally tested these hypotheses by using laboratory automation. We have confirmed Adam's conclusions through manual experiments. To describe Adam's research, we have developed an ontology and logical language. The resulting formalization involves over 10,000 different research units in a nested treelike structure, 10 levels deep, that relates the 6.6 million biomass measurements to their logical description. This formalization describes how a machine contributed to scientific knowledge. PMID:19342587

  6. Automated campaign system

    NASA Astrophysics Data System (ADS)

    Vondran, Gary; Chao, Hui; Lin, Xiaofan; Beyer, Dirk; Joshi, Parag; Atkins, Brian; Obrador, Pere

    2006-02-01

    To run a targeted campaign involves coordination and management across numerous organizations and complex process flows. Everything from market analytics on customer databases, acquiring content and images, composing the materials, meeting the sponsoring enterprise brand standards, driving through production and fulfillment, and evaluating results; all processes are currently performed by experienced highly trained staff. Presented is a developed solution that not only brings together technologies that automate each process, but also automates the entire flow so that a novice user could easily run a successful campaign from their desktop. This paper presents the technologies, structure, and process flows used to bring this system together. Highlighted will be how the complexity of running a targeted campaign is hidden from the user through technologies, all while providing the benefits of a professionally managed campaign.

  7. Automated discovery of structural features of the optic nerve head on the basis of image and genetic data

    NASA Astrophysics Data System (ADS)

    Christopher, Mark; Tang, Li; Fingert, John H.; Scheetz, Todd E.; Abramoff, Michael D.

    2014-03-01

    Evaluation of optic nerve head (ONH) structure is a commonly used clinical technique for both diagnosis and monitoring of glaucoma. Glaucoma is associated with characteristic changes in the structure of the ONH. We present a method for computationally identifying ONH structural features using both imaging and genetic data from a large cohort of participants at risk for primary open angle glaucoma (POAG). Using 1054 participants from the Ocular Hypertension Treatment Study, ONH structure was measured by application of a stereo correspondence algorithm to stereo fundus images. In addition, the genotypes of several known POAG genetic risk factors were considered for each participant. ONH structural features were discovered using both a principal component analysis approach to identify the major modes of variance within structural measurements and a linear discriminant analysis approach to capture the relationship between genetic risk factors and ONH structure. The identified ONH structural features were evaluated based on the strength of their associations with genotype and development of POAG by the end of the OHTS study. ONH structural features with strong associations with genotype were identified for each of the genetic loci considered. Several identified ONH structural features were significantly associated (p < 0.05) with the development of POAG after Bonferroni correction. Further, incorporation of genetic risk status was found to substantially increase performance of early POAG prediction. These results suggest incorporating both imaging and genetic data into ONH structural modeling significantly improves the ability to explain POAG-related changes to ONH structure.

  8. Identification of new leishmanicidal peptide lead structures by automated real-time monitoring of changes in intracellular ATP.

    PubMed Central

    Luque-Ortega, J Román; Saugar, José M; Chiva, Cristina; Andreu, David; Rivas, Luis

    2003-01-01

    Leishmanicidal drugs interacting stoichiometrically with parasite plasma membrane lipids, thus promoting permeability, have raised significant expectations for Leishmania chemotherapy due to their nil or very low induction of resistance. Inherent in this process is a decrease in intracellular ATP, either wasted by ionic pumps to restore membrane potential or directly leaked through larger membrane lesions caused by the drug. We have adapted a luminescence method for fast automated real-time monitoring of this process, using Leishmania donovani promastigotes transfected with a cytoplasmic luciferase form, previously tested for anti-mitochondrial drugs. The system was first assayed against a set of well-known membrane-active drugs [amphotericin B, nystatin, cecropin A-melittin peptide CA(1-8)M(1-18)], plus two ionophoric polyethers (narasin and salinomycin) not previously tested on Leishmania, then used to screen seven new cecropin A-melittin hybrid peptides. All membrane-active compounds showed a good correlation between inhibition of luminescence and leishmanicidal activity. Induction of membrane permeability was demonstrated by dissipation of membrane potential, SYTOX trade mark Green influx and membrane damage assessed by electron microscopy, except for the polyethers, where ATP decrease was due to inhibition of its mitochondrial synthesis. Five of the test peptides showed an ED50 around 1 microM on promastigotes. These peptides, with equal or better activity than 26-residue-long CA(1-8)M(1-18), are the shortest leishmanicidal peptides described so far, and validate our luminescence assay as a fast and cheap screening tool for membrane-active compounds. PMID:12864731

  9. Solution synthesis of a new thermoelectric Zn(1+x)Sb nanophase and its structure determination using automated electron diffraction tomography.

    PubMed

    Birkel, Christina S; Mugnaioli, Enrico; Gorelik, Tatiana; Kolb, Ute; Panthöfer, Martin; Tremel, Wolfgang

    2010-07-21

    Engineering materials with specific physical properties have recently focused on the effect of nanoscopic inhomogeneities at the 10 nm scale. Such features are expected to scatter medium- and long-wavelength phonons thereby lowering the thermal conductivity of the system. Low thermal conductivity is a prerequisite for effective thermoelectric materials, and the challenge is to limit the transport of heat by phonons, without simultaneously decreasing charge transport. A solution-phase technique was devised for synthesis of thermoelectric "Zn(4)Sb(3)" nanocrystals as a precursor for phase segregation into ZnSb and a new Zn-Sb intermetallic phase, Zn(1+delta)Sb, in a peritectoid reaction. Our approach uses activated metal nanoparticles as precursors for the synthesis of this intermetallic compound. The small particle size of the reactants ensures minimum diffusion paths, low activation barriers, and low reaction temperatures, thereby eliminating solid-solid diffusion as the rate-limiting step in conventional bulk-scale solid-state synthesis. Both phases were identified and structurally characterized by automated electron diffraction tomography combined with precession electron diffraction. An ab initio structure solution based on electron diffraction data revealed two different phases. The new pseudo-hexagonal phase, Zn(1+delta)Sb, was identified and classified within the structural diversity of the Zn-Sb phase diagram.

  10. Automated 3D architecture reconstruction from photogrammetric structure-and-motion: A case study of the One Pilla pagoda, Hanoi, Vienam

    NASA Astrophysics Data System (ADS)

    To, T.; Nguyen, D.; Tran, G.

    2015-04-01

    Heritage system of Vietnam has decline because of poor-conventional condition. For sustainable development, it is required a firmly control, space planning organization, and reasonable investment. Moreover, in the field of Cultural Heritage, the use of automated photogrammetric systems, based on Structure from Motion techniques (SfM), is widely used. With the potential of high-resolution, low-cost, large field of view, easiness, rapidity and completeness, the derivation of 3D metric information from Structure-and- Motion images is receiving great attention. In addition, heritage objects in form of 3D physical models are recorded not only for documentation issues, but also for historical interpretation, restoration, cultural and educational purposes. The study suggests the archaeological documentation of the "One Pilla" pagoda placed in Hanoi capital, Vietnam. The data acquired through digital camera Cannon EOS 550D, CMOS APS-C sensor 22.3 x 14.9 mm. Camera calibration and orientation were carried out by VisualSFM, CMPMVS (Multi-View Reconstruction) and SURE (Photogrammetric Surface Reconstruction from Imagery) software. The final result represents a scaled 3D model of the One Pilla Pagoda and displayed different views in MeshLab software.

  11. Intensity targeted radial structure tensor analysis and its application for automated mediastinal lymph node detection from CT volumes

    NASA Astrophysics Data System (ADS)

    Oda, Hirohisa; Nimura, Yukitaka; Oda, Masahiro; Kitasaka, Takayuki; Iwano, Shingo; Honma, Hirotoshi; Takabatake, Hirotsugu; Mori, Masaki; Natori, Hiroshi; Mori, Kensaku

    2016-03-01

    This paper presents a new blob-like enhancement filter based on Intensity Targeted Radial Structure Tensor (ITRST) analysis to improve mediastinal lymph node detection from chest CT volumes. Blob-like structure enhancement filter based on Radial Structure Tensor (RST) analysis can be utilized for initial detection of lymph node candidate regions. However, some of lymph nodes cannot be detected because RST analysis is influenced by neighboring regions whose intensity is very high or low, such as contrast-enhanced blood vessels and air. To overcome the problem, we propose ITRST analysis that integrate the prior knowledge on detection target intensity into RST analysis. Our lymph node detection method consists of two steps. First, candidate regions are obtained by ITRST analysis. Second, false positives (FPs) are removed by the Support Vector Machine (SVM) classifier. We applied the proposed method to 47 cases. Among 19 lymph nodes whose short axis is no less than 10 mm, 100.0 % of them were detected with 247.7 FPs/case by ITRST analysis, while only 80.0 % were detected with 123.0 FPs/case by RST analysis. After the false positive (FP) reduction by SVM, ITRST analysis outperformed RST analysis in lymph node detection performance.

  12. Using a semi-automated filtering process to improve large footprint lidar sub-canopy elevation models and forest structure metrics

    NASA Astrophysics Data System (ADS)

    Fricker, G. A.; Saatchi, S.; Meyer, V.; Gillespie, T.; Sheng, Y.

    2011-12-01

    Quantification of sub-canopy topography and forest structure is important for developing a better understanding of how forest ecosystems function. This study focuses on a three-step method to adapt discrete return lidar (DRL) filtering techniques to Laser Vegetation Imaging Sensor (LVIS) large-footprint lidar (LFL) waveforms to improve the accuracy of both sub-canopy digital elevation models (DEMs), as well as forest structure measurements. The results of the experiment demonstrate that LFL ground surfaces can be effectively filtered using methods adapted from DRL point filtering methods, and the resulting data will produce more accurate digital elevation models, as well as improved estimates of forest structure. The first step quantifies the slope present at the center of each LFL pulse, and the average error expected at each particular degree of slope is modeled. Areas of high terrain slope show consistently more error in LFL ground detection, and empirical relationships between terrain angle and expected LVIS ground detection error are established. These relationships are then used to create an algorithm for LFL ground elevation correction. The second step uses an iterative, expanding window filter to identify outlier points which are not part of the ground surface, as well as manual editing to identify laser pulses which are not at ground level. The semi-automated methods improved the LVIS DEM accuracy significantly by identifying significant outliers in the LVIS point cloud. The final step develops an approach which utilizes both the filtered LFL DEMs, and the modeled error introduced by terrain slope to improve both sub-canopy elevation models, and above ground LFL waveform metrics. DRL and LVIS data from Barro Colorado Island, Panama, and La Selva, Costa Rica were used to develop and test the algorithm. Acknowledgements: Special thanks to Dr. Jim Dilling for providing the DRL lidar data for Barro Colorado Island.

  13. Semiautomated inspection versus fully automated inspection of lyophilized products.

    PubMed

    Seidenader, N W

    1994-01-01

    The development of fully automated inspection systems for parenteral products has created a situation of high expectations regarding productivity and quality improvements. However, not all products and production situations are suited for automation. A guideline for inspection and automation strategies will be discussed, structuring the field of lyophilized products according to the critical decision parameters.

  14. Automated method for determination of dissolved organic carbon-water distribution constants of structurally diverse pollutants using pre-equilibrium solid-phase microextraction.

    PubMed

    Ripszam, Matyas; Haglund, Peter

    2015-02-01

    Dissolved organic carbon (DOC) plays a key role in determining the environmental fate of semivolatile organic environmental contaminants. The goal of the present study was to develop a method using commercially available hardware to rapidly characterize the sorption properties of DOC in water samples. The resulting method uses negligible-depletion direct immersion solid-phase microextraction (SPME) and gas chromatography-mass spectrometry. Its performance was evaluated using Nordic reference fulvic acid and 40 priority environmental contaminants that cover a wide range of physicochemical properties. Two SPME fibers had to be used to cope with the span of properties, 1 coated with polydimethylsiloxane and 1 coated with polystyrene divinylbenzene polydimethylsiloxane, for nonpolar and semipolar contaminants, respectively. The measured DOC-water distribution constants showed reasonably good reproducibility (standard deviation ≤ 0.32) and good correlation (R(2)  = 0.80) with log octanol-water partition coefficients for nonpolar persistent organic pollutants. The sample pretreatment is limited to filtration, and the method is easy to adjust to different DOC concentrations. These experiments also utilized the latest SPME automation that largely decreases total cycle time (to 20 min or shorter) and increases sample throughput, which is advantageous in cases when many samples of DOC must be characterized or when the determinations must be performed quickly, for example, to avoid precipitation, aggregation, and other changes of DOC structure and properties. The data generated by this method are valuable as a basis for transport and fate modeling studies.

  15. Computer-assisted structure identification (CASI)--an automated platform for high-throughput identification of small molecules by two-dimensional gas chromatography coupled to mass spectrometry.

    PubMed

    Knorr, Arno; Monge, Aurelien; Stueber, Markus; Stratmann, André; Arndt, Daniel; Martin, Elyette; Pospisil, Pavel

    2013-12-01

    Compound identification is widely recognized as a major bottleneck for modern metabolomic approaches and high-throughput nontargeted characterization of complex matrices. To tackle this challenge, an automated platform entitled computer-assisted structure identification (CASI) was designed and developed in order to accelerate and standardize the identification of compound structures. In the first step of the process, CASI automatically searches mass spectral libraries for matches using a NIST MS Search algorithm, which proposes structural candidates for experimental spectra from two-dimensional gas chromatography with time-of-flight mass spectrometry (GC × GC-TOF-MS) measurements, each with an associated match factor. Next, quantitative structure-property relationship (QSPR) models implemented in CASI predict three specific parameters to enhance the confidence for correct compound identification, which were Kovats Index (KI) for the first dimension (1D) separation, relative retention time for the second dimension separation (2DrelRT) and boiling point (BP). In order to reduce the impact of chromatographic variability on the second dimension retention time, a concept based upon hypothetical reference points from linear regressions of a deuterated n-alkanes reference system was introduced, providing a more stable relative retention time measurement. Predicted values for KI and 2DrelRT were calculated and matched with experimentally derived values. Boiling points derived from 1D separations were matched with predicted boiling points, calculated from the chemical structures of the candidates. As a last step, CASI combines the NIST MS Search match factors (NIST MF) with up to three predicted parameter matches from the QSPR models to generate a combined CASI Score representing the measure of confidence for the identification. Threshold values were applied to the CASI Scores assigned to proposed structures, which improved the accuracy for the classification of true

  16. A linear programming approach to reconstructing subcellular structures from confocal images for automated generation of representative 3D cellular models

    PubMed Central

    Wood, Scott T.; Dean, Brian C.; Dean, Delphine

    2013-01-01

    This paper presents a novel computer vision algorithm to analyze 3D stacks of confocal images of fluorescently stained single cells. The goal of the algorithm is to create representative in silico model structures that can be imported into finite element analysis software for mechanical characterization. Segmentation of cell and nucleus boundaries is accomplished via standard thresholding methods. Using novel linear programming methods, a representative actin stress fiber network is generated by computing a linear superposition of fibers having minimum discrepancy compared with an experimental 3D confocal image. Qualitative validation is performed through analysis of seven 3D confocal image stacks of adherent vascular smooth muscle cells (VSMCs) grown in 2D culture. The presented method is able to automatically generate 3D geometries of the cell's boundary, nucleus, and representative F-actin network based on standard cell microscopy data. These geometries can be used for direct importation and implementation in structural finite element models for analysis of the mechanics of a single cell to potentially speed discoveries in the fields of regenerative medicine, mechanobiology, and drug discovery. PMID:23395283

  17. A new methodology for non-contact accurate crack width measurement through photogrammetry for automated structural safety evaluation

    NASA Astrophysics Data System (ADS)

    Jahanshahi, Mohammad R.; Masri, Sami F.

    2013-03-01

    In mechanical, aerospace and civil structures, cracks are important defects that can cause catastrophes if neglected. Visual inspection is currently the predominant method for crack assessment. This approach is tedious, labor-intensive, subjective and highly qualitative. An inexpensive alternative to current monitoring methods is to use a robotic system that could perform autonomous crack detection and quantification. To reach this goal, several image-based crack detection approaches have been developed; however, the crack thickness quantification, which is an essential element for a reliable structural condition assessment, has not been sufficiently investigated. In this paper, a new contact-less crack quantification methodology, based on computer vision and image processing concepts, is introduced and evaluated against a crack quantification approach which was previously developed by the authors. The proposed approach in this study utilizes depth perception to quantify crack thickness and, as opposed to most previous studies, needs no scale attachment to the region under inspection, which makes this approach ideal for incorporation with autonomous or semi-autonomous mobile inspection systems. Validation tests are performed to evaluate the performance of the proposed approach, and the results show that the new proposed approach outperforms the previously developed one.

  18. Automated protein NMR resonance assignments.

    PubMed

    Wan, Xiang; Xu, Dong; Slupsky, Carolyn M; Lin, Guohui

    2003-01-01

    NMR resonance peak assignment is one of the key steps in solving an NMR protein structure. The assignment process links resonance peaks to individual residues of the target protein sequence, providing the prerequisite for establishing intra- and inter-residue spatial relationships between atoms. The assignment process is tedious and time-consuming, which could take many weeks. Though there exist a number of computer programs to assist the assignment process, many NMR labs are still doing the assignments manually to ensure quality. This paper presents (1) a new scoring system for mapping spin systems to residues, (2) an automated adjacency information extraction procedure from NMR spectra, and (3) a very fast assignment algorithm based on our previous proposed greedy filtering method and a maximum matching algorithm to automate the assignment process. The computational tests on 70 instances of (pseudo) experimental NMR data of 14 proteins demonstrate that the new score scheme has much better discerning power with the aid of adjacency information between spin systems simulated across various NMR spectra. Typically, with automated extraction of adjacency information, our method achieves nearly complete assignments for most of the proteins. The experiment shows very promising perspective that the fast automated assignment algorithm together with the new score scheme and automated adjacency extraction may be ready for practical use. PMID:16452794

  19. Space power subsystem automation technology

    NASA Technical Reports Server (NTRS)

    Graves, J. R. (Compiler)

    1982-01-01

    The technology issues involved in power subsystem automation and the reasonable objectives to be sought in such a program were discussed. The complexities, uncertainties, and alternatives of power subsystem automation, along with the advantages from both an economic and a technological perspective were considered. Whereas most spacecraft power subsystems now use certain automated functions, the idea of complete autonomy for long periods of time is almost inconceivable. Thus, it seems prudent that the technology program for power subsystem automation be based upon a growth scenario which should provide a structured framework of deliberate steps to enable the evolution of space power subsystems from the current practice of limited autonomy to a greater use of automation with each step being justified on a cost/benefit basis. Each accomplishment should move toward the objectives of decreased requirement for ground control, increased system reliability through onboard management, and ultimately lower energy cost through longer life systems that require fewer resources to operate and maintain. This approach seems well-suited to the evolution of more sophisticated algorithms and eventually perhaps even the use of some sort of artificial intelligence. Multi-hundred kilowatt systems of the future will probably require an advanced level of autonomy if they are to be affordable and manageable.

  20. Automated External Defibrillator

    MedlinePlus

    ... from the NHLBI on Twitter. What Is an Automated External Defibrillator? An automated external defibrillator (AED) is a portable device that ... Institutes of Health Department of Health and Human Services USA.gov

  1. Automated, in-water determination of colored dissolved organic material and phytoplankton community structure using the optical phytoplankton discriminator

    NASA Astrophysics Data System (ADS)

    Kirkpatrick, Gary J.; Millie, David F.; Moline, Mark A.; Lohrenz, Steven E.; Schofield, Oscar M.

    2011-06-01

    Optical Phytoplankton Discriminator (OPD, a.k.a. BreveBuster) determines colored dissolved organic material (CDOM) absorption spectra and particulate light absorbance spectra. The CDOM absorption spectra and correlation coefficients (referred to as 'similarity indexes') between the particulate absorbance spectra and known phytoplankton classes are available in real-time. Post-deployment processing calculates the best fit of multiple absorbance spectra from known phytoplankton taxonomic classes. Through this process the OPD provides an estimate of the phytoplankton community chlorophyll distribution among the classes included in the fit process. The major components of the OPD include: a liquid-waveguide capillary cell (LWCC), a fiber-optic spectrometer, a tungsten-deuterium fiber-optic light and a 0.2 micrometer pore cross-flow filter. In-water operation of the OPD began in May 2003. Since that date 25 of these instruments have been deployed on a variety of autonomous underwater vehicles, buoys, piers, channel markers and boats and ships. It has been utilized in CDOM studies off the New Jersey coast, in HAB monitoring efforts in the Gulf of Mexico and the Great Lakes, and in phytoplankton community structure studies in the Galapagos Islands and the Mediterranean Sea. Most recently, it has been deployed to Veracruz, Mexico for HAB monitoring. Presently, several OPD's operating on Slocum gliders and coastal buoys make up a local HAB observatory south of Tampa Bay, Florida, partially supported by the NOAA/IOOS through GCOOS. This presentation will detail the OPD's capabilities and report results from several of the deployments listed above. The ongoing effort to effectively visualize 4-D phytoplankton community structure will be discussed.

  2. Automation: triumph or trap?

    PubMed

    Smythe, M H

    1997-01-01

    Automation, a hot topic in the laboratory world today, can be a very expensive option. Those who are considering implementing automation can save time and money by examining the issues from the standpoint of an industrial/manufacturing engineer. The engineer not only asks what problems will be solved by automation, but what problems will be created. This article discusses questions that must be asked and answered to ensure that automation efforts will yield real and substantial payoffs.

  3. Workflow automation architecture standard

    SciTech Connect

    Moshofsky, R.P.; Rohen, W.T.

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  4. Application of DEN refinement and automated model building to a difficult case of molecular-replacement phasing: the structure of a putative succinyl-diaminopimelate desuccinylase from Corynebacterium glutamicum

    PubMed Central

    Brunger, Axel T.; Das, Debanu; Deacon, Ashley M.; Grant, Joanna; Terwilliger, Thomas C.; Read, Randy J.; Adams, Paul D.; Levitt, Michael; Schröder, Gunnar F.

    2012-01-01

    Phasing by molecular replacement remains difficult for targets that are far from the search model or in situations where the crystal diffracts only weakly or to low resolution. Here, the process of determining and refining the structure of Cgl1109, a putative succinyl-diaminopimelate desuccinylase from Corynebacterium glutamicum, at ∼3 Å resolution is described using a combination of homology modeling with MODELLER, molecular-replacement phasing with Phaser, deformable elastic network (DEN) refinement and automated model building using AutoBuild in a semi-automated fashion, followed by final refinement cycles with phenix.refine and Coot. This difficult molecular-replacement case illustrates the power of including DEN restraints derived from a starting model to guide the movements of the model during refinement. The resulting improved model phases provide better starting points for automated model building and produce more significant difference peaks in anomalous difference Fourier maps to locate anomalous scatterers than does standard refinement. This example also illustrates a current limitation of automated procedures that require manual adjustment of local sequence misalignments between the homology model and the target sequence. PMID:22505259

  5. Shoe-String Automation

    SciTech Connect

    Duncan, M.L.

    2001-07-30

    Faced with a downsizing organization, serious budget reductions and retirement of key metrology personnel, maintaining capabilities to provide necessary services to our customers was becoming increasingly difficult. It appeared that the only solution was to automate some of our more personnel-intensive processes; however, it was crucial that the most personnel-intensive candidate process be automated, at the lowest price possible and with the lowest risk of failure. This discussion relates factors in the selection of the Standard Leak Calibration System for automation, the methods of automation used to provide the lowest-cost solution and the benefits realized as a result of the automation.

  6. Automation of industrial bioprocesses.

    PubMed

    Beyeler, W; DaPra, E; Schneider, K

    2000-01-01

    The dramatic development of new electronic devices within the last 25 years has had a substantial influence on the control and automation of industrial bioprocesses. Within this short period of time the method of controlling industrial bioprocesses has changed completely. In this paper, the authors will use a practical approach focusing on the industrial applications of automation systems. From the early attempts to use computers for the automation of biotechnological processes up to the modern process automation systems some milestones are highlighted. Special attention is given to the influence of Standards and Guidelines on the development of automation systems.

  7. Automation in Clinical Microbiology

    PubMed Central

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  8. Top-down characterization of nucleic acids modified by structural probes using high-resolution tandem mass spectrometry and automated data interpretation.

    PubMed

    Kellersberger, Katherine A; Yu, Eizadora; Kruppa, Gary H; Young, Malin M; Fabris, Daniele

    2004-05-01

    . A new program called MS2Links was developed for the automated reduction and interpretation of fragmentation data obtained from modified nucleic acids. Based on an algorithm that searches for plausible isotopic patterns, the data reduction module is capable of discriminating legitimate signals from noise spikes of comparable intensity. The fragment identification module calculates the monoisotopic mass of ion products expected from a certain sequence and user-defined covalent modifications, which are finally matched with the signals selected by the data reduction program. Considering that MS2Links can generate similar fragment libraries for peptides and their covalent conjugates with other peptides or nucleic acids, this program provides an integrated platform for the structural investigation of protein-nucleic acid complexes based on cross-linking strategies and top-down ESI-FTMS.

  9. Protein fabrication automation

    PubMed Central

    Cox, J. Colin; Lape, Janel; Sayed, Mahmood A.; Hellinga, Homme W.

    2007-01-01

    Facile “writing” of DNA fragments that encode entire gene sequences potentially has widespread applications in biological analysis and engineering. Rapid writing of open reading frames (ORFs) for expressed proteins could transform protein engineering and production for protein design, synthetic biology, and structural analysis. Here we present a process, protein fabrication automation (PFA), which facilitates the rapid de novo construction of any desired ORF from oligonucleotides with low effort, high speed, and little human interaction. PFA comprises software for sequence design, data management, and the generation of instruction sets for liquid-handling robotics, a liquid-handling robot, a robust PCR scheme for gene assembly from synthetic oligonucleotides, and a genetic selection system to enrich correctly assembled full-length synthetic ORFs. The process is robust and scalable. PMID:17242375

  10. Automated Standard Hazard Tool

    NASA Technical Reports Server (NTRS)

    Stebler, Shane

    2014-01-01

    The current system used to generate standard hazard reports is considered cumbersome and iterative. This study defines a structure for this system's process in a clear, algorithmic way so that standard hazard reports and basic hazard analysis may be completed using a centralized, web-based computer application. To accomplish this task, a test server is used to host a prototype of the tool during development. The prototype is configured to easily integrate into NASA's current server systems with minimal alteration. Additionally, the tool is easily updated and provides NASA with a system that may grow to accommodate future requirements and possibly, different applications. Results of this project's success are outlined in positive, subjective reviews complete by payload providers and NASA Safety and Mission Assurance personnel. Ideally, this prototype will increase interest in the concept of standard hazard automation and lead to the full-scale production of a user-ready application.

  11. Protein fabrication automation.

    PubMed

    Cox, J Colin; Lape, Janel; Sayed, Mahmood A; Hellinga, Homme W

    2007-03-01

    Facile "writing" of DNA fragments that encode entire gene sequences potentially has widespread applications in biological analysis and engineering. Rapid writing of open reading frames (ORFs) for expressed proteins could transform protein engineering and production for protein design, synthetic biology, and structural analysis. Here we present a process, protein fabrication automation (PFA), which facilitates the rapid de novo construction of any desired ORF from oligonucleotides with low effort, high speed, and little human interaction. PFA comprises software for sequence design, data management, and the generation of instruction sets for liquid-handling robotics, a liquid-handling robot, a robust PCR scheme for gene assembly from synthetic oligonucleotides, and a genetic selection system to enrich correctly assembled full-length synthetic ORFs. The process is robust and scalable.

  12. Automating the multiprocessing environment

    NASA Technical Reports Server (NTRS)

    Arpasi, Dale J.

    1989-01-01

    An approach to automate the programming and operation of tree-structured networks of multiprocessor systems is discussed. A conceptual, knowledge-based operating environment is presented, and requirements for two major technology elements are identified as follows: (1) An intelligent information translator is proposed for implementating information transfer between dissimilar hardware and software, thereby enabling independent and modular development of future systems and promoting a language-independence of codes and information; (2) A resident system activity manager, which recognizes the systems capabilities and monitors the status of all systems within the environment, is proposed for integrating dissimilar systems into effective parallel processing resources to optimally meet user needs. Finally, key computational capabilities which must be provided before the environment can be realized are identified.

  13. Automated DNA Sequencing System

    SciTech Connect

    Armstrong, G.A.; Ekkebus, C.P.; Hauser, L.J.; Kress, R.L.; Mural, R.J.

    1999-04-25

    Oak Ridge National Laboratory (ORNL) is developing a core DNA sequencing facility to support biological research endeavors at ORNL and to conduct basic sequencing automation research. This facility is novel because its development is based on existing standard biology laboratory equipment; thus, the development process is of interest to the many small laboratories trying to use automation to control costs and increase throughput. Before automation, biology Laboratory personnel purified DNA, completed cycle sequencing, and prepared 96-well sample plates with commercially available hardware designed specifically for each step in the process. Following purification and thermal cycling, an automated sequencing machine was used for the sequencing. A technician handled all movement of the 96-well sample plates between machines. To automate the process, ORNL is adding a CRS Robotics A- 465 arm, ABI 377 sequencing machine, automated centrifuge, automated refrigerator, and possibly an automated SpeedVac. The entire system will be integrated with one central controller that will direct each machine and the robot. The goal of this system is to completely automate the sequencing procedure from bacterial cell samples through ready-to-be-sequenced DNA and ultimately to completed sequence. The system will be flexible and will accommodate different chemistries than existing automated sequencing lines. The system will be expanded in the future to include colony picking and/or actual sequencing. This discrete event, DNA sequencing system will demonstrate that smaller sequencing labs can achieve cost-effective the laboratory grow.

  14. Laboratory Automation and Middleware.

    PubMed

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory.

  15. Automating checks of plan check automation.

    PubMed

    Halabi, Tarek; Lu, Hsiao-Ming

    2014-07-08

    While a few physicists have designed new plan check automation solutions for their clinics, fewer, if any, managed to adapt existing solutions. As complex and varied as the systems they check, these programs must gain the full confidence of those who would run them on countless patient plans. The present automation effort, planCheck, therefore focuses on versatility and ease of implementation and verification. To demonstrate this, we apply planCheck to proton gantry, stereotactic proton gantry, stereotactic proton fixed beam (STAR), and IMRT treatments.

  16. An automation simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.; Mutammara, Atheel

    1988-01-01

    The work being done in porting ROBOSIM (a graphical simulation system developed jointly by NASA-MSFC and Vanderbilt University) to the HP350SRX graphics workstation is described. New additional ROBOSIM features, like collision detection and new kinematics simulation methods are also discussed. Based on the experiences of the work on ROBOSIM, a new graphics structural modeling environment is suggested which is intended to be a part of a new knowledge-based multiple aspect modeling testbed. The knowledge-based modeling methodologies and tools already available are described. Three case studies in the area of Space Station automation are also reported. First a geometrical structural model of the station is presented. This model was developed using the ROBOSIM package. Next the possible application areas of an integrated modeling environment in the testing of different Space Station operations are discussed. One of these possible application areas is the modeling of the Environmental Control and Life Support System (ECLSS), which is one of the most complex subsystems of the station. Using the multiple aspect modeling methodology, a fault propagation model of this system is being built and is described.

  17. Hardware flexibility of laboratory automation systems: analysis and new flexible automation architectures.

    PubMed

    Najmabadi, Peyman; Goldenberg, Andrew A; Emili, Andrew

    2007-03-01

    Development of flexible laboratory automation systems has attracted tremendous attention in recent years as biotechnology scientists perform diverse types of protocols and tend to continuously modify them as part of their research. This article is a system level study of hardware flexibility of laboratory automation architectures for high-throughput automation of various sample preparation protocols. Hardware flexibility (system components' adaptability to protocol variations) of automation systems is addressed through the introduction of three main parametric flexibility measures functional, structural, and throughput. A new quantitative measurement method for these parameters in the realm of the Axiomatic Theory is introduced in this article. The method relies on defining probability of success functions for flexibility parameters and calculating their information contents. As flexibility information content decreases, automation system flexibility increases.

  18. Work and Programmable Automation.

    ERIC Educational Resources Information Center

    DeVore, Paul W.

    A new industrial era based on electronics and the microprocessor has arrived, an era that is being called intelligent automation. Intelligent automation, in the form of robots, replaces workers, and the new products, using microelectronic devices, require significantly less labor to produce than the goods they replace. The microprocessor thus…

  19. Automation and Cataloging.

    ERIC Educational Resources Information Center

    Furuta, Kenneth; And Others

    1990-01-01

    These three articles address issues in library cataloging that are affected by automation: (1) the impact of automation and bibliographic utilities on professional catalogers; (2) the effect of the LASS microcomputer software on the cost of authority work in cataloging at the University of Arizona; and (3) online subject heading and classification…

  20. Library Automation Style Guide.

    ERIC Educational Resources Information Center

    Gaylord Bros., Liverpool, NY.

    This library automation style guide lists specific terms and names often used in the library automation industry. The terms and/or acronyms are listed alphabetically and each is followed by a brief definition. The guide refers to the "Chicago Manual of Style" for general rules, and a notes section is included for the convenience of individual…

  1. More Benefits of Automation.

    ERIC Educational Resources Information Center

    Getz, Malcolm

    1988-01-01

    Describes a study that measured the benefits of an automated catalog and automated circulation system from the library user's point of view in terms of the value of time saved. Topics discussed include patterns of use, access time, availability of information, search behaviors, and the effectiveness of the measures used. (seven references)…

  2. Educating Archivists for Automation.

    ERIC Educational Resources Information Center

    Weber, Lisa B.

    1988-01-01

    Archivists indicate they want to learn more about automation in archives, the MARC AMC (Archival and Manuscripts Control) format, and emerging computer technologies; they look for educational opportunities through professional associations, publications, and college coursework; future archival automation education needs include standards, shared…

  3. Automation and robotics

    NASA Technical Reports Server (NTRS)

    Montemerlo, Melvin

    1988-01-01

    The Autonomous Systems focus on the automation of control systems for the Space Station and mission operations. Telerobotics focuses on automation for in-space servicing, assembly, and repair. The Autonomous Systems and Telerobotics each have a planned sequence of integrated demonstrations showing the evolutionary advance of the state-of-the-art. Progress is briefly described for each area of concern.

  4. Automation in Immunohematology

    PubMed Central

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-01-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process. PMID:22988378

  5. Automation in immunohematology.

    PubMed

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-07-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process. PMID:22988378

  6. Automation in immunohematology.

    PubMed

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-07-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  7. Systematic review automation technologies.

    PubMed

    Tsafnat, Guy; Glasziou, Paul; Choong, Miew Keen; Dunn, Adam; Galgani, Filippo; Coiera, Enrico

    2014-07-09

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects.We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time.

  8. Systematic review automation technologies

    PubMed Central

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  9. Definitive Metabolite Identification Coupled with Automated Ligand Identification System (ALIS) Technology: A Novel Approach to Uncover Structure-Activity Relationships and Guide Drug Design in a Factor IXa Inhibitor Program.

    PubMed

    Zhang, Ting; Liu, Yong; Yang, Xianshu; Martin, Gary E; Yao, Huifang; Shang, Jackie; Bugianesi, Randal M; Ellsworth, Kenneth P; Sonatore, Lisa M; Nizner, Peter; Sherer, Edward C; Hill, Susan E; Knemeyer, Ian W; Geissler, Wayne M; Dandliker, Peter J; Helmy, Roy; Wood, Harold B

    2016-03-10

    A potent and selective Factor IXa (FIXa) inhibitor was subjected to a series of liver microsomal incubations, which generated a number of metabolites. Using automated ligand identification system-affinity selection (ALIS-AS) methodology, metabolites in the incubation mixture were prioritized by their binding affinities to the FIXa protein. Microgram quantities of the metabolites of interest were then isolated through microisolation analytical capabilities, and structurally characterized using MicroCryoProbe heteronuclear 2D NMR techniques. The isolated metabolites recovered from the NMR experiments were then submitted directly to an in vitro FIXa enzymatic assay. The order of the metabolites' binding affinity to the Factor IXa protein from the ALIS assay was completely consistent with the enzymatic assay results. This work showcases an innovative and efficient approach to uncover structure-activity relationships (SARs) and guide drug design via microisolation-structural characterization and ALIS capabilities. PMID:26871940

  10. To automate or not to automate: this is the question

    PubMed Central

    Cymborowski, M.; Klimecka, M.; Chruszcz, M.; Zimmerman, M. D.; Shumilin, I. A.; Borek, D.; Lazarski, K.; Joachimiak, A.; Otwinowski, Z.; Anderson, W.

    2010-01-01

    New protocols and instrumentation significantly boost the outcome of structural biology, which has resulted in significant growth in the number of deposited Protein Data Bank structures. However, even an enormous increase of the productivity of a single step of the structure determination process may not significantly shorten the time between clone and deposition or publication. For example, in a medium size laboratory equipped with the LabDB and HKL-3000 systems, we show that automation of some (and integration of all) steps of the X-ray structure determination pathway is critical for laboratory productivity. Moreover, we show that the lag period after which the impact of a technology change is observed is longer than expected. PMID:20526815

  11. To automate or not to automate : this is the question.

    SciTech Connect

    Cymborowski, M.; Klimecka, M.; Chruszcz, M.; Zimmerman, M.; Shumilin, I.; Borek, D.; Lazarski, K.; Joachimiak, A.; Otwinowski, Z.; Anderson, W.; Minor, W.; Biosciences Division; Univ. of Virginia; Univ. of Texas; Northwestern Univ.; Univ. of Chicago

    2010-06-06

    New protocols and instrumentation significantly boost the outcome of structural biology, which has resulted in significant growth in the number of deposited Protein Data Bank structures. However, even an enormous increase of the productivity of a single step of the structure determination process may not significantly shorten the time between clone and deposition or publication. For example, in a medium size laboratory equipped with the LabDB and HKL-3000 systems, we show that automation of some (and integration of all) steps of the X-ray structure determination pathway is critical for laboratory productivity. Moreover, we show that the lag period after which the impact of a technology change is observed is longer than expected.

  12. Automation synthesis modules review.

    PubMed

    Boschi, S; Lodi, F; Malizia, C; Cicoria, G; Marengo, M

    2013-06-01

    The introduction of (68)Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived (68)Ge/(68)Ga generator has been at the bases of the development of (68)Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for (68)Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations.

  13. Automated Fabrication Technologies for High Performance Polymer Composites

    NASA Technical Reports Server (NTRS)

    Shuart , M. J.; Johnston, N. J.; Dexter, H. B.; Marchello, J. M.; Grenoble, R. W.

    1998-01-01

    New fabrication technologies are being exploited for building high graphite-fiber-reinforced composite structure. Stitched fiber preforms and resin film infusion have been successfully demonstrated for large, composite wing structures. Other automatic processes being developed include automated placement of tacky, drapable epoxy towpreg, automated heated head placement of consolidated ribbon/tape, and vacuum-assisted resin transfer molding. These methods have the potential to yield low cost high performance structures by fabricating composite structures to net shape out-of-autoclave.

  14. Fully automated segmentation of cartilage from the MR images of knee using a multi-atlas and local structural analysis method

    PubMed Central

    Lee, June-Goo; Gumus, Serter; Moon, Chan Hong; Kwoh, C. Kent; Bae, Kyongtae Ty

    2014-01-01

    Purpose: To develop a fully automated method to segment cartilage from the magnetic resonance (MR) images of knee and to evaluate the performance of the method on a public, open dataset. Methods: The segmentation scheme consisted of three procedures: multiple-atlas building, applying a locally weighted vote (LWV), and region adjustment. In the atlas building procedure, all training cases were registered to a target image by a nonrigid registration scheme and the best matched atlases selected. A LWV algorithm was applied to merge the information from these atlases and generate the initial segmentation result. Subsequently, for the region adjustment procedure, the statistical information of bone, cartilage, and surrounding regions was computed from the initial segmentation result. The statistical information directed the automated determination of the seed points inside and outside bone regions for the graph-cut based method. Finally, the region adjustment was conducted by the revision of outliers and the inclusion of abnormal bone regions. Results: A total of 150 knee MR images from a public, open dataset (available atwww.ski10.org) were used for the development and evaluation of this approach. The 150 cases were divided into the training set (100 cases) and the test set (50 cases). The cartilages were segmented successfully in all test cases in an average of 40 min computation time. The average dice similarity coefficient was 71.7% ± 8.0% for femoral and 72.4% ± 6.9% for tibial cartilage. Conclusions: The authors have developed a fully automated segmentation program for knee cartilage from MR images. The performance of the program based on 50 test cases was highly promising. PMID:25186408

  15. Automated fiber placement: Evolution and current demonstrations

    NASA Technical Reports Server (NTRS)

    Grant, Carroll G.; Benson, Vernon M.

    1993-01-01

    The automated fiber placement process has been in development at Hercules since 1980. Fiber placement is being developed specifically for aircraft and other high performance structural applications. Several major milestones have been achieved during process development. These milestones are discussed in this paper. The automated fiber placement process is currently being demonstrated on the NASA ACT program. All demonstration projects to date have focused on fiber placement of transport aircraft fuselage structures. Hercules has worked closely with Boeing and Douglas on these demonstration projects. This paper gives a description of demonstration projects and results achieved.

  16. Monitoring of the physical status of Mars-500 subjects as a model of structuring an automated system in support of the training process in an exploration mission

    NASA Astrophysics Data System (ADS)

    Fomina, Elena; Savinkina, Alexandra; Kozlovskaya, Inesa; Lysova, Nataliya; Angeli, Tomas; Chernova, Maria; Uskov, Konstantin; Kukoba, Tatyana; Sonkin, Valentin; Ba, Norbert

    Physical training sessions aboard the ISS are performed under the permanent continuous control from Earth. Every week the instructors give their recommendations on how to proceed with the training considering the results of analysis of the daily records of training cosmonauts and data of the monthly fitness testing. It is obvious that in very long exploration missions this system of monitoring will be inapplicable. For this reason we venture to develop an automated system to control the physical training process using the current ISS locomotion test parameters as the leading criteria. Simulation of an extended exploration mission in experiment MARS-500 enabled the trial application of the automated system for assessing shifts in cosmonauts’ physical status in response to exercises of varying category and dismissal periods. Methods. Six subjects spent 520 days in the analog of an interplanetary vehicle at IBMP (Moscow). A variety of training regimens and facilities were used to maintain a high level of physical performance of the subjects. The resistance exercises involved expanders, strength training device (MDS) and vibrotraining device (Galileo). The cycling exercises were performed on the bicycle ergometer (VB-3) and a treadmill with the motor in or out of motion. To study the effect of prolonged periods of dismissal from training on physical performance, the training flow was interrupted for a month once in the middle and then at the end of isolation. In addition to the in-flight locomotion test integrated into the automated training control system, the physical status of subjects was attested by analysis of the records of the monthly incremental testing on the bicycle ergometer and MDS. Results. It was demonstrated that the recommended training regimens maintained high physical performance levels despite the limited motor activities in isolation. According to the locomotion testing, the subjects increased velocity significantly and reduced the physiological

  17. Automated Lattice Perturbation Theory

    SciTech Connect

    Monahan, Christopher

    2014-11-01

    I review recent developments in automated lattice perturbation theory. Starting with an overview of lattice perturbation theory, I focus on the three automation packages currently "on the market": HiPPy/HPsrc, Pastor and PhySyCAl. I highlight some recent applications of these methods, particularly in B physics. In the final section I briefly discuss the related, but distinct, approach of numerical stochastic perturbation theory.

  18. Automated Pilot Advisory System

    NASA Technical Reports Server (NTRS)

    Parks, J. L., Jr.; Haidt, J. G.

    1981-01-01

    An Automated Pilot Advisory System (APAS) was developed and operationally tested to demonstrate the concept that low cost automated systems can provide air traffic and aviation weather advisory information at high density uncontrolled airports. The system was designed to enhance the see and be seen rule of flight, and pilots who used the system preferred it over the self announcement system presently used at uncontrolled airports.

  19. Automated Status Notification System

    NASA Technical Reports Server (NTRS)

    2005-01-01

    NASA Lewis Research Center's Automated Status Notification System (ASNS) was born out of need. To prevent "hacker attacks," Lewis' telephone system needed to monitor communications activities 24 hr a day, 7 days a week. With decreasing staff resources, this continuous monitoring had to be automated. By utilizing existing communications hardware, a UNIX workstation, and NAWK (a pattern scanning and processing language), we implemented a continuous monitoring system.

  20. Automated Groundwater Screening

    SciTech Connect

    Taylor, Glenn A.; Collard, Leonard, B.

    2005-10-31

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application.

  1. Automated imagery orthorectification pilot

    NASA Astrophysics Data System (ADS)

    Slonecker, E. Terrence; Johnson, Brad; McMahon, Joe

    2009-10-01

    Automated orthorectification of raw image products is now possible based on the comprehensive metadata collected by Global Positioning Systems and Inertial Measurement Unit technology aboard aircraft and satellite digital imaging systems, and based on emerging pattern-matching and automated image-to-image and control point selection capabilities in many advanced image processing systems. Automated orthorectification of standard aerial photography is also possible if a camera calibration report and sufficient metadata is available. Orthorectification of historical imagery, for which only limited metadata was available, was also attempted and found to require some user input, creating a semi-automated process that still has significant potential to reduce processing time and expense for the conversion of archival historical imagery into geospatially enabled, digital formats, facilitating preservation and utilization of a vast archive of historical imagery. Over 90 percent of the frames of historical aerial photos used in this experiment were successfully orthorectified to the accuracy of the USGS 100K base map series utilized for the geospatial reference of the archive. The accuracy standard for the 100K series maps is approximately 167 feet (51 meters). The main problems associated with orthorectification failure were cloud cover, shadow and historical landscape change which confused automated image-to-image matching processes. Further research is recommended to optimize automated orthorectification methods and enable broad operational use, especially as related to historical imagery archives.

  2. Desperately Seeking Authority Control: Automated Systems Are Not Providing It.

    ERIC Educational Resources Information Center

    Johnston, Sarah Hager

    1990-01-01

    Reports on a survey which assessed automated authority control capabilities of 18 vendors' automated library systems, software, or services. Graphs rank vendors according to overall score, authority record source, format/storage of authority records, database dynamics, matching/linking authority and bibliographic records, syndetic structure,…

  3. Approaches to automated protein crystal harvesting

    PubMed Central

    Deller, Marc C.; Rupp, Bernhard

    2014-01-01

    The harvesting of protein crystals is almost always a necessary step in the determination of a protein structure using X-ray crystallographic techniques. However, protein crystals are usually fragile and susceptible to damage during the harvesting process. For this reason, protein crystal harvesting is the single step that remains entirely dependent on skilled human intervention. Automation has been implemented in the majority of other stages of the structure-determination pipeline, including cloning, expression, purification, crystallization and data collection. The gap in automation between crystallization and data collection results in a bottleneck in throughput and presents unfortunate opportunities for crystal damage. Several automated protein crystal harvesting systems have been developed, including systems utilizing microcapillaries, microtools, microgrippers, acoustic droplet ejection and optical traps. However, these systems have yet to be commonly deployed in the majority of crystallography laboratories owing to a variety of technical and cost-related issues. Automation of protein crystal harvesting remains essential for harnessing the full benefits of fourth-generation synchrotrons, free-electron lasers and microfocus beamlines. Furthermore, automation of protein crystal harvesting offers several benefits when compared with traditional manual approaches, including the ability to harvest microcrystals, improved flash-cooling procedures and increased throughput. PMID:24637746

  4. Approaches to automated protein crystal harvesting.

    PubMed

    Deller, Marc C; Rupp, Bernhard

    2014-02-01

    The harvesting of protein crystals is almost always a necessary step in the determination of a protein structure using X-ray crystallographic techniques. However, protein crystals are usually fragile and susceptible to damage during the harvesting process. For this reason, protein crystal harvesting is the single step that remains entirely dependent on skilled human intervention. Automation has been implemented in the majority of other stages of the structure-determination pipeline, including cloning, expression, purification, crystallization and data collection. The gap in automation between crystallization and data collection results in a bottleneck in throughput and presents unfortunate opportunities for crystal damage. Several automated protein crystal harvesting systems have been developed, including systems utilizing microcapillaries, microtools, microgrippers, acoustic droplet ejection and optical traps. However, these systems have yet to be commonly deployed in the majority of crystallography laboratories owing to a variety of technical and cost-related issues. Automation of protein crystal harvesting remains essential for harnessing the full benefits of fourth-generation synchrotrons, free-electron lasers and microfocus beamlines. Furthermore, automation of protein crystal harvesting offers several benefits when compared with traditional manual approaches, including the ability to harvest microcrystals, improved flash-cooling procedures and increased throughput. PMID:24637746

  5. Improving Acceptance of Automated Counseling Procedures.

    ERIC Educational Resources Information Center

    Johnson, James H.; And Others

    This paper discusses factors that may influence the acceptance of automated counseling procedures by the military. A consensual model of the change process is presented which structures organizational readiness, the change strategy, and acceptance as integrated variables to be considered in a successful installation. A basic introduction to the…

  6. Automated telescope scheduling

    NASA Technical Reports Server (NTRS)

    Johnston, Mark D.

    1988-01-01

    With the ever increasing level of automation of astronomical telescopes the benefits and feasibility of automated planning and scheduling are becoming more apparent. Improved efficiency and increased overall telescope utilization are the most obvious goals. Automated scheduling at some level has been done for several satellite observatories, but the requirements on these systems were much less stringent than on modern ground or satellite observatories. The scheduling problem is particularly acute for Hubble Space Telescope: virtually all observations must be planned in excruciating detail weeks to months in advance. Space Telescope Science Institute has recently made significant progress on the scheduling problem by exploiting state-of-the-art artificial intelligence software technology. What is especially interesting is that this effort has already yielded software that is well suited to scheduling groundbased telescopes, including the problem of optimizing the coordinated scheduling of more than one telescope.

  7. Materials Testing and Automation

    NASA Astrophysics Data System (ADS)

    Cooper, Wayne D.; Zweigoron, Ronald B.

    1980-07-01

    The advent of automation in materials testing has been in large part responsible for recent radical changes in the materials testing field: Tests virtually impossible to perform without a computer have become more straightforward to conduct. In addition, standardized tests may be performed with enhanced efficiency and repeatability. A typical automated system is described in terms of its primary subsystems — an analog station, a digital computer, and a processor interface. The processor interface links the analog functions with the digital computer; it includes data acquisition, command function generation, and test control functions. Features of automated testing are described with emphasis on calculated variable control, control of a variable that is computed by the processor and cannot be read directly from a transducer. Three calculated variable tests are described: a yield surface probe test, a thermomechanical fatigue test, and a constant-stress-intensity range crack-growth test. Future developments are discussed.

  8. Automated Factor Slice Sampling

    PubMed Central

    Tibbits, Matthew M.; Groendyke, Chris; Haran, Murali; Liechty, John C.

    2013-01-01

    Markov chain Monte Carlo (MCMC) algorithms offer a very general approach for sampling from arbitrary distributions. However, designing and tuning MCMC algorithms for each new distribution, can be challenging and time consuming. It is particularly difficult to create an efficient sampler when there is strong dependence among the variables in a multivariate distribution. We describe a two-pronged approach for constructing efficient, automated MCMC algorithms: (1) we propose the “factor slice sampler”, a generalization of the univariate slice sampler where we treat the selection of a coordinate basis (factors) as an additional tuning parameter, and (2) we develop an approach for automatically selecting tuning parameters in order to construct an efficient factor slice sampler. In addition to automating the factor slice sampler, our tuning approach also applies to the standard univariate slice samplers. We demonstrate the efficiency and general applicability of our automated MCMC algorithm with a number of illustrative examples. PMID:24955002

  9. Automation in medicinal chemistry.

    PubMed

    Reader, John C

    2004-01-01

    The implementation of appropriate automation can make a significant improvement in productivity at each stage of the drug discovery process, if it is incorporated into an efficient overall process. Automated chemistry has evolved rapidly from the 'combinatorial' techniques implemented in many industrial laboratories in the early 1990's which focused primarily on the hit discovery phase, and were highly dependent on solid-phase techniques and instrumentation derived from peptide synthesis. Automated tools and strategies have been developed which can impact the hit discovery, hit expansion and lead optimization phases, not only in synthesis, but also in reaction optimization, work-up, and purification of compounds. This article discusses the implementation of some of these techniques, based especially on experiences at Millennium Pharmaceuticals Research and Development Ltd.

  10. Automated Camera Calibration

    NASA Technical Reports Server (NTRS)

    Chen, Siqi; Cheng, Yang; Willson, Reg

    2006-01-01

    Automated Camera Calibration (ACAL) is a computer program that automates the generation of calibration data for camera models used in machine vision systems. Machine vision camera models describe the mapping between points in three-dimensional (3D) space in front of the camera and the corresponding points in two-dimensional (2D) space in the camera s image. Calibrating a camera model requires a set of calibration data containing known 3D-to-2D point correspondences for the given camera system. Generating calibration data typically involves taking images of a calibration target where the 3D locations of the target s fiducial marks are known, and then measuring the 2D locations of the fiducial marks in the images. ACAL automates the analysis of calibration target images and greatly speeds the overall calibration process.

  11. Power subsystem automation study

    NASA Technical Reports Server (NTRS)

    Imamura, M. S.; Moser, R. L.; Veatch, M.

    1983-01-01

    Generic power-system elements and their potential faults are identified. Automation functions and their resulting benefits are defined and automation functions between power subsystem, central spacecraft computer, and ground flight-support personnel are partitioned. All automation activities were categorized as data handling, monitoring, routine control, fault handling, planning and operations, or anomaly handling. Incorporation of all these classes of tasks, except for anomaly handling, in power subsystem hardware and software was concluded to be mandatory to meet the design and operational requirements of the space station. The key drivers are long mission lifetime, modular growth, high-performance flexibility, a need to accommodate different electrical user-load equipment, onorbit assembly/maintenance/servicing, and potentially large number of power subsystem components. A significant effort in algorithm development and validation is essential in meeting the 1987 technology readiness date for the space station.

  12. Automated fiber pigtailing technology

    NASA Astrophysics Data System (ADS)

    Strand, O. T.; Lowry, M. E.; Lu, S. Y.; Nelson, D. C.; Nikkel, D. J.; Pocha, M. D.; Young, K. D.

    1994-02-01

    The high cost of optoelectronic (OE) devices is due mainly to the labor-intensive packaging process. Manually pigtailing such devices as single-mode laser diodes and modulators is very time consuming with poor quality control. The Photonics Program and the Engineering Research Division at LLNL are addressing several issues associated with automatically packaging OE devices. A furry automated system must include high-precision fiber alignment, fiber attachment techniques, in-situ quality control, and parts handling and feeding. This paper will present on-going work at LLNL in the areas of automated fiber alignment and fiber attachment. For the fiber alignment, we are building an automated fiber pigtailing machine (AFPM) which combines computer vision and object recognition algorithms with active feedback to perform sub-micron alignments of single-mode fibers to modulators and laser diodes. We expect to perform sub-micron alignments in less than five minutes with this technology. For fiber attachment, we are building various geometries of silicon microbenches which include on-board heaters to solder metal-coated fibers and other components in place; these designs are completely compatible with an automated process of OE packaging. We have manually attached a laser diode, a thermistor, and a thermo-electric heater to one of our microbenches in less than 15 minutes using the on-board heaters for solder reflow; an automated process could perform this same exercise in only a few minutes. Automated packaging techniques such as these will help lower the costs of OE devices.

  13. Automated gas chromatography

    DOEpatents

    Mowry, Curtis D.; Blair, Dianna S.; Rodacy, Philip J.; Reber, Stephen D.

    1999-01-01

    An apparatus and process for the continuous, near real-time monitoring of low-level concentrations of organic compounds in a liquid, and, more particularly, a water stream. A small liquid volume of flow from a liquid process stream containing organic compounds is diverted by an automated process to a heated vaporization capillary where the liquid volume is vaporized to a gas that flows to an automated gas chromatograph separation column to chromatographically separate the organic compounds. Organic compounds are detected and the information transmitted to a control system for use in process control. Concentrations of organic compounds less than one part per million are detected in less than one minute.

  14. Ground based automated telescope

    SciTech Connect

    Colgate, S.A.; Thompson, W.

    1980-01-01

    Recommendation that a ground-based automated telescope of the 2-meter class be built for remote multiuser use as a natural facility. Experience dictates that a primary consideration is a time shared multitasking operating system with virtual memory overlayed with a real time priority interrupt. The primary user facility is a remote terminal networked to the single computer. Many users must have simultaneous time shared access to the computer for program development. The telescope should be rapid slewing, and hence a light weight construction. Automation allows for the closed loop pointing error correction independent of extreme accuracy of the mount.

  15. Automated software development workstation

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Engineering software development was automated using an expert system (rule-based) approach. The use of this technology offers benefits not available from current software development and maintenance methodologies. A workstation was built with a library or program data base with methods for browsing the designs stored; a system for graphical specification of designs including a capability for hierarchical refinement and definition in a graphical design system; and an automated code generation capability in FORTRAN. The workstation was then used in a demonstration with examples from an attitude control subsystem design for the space station. Documentation and recommendations are presented.

  16. Automating the CMS DAQ

    SciTech Connect

    Bauer, G.; et al.

    2014-01-01

    We present the automation mechanisms that have been added to the Data Acquisition and Run Control systems of the Compact Muon Solenoid (CMS) experiment during Run 1 of the LHC, ranging from the automation of routine tasks to automatic error recovery and context-sensitive guidance to the operator. These mechanisms helped CMS to maintain a data taking efficiency above 90% and to even improve it to 95% towards the end of Run 1, despite an increase in the occurrence of single-event upsets in sub-detector electronics at high LHC luminosity.

  17. Automated knowledge generation

    NASA Technical Reports Server (NTRS)

    Myler, Harley R.; Gonzalez, Avelino J.

    1988-01-01

    The general objectives of the NASA/UCF Automated Knowledge Generation Project were the development of an intelligent software system that could access CAD design data bases, interpret them, and generate a diagnostic knowledge base in the form of a system model. The initial area of concentration is in the diagnosis of the process control system using the Knowledge-based Autonomous Test Engineer (KATE) diagnostic system. A secondary objective was the study of general problems of automated knowledge generation. A prototype was developed, based on object-oriented language (Flavors).

  18. Human Factors In Aircraft Automation

    NASA Technical Reports Server (NTRS)

    Billings, Charles

    1995-01-01

    Report presents survey of state of art in human factors in automation of aircraft operation. Presents examination of aircraft automation and effects on flight crews in relation to human error and aircraft accidents.

  19. RCrane: semi-automated RNA model building

    PubMed Central

    Keating, Kevin S.; Pyle, Anna Marie

    2012-01-01

    RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems. PMID:22868764

  20. A technique for recording polycrystalline structure and orientation during in situ deformation cycles of rock analogues using an automated fabric analyser.

    PubMed

    Peternell, M; Russell-Head, D S; Wilson, C J L

    2011-05-01

    Two in situ plane-strain deformation experiments on norcamphor and natural ice using synchronous recording of crystal c-axis orientations have been performed with an automated fabric analyser and a newly developed sample press and deformation stage. Without interrupting the deformation experiment, c-axis orientations are determined for each pixel in a 5 × 5 mm sample area at a spatial resolution of 5 μm/pixel. In the case of norcamphor, changes in microstructures and associated crystallographic information, at a strain rate of ∼2 × 10(-5) s(-1), were recorded for the first time during a complete in situ deformation-cycle experiment that consisted of an annealing, deformation and post-deformation annealing path. In the case of natural ice, slower external strain rates (∼1 × 10(-6) s(-1)) enabled the investigation of small changes in the polycrystal aggregate's crystallography and microstructure for small amounts of strain. The technical setup and first results from the experiments are presented.

  1. The 3D Euler solutions using automated Cartesian grid generation

    NASA Technical Reports Server (NTRS)

    Melton, John E.; Enomoto, Francis Y.; Berger, Marsha J.

    1993-01-01

    Viewgraphs on 3-dimensional Euler solutions using automated Cartesian grid generation are presented. Topics covered include: computational fluid dynamics (CFD) and the design cycle; Cartesian grid strategy; structured body fit; grid generation; prolate spheroid; and ONERA M6 wing.

  2. Automated Student Model Improvement

    ERIC Educational Resources Information Center

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  3. Library Automation: An Overview.

    ERIC Educational Resources Information Center

    Saffady, William

    1989-01-01

    Surveys the current state of computer applications in six areas of library work: circulation control; descriptive cataloging; catalog maintenance and production; reference services; acquisitions; and serials control. Motives for automation are discussed, and examples of representative vendors, products, and services are given. (15 references) (LRW)

  4. Automation in haemostasis.

    PubMed

    Huber, A R; Méndez, A; Brunner-Agten, S

    2013-01-01

    Automatia, an ancient Greece goddess of luck who makes things happen by themselves and on her own will without human engagement, is present in our daily life in the medical laboratory. Automation has been introduced and perfected by clinical chemistry and since then expanded into other fields such as haematology, immunology, molecular biology and also coagulation testing. The initial small and relatively simple standalone instruments have been replaced by more complex systems that allow for multitasking. Integration of automated coagulation testing into total laboratory automation has become possible in the most recent years. Automation has many strengths and opportunities if weaknesses and threats are respected. On the positive side, standardization, reduction of errors, reduction of cost and increase of throughput are clearly beneficial. Dependence on manufacturers, high initiation cost and somewhat expensive maintenance are less favourable factors. The modern lab and especially the todays lab technicians and academic personnel in the laboratory do not add value for the doctor and his patients by spending lots of time behind the machines. In the future the lab needs to contribute at the bedside suggesting laboratory testing and providing support and interpretation of the obtained results. The human factor will continue to play an important role in testing in haemostasis yet under different circumstances.

  5. Building Automation Systems.

    ERIC Educational Resources Information Center

    Honeywell, Inc., Minneapolis, Minn.

    A number of different automation systems for use in monitoring and controlling building equipment are described in this brochure. The system functions include--(1) collection of information, (2) processing and display of data at a central panel, and (3) taking corrective action by sounding alarms, making adjustments, or automatically starting and…

  6. Automated CCTV Tester

    2000-09-13

    The purpose of an automated CCTV tester is to automatically and continuously monitor multiple perimeter security cameras for changes in a camera's measured resolution and alignment (camera looking at the proper area). It shall track and record the image quality and position of each camera and produce an alarm when a camera is out of specification.

  7. Blastocyst microinjection automation.

    PubMed

    Mattos, Leonardo S; Grant, Edward; Thresher, Randy; Kluckman, Kimberly

    2009-09-01

    Blastocyst microinjections are routinely involved in the process of creating genetically modified mice for biomedical research, but their efficiency is highly dependent on the skills of the operators. As a consequence, much time and resources are required for training microinjection personnel. This situation has been aggravated by the rapid growth of genetic research, which has increased the demand for mutant animals. Therefore, increased productivity and efficiency in this area are highly desired. Here, we pursue these goals through the automation of a previously developed teleoperated blastocyst microinjection system. This included the design of a new system setup to facilitate automation, the definition of rules for automatic microinjections, the implementation of video processing algorithms to extract feedback information from microscope images, and the creation of control algorithms for process automation. Experimentation conducted with this new system and operator assistance during the cells delivery phase demonstrated a 75% microinjection success rate. In addition, implantation of the successfully injected blastocysts resulted in a 53% birth rate and a 20% yield of chimeras. These results proved that the developed system was capable of automatic blastocyst penetration and retraction, demonstrating the success of major steps toward full process automation.

  8. Library Automation in Australia.

    ERIC Educational Resources Information Center

    Blank, Karen L.

    1984-01-01

    Discussion of Australia's move toward library automation highlights development of a national bibliographic network, local and regional cooperation, integrated library systems, telecommunications, and online systems, as well as microcomputer usage, ergonomics, copyright issues, and national information policy. Information technology plans of the…

  9. Automated Management Of Documents

    NASA Technical Reports Server (NTRS)

    Boy, Guy

    1995-01-01

    Report presents main technical issues involved in computer-integrated documentation. Problems associated with automation of management and maintenance of documents analyzed from perspectives of artificial intelligence and human factors. Technologies that may prove useful in computer-integrated documentation reviewed: these include conventional approaches to indexing and retrieval of information, use of hypertext, and knowledge-based artificial-intelligence systems.

  10. Mining Your Automated System.

    ERIC Educational Resources Information Center

    Larsen, Patricia M., Ed.; And Others

    1996-01-01

    Four articles address issues of collecting, compiling, reporting, and interpreting statistics generated by automated library systems for administrative decision making. Topics include using a management information system to forecast growth and assess areas for downsizing; statistics for collection development and analysis; and online system…

  11. Automated conflict resolution issues

    NASA Technical Reports Server (NTRS)

    Wike, Jeffrey S.

    1991-01-01

    A discussion is presented of how conflicts for Space Network resources should be resolved in the ATDRSS era. The following topics are presented: a description of how resource conflicts are currently resolved; a description of issues associated with automated conflict resolution; present conflict resolution strategies; and topics for further discussion.

  12. Automating Food Service.

    ERIC Educational Resources Information Center

    Kavulla, Timothy A.

    1986-01-01

    The Wichita, Kansas, Public Schools' Food Service Department Project Reduction in Paperwork (RIP) is designed to automate certain paperwork functions, thus reducing cost and flow of paper. This article addresses how RIP manages free/reduced meal applications and meets the objectives of reducing paper and increasing accuracy, timeliness, and…

  13. Automated Estimating System (AES)

    SciTech Connect

    Holder, D.A.

    1989-09-01

    This document describes Version 3.1 of the Automated Estimating System, a personal computer-based software package designed to aid in the creation, updating, and reporting of project cost estimates for the Estimating and Scheduling Department of the Martin Marietta Energy Systems Engineering Division. Version 3.1 of the Automated Estimating System is capable of running in a multiuser environment across a token ring network. The token ring network makes possible services and applications that will more fully integrate all aspects of information processing, provides a central area for large data bases to reside, and allows access to the data base by multiple users. Version 3.1 of the Automated Estimating System also has been enhanced to include an Assembly pricing data base that may be used to retrieve cost data into an estimate. A WBS Title File program has also been included in Version 3.1. The WBS Title File program allows for the creation of a WBS title file that has been integrated with the Automated Estimating System to provide WBS titles in update mode and in reports. This provides for consistency in WBS titles and provides the capability to display WBS titles on reports generated at a higher WBS level.

  14. Automated Administrative Data Bases

    NASA Technical Reports Server (NTRS)

    Marrie, M. D.; Jarrett, J. R.; Reising, S. A.; Hodge, J. E.

    1984-01-01

    Improved productivity and more effective response to information requirements for internal management, NASA Centers, and Headquarters resulted from using automated techniques. Modules developed to provide information on manpower, RTOPS, full time equivalency, and physical space reduced duplication, increased communication, and saved time. There is potential for greater savings by sharing and integrating with those who have the same requirements.

  15. Automating Small Libraries.

    ERIC Educational Resources Information Center

    Swan, James

    1996-01-01

    Presents a four-phase plan for small libraries strategizing for automation: inventory and weeding, data conversion, implementation, and enhancements. Other topics include selecting a system, MARC records, compatibility, ease of use, industry standards, searching capabilities, support services, system security, screen displays, circulation modules,…

  16. CLAN Automation Plan.

    ERIC Educational Resources Information Center

    Nevada State Library and Archives, Carson City.

    The Central Libraries Automated Network (CLAN) of Nevada is a cooperative system which shares circulation, cataloging, and acquisitions systems and numerous online databases. Its mission is to provide public access to information and efficient library administration through shared computer systems, databases, and telecommunications. This document…

  17. Automated EEG acquisition

    NASA Technical Reports Server (NTRS)

    Frost, J. D., Jr.; Hillman, C. E., Jr.

    1977-01-01

    Automated self-contained portable device can be used by technicians with minimal training. Data acquired from patient at remote site are transmitted to centralized interpretation center using conventional telephone equipment. There, diagnostic information is analyzed, and results are relayed back to remote site.

  18. Automated Essay Scoring

    ERIC Educational Resources Information Center

    Dikli, Semire

    2006-01-01

    The impacts of computers on writing have been widely studied for three decades. Even basic computers functions, i.e. word processing, have been of great assistance to writers in modifying their essays. The research on Automated Essay Scoring (AES) has revealed that computers have the capacity to function as a more effective cognitive tool (Attali,…

  19. Automated pipelines for spectroscopic analysis

    NASA Astrophysics Data System (ADS)

    Allende Prieto, C.

    2016-09-01

    The Gaia mission will have a profound impact on our understanding of the structure and dynamics of the Milky Way. Gaia is providing an exhaustive census of stellar parallaxes, proper motions, positions, colors and radial velocities, but also leaves some glaring holes in an otherwise complete data set. The radial velocities measured with the on-board high-resolution spectrograph will only reach some 10 % of the full sample of stars with astrometry and photometry from the mission, and detailed chemical information will be obtained for less than 1 %. Teams all over the world are organizing large-scale projects to provide complementary radial velocities and chemistry, since this can now be done very efficiently from the ground thanks to large and mid-size telescopes with a wide field-of-view and multi-object spectrographs. As a result, automated data processing is taking an ever increasing relevance, and the concept is applying to many more areas, from targeting to analysis. In this paper, I provide a quick overview of recent, ongoing, and upcoming spectroscopic surveys, and the strategies adopted in their automated analysis pipelines.

  20. Automated Conflict Resolution For Air Traffic Control

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    2005-01-01

    The ability to detect and resolve conflicts automatically is considered to be an essential requirement for the next generation air traffic control system. While systems for automated conflict detection have been used operationally by controllers for more than 20 years, automated resolution systems have so far not reached the level of maturity required for operational deployment. Analytical models and algorithms for automated resolution have been traffic conditions to demonstrate that they can handle the complete spectrum of conflict situations encountered in actual operations. The resolution algorithm described in this paper was formulated to meet the performance requirements of the Automated Airspace Concept (AAC). The AAC, which was described in a recent paper [1], is a candidate for the next generation air traffic control system. The AAC's performance objectives are to increase safety and airspace capacity and to accommodate user preferences in flight operations to the greatest extent possible. In the AAC, resolution trajectories are generated by an automation system on the ground and sent to the aircraft autonomously via data link .The algorithm generating the trajectories must take into account the performance characteristics of the aircraft, the route structure of the airway system, and be capable of resolving all types of conflicts for properly equipped aircraft without requiring supervision and approval by a controller. Furthermore, the resolution trajectories should be compatible with the clearances, vectors and flight plan amendments that controllers customarily issue to pilots in resolving conflicts. The algorithm described herein, although formulated specifically to meet the needs of the AAC, provides a generic engine for resolving conflicts. Thus, it can be incorporated into any operational concept that requires a method for automated resolution, including concepts for autonomous air to air resolution.

  1. Automated generation of weld path trajectories.

    SciTech Connect

    Sizemore, John M.; Hinman-Sweeney, Elaine Marie; Ames, Arlo Leroy

    2003-06-01

    AUTOmated GENeration of Control Programs for Robotic Welding of Ship Structure (AUTOGEN) is software that automates the planning and compiling of control programs for robotic welding of ship structure. The software works by evaluating computer representations of the ship design and the manufacturing plan. Based on this evaluation, AUTOGEN internally identifies and appropriately characterizes each weld. Then it constructs the robot motions necessary to accomplish the welds and determines for each the correct assignment of process control values. AUTOGEN generates these robot control programs completely without manual intervention or edits except to correct wrong or missing input data. Most ship structure assemblies are unique or at best manufactured only a few times. Accordingly, the high cost inherent in all previous methods of preparing complex control programs has made robot welding of ship structures economically unattractive to the U.S. shipbuilding industry. AUTOGEN eliminates the cost of creating robot control programs. With programming costs eliminated, capitalization of robots to weld ship structures becomes economically viable. Robot welding of ship structures will result in reduced ship costs, uniform product quality, and enhanced worker safety. Sandia National Laboratories and Northrop Grumman Ship Systems worked with the National Shipbuilding Research Program to develop a means of automated path and process generation for robotic welding. This effort resulted in the AUTOGEN program, which has successfully demonstrated automated path generation and robot control. Although the current implementation of AUTOGEN is optimized for welding applications, the path and process planning capability has applicability to a number of industrial applications, including painting, riveting, and adhesive delivery.

  2. Automated gas chromatography

    DOEpatents

    Mowry, C.D.; Blair, D.S.; Rodacy, P.J.; Reber, S.D.

    1999-07-13

    An apparatus and process for the continuous, near real-time monitoring of low-level concentrations of organic compounds in a liquid, and, more particularly, a water stream. A small liquid volume of flow from a liquid process stream containing organic compounds is diverted by an automated process to a heated vaporization capillary where the liquid volume is vaporized to a gas that flows to an automated gas chromatograph separation column to chromatographically separate the organic compounds. Organic compounds are detected and the information transmitted to a control system for use in process control. Concentrations of organic compounds less than one part per million are detected in less than one minute. 7 figs.

  3. Automated theorem proving.

    PubMed

    Plaisted, David A

    2014-03-01

    Automated theorem proving is the use of computers to prove or disprove mathematical or logical statements. Such statements can express properties of hardware or software systems, or facts about the world that are relevant for applications such as natural language processing and planning. A brief introduction to propositional and first-order logic is given, along with some of the main methods of automated theorem proving in these logics. These methods of theorem proving include resolution, Davis and Putnam-style approaches, and others. Methods for handling the equality axioms are also presented. Methods of theorem proving in propositional logic are presented first, and then methods for first-order logic. WIREs Cogn Sci 2014, 5:115-128. doi: 10.1002/wcs.1269 CONFLICT OF INTEREST: The authors has declared no conflicts of interest for this article. For further resources related to this article, please visit the WIREs website. PMID:26304304

  4. Automated macromolecular crystallization screening

    DOEpatents

    Segelke, Brent W.; Rupp, Bernhard; Krupka, Heike I.

    2005-03-01

    An automated macromolecular crystallization screening system wherein a multiplicity of reagent mixes are produced. A multiplicity of analysis plates is produced utilizing the reagent mixes combined with a sample. The analysis plates are incubated to promote growth of crystals. Images of the crystals are made. The images are analyzed with regard to suitability of the crystals for analysis by x-ray crystallography. A design of reagent mixes is produced based upon the expected suitability of the crystals for analysis by x-ray crystallography. A second multiplicity of mixes of the reagent components is produced utilizing the design and a second multiplicity of reagent mixes is used for a second round of automated macromolecular crystallization screening. In one embodiment the multiplicity of reagent mixes are produced by a random selection of reagent components.

  5. Automated breeder fuel fabrication

    SciTech Connect

    Goldmann, L.H.; Frederickson, J.R.

    1983-09-01

    The objective of the Secure Automated Fabrication (SAF) Project is to develop remotely operated equipment for the processing and manufacturing of breeder reactor fuel pins. The SAF line will be installed in the Fuels and Materials Examination Facility (FMEF). The FMEF is presently under construction at the Department of Energy's (DOE) Hanford site near Richland, Washington, and is operated by the Westinghouse Hanford Company (WHC). The fabrication and support systems of the SAF line are designed for computer-controlled operation from a centralized control room. Remote and automated fuel fabriction operations will result in: reduced radiation exposure to workers; enhanced safeguards; improved product quality; near real-time accountability, and increased productivity. The present schedule calls for installation of SAF line equipment in the FMEF beginning in 1984, with qualifying runs starting in 1986 and production commencing in 1987. 5 figures.

  6. Compact reactor design automation

    NASA Technical Reports Server (NTRS)

    Nassersharif, Bahram; Gaeta, Michael J.

    1991-01-01

    A conceptual compact reactor design automation experiment was performed using the real-time expert system G2. The purpose of this experiment was to investigate the utility of an expert system in design; in particular, reactor design. The experiment consisted of the automation and integration of two design phases: reactor neutronic design and fuel pin design. The utility of this approach is shown using simple examples of formulating rules to ensure design parameter consistency between the two design phases. The ability of G2 to communicate with external programs even across networks provides the system with the capability of supplementing the knowledge processing features with conventional canned programs with possible applications for realistic iterative design tools.

  7. Automated assembly in space

    NASA Technical Reports Server (NTRS)

    Srivastava, Sandanand; Dwivedi, Suren N.; Soon, Toh Teck; Bandi, Reddy; Banerjee, Soumen; Hughes, Cecilia

    1989-01-01

    The installation of robots and their use of assembly in space will create an exciting and promising future for the U.S. Space Program. The concept of assembly in space is very complicated and error prone and it is not possible unless the various parts and modules are suitably designed for automation. Certain guidelines are developed for part designing and for an easy precision assembly. Major design problems associated with automated assembly are considered and solutions to resolve these problems are evaluated in the guidelines format. Methods for gripping and methods for part feeding are developed with regard to the absence of gravity in space. The guidelines for part orientation, adjustments, compliances and various assembly construction are discussed. Design modifications of various fasteners and fastening methods are also investigated.

  8. Millisecond single-molecule localization microscopy combined with convolution analysis and automated image segmentation to determine protein concentrations in complexly structured, functional cells, one cell at a time.

    PubMed

    Wollman, Adam J M; Leake, Mark C

    2015-01-01

    We present a single-molecule tool called the CoPro (concentration of proteins) method that uses millisecond imaging with convolution analysis, automated image segmentation and super-resolution localization microscopy to generate robust estimates for protein concentration in different compartments of single living cells, validated using realistic simulations of complex multiple compartment cell types. We demonstrate its utility experimentally on model Escherichia coli bacteria and Saccharomyces cerevisiae budding yeast cells, and use it to address the biological question of how signals are transduced in cells. Cells in all domains of life dynamically sense their environment through signal transduction mechanisms, many involving gene regulation. The glucose sensing mechanism of S. cerevisiae is a model system for studying gene regulatory signal transduction. It uses the multi-copy expression inhibitor of the GAL gene family, Mig1, to repress unwanted genes in the presence of elevated extracellular glucose concentrations. We fluorescently labelled Mig1 molecules with green fluorescent protein (GFP) via chromosomal integration at physiological expression levels in living S. cerevisiae cells, in addition to the RNA polymerase protein Nrd1 with the fluorescent protein reporter mCherry. Using CoPro we make quantitative estimates of Mig1 and Nrd1 protein concentrations in the cytoplasm and nucleus compartments on a cell-by-cell basis under physiological conditions. These estimates indicate a ∼4-fold shift towards higher values in the concentration of diffusive Mig1 in the nucleus if the external glucose concentration is raised, whereas equivalent levels in the cytoplasm shift to smaller values with a relative change an order of magnitude smaller. This compares with Nrd1 which is not involved directly in glucose sensing, and which is almost exclusively localized in the nucleus under high and low external glucose levels. CoPro facilitates time-resolved quantification of

  9. Automated design of flexible linkers.

    PubMed

    Manion, Charles; Arlitt, Ryan; Campbell, Matthew I; Tumer, Irem; Stone, Rob; Greaney, P Alex

    2016-03-14

    This paper presents a method for the systematic and automated design of flexible organic linkers for construction of metal organic-frameworks (MOFs) in which flexibility, compliance, or other mechanically exotic properties originate at the linker level rather than from the framework kinematics. Our method couples a graph grammar method for systematically generating linker like molecules with molecular dynamics modeling of linkers' mechanical response. Using this approach we have generated a candidate pool of >59,000 hypothetical linkers. We screen linker candidates according to their mechanical behaviors under large deformation, and extract fragments common to the most performant candidate materials. To demonstrate the general approach to MOF design we apply our system to designing linkers for pressure switching MOFs-MOFs that undergo reversible structural collapse after a stress threshold is exceeded. PMID:26687337

  10. Automated Testing System

    2006-05-09

    ATS is a Python-language program for automating test suites for software programs that do not interact with thier users, such as scripted scientific simulations. ATS features a decentralized approach especially suited to larger projects. In its multinode mode it can utilize many nodes of a cluster in order to do many test in parallel. It has features for submitting longer-running tests to a batch system and would have to be customized for use elsewhere.

  11. Power subsystem automation study

    NASA Technical Reports Server (NTRS)

    Tietz, J. C.; Sewy, D.; Pickering, C.; Sauers, R.

    1984-01-01

    The purpose of the phase 2 of the power subsystem automation study was to demonstrate the feasibility of using computer software to manage an aspect of the electrical power subsystem on a space station. The state of the art in expert systems software was investigated in this study. This effort resulted in the demonstration of prototype expert system software for managing one aspect of a simulated space station power subsystem.

  12. Cavendish Balance Automation

    NASA Technical Reports Server (NTRS)

    Thompson, Bryan

    2000-01-01

    This is the final report for a project carried out to modify a manual commercial Cavendish Balance for automated use in cryostat. The scope of this project was to modify an off-the-shelf manually operated Cavendish Balance to allow for automated operation for periods of hours or days in cryostat. The purpose of this modification was to allow the balance to be used in the study of effects of superconducting materials on the local gravitational field strength to determine if the strength of gravitational fields can be reduced. A Cavendish Balance was chosen because it is a fairly simple piece of equipment for measuring gravity, one the least accurately known and least understood physical constants. The principle activities that occurred under this purchase order were: (1) All the components necessary to hold and automate the Cavendish Balance in a cryostat were designed. Engineering drawings were made of custom parts to be fabricated, other off-the-shelf parts were procured; (2) Software was written in LabView to control the automation process via a stepper motor controller and stepper motor, and to collect data from the balance during testing; (3)Software was written to take the data collected from the Cavendish Balance and reduce it to give a value for the gravitational constant; (4) The components of the system were assembled and fitted to a cryostat. Also the LabView hardware including the control computer, stepper motor driver, data collection boards, and necessary cabling were assembled; and (5) The system was operated for a number of periods, data collected, and reduced to give an average value for the gravitational constant.

  13. Automated Microbial Metabolism Laboratory

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Development of the automated microbial metabolism laboratory (AMML) concept is reported. The focus of effort of AMML was on the advanced labeled release experiment. Labeled substrates, inhibitors, and temperatures were investigated to establish a comparative biochemical profile. Profiles at three time intervals on soil and pure cultures of bacteria isolated from soil were prepared to establish a complete library. The development of a strategy for the return of a soil sample from Mars is also reported.

  14. Automated Cooperative Trajectories

    NASA Technical Reports Server (NTRS)

    Hanson, Curt; Pahle, Joseph; Brown, Nelson

    2015-01-01

    This presentation is an overview of the Automated Cooperative Trajectories project. An introduction to the phenomena of wake vortices is given, along with a summary of past research into the possibility of extracting energy from the wake by flying close parallel trajectories. Challenges and barriers to adoption of civilian automatic wake surfing technology are identified. A hardware-in-the-loop simulation is described that will support future research. Finally, a roadmap for future research and technology transition is proposed.

  15. Automation in biological crystallization.

    PubMed

    Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen

    2014-06-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.

  16. Automation in biological crystallization

    PubMed Central

    Shaw Stewart, Patrick; Mueller-Dieckmann, Jochen

    2014-01-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given. PMID:24915074

  17. Automation model of sewerage rehabilitation planning.

    PubMed

    Yang, M D; Su, T C

    2006-01-01

    The major steps of sewerage rehabilitation include inspection of sewerage, assessment of structural conditions, computation of structural condition grades, and determination of rehabilitation methods and materials. Conventionally, sewerage rehabilitation planning relies on experts with professional background that is tedious and time-consuming. This paper proposes an automation model of planning optimal sewerage rehabilitation strategies for the sewer system by integrating image process, clustering technology, optimization, and visualization display. Firstly, image processing techniques, such as wavelet transformation and co-occurrence features extraction, were employed to extract various characteristics of structural failures from CCTV inspection images. Secondly, a classification neural network was established to automatically interpret the structural conditions by comparing the extracted features with the typical failures in a databank. Then, to achieve optimal rehabilitation efficiency, a genetic algorithm was used to determine appropriate rehabilitation methods and substitution materials for the pipe sections with a risk of mal-function and even collapse. Finally, the result from the automation model can be visualized in a geographic information system in which essential information of the sewer system and sewerage rehabilitation plans are graphically displayed. For demonstration, the automation model of optimal sewerage rehabilitation planning was applied to a sewer system in east Taichung, Chinese Taiwan.

  18. Automation model of sewerage rehabilitation planning.

    PubMed

    Yang, M D; Su, T C

    2006-01-01

    The major steps of sewerage rehabilitation include inspection of sewerage, assessment of structural conditions, computation of structural condition grades, and determination of rehabilitation methods and materials. Conventionally, sewerage rehabilitation planning relies on experts with professional background that is tedious and time-consuming. This paper proposes an automation model of planning optimal sewerage rehabilitation strategies for the sewer system by integrating image process, clustering technology, optimization, and visualization display. Firstly, image processing techniques, such as wavelet transformation and co-occurrence features extraction, were employed to extract various characteristics of structural failures from CCTV inspection images. Secondly, a classification neural network was established to automatically interpret the structural conditions by comparing the extracted features with the typical failures in a databank. Then, to achieve optimal rehabilitation efficiency, a genetic algorithm was used to determine appropriate rehabilitation methods and substitution materials for the pipe sections with a risk of mal-function and even collapse. Finally, the result from the automation model can be visualized in a geographic information system in which essential information of the sewer system and sewerage rehabilitation plans are graphically displayed. For demonstration, the automation model of optimal sewerage rehabilitation planning was applied to a sewer system in east Taichung, Chinese Taiwan. PMID:17302324

  19. Automation and Robotics for Space-Based Systems, 1991

    NASA Technical Reports Server (NTRS)

    Williams, Robert L., II (Editor)

    1992-01-01

    The purpose of this in-house workshop was to assess the state-of-the-art of automation and robotics for space operations from an LaRC perspective and to identify areas of opportunity for future research. Over half of the presentations came from the Automation Technology Branch, covering telerobotic control, extravehicular activity (EVA) and intra-vehicular activity (IVA) robotics, hand controllers for teleoperation, sensors, neural networks, and automated structural assembly, all applied to space missions. Other talks covered the Remote Manipulator System (RMS) active damping augmentation, space crane work, modeling, simulation, and control of large, flexible space manipulators, and virtual passive controller designs for space robots.

  20. Automation in organizations: Eternal conflict

    NASA Technical Reports Server (NTRS)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  1. Medical linguistics: automated indexing into SNOMED.

    PubMed

    Wingert, F

    1988-01-01

    This paper reviews the state of the art in processing medical language data. The area is divided into the topics: (1) morphologic analysis, (2) syntactic analysis, (3) semantic analysis, and (4) pragmatics. Additional attention is given to medical nomenclatures and classifications as the bases of (automated) indexing procedures which are required whenever medical information is formalized. These topics are completed by an evaluation of related data structures and methods used to organize language-based medical knowledge.

  2. Numerical analysis of stiffened shells of revolution. Volume 4: Engineer's program manual for STARS-2S shell theory automated for rotational structures - 2 (statics) digital computer program

    NASA Technical Reports Server (NTRS)

    Svalbonas, V.; Ogilvie, P.

    1973-01-01

    The engineering programming information for the digital computer program for analyzing shell structures is presented. The program is designed to permit small changes such as altering the geometry or a table size to fit the specific requirements. Each major subroutine is discussed and the following subjects are included: (1) subroutine description, (2) pertinent engineering symbols and the FORTRAN coded counterparts, (3) subroutine flow chart, and (4) subroutine FORTRAN listing.

  3. Assessment of the Molecular Expression and Structure of Gangliosides in Brain Metastasis of Lung Adenocarcinoma by an Advanced Approach Based on Fully Automated Chip-Nanoelectrospray Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Zamfir, Alina D.; Serb, Alina; Vukeli, Željka; Flangea, Corina; Schiopu, Catalin; Fabris, Dragana; Kalanj-Bognar, Svjetlana; Capitan, Florina; Sisu, Eugen

    2011-12-01

    Gangliosides (GGs), sialic acid-containing glycosphingolipids, are known to be involved in the invasive/metastatic behavior of brain tumor cells. Development of modern methods for determination of the variations in GG expression and structure during neoplastic cell transformation is a priority in the field of biomedical analysis. In this context, we report here on the first optimization and application of chip-based nanoelectrospray (NanoMate robot) mass spectrometry (MS) for the investigation of gangliosides in a secondary brain tumor. In our work a native GG mixture extracted and purified from brain metastasis of lung adenocarcinoma was screened by NanoMate robot coupled to a quadrupole time-of-flight MS. A native GG mixture from an age-matched healthy brain tissue, sampled and analyzed under identical conditions, served as a control. Comparative MS analysis demonstrated an evident dissimilarity in GG expression in the two tissue types. Brain metastasis is characterized by many species having a reduced N-acetylneuraminic acid (Neu5Ac) content, however, modified by fucosylation or O-acetylation such as Fuc-GM4, Fuc-GM3, di- O-Ac-GM1, O-Ac-GM3. In contrast, healthy brain tissue is dominated by longer structures exhibiting from mono- to hexasialylated sugar chains. Also, significant differences in ceramide composition were discovered. By tandem MS using collision-induced dissociation at low energies, brain metastasis-associated GD3 (d18:1/18:0) species as well as an uncommon Fuc-GM1 (d18:1/18:0) detected in the normal brain tissue could be structurally characterized. The novel protocol was able to provide a reliable compositional and structural characterization with high analysis pace and at a sensitivity situated in the fmol range.

  4. Automated method for relating regional pulmonary structure and function: integration of dynamic multislice CT and thin-slice high-resolution CT

    NASA Astrophysics Data System (ADS)

    Tajik, Jehangir K.; Kugelmass, Steven D.; Hoffman, Eric A.

    1993-07-01

    We have developed a method utilizing x-ray CT for relating pulmonary perfusion to global and regional anatomy, allowing for detailed study of structure to function relationships. A thick slice, high temporal resolution mode is used to follow a bolus contrast agent for blood flow evaluation and is fused with a high spatial resolution, thin slice mode to obtain structure- function detail. To aid analysis of blood flow, we have developed a software module, for our image analysis package (VIDA), to produce the combined structure-function image. Color coded images representing blood flow, mean transit time, regional tissue content, regional blood volume, regional air content, etc. are generated and imbedded in the high resolution volume image. A text file containing these values along with a voxel's 3-D coordinates is also generated. User input can be minimized to identifying the location of the pulmonary artery from which the input function to a blood flow model is derived. Any flow model utilizing one input and one output function can be easily added to a user selectable list. We present examples from our physiologic based research findings to demonstrate the strengths of combining dynamic CT and HRCT relative to other scanning modalities to uniquely characterize pulmonary normal and pathophysiology.

  5. NASA space station automation: AI-based technology review

    NASA Technical Reports Server (NTRS)

    Firschein, O.; Georgeff, M. P.; Park, W.; Neumann, P.; Kautz, W. H.; Levitt, K. N.; Rom, R. J.; Poggio, A. A.

    1985-01-01

    Research and Development projects in automation for the Space Station are discussed. Artificial Intelligence (AI) based automation technologies are planned to enhance crew safety through reduced need for EVA, increase crew productivity through the reduction of routine operations, increase space station autonomy, and augment space station capability through the use of teleoperation and robotics. AI technology will also be developed for the servicing of satellites at the Space Station, system monitoring and diagnosis, space manufacturing, and the assembly of large space structures.

  6. [Automated anesthesia record system].

    PubMed

    Zhu, Tao; Liu, Jin

    2005-12-01

    Based on Client/Server architecture, a software of automated anesthesia record system running under Windows operation system and networks has been developed and programmed with Microsoft Visual C++ 6.0, Visual Basic 6.0 and SQL Server. The system can deal with patient's information throughout the anesthesia. It can collect and integrate the data from several kinds of medical equipment such as monitor, infusion pump and anesthesia machine automatically and real-time. After that, the system presents the anesthesia sheets automatically. The record system makes the anesthesia record more accurate and integral and can raise the anesthesiologist's working efficiency.

  7. Automated fiber pigtailing machine

    DOEpatents

    Strand, O.T.; Lowry, M.E.

    1999-01-05

    The Automated Fiber Pigtailing Machine (AFPM) aligns and attaches optical fibers to optoelectronic (OE) devices such as laser diodes, photodiodes, and waveguide devices without operator intervention. The so-called pigtailing process is completed with sub-micron accuracies in less than 3 minutes. The AFPM operates unattended for one hour, is modular in design and is compatible with a mass production manufacturing environment. This machine can be used to build components which are used in military aircraft navigation systems, computer systems, communications systems and in the construction of diagnostics and experimental systems. 26 figs.

  8. Automated fiber pigtailing machine

    DOEpatents

    Strand, Oliver T.; Lowry, Mark E.

    1999-01-01

    The Automated Fiber Pigtailing Machine (AFPM) aligns and attaches optical fibers to optoelectonic (OE) devices such as laser diodes, photodiodes, and waveguide devices without operator intervention. The so-called pigtailing process is completed with sub-micron accuracies in less than 3 minutes. The AFPM operates unattended for one hour, is modular in design and is compatible with a mass production manufacturing environment. This machine can be used to build components which are used in military aircraft navigation systems, computer systems, communications systems and in the construction of diagnostics and experimental systems.

  9. Automated Propellant Blending

    NASA Technical Reports Server (NTRS)

    Hohmann, Carl W. (Inventor); Harrington, Douglas W. (Inventor); Dutton, Maureen L. (Inventor); Tipton, Billy Charles, Jr. (Inventor); Bacak, James W. (Inventor); Salazar, Frank (Inventor)

    2000-01-01

    An automated propellant blending apparatus and method that uses closely metered addition of countersolvent to a binder solution with propellant particles dispersed therein to precisely control binder precipitation and particle aggregation is discussed. A profile of binder precipitation versus countersolvent-solvent ratio is established empirically and used in a computer algorithm to establish countersolvent addition parameters near the cloud point for controlling the transition of properties of the binder during agglomeration and finishing of the propellant composition particles. The system is remotely operated by computer for safety, reliability and improved product properties, and also increases product output.

  10. Automated Propellant Blending

    NASA Technical Reports Server (NTRS)

    Hohmann, Carl W. (Inventor); Harrington, Douglas W. (Inventor); Dutton, Maureen L. (Inventor); Tipton, Billy Charles, Jr. (Inventor); Bacak, James W. (Inventor); Salazar, Frank (Inventor)

    1999-01-01

    An automated propellant blending apparatus and method uses closely metered addition of countersolvent to a binder solution with propellant particles dispersed therein to precisely control binder precipitation and particle aggregation. A profile of binder precipitation versus countersolvent-solvent ratio is established empirically and used in a computer algorithm to establish countersolvent addition parameters near the cloud point for controlling the transition of properties of the binder during agglomeration and finishing of the propellant composition particles. The system is remotely operated by computer for safety, reliability and improved product properties, and also increases product output.

  11. The Automated Medical Office

    PubMed Central

    Petreman, Mel

    1990-01-01

    With shock and surprise many physicians learned in the 1980s that they must change the way they do business. Competition for patients, increasing government regulation, and the rapidly escalating risk of litigation forces physicians to seek modern remedies in office management. The author describes a medical clinic that strives to be paperless using electronic innovation to solve the problems of medical practice management. A computer software program to automate information management in a clinic shows that practical thinking linked to advanced technology can greatly improve office efficiency. PMID:21233899

  12. Automated Hazard Analysis

    2003-06-26

    The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control andmore » job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the work planning process.« less

  13. The automated medical office.

    PubMed

    Petreman, M

    1990-08-01

    With shock and surprise many physicians learned in the 1980s that they must change the way they do business. Competition for patients, increasing government regulation, and the rapidly escalating risk of litigation forces physicians to seek modern remedies in office management. The author describes a medical clinic that strives to be paperless using electronic innovation to solve the problems of medical practice management. A computer software program to automate information management in a clinic shows that practical thinking linked to advanced technology can greatly improve office efficiency.

  14. World-wide distribution automation systems

    SciTech Connect

    Devaney, T.M.

    1994-12-31

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems.

  15. Numerical analysis of stiffened shells of revolution. Volume 2: Users' manual for STAR-02S - shell theory automated for rotational structures - 2 (statics), digital computer program

    NASA Technical Reports Server (NTRS)

    Svalbonas, V.

    1973-01-01

    A procedure for the structural analysis of stiffened shells of revolution is presented. A digital computer program based on the Love-Reissner first order shell theory was developed. The computer program can analyze orthotropic thin shells of revolution, subjected to unsymmetric distributed loading or concentrated line loads, as well as thermal strains. The geometrical shapes of the shells which may be analyzed are described. The shell wall cross section can be a sheet, sandwich, or reinforced sheet or sandwich. General stiffness input options are also available.

  16. Automated System Marketplace 1995: The Changing Face of Automation.

    ERIC Educational Resources Information Center

    Barry, Jeff; And Others

    1995-01-01

    Discusses trends in the automated system marketplace with specific attention to online vendors and their customers: academic, public, school, and special libraries. Presents vendor profiles; tables and charts on computer systems and sales; and sidebars that include a vendor source list and the differing views on procuring an automated library…

  17. The Center for Optimized Structural Studies (COSS) platform for automation in cloning, expression, and purification of single proteins and protein-protein complexes.

    PubMed

    Mlynek, Georg; Lehner, Anita; Neuhold, Jana; Leeb, Sarah; Kostan, Julius; Charnagalov, Alexej; Stolt-Bergner, Peggy; Djinović-Carugo, Kristina; Pinotsis, Nikos

    2014-06-01

    Expression in Escherichia coli represents the simplest and most cost effective means for the production of recombinant proteins. This is a routine task in structural biology and biochemistry where milligrams of the target protein are required in high purity and monodispersity. To achieve these criteria, the user often needs to screen several constructs in different expression and purification conditions in parallel. We describe a pipeline, implemented in the Center for Optimized Structural Studies, that enables the systematic screening of expression and purification conditions for recombinant proteins and relies on a series of logical decisions. We first use bioinformatics tools to design a series of protein fragments, which we clone in parallel, and subsequently screen in small scale for optimal expression and purification conditions. Based on a scoring system that assesses soluble expression, we then select the top ranking targets for large-scale purification. In the establishment of our pipeline, emphasis was put on streamlining the processes such that it can be easily but not necessarily automatized. In a typical run of about 2 weeks, we are able to prepare and perform small-scale expression screens for 20-100 different constructs followed by large-scale purification of at least 4-6 proteins. The major advantage of our approach is its flexibility, which allows for easy adoption, either partially or entirely, by any average hypothesis driven laboratory in a manual or robot-assisted manner.

  18. Maneuver Automation Software

    NASA Technical Reports Server (NTRS)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; Illsley, Jeannette

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  19. Space station advanced automation

    NASA Technical Reports Server (NTRS)

    Woods, Donald

    1990-01-01

    In the development of a safe, productive and maintainable space station, Automation and Robotics (A and R) has been identified as an enabling technology which will allow efficient operation at a reasonable cost. The Space Station Freedom's (SSF) systems are very complex, and interdependent. The usage of Advanced Automation (AA) will help restructure, and integrate system status so that station and ground personnel can operate more efficiently. To use AA technology for the augmentation of system management functions requires a development model which consists of well defined phases of: evaluation, development, integration, and maintenance. The evaluation phase will consider system management functions against traditional solutions, implementation techniques and requirements; the end result of this phase should be a well developed concept along with a feasibility analysis. In the development phase the AA system will be developed in accordance with a traditional Life Cycle Model (LCM) modified for Knowledge Based System (KBS) applications. A way by which both knowledge bases and reasoning techniques can be reused to control costs is explained. During the integration phase the KBS software must be integrated with conventional software, and verified and validated. The Verification and Validation (V and V) techniques applicable to these KBS are based on the ideas of consistency, minimal competency, and graph theory. The maintenance phase will be aided by having well designed and documented KBS software.

  20. Automated office blood pressure.

    PubMed

    Myers, Martin G; Godwin, Marshall

    2012-05-01

    Manual blood pressure (BP) is gradually disappearing from clinical practice with the mercury sphygmomanometer now considered to be an environmental hazard. Manual BP is also subject to measurement error on the part of the physician/nurse and patient-related anxiety which can result in poor quality BP measurements and office-induced (white coat) hypertension. Automated office (AO) BP with devices such as the BpTRU (BpTRU Medical Devices, Coquitlam, BC) has already replaced conventional manual BP in many primary care practices in Canada and has also attracted interest in other countries where research studies using AOBP have been undertaken. The basic principles of AOBP include multiple readings taken with a fully automated recorder with the patient resting alone in a quiet room. When these principles are followed, office-induced hypertension is eliminated and AOBP exhibits a much stronger correlation with the awake ambulatory BP as compared with routine manual BP measurements. Unlike routine manual BP, AOBP correlates as well with left ventricular mass as does the awake ambulatory BP. AOBP also simplifies the definition of hypertension in that the cut point for a normal AOBP (< 135/85 mm Hg) is the same as for the awake ambulatory BP and home BP. This article summarizes the currently available evidence supporting the use of AOBP in routine clinical practice and proposes an algorithm in which AOBP replaces manual BP for the diagnosis and management of hypertension. PMID:22265230

  1. Automated classification of antibody complementarity determining region 3 of the heavy chain (H3) loops into canonical forms and its application to protein structure prediction.

    PubMed

    Oliva, B; Bates, P A; Querol, E; Avilés, F X; Sternberg, M J

    1998-06-26

    A computer-based algorithm was used to cluster the loops forming the complementarity determining region (CDR) 3 of the heavy chain (H3) into canonical classes. Previous analyses of the three-dimensional structures of CDR loops (also known as the hypervariable regions) within antibody immunoglobulin variable domains have shown that for five of the six CDRs there are only a few main-chain conformations (known as canonical forms) that show clear relationships between sequence and structure. However, the larger variation in length and conformation of loops within H3 has limited the classification of these loops into canonical forms. The clustering procedure presented here is based on aligning the Ramachandran-coded main-chain conformation of the residues using a dynamic algorithm that allows the insertion of gaps to obtain an optimum alignment. A total of 41 H3 loops out of 62 non-identical loops, extracted from the Brookhaven Protein Data Bank, have been automatically grouped into 22 clusters. Inspection of the clusters for consensus sequences or intra-loop interactions or invariant conformation led to the proposal of 13 canonical forms representing 31 loops. These canonical forms include a consideration of the geometry of both the take-off region adjacent to the bracing beta-strands and the remaining loop apex. Subsequently a new set of 15 H3 loops not included in the initial analysis was considered. The clustering procedure was repeated and nine of these 15 loops could be assigned to original clusters, including seven to canonical forms. A sequence profile was generated for each canonical form from the original set of loops and matched against the sequences of the new H3 loops. For five out of the seven new H3 loops that were in a canonical form, the correct form was identified at first rank by this predictive scheme. PMID:9642095

  2. Computer automated design and computer automated manufacture.

    PubMed

    Brncick, M

    2000-08-01

    The introduction of computer aided design and computer aided manufacturing into the field of prosthetics and orthotics did not arrive without concern. Many prosthetists feared that the computer would provide other allied health practitioners who had little or no experience in prosthetics the ability to fit and manage amputees. Technicians in the field felt their jobs may be jeopardized by automated fabrication techniques. This has not turned out to be the case. Prosthetists who use CAD-CAM techniques are finding they have more time for patient care and clinical assessment. CAD-CAM is another tool for them to provide better care for the patients/clients they serve. One of the factors that deterred the acceptance of CAD-CAM techniques in its early stages was that of cost. It took a significant investment in software and hardware for the prosthetists to begin to use the new systems. This new technique was not reimbursed by insurance coverage. Practitioners did not have enough information about this new technique to make a sound decision on their investment of time and money. Ironically, it is the need to hold health care costs down that may prove to be the catalyst for the increased use of CAD-CAM in the field. Providing orthoses and prostheses to patients who require them is a very labor intensive process. Practitioners are looking for better, faster, and more economical ways in which to provide their services under the pressure of managed care. CAD-CAM may be the answer. The author foresees shape sensing departments in hospitals where patients would be sent to be digitized, similar to someone going for radiograph or ultrasound. Afterwards, an orthosis or prosthesis could be provided from a central fabrication facility at a remote site, most likely on the same day. Not long ago, highly skilled practitioners with extensive technical ability would custom make almost every orthosis. One now practices in an atmosphere where off-the-shelf orthoses are the standard. This

  3. A Demonstration of Automated DNA Sequencing.

    ERIC Educational Resources Information Center

    Latourelle, Sandra; Seidel-Rogol, Bonnie

    1998-01-01

    Details a simulation that employs a paper-and-pencil model to demonstrate the principles behind automated DNA sequencing. Discusses the advantages of automated sequencing as well as the chemistry of automated DNA sequencing. (DDR)

  4. Robotics/Automated Systems Technicians.

    ERIC Educational Resources Information Center

    Doty, Charles R.

    Major resources exist that can be used to develop or upgrade programs in community colleges and technical institutes that educate robotics/automated systems technicians. The first category of resources is Economic, Social, and Education Issues. The Office of Technology Assessment (OTA) report, "Automation and the Workplace," presents analyses of…

  5. Automated Test-Form Generation

    ERIC Educational Resources Information Center

    van der Linden, Wim J.; Diao, Qi

    2011-01-01

    In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…

  6. Opening up Library Automation Software

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2009-01-01

    Throughout the history of library automation, the author has seen a steady advancement toward more open systems. In the early days of library automation, when proprietary systems dominated, the need for standards was paramount since other means of inter-operability and data exchange weren't possible. Today's focus on Application Programming…

  7. Automated Power-Distribution System

    NASA Technical Reports Server (NTRS)

    Ashworth, Barry; Riedesel, Joel; Myers, Chris; Miller, William; Jones, Ellen F.; Freeman, Kenneth; Walsh, Richard; Walls, Bryan K.; Weeks, David J.; Bechtel, Robert T.

    1992-01-01

    Autonomous power-distribution system includes power-control equipment and automation equipment. System automatically schedules connection of power to loads and reconfigures itself when it detects fault. Potential terrestrial applications include optimization of consumption of power in homes, power supplies for autonomous land vehicles and vessels, and power supplies for automated industrial processes.

  8. Automating a clinical management system.

    PubMed

    Gordon, B; Braun, D

    1990-06-01

    Automating the clinical documentation of a home health care agency will prove crucial as the industry continues to grow and becomes increasingly complex. Kimberly Quality Care, a large, multi-office home care company, made a major commitment to the automation of its clinical management documents.

  9. Translation: Aids, Robots, and Automation.

    ERIC Educational Resources Information Center

    Andreyewsky, Alexander

    1981-01-01

    Examines electronic aids to translation both as ways to automate it and as an approach to solve problems resulting from shortage of qualified translators. Describes the limitations of robotic MT (Machine Translation) systems, viewing MAT (Machine-Aided Translation) as the only practical solution and the best vehicle for further automation. (MES)

  10. Progress Toward Automated Cost Estimation

    NASA Technical Reports Server (NTRS)

    Brown, Joseph A.

    1992-01-01

    Report discusses efforts to develop standard system of automated cost estimation (ACE) and computer-aided design (CAD). Advantage of system is time saved and accuracy enhanced by automating extraction of quantities from design drawings, consultation of price lists, and application of cost and markup formulas.

  11. Automated Circulation. SPEC Kit 43.

    ERIC Educational Resources Information Center

    Association of Research Libraries, Washington, DC. Office of Management Studies.

    Of the 64 libraries responding to a 1978 Association of Research Libraries (ARL) survey, 37 indicated that they used automated circulation systems; half of these were commercial systems, and most were batch-process or combination batch process and online. Nearly all libraries without automated systems cited lack of funding as the reason for not…

  12. Automated Desalting Apparatus

    NASA Technical Reports Server (NTRS)

    Spencer, Maegan K.; Liu, De-Ling; Kanik, Isik; Beegle, Luther

    2010-01-01

    Because salt and metals can mask the signature of a variety of organic molecules (like amino acids) in any given sample, an automated system to purify complex field samples has been created for the analytical techniques of electrospray ionization/ mass spectroscopy (ESI/MS), capillary electrophoresis (CE), and biological assays where unique identification requires at least some processing of complex samples. This development allows for automated sample preparation in the laboratory and analysis of complex samples in the field with multiple types of analytical instruments. Rather than using tedious, exacting protocols for desalting samples by hand, this innovation, called the Automated Sample Processing System (ASPS), takes analytes that have been extracted through high-temperature solvent extraction and introduces them into the desalting column. After 20 minutes, the eluent is produced. This clear liquid can then be directly analyzed by the techniques listed above. The current apparatus including the computer and power supplies is sturdy, has an approximate mass of 10 kg, and a volume of about 20 20 20 cm, and is undergoing further miniaturization. This system currently targets amino acids. For these molecules, a slurry of 1 g cation exchange resin in deionized water is packed into a column of the apparatus. Initial generation of the resin is done by flowing sequentially 2.3 bed volumes of 2N NaOH and 2N HCl (1 mL each) to rinse the resin, followed by .5 mL of deionized water. This makes the pH of the resin near neutral, and eliminates cross sample contamination. Afterward, 2.3 mL of extracted sample is then loaded into the column onto the top of the resin bed. Because the column is packed tightly, the sample can be applied without disturbing the resin bed. This is a vital step needed to ensure that the analytes adhere to the resin. After the sample is drained, oxalic acid (1 mL, pH 1.6-1.8, adjusted with NH4OH) is pumped into the column. Oxalic acid works as a

  13. Automated spectral classification and the GAIA project

    NASA Technical Reports Server (NTRS)

    Lasala, Jerry; Kurtz, Michael J.

    1995-01-01

    Two dimensional spectral types for each of the stars observed in the global astrometric interferometer for astrophysics (GAIA) mission would provide additional information for the galactic structure and stellar evolution studies, as well as helping in the identification of unusual objects and populations. The classification of the large quantity generated spectra requires that automated techniques are implemented. Approaches for the automatic classification are reviewed, and a metric-distance method is discussed. In tests, the metric-distance method produced spectral types with mean errors comparable to those of human classifiers working at similar resolution. Data and equipment requirements for an automated classification survey, are discussed. A program of auxiliary observations is proposed to yield spectral types and radial velocities for the GAIA-observed stars.

  14. Automation impact study of Army Training Management

    SciTech Connect

    Sanquist, T.F.; Schuller, C.R.; McCallum, M.C.; Underwood, J.A.; Bettin, P.J.; King, J.L.; Melber, B.D.; Hostick, C.J.; Seaver, D.A.

    1988-01-01

    The main objectives of this impact study were to identify the potential cost savings associated with automated Army Training Management (TM), and to perform a cost-benefit analysis for an Army-wide automated TM system. A subsidiary goal was to establish baseline data for an independent evaluation of a prototype Integrated Training Management System (ITMS), to be tested in the fall of 1988. A structured analysis of TM doctrine was performed for comparison with empirical data gathered in a job analysis survey of selected units of the 9ID (MTZ) at Ft. Lewis, Washington. These observations will be extended to other units in subsequent surveys. The survey data concerning staffing levels and amount of labor expended on eight distinct TM tasks were analyzed in a cost effectiveness model. The main results of the surveys and cost effectiveness modelling are summarized. 18 figs., 47 tabs.

  15. Flexible automation of cell culture and tissue engineering tasks.

    PubMed

    Knoll, Alois; Scherer, Torsten; Poggendorf, Iris; Lütkemeyer, Dirk; Lehmann, Jürgen

    2004-01-01

    Until now, the predominant use cases of industrial robots have been routine handling tasks in the automotive industry. In biotechnology and tissue engineering, in contrast, only very few tasks have been automated with robots. New developments in robot platform and robot sensor technology, however, make it possible to automate plants that largely depend on human interaction with the production process, e.g., for material and cell culture fluid handling, transportation, operation of equipment, and maintenance. In this paper we present a robot system that lends itself to automating routine tasks in biotechnology but also has the potential to automate other production facilities that are similar in process structure. After motivating the design goals, we describe the system and its operation, illustrate sample runs, and give an assessment of the advantages. We conclude this paper by giving an outlook on possible further developments.

  16. Flexible automation of cell culture and tissue engineering tasks.

    PubMed

    Knoll, Alois; Scherer, Torsten; Poggendorf, Iris; Lütkemeyer, Dirk; Lehmann, Jürgen

    2004-01-01

    Until now, the predominant use cases of industrial robots have been routine handling tasks in the automotive industry. In biotechnology and tissue engineering, in contrast, only very few tasks have been automated with robots. New developments in robot platform and robot sensor technology, however, make it possible to automate plants that largely depend on human interaction with the production process, e.g., for material and cell culture fluid handling, transportation, operation of equipment, and maintenance. In this paper we present a robot system that lends itself to automating routine tasks in biotechnology but also has the potential to automate other production facilities that are similar in process structure. After motivating the design goals, we describe the system and its operation, illustrate sample runs, and give an assessment of the advantages. We conclude this paper by giving an outlook on possible further developments. PMID:15575718

  17. Automated Analysis Workstation

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Information from NASA Tech Briefs of work done at Langley Research Center and the Jet Propulsion Laboratory assisted DiaSys Corporation in manufacturing their first product, the R/S 2000. Since then, the R/S 2000 and R/S 2003 have followed. Recently, DiaSys released their fourth workstation, the FE-2, which automates the process of making and manipulating wet-mount preparation of fecal concentrates. The time needed to read the sample is decreased, permitting technologists to rapidly spot parasites, ova and cysts, sometimes carried in the lower intestinal tract of humans and animals. Employing the FE-2 is non-invasive, can be performed on an out-patient basis, and quickly provides confirmatory results.

  18. Robust automated knowledge capture.

    SciTech Connect

    Stevens-Adams, Susan Marie; Abbott, Robert G.; Forsythe, James Chris; Trumbo, Michael Christopher Stefan; Haass, Michael Joseph; Hendrickson, Stacey M. Langfitt

    2011-10-01

    This report summarizes research conducted through the Sandia National Laboratories Robust Automated Knowledge Capture Laboratory Directed Research and Development project. The objective of this project was to advance scientific understanding of the influence of individual cognitive attributes on decision making. The project has developed a quantitative model known as RumRunner that has proven effective in predicting the propensity of an individual to shift strategies on the basis of task and experience related parameters. Three separate studies are described which have validated the basic RumRunner model. This work provides a basis for better understanding human decision making in high consequent national security applications, and in particular, the individual characteristics that underlie adaptive thinking.

  19. Automated Defect Classification (ADC)

    1998-01-01

    The ADC Software System is designed to provide semiconductor defect feature analysis and defect classification capabilities. Defect classification is an important software method used by semiconductor wafer manufacturers to automate the analysis of defect data collected by a wide range of microscopy techniques in semiconductor wafer manufacturing today. These microscopies (e.g., optical bright and dark field, scanning electron microscopy, atomic force microscopy, etc.) generate images of anomalies that are induced or otherwise appear on wafermore » surfaces as a result of errant manufacturing processes or simple atmospheric contamination (e.g., airborne particles). This software provides methods for analyzing these images, extracting statistical features from the anomalous regions, and applying supervised classifiers to label the anomalies into user-defined categories.« less

  20. Health care automation companies.

    PubMed

    1995-12-01

    Health care automation companies: card transaction processing/EFT/EDI-capable banks; claims auditing/analysis; claims processors/clearinghouses; coding products/services; computer hardware; computer networking/LAN/WAN; consultants; data processing/outsourcing; digital dictation/transcription; document imaging/optical disk storage; executive information systems; health information networks; hospital/health care information systems; interface engines; laboratory information systems; managed care information systems; patient identification/credit cards; pharmacy information systems; POS terminals; radiology information systems; software--claims related/computer-based patient records/home health care/materials management/supply ordering/physician practice management/translation/utilization review/outcomes; telecommunications products/services; telemedicine/teleradiology; value-added networks. PMID:10153839

  1. Expedition automated flow fluorometer

    NASA Astrophysics Data System (ADS)

    Krikun, V. A.; Salyuk, P. A.

    2015-11-01

    This paper describes an apparatus and operation of automated flow-through dual-channel fluorometer for studying the fluorescence of dissolved organic matter, and the fluorescence of phytoplankton cells with open and closed reaction centers in sea areas with oligotrophic and eutrophic water type. The step-by step excitation by two semiconductor lasers or two light-emitting diodes is realized in the current device. The excitation wavelengths are 405nm and 532nm in the default configuration. Excitation radiation of each light source can be changed with different durations, intensities and repetition rate. Registration of the fluorescence signal carried out by two photo-multipliers with different optical filters of 580-600 nm and 680-700 nm band pass diapasons. The configuration of excitation sources and spectral diapasons of registered radiation can be changed due to decided tasks.

  2. Automated external defibrillators (AEDs).

    PubMed

    2003-06-01

    Automated external defibrillators, or AEDs, will automatically analyze a patient's ECG and, if needed, deliver a defibrillating shock to the heart. We sometimes refer to these devices as AED-only devices or stand-alone AEDs. The basic function of AEDs is similar to that of defibrillator/monitors, but AEDs lack their advanced capabilities and generally don't allow manual defibrillation. A device that functions strictly as an AED is intended to be used by basic users only. Such devices are often referred to as public access defibrillators. In this Evaluation, we present our findings for a newly evaluated model, the Zoll AED Plus. We also summarize our findings for the previously evaluated model that is still on the market and describe other AEDs that are also available but that we haven't evaluated. We rate the models collectively for first-responder use and public access defibrillation (PAD) applications.

  3. Health care automation companies.

    PubMed

    1995-12-01

    Health care automation companies: card transaction processing/EFT/EDI-capable banks; claims auditing/analysis; claims processors/clearinghouses; coding products/services; computer hardware; computer networking/LAN/WAN; consultants; data processing/outsourcing; digital dictation/transcription; document imaging/optical disk storage; executive information systems; health information networks; hospital/health care information systems; interface engines; laboratory information systems; managed care information systems; patient identification/credit cards; pharmacy information systems; POS terminals; radiology information systems; software--claims related/computer-based patient records/home health care/materials management/supply ordering/physician practice management/translation/utilization review/outcomes; telecommunications products/services; telemedicine/teleradiology; value-added networks.

  4. [From automation to robotics].

    PubMed

    1985-01-01

    The introduction of automation into the laboratory of biology seems to be unavoidable. But at which cost, if it is necessary to purchase a new machine for every new application? Fortunately the same image processing techniques, belonging to a theoretic framework called Mathematical Morphology, may be used in visual inspection tasks, both in car industry and in the biology lab. Since the market for industrial robotics applications is much higher than the market of biomedical applications, the price of image processing devices drops, and becomes sometimes less than the price of a complete microscope equipment. The power of the image processing methods of Mathematical Morphology will be illustrated by various examples, as automatic silver grain counting in autoradiography, determination of HLA genotype, electrophoretic gels analysis, automatic screening of cervical smears... Thus several heterogeneous applications may share the same image processing device, provided there is a separate and devoted work station for each of them.

  5. Berkeley automated supernova search

    SciTech Connect

    Kare, J.T.; Pennypacker, C.R.; Muller, R.A.; Mast, T.S.; Crawford, F.S.; Burns, M.S.

    1981-01-01

    The Berkeley automated supernova search employs a computer controlled 36-inch telescope and charge coupled device (CCD) detector to image 2500 galaxies per night. A dedicated minicomputer compares each galaxy image with stored reference data to identify supernovae in real time. The threshold for detection is m/sub v/ = 18.8. We plan to monitor roughly 500 galaxies in Virgo and closer every night, and an additional 6000 galaxies out to 70 Mpc on a three night cycle. This should yield very early detection of several supernovae per year for detailed study, and reliable premaximum detection of roughly 100 supernovae per year for statistical studies. The search should be operational in mid-1982.

  6. Automating Frame Analysis

    SciTech Connect

    Sanfilippo, Antonio P.; Franklin, Lyndsey; Tratz, Stephen C.; Danielson, Gary R.; Mileson, Nicholas D.; Riensche, Roderick M.; McGrath, Liam

    2008-04-01

    Frame Analysis has come to play an increasingly stronger role in the study of social movements in Sociology and Political Science. While significant steps have been made in providing a theory of frames and framing, a systematic characterization of the frame concept is still largely lacking and there are no rec-ognized criteria and methods that can be used to identify and marshal frame evi-dence reliably and in a time and cost effective manner. Consequently, current Frame Analysis work is still too reliant on manual annotation and subjective inter-pretation. The goal of this paper is to present an approach to the representation, acquisition and analysis of frame evidence which leverages Content Analysis, In-formation Extraction and Semantic Search methods to provide a systematic treat-ment of a Frame Analysis and automate frame annotation.

  7. Automated calorimeter testing system

    SciTech Connect

    Rodenburg, W.W.; James, S.J.

    1990-01-01

    The Automated Calorimeter Testing System (ACTS) is a portable measurement device that provides an independent measurement of all critical parameters of a calorimeter system. The ACTS was developed to improve productivity and performance of Mound-produced calorimeters. With ACTS, an individual with minimal understanding of calorimetry operation can perform a consistent set of diagnostic measurements on the system. The operator can identify components whose performance has deteriorated by a simple visual comparison of the current data plots with previous measurements made when the system was performing properly. Thus, downtime and out of control'' situations can be reduced. Should a system malfunction occur, a flowchart of troubleshooting procedures has been developed to facilitate quick identification of the malfunctioning component. If diagnosis is beyond the capability of the operator, the ACTS provides a consistent set of test data for review by a knowledgeable expert. The first field test was conducted at the Westinghouse Savannah River Site in early 1990. 6 figs.

  8. Automated attendance accounting system

    NASA Technical Reports Server (NTRS)

    Chapman, C. P. (Inventor)

    1973-01-01

    An automated accounting system useful for applying data to a computer from any or all of a multiplicity of data terminals is disclosed. The system essentially includes a preselected number of data terminals which are each adapted to convert data words of decimal form to another form, i.e., binary, usable with the computer. Each data terminal may take the form of a keyboard unit having a number of depressable buttons or switches corresponding to selected data digits and/or function digits. A bank of data buffers, one of which is associated with each data terminal, is provided as a temporary storage. Data from the terminals is applied to the data buffers on a digit by digit basis for transfer via a multiplexer to the computer.

  9. Automated Defect Classification (ADC)

    SciTech Connect

    1998-01-01

    The ADC Software System is designed to provide semiconductor defect feature analysis and defect classification capabilities. Defect classification is an important software method used by semiconductor wafer manufacturers to automate the analysis of defect data collected by a wide range of microscopy techniques in semiconductor wafer manufacturing today. These microscopies (e.g., optical bright and dark field, scanning electron microscopy, atomic force microscopy, etc.) generate images of anomalies that are induced or otherwise appear on wafer surfaces as a result of errant manufacturing processes or simple atmospheric contamination (e.g., airborne particles). This software provides methods for analyzing these images, extracting statistical features from the anomalous regions, and applying supervised classifiers to label the anomalies into user-defined categories.

  10. Automating the analytical laboratory via the Chemical Analysis Automation paradigm

    SciTech Connect

    Hollen, R.; Rzeszutko, C.

    1997-10-01

    To address the need for standardization within the analytical chemistry laboratories of the nation, the Chemical Analysis Automation (CAA) program within the US Department of Energy, Office of Science and Technology`s Robotic Technology Development Program is developing laboratory sample analysis systems that will automate the environmental chemical laboratories. The current laboratory automation paradigm consists of islands-of-automation that do not integrate into a system architecture. Thus, today the chemist must perform most aspects of environmental analysis manually using instrumentation that generally cannot communicate with other devices in the laboratory. CAA is working towards a standardized and modular approach to laboratory automation based upon the Standard Analysis Method (SAM) architecture. Each SAM system automates a complete chemical method. The building block of a SAM is known as the Standard Laboratory Module (SLM). The SLM, either hardware or software, automates a subprotocol of an analysis method and can operate as a standalone or as a unit within a SAM. The CAA concept allows the chemist to easily assemble an automated analysis system, from sample extraction through data interpretation, using standardized SLMs without the worry of hardware or software incompatibility or the necessity of generating complicated control programs. A Task Sequence Controller (TSC) software program schedules and monitors the individual tasks to be performed by each SLM configured within a SAM. The chemist interfaces with the operation of the TSC through the Human Computer Interface (HCI), a logical, icon-driven graphical user interface. The CAA paradigm has successfully been applied in automating EPA SW-846 Methods 3541/3620/8081 for the analysis of PCBs in a soil matrix utilizing commercially available equipment in tandem with SLMs constructed by CAA.

  11. Automated imatinib immunoassay

    PubMed Central

    Beumer, Jan H.; Kozo, Daniel; Harney, Rebecca L.; Baldasano, Caitlin N.; Jarrah, Justin; Christner, Susan M.; Parise, Robert; Baburina, Irina; Courtney, Jodi B.; Salamone, Salvatore J.

    2014-01-01

    Background Imatinib pharmacokinetic variability and the relationship of trough concentrations with clinical outcomes have been extensively reported. Though physical methods to quantitate imatinib exist, they are not widely available for routine use. An automated homogenous immunoassay for imatinib has been developed, facilitating routine imatinib testing. Methods Imatinib-selective monoclonal antibodies, without substantial cross-reactivity to the N-desmethyl metabolite or N-desmethyl conjugates, were produced. The antibodies were conjugated to 200 nm particles to develop immunoassay reagents on the Beckman Coulter AU480™ analyzer. These reagents were analytically validated using Clinical Laboratory Standards Institute protocols. Method comparison to LC-MS/MS was conducted using 77 plasma samples collected from subjects receiving imatinib. Results The assay requires 4 µL of sample without pre-treatment. The non-linear calibration curve ranges from 0 to 3,000 ng/mL. With automated sample dilution, concentrations of up to 9,000 ng/mL can be quantitated. The AU480 produces the first result in 10 minutes, and up to 400 tests per hour. Repeatability ranged from 2.0 to 6.0% coefficient of variation (CV), and within-laboratory reproducibility ranged from 2.9 to 7.4% CV. Standard curve stability was two weeks and on-board reagent stability was 6 weeks. For clinical samples with imatinib concentrations from 438 – 2,691 ng/mL, method comparison with LC-MS/MS gave a slope of 0.995 with a y-intercept of 24.3 and a correlation coefficient of 0.978. Conclusion The immunoassay is suitable for quantitating imatinib in human plasma, demonstrating good correlation with a physical method. Testing for optimal imatinib exposure can now be performed on routine clinical analyzers. PMID:25551407

  12. Automated Cognome Construction and Semi-automated Hypothesis Generation

    PubMed Central

    Voytek, Jessica B.; Voytek, Bradley

    2012-01-01

    Modern neuroscientific research stands on the shoulders of countless giants. PubMed alone contains more than 21 million peer-reviewed articles with 40–50,000 more published every month. Understanding the human brain, cognition, and disease will require integrating facts from dozens of scientific fields spread amongst millions of studies locked away in static documents, making any such integration daunting, at best. The future of scientific progress will be aided by bridging the gap between the millions of published research articles and modern databases such as the Allen Brain Atlas (ABA). To that end, we have analyzed the text of over 3.5 million scientific abstracts to find associations between neuroscientific concepts. From the literature alone, we show that we can blindly and algorithmically extract a “cognome”: relationships between brain structure, function, and disease. We demonstrate the potential of data-mining and cross-platform data-integration with the ABA by introducing two methods for semiautomated hypothesis generation. By analyzing statistical “holes” and discrepancies in the literature we can find understudied or overlooked research paths. That is, we have added a layer of semi-automation to a part of the scientific process itself. This is an important step toward fundamentally incorporating data-mining algorithms into the scientific method in a manner that is generalizable to any scientific or medical field. PMID:22584238

  13. The historical development and basis of human factors guidelines for automated systems in aeronautical operations

    NASA Technical Reports Server (NTRS)

    Ciciora, J. A.; Leonard, S. D.; Johnson, N.; Amell, J.

    1984-01-01

    In order to derive general design guidelines for automated systems a study was conducted on the utilization and acceptance of existing automated systems as currently employed in several commercial fields. Four principal study area were investigated by means of structured interviews, and in some cases questionnaires. The study areas were aviation, a both scheduled airline and general commercial aviation; process control and factory applications; office automation; and automation in the power industry. The results of over eighty structured interviews were analyzed and responses categoried as various human factors issues for use by both designers and users of automated equipment. These guidelines address such items as general physical features of automated equipment; personnel orientation, acceptance, and training; and both personnel and system reliability.

  14. Automated Fluid Interface System (AFIS)

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Automated remote fluid servicing will be necessary for future space missions, as future satellites will be designed for on-orbit consumable replenishment. In order to develop an on-orbit remote servicing capability, a standard interface between a tanker and the receiving satellite is needed. The objective of the Automated Fluid Interface System (AFIS) program is to design, fabricate, and functionally demonstrate compliance with all design requirements for an automated fluid interface system. A description and documentation of the Fairchild AFIS design is provided.

  15. Automated Engineering Design (AED); An approach to automated documentation

    NASA Technical Reports Server (NTRS)

    Mcclure, C. W.

    1970-01-01

    The automated engineering design (AED) is reviewed, consisting of a high level systems programming language, a series of modular precoded subroutines, and a set of powerful software machine tools that effectively automate the production and design of new languages. AED is used primarily for development of problem and user-oriented languages. Software production phases are diagramed, and factors which inhibit effective documentation are evaluated.

  16. Automating Ontological Annotation with WordNet

    SciTech Connect

    Sanfilippo, Antonio P.; Tratz, Stephen C.; Gregory, Michelle L.; Chappell, Alan R.; Whitney, Paul D.; Posse, Christian; Paulson, Patrick R.; Baddeley, Bob L.; Hohimer, Ryan E.; White, Amanda M.

    2006-01-22

    Semantic Web applications require robust and accurate annotation tools that are capable of automating the assignment of ontological classes to words in naturally occurring text (ontological annotation). Most current ontologies do not include rich lexical databases and are therefore not easily integrated with word sense disambiguation algorithms that are needed to automate ontological annotation. WordNet provides a potentially ideal solution to this problem as it offers a highly structured lexical conceptual representation that has been extensively used to develop word sense disambiguation algorithms. However, WordNet has not been designed as an ontology, and while it can be easily turned into one, the result of doing this would present users with serious practical limitations due to the great number of concepts (synonym sets) it contains. Moreover, mapping WordNet to an existing ontology may be difficult and requires substantial labor. We propose to overcome these limitations by developing an analytical platform that (1) provides a WordNet-based ontology offering a manageable and yet comprehensive set of concept classes, (2) leverages the lexical richness of WordNet to give an extensive characterization of concept class in terms of lexical instances, and (3) integrates a class recognition algorithm that automates the assignment of concept classes to words in naturally occurring text. The ensuing framework makes available an ontological annotation platform that can be effectively integrated with intelligence analysis systems to facilitate evidence marshaling and sustain the creation and validation of inference models.

  17. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  18. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  19. Fuzzy Control/Space Station automation

    NASA Technical Reports Server (NTRS)

    Gersh, Mark

    1990-01-01

    Viewgraphs on fuzzy control/space station automation are presented. Topics covered include: Space Station Freedom (SSF); SSF evolution; factors pointing to automation & robotics (A&R); astronaut office inputs concerning A&R; flight system automation and ground operations applications; transition definition program; and advanced automation software tools.

  20. 46 CFR 15.715 - Automated vessels.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Automated vessels. 15.715 Section 15.715 Shipping COAST... Limitations and Qualifying Factors § 15.715 Automated vessels. (a) Coast Guard acceptance of automated systems... automated system in establishing initial manning levels; however, until the system is proven reliable,...

  1. 46 CFR 15.715 - Automated vessels.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 1 2012-10-01 2012-10-01 false Automated vessels. 15.715 Section 15.715 Shipping COAST... Limitations and Qualifying Factors § 15.715 Automated vessels. (a) Coast Guard acceptance of automated systems... automated system in establishing initial manning levels; however, until the system is proven reliable,...

  2. 46 CFR 15.715 - Automated vessels.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 1 2011-10-01 2011-10-01 false Automated vessels. 15.715 Section 15.715 Shipping COAST... Limitations and Qualifying Factors § 15.715 Automated vessels. (a) Coast Guard acceptance of automated systems... automated system in establishing initial manning levels; however, until the system is proven reliable,...

  3. 46 CFR 15.715 - Automated vessels.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 1 2014-10-01 2014-10-01 false Automated vessels. 15.715 Section 15.715 Shipping COAST... Limitations and Qualifying Factors § 15.715 Automated vessels. (a) Coast Guard acceptance of automated systems... automated system in establishing initial manning levels; however, until the system is proven reliable,...

  4. 46 CFR 15.715 - Automated vessels.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 1 2013-10-01 2013-10-01 false Automated vessels. 15.715 Section 15.715 Shipping COAST... Limitations and Qualifying Factors § 15.715 Automated vessels. (a) Coast Guard acceptance of automated systems... automated system in establishing initial manning levels; however, until the system is proven reliable,...

  5. Automated shell theory for rotating structures (ASTROS)

    NASA Technical Reports Server (NTRS)

    Foster, B. J.; Thomas, J. M.

    1971-01-01

    A computer program for analyzing axisymmetric shells with inertial forces caused by rotation about the shell axis is developed by revising the STARS II shell program. The basic capabilities of the STARS II shell program, such as the treatment of the branched shells, stiffened wall construction, and thermal gradients, are retained.

  6. Human factors in cockpit automation

    NASA Technical Reports Server (NTRS)

    Wiener, E. L.

    1984-01-01

    The rapid advance in microprocessor technology has made it possible to automate many functions that were previously performed manually. Several research areas have been identified which are basic to the question of the implementation of automation in the cockpit. One of the identified areas deserving further research is warning and alerting systems. Modern transport aircraft have had one after another warning and alerting systems added, and computer-based cockpit systems make it possible to add even more. Three major areas of concern are: input methods (including voice, keyboard, touch panel, etc.), output methods and displays (from traditional instruments to CRTs, to exotic displays including the human voice), and training for automation. Training for operating highly automatic systems requires considerably more attention than it has been given in the past. Training methods have not kept pace with the advent of flight-deck automation.

  7. Automating the Purple Crow Lidar

    NASA Astrophysics Data System (ADS)

    Hicks, Shannon; Sica, R. J.; Argall, P. S.

    2016-06-01

    The Purple Crow LiDAR (PCL) was built to measure short and long term coupling between the lower, middle, and upper atmosphere. The initial component of my MSc. project is to automate two key elements of the PCL: the rotating liquid mercury mirror and the Zaber alignment mirror. In addition to the automation of the Zaber alignment mirror, it is also necessary to describe the mirror's movement and positioning errors. Its properties will then be added into the alignment software. Once the alignment software has been completed, we will compare the new alignment method with the previous manual procedure. This is the first among several projects that will culminate in a fully-automated lidar. Eventually, we will be able to work remotely, thereby increasing the amount of data we collect. This paper will describe the motivation for automation, the methods we propose, preliminary results for the Zaber alignment error analysis, and future work.

  8. Real Automation in the Field

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Mayero, Micaela; Bushnell, Dennis M. (Technical Monitor)

    2001-01-01

    We provide a package of strategies for automation of non-linear arithmetic in PVS. In particular, we describe a simplication procedure for the field of real numbers and a strategy for cancellation of common terms.

  9. Genetic circuit design automation.

    PubMed

    Nielsen, Alec A K; Der, Bryan S; Shin, Jonghyeon; Vaidyanathan, Prashant; Paralanov, Vanya; Strychalski, Elizabeth A; Ross, David; Densmore, Douglas; Voigt, Christopher A

    2016-04-01

    Computation can be performed in living cells by DNA-encoded circuits that process sensory information and control biological functions. Their construction is time-intensive, requiring manual part assembly and balancing of regulator expression. We describe a design environment, Cello, in which a user writes Verilog code that is automatically transformed into a DNA sequence. Algorithms build a circuit diagram, assign and connect gates, and simulate performance. Reliable circuit design requires the insulation of gates from genetic context, so that they function identically when used in different circuits. We used Cello to design 60 circuits forEscherichia coli(880,000 base pairs of DNA), for which each DNA sequence was built as predicted by the software with no additional tuning. Of these, 45 circuits performed correctly in every output state (up to 10 regulators and 55 parts), and across all circuits 92% of the output states functioned as predicted. Design automation simplifies the incorporation of genetic circuits into biotechnology projects that require decision-making, control, sensing, or spatial organization.

  10. Automated Supernova Discovery (Abstract)

    NASA Astrophysics Data System (ADS)

    Post, R. S.

    2015-12-01

    (Abstract only) We are developing a system of robotic telescopes for automatic recognition of Supernovas as well as other transient events in collaboration with the Puckett Supernova Search Team. At the SAS2014 meeting, the discovery program, SNARE, was first described. Since then, it has been continuously improved to handle searches under a wide variety of atmospheric conditions. Currently, two telescopes are used to build a reference library while searching for PSN with a partial library. Since data is taken every night without clouds, we must deal with varying atmospheric and high background illumination from the moon. Software is configured to identify a PSN, reshoot for verification with options to change the run plan to acquire photometric or spectrographic data. The telescopes are 24-inch CDK24, with Alta U230 cameras, one in CA and one in NM. Images and run plans are sent between sites so the CA telescope can search while photometry is done in NM. Our goal is to find bright PSNs with magnitude 17.5 or less which is the limit of our planned spectroscopy. We present results from our first automated PSN discoveries and plans for PSN data acquisition.

  11. Genetic circuit design automation.

    PubMed

    Nielsen, Alec A K; Der, Bryan S; Shin, Jonghyeon; Vaidyanathan, Prashant; Paralanov, Vanya; Strychalski, Elizabeth A; Ross, David; Densmore, Douglas; Voigt, Christopher A

    2016-04-01

    Computation can be performed in living cells by DNA-encoded circuits that process sensory information and control biological functions. Their construction is time-intensive, requiring manual part assembly and balancing of regulator expression. We describe a design environment, Cello, in which a user writes Verilog code that is automatically transformed into a DNA sequence. Algorithms build a circuit diagram, assign and connect gates, and simulate performance. Reliable circuit design requires the insulation of gates from genetic context, so that they function identically when used in different circuits. We used Cello to design 60 circuits forEscherichia coli(880,000 base pairs of DNA), for which each DNA sequence was built as predicted by the software with no additional tuning. Of these, 45 circuits performed correctly in every output state (up to 10 regulators and 55 parts), and across all circuits 92% of the output states functioned as predicted. Design automation simplifies the incorporation of genetic circuits into biotechnology projects that require decision-making, control, sensing, or spatial organization. PMID:27034378

  12. Automated Gas Distribution System

    NASA Astrophysics Data System (ADS)

    Starke, Allen; Clark, Henry

    2012-10-01

    The cyclotron of Texas A&M University is one of the few and prized cyclotrons in the country. Behind the scenes of the cyclotron is a confusing, and dangerous setup of the ion sources that supplies the cyclotron with particles for acceleration. To use this machine there is a time consuming, and even wasteful step by step process of switching gases, purging, and other important features that must be done manually to keep the system functioning properly, while also trying to maintain the safety of the working environment. Developing a new gas distribution system to the ion source prevents many of the problems generated by the older manually setup process. This developed system can be controlled manually in an easier fashion than before, but like most of the technology and machines in the cyclotron now, is mainly operated based on software programming developed through graphical coding environment Labview. The automated gas distribution system provides multi-ports for a selection of different gases to decrease the amount of gas wasted through switching gases, and a port for the vacuum to decrease the amount of time spent purging the manifold. The Labview software makes the operation of the cyclotron and ion sources easier, and safer for anyone to use.

  13. Automated call tracking systems

    SciTech Connect

    Hardesty, C.

    1993-03-01

    User Services groups are on the front line with user support. We are the first to hear about problems. The speed, accuracy, and intelligence with which we respond determines the user`s perception of our effectiveness and our commitment to quality and service. To keep pace with the complex changes at our sites, we must have tools to help build a knowledge base of solutions, a history base of our users, and a record of every problem encountered. Recently, I completed a survey of twenty sites similar to the National Energy Research Supercomputer Center (NERSC). This informal survey reveals that 27% of the sites use a paper system to log calls, 60% employ homegrown automated call tracking systems, and 13% use a vendor-supplied system. Fifty-four percent of those using homegrown systems are exploring the merits of switching to a vendor-supplied system. The purpose of this paper is to provide guidelines for evaluating a call tracking system. In addition, insights are provided to assist User Services groups in selecting a system that fits their needs.

  14. Automated Microbial Metabolism Laboratory

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The Automated Microbial Metabolism Laboratory (AMML) 1971-1972 program involved the investigation of three separate life detection schemes. The first was a continued further development of the labeled release experiment. The possibility of chamber reuse without inbetween sterilization, to provide comparative biochemical information was tested. Findings show that individual substrates or concentrations of antimetabolites may be sequentially added to a single test chamber. The second detection system which was investigated for possible inclusion in the AMML package of assays, was nitrogen fixation as detected by acetylene reduction. Thirdly, a series of preliminary steps were taken to investigate the feasibility of detecting biopolymers in soil. A strategy for the safe return to Earth of a Mars sample prior to manned landings on Mars is outlined. The program assumes that the probability of indigenous life on Mars is unity and then broadly presents the procedures for acquisition and analysis of the Mars sample in a manner to satisfy the scientific community and the public that adequate safeguards are being taken.

  15. Technology modernization assessment flexible automation

    SciTech Connect

    Bennett, D.W.; Boyd, D.R.; Hansen, N.H.; Hansen, M.A.; Yount, J.A.

    1990-12-01

    The objectives of this report are: to present technology assessment guidelines to be considered in conjunction with defense regulations before an automation project is developed to give examples showing how assessment guidelines may be applied to a current project to present several potential areas where automation might be applied successfully in the depot system. Depots perform primarily repair and remanufacturing operations, with limited small batch manufacturing runs. While certain activities (such as Management Information Systems and warehousing) are directly applicable to either environment, the majority of applications will require combining existing and emerging technologies in different ways, with the special needs of depot remanufacturing environment. Industry generally enjoys the ability to make revisions to its product lines seasonally, followed by batch runs of thousands or more. Depot batch runs are in the tens, at best the hundreds, of parts with a potential for large variation in product mix; reconfiguration may be required on a week-to-week basis. This need for a higher degree of flexibility suggests a higher level of operator interaction, and, in turn, control systems that go beyond the state of the art for less flexible automation and industry in general. This report investigates the benefits and barriers to automation and concludes that, while significant benefits do exist for automation, depots must be prepared to carefully investigate the technical feasibility of each opportunity and the life-cycle costs associated with implementation. Implementation is suggested in two ways: (1) develop an implementation plan for automation technologies based on results of small demonstration automation projects; (2) use phased implementation for both these and later stage automation projects to allow major technical and administrative risk issues to be addressed. 10 refs., 2 figs., 2 tabs. (JF)

  16. Automated Power-Distribution System

    NASA Technical Reports Server (NTRS)

    Thomason, Cindy; Anderson, Paul M.; Martin, James A.

    1990-01-01

    Automated power-distribution system monitors and controls electrical power to modules in network. Handles both 208-V, 20-kHz single-phase alternating current and 120- to 150-V direct current. Power distributed to load modules from power-distribution control units (PDCU's) via subsystem distributors. Ring busses carry power to PDCU's from power source. Needs minimal attention. Detects faults and also protects against them. Potential applications include autonomous land vehicles and automated industrial process systems.

  17. Evolution paths for advanced automation

    NASA Technical Reports Server (NTRS)

    Healey, Kathleen J.

    1990-01-01

    As Space Station Freedom (SSF) evolves, increased automation and autonomy will be required to meet Space Station Freedom Program (SSFP) objectives. As a precursor to the use of advanced automation within the SSFP, especially if it is to be used on SSF (e.g., to automate the operation of the flight systems), the underlying technologies will need to be elevated to a high level of readiness to ensure safe and effective operations. Ground facilities supporting the development of these flight systems -- from research and development laboratories through formal hardware and software development environments -- will be responsible for achieving these levels of technology readiness. These facilities will need to evolve support the general evolution of the SSFP. This evolution will include support for increasing the use of advanced automation. The SSF Advanced Development Program has funded a study to define evolution paths for advanced automaton within the SSFP's ground-based facilities which will enable, promote, and accelerate the appropriate use of advanced automation on-board SSF. The current capability of the test beds and facilities, such as the Software Support Environment, with regard to advanced automation, has been assessed and their desired evolutionary capabilities have been defined. Plans and guidelines for achieving this necessary capability have been constructed. The approach taken has combined indepth interviews of test beds personnel at all SSF Work Package centers with awareness of relevant state-of-the-art technology and technology insertion methodologies. Key recommendations from the study include advocating a NASA-wide task force for advanced automation, and the creation of software prototype transition environments to facilitate the incorporation of advanced automation in the SSFP.

  18. Automated compound classification using a chemical ontology

    PubMed Central

    2012-01-01

    Background Classification of chemical compounds into compound classes by using structure derived descriptors is a well-established method to aid the evaluation and abstraction of compound properties in chemical compound databases. MeSH and recently ChEBI are examples of chemical ontologies that provide a hierarchical classification of compounds into general compound classes of biological interest based on their structural as well as property or use features. In these ontologies, compounds have been assigned manually to their respective classes. However, with the ever increasing possibilities to extract new compounds from text documents using name-to-structure tools and considering the large number of compounds deposited in databases, automated and comprehensive chemical classification methods are needed to avoid the error prone and time consuming manual classification of compounds. Results In the present work we implement principles and methods to construct a chemical ontology of classes that shall support the automated, high-quality compound classification in chemical databases or text documents. While SMARTS expressions have already been used to define chemical structure class concepts, in the present work we have extended the expressive power of such class definitions by expanding their structure-based reasoning logic. Thus, to achieve the required precision and granularity of chemical class definitions, sets of SMARTS class definitions are connected by OR and NOT logical operators. In addition, AND logic has been implemented to allow the concomitant use of flexible atom lists and stereochemistry definitions. The resulting chemical ontology is a multi-hierarchical taxonomy of concept nodes connected by directed, transitive relationships. Conclusions A proposal for a rule based definition of chemical classes has been made that allows to define chemical compound classes more precisely than before. The proposed structure-based reasoning logic allows to translate

  19. Advanced in In Situ Inspection of Automated Fiber Placement Systems

    NASA Technical Reports Server (NTRS)

    Juarez, Peter D.; Cramer, K. Elliott; Seebo, Jeffrey P.

    2016-01-01

    Automated Fiber Placement (AFP) systems have been developed to help take advantage of the tailorability of composite structures in aerospace applications. AFP systems allow the repeatable placement of uncured, spool fed, preimpregnated carbon fiber tape (tows) onto substrates in desired thicknesses and orientations. This automated process can incur defects, such as overlapping tow lines, which can severely undermine the structural integrity of the part. Current defect detection and abatement methods are very labor intensive, and still mostly rely on human manual inspection. Proposed is a thermographic in situ inspection technique which monitors tow placement with an on board thermal camera using the preheated substrate as a through transmission heat source. An investigation of the concept is conducted, and preliminary laboratory results are presented. Also included will be a brief overview of other emerging technologies that tackle the same issue. Keywords: Automated Fiber Placement, Manufacturing defects, Thermography

  20. Automated ship image acquisition

    NASA Astrophysics Data System (ADS)

    Hammond, T. R.

    2008-04-01

    The experimental Automated Ship Image Acquisition System (ASIA) collects high-resolution ship photographs at a shore-based laboratory, with minimal human intervention. The system uses Automatic Identification System (AIS) data to direct a high-resolution SLR digital camera to ship targets and to identify the ships in the resulting photographs. The photo database is then searchable using the rich data fields from AIS, which include the name, type, call sign and various vessel identification numbers. The high-resolution images from ASIA are intended to provide information that can corroborate AIS reports (e.g., extract identification from the name on the hull) or provide information that has been omitted from the AIS reports (e.g., missing or incorrect hull dimensions, cargo, etc). Once assembled into a searchable image database, the images can be used for a wide variety of marine safety and security applications. This paper documents the author's experience with the practicality of composing photographs based on AIS reports alone, describing a number of ways in which this can go wrong, from errors in the AIS reports, to fixed and mobile obstructions and multiple ships in the shot. The frequency with which various errors occurred in automatically-composed photographs collected in Halifax harbour in winter time were determined by manual examination of the images. 45% of the images examined were considered of a quality sufficient to read identification markings, numbers and text off the entire ship. One of the main technical challenges for ASIA lies in automatically differentiating good and bad photographs, so that few bad ones would be shown to human users. Initial attempts at automatic photo rating showed 75% agreement with manual assessments.

  1. The logic of automated glycan assembly.

    PubMed

    Seeberger, Peter H

    2015-05-19

    Carbohydrates are the most abundant biopolymers on earth and part of every living creature. Glycans are essential as materials for nutrition and for information transfer in biological processes. To date, in few cases a detailed correlation between glycan structure and glycan function has been established. A molecular understanding of glycan function will require pure glycans for biological, immunological, and structural studies. Given the immense structural complexity of glycans found in living organisms and the lack of amplification methods or expression systems, chemical synthesis is the only means to access usable quantities of pure glycan molecules. While the solid-phase synthesis of DNA and peptides has become routine for decades, access to glycans has been technically difficult, time-consuming and confined to a few expert laboratories. In this Account, the development of a comprehensive approach to the automated synthesis of all classes of mammalian glycans, including glycosaminoglycans and glycosylphosphatidyl inositol (GPI) anchors, as well as bacterial and plant carbohydrates is described. A conceptual advance concerning the logic of glycan assembly was required in order to enable automated execution of the synthetic process. Based on the central glycosidic bond forming reaction, a general concept for the protecting groups and leaving groups has been developed. Building blocks that can be procured on large scale, are stable for prolonged periods of time, but upon activation result in high yields and selectivities were identified. A coupling-capping and deprotection cycle was invented that can be executed by an automated synthesis instrument. Straightforward postsynthetic protocols for cleavage from the solid support as well as purification of conjugation-ready oligosaccharides have been established. Introduction of methods to install selectively a wide variety of glycosidic linkages has enabled the rapid assembly of linear and branched oligo- and

  2. Automation: Decision Aid or Decision Maker?

    NASA Technical Reports Server (NTRS)

    Skitka, Linda J.

    1998-01-01

    This study clarified that automation bias is something unique to automated decision making contexts, and is not the result of a general tendency toward complacency. By comparing performance on exactly the same events on the same tasks with and without an automated decision aid, we were able to determine that at least the omission error part of automation bias is due to the unique context created by having an automated decision aid, and is not a phenomena that would occur even if people were not in an automated context. However, this study also revealed that having an automated decision aid did lead to modestly improved performance across all non-error events. Participants in the non- automated condition responded with 83.68% accuracy, whereas participants in the automated condition responded with 88.67% accuracy, across all events. Automated decision aids clearly led to better overall performance when they were accurate. People performed almost exactly at the level of reliability as the automation (which across events was 88% reliable). However, also clear, is that the presence of less than 100% accurate automated decision aids creates a context in which new kinds of errors in decision making can occur. Participants in the non-automated condition responded with 97% accuracy on the six "error" events, whereas participants in the automated condition had only a 65% accuracy rate when confronted with those same six events. In short, the presence of an AMA can lead to vigilance decrements that can lead to errors in decision making.

  3. THE SKILL IMPACT OF AUTOMATION. REPRINT NO. 136.

    ERIC Educational Resources Information Center

    SULTAN, PAUL; PRASOW, P.

    THIS SAMPLING OF COLLECTED TESTIMONY WAS INTENDED TO ILLUSTRATE SOME OF THE DIMENSIONS OF MANPOWER PROBLEMS FACED EVEN IN EXPANDING LABOR MARKETS. A REVIEW OF SELECTED "STRUCTURAL" ASPECTS OF EMPLOYMENT ANALYSIS GAVE PARTICULAR ATTENTION TO THE IMPACT OF AUTOMATION ON EMPLOYMENT WHEN CONSIDERATION WAS GIVEN, NOT TO THE AMOUNT OF LABOR DEMANDED,…

  4. TSORT - an automated tool for allocating tasks to training strategies

    SciTech Connect

    Carter, R.J.; Jorgensen, C.C.

    1986-01-01

    An automated tool (TSORT) that can aid training system developers in determining which training strategy should be applied to a particular task and in grouping similar tasks into training categories has been developed. This paper describes the rationale for TSORT's development and addresses its structure, including training categories, task description dimensions, and categorization metrics. It also provides some information on TSORT's application.

  5. Performance Analysis of GAME: A Generic Automated Marking Environment

    ERIC Educational Resources Information Center

    Blumenstein, Michael; Green, Steve; Fogelman, Shoshana; Nguyen, Ann; Muthukkumarasamy, Vallipuram

    2008-01-01

    This paper describes the Generic Automated Marking Environment (GAME) and provides a detailed analysis of its performance in assessing student programming projects and exercises. GAME has been designed to automatically assess programming assignments written in a variety of languages based on the "structure" of the source code and the correctness…

  6. Automated Bilingual Circulation System Using PC Local Area Networks.

    ERIC Educational Resources Information Center

    Iskanderani, A. I.; Anwar, M. A.

    1992-01-01

    Describes a local automated bilingual circulation system using personal computers in a local area network that was developed at King Abdulaziz University (Saudi Arabia) for Arabic and English materials. Topics addressed include the system structure, hardware, major features, storage requirements, and costs. (nine references) (LRW)

  7. Design automation for integrated circuits

    NASA Astrophysics Data System (ADS)

    Newell, S. B.; de Geus, A. J.; Rohrer, R. A.

    1983-04-01

    Consideration is given to the development status of the use of computers in automated integrated circuit design methods, which promise the minimization of both design time and design error incidence. Integrated circuit design encompasses two major tasks: error specification, in which the goal is a logic diagram that accurately represents the desired electronic function, and physical specification, in which the goal is an exact description of the physical locations of all circuit elements and their interconnections on the chip. Design automation not only saves money by reducing design and fabrication time, but also helps the community of systems and logic designers to work more innovatively. Attention is given to established design automation methodologies, programmable logic arrays, and design shortcuts.

  8. Automated power management and control

    NASA Technical Reports Server (NTRS)

    Dolce, James L.

    1991-01-01

    A comprehensive automation design is being developed for Space Station Freedom's electric power system. A joint effort between NASA's Office of Aeronautics and Exploration Technology and NASA's Office of Space Station Freedom, it strives to increase station productivity by applying expert systems and conventional algorithms to automate power system operation. The initial station operation will use ground-based dispatches to perform the necessary command and control tasks. These tasks constitute planning and decision-making activities that strive to eliminate unplanned outages. We perceive an opportunity to help these dispatchers make fast and consistent on-line decisions by automating three key tasks: failure detection and diagnosis, resource scheduling, and security analysis. Expert systems will be used for the diagnostics and for the security analysis; conventional algorithms will be used for the resource scheduling.

  9. Automated mapping of hammond's landforms

    USGS Publications Warehouse

    Gallant, A.L.; Brown, D.D.; Hoffer, R.M.

    2005-01-01

    We automated a method for mapping Hammond's landforms over large landscapes using digital elevation data. We compared our results against Hammond's published landform maps, derived using manual interpretation procedures. We found general agreement in landform patterns mapped by the manual and the automated approaches, and very close agreement in characterization of local topographic relief. The two approaches produced different interpretations of intermediate landforms, which relied upon quantification of the proportion of landscape having gently sloping terrain. This type of computation is more efficiently and consistently applied by computer than human. Today's ready access to digital data and computerized geospatial technology provides a good foundation for mapping terrain features, but the mapping criteria guiding manual techniques in the past may not be appropriate for automated approaches. We suggest that future efforts center on the advantages offered by digital advancements in refining an approach to better characterize complex landforms. ?? 2005 IEEE.

  10. Automated brain segmentation using neural networks

    NASA Astrophysics Data System (ADS)

    Powell, Stephanie; Magnotta, Vincent; Johnson, Hans; Andreasen, Nancy

    2006-03-01

    Automated methods to delineate brain structures of interest are required to analyze large amounts of imaging data like that being collected in several on going multi-center studies. We have previously reported on using artificial neural networks (ANN) to define subcortical brain structures such as the thalamus (0.825), caudate (0.745), and putamen (0.755). One of the inputs into the ANN is the apriori probability of a structure existing at a given location. In this previous work, the apriori probability information was generated in Talairach space using a piecewise linear registration. In this work we have increased the dimensionality of this registration using Thirion's demons registration algorithm. The input vector consisted of apriori probability, spherical coordinates, and an iris of surrounding signal intensity values. The output of the neural network determined if the voxel was defined as one of the N regions used for training. Training was performed using a standard back propagation algorithm. The ANN was trained on a set of 15 images for 750,000,000 iterations. The resulting ANN weights were then applied to 6 test images not part of the training set. Relative overlap calculated for each structure was 0.875 for the thalamus, 0.845 for the caudate, and 0.814 for the putamen. With the modifications on the neural net algorithm and the use of multi-dimensional registration, we found substantial improvement in the automated segmentation method. The resulting segmented structures are as reliable as manual raters and the output of the neural network can be used without additional rater intervention.

  11. Automated gaseous criteria pollutant audits

    SciTech Connect

    Watson, J.P.

    1998-12-31

    The Quality Assurance Section (QAS) of the California Air Resources Board (CARB) began performing automated gaseous audits of its ambient air monitoring sites in July 1996. The concept of automated audits evolved from the constant streamlining of the through-the-probe audit process. Continual audit van development and the desire to utilize advanced technology to save time and improve the accuracy of the overall audit process also contributed to the concept. The automated audit process is a computer program which controls an audit van`s ambient gas calibration system, isolated relay and analog to digital cards, and a monitoring station`s data logging system. The program instructs the audit van`s gas calibration system to deliver specified audit concentrations to a monitoring station`s instruments through their collection probe inlet. The monitoring station`s responses to the audit concentrations are obtained by the program polling the station`s datalogger through its RS-232 port. The program calculates relevant audit statistics and stores all data collected during an audit in a relational database. Planning for the development of an automated gaseous audit system began in earnest in 1993, when the CARB purchased computerized ambient air calibration systems which could be remotely controlled by computer through their serial ports. After receiving all the required components of the automated audit system, they were individually tested to confirm their correct operation. Subsequently, a prototype program was developed to perform through-the-probe automated ozone audits. Numerous simulated ozone audits documented the program`s ability to control audit equipment and extract data from a monitoring station`s data logging system. The program was later modified to incorporate the capability to perform audits for carbon monoxide, total hydrocarbons, methane, nitrogen dioxide, sulfur dioxide, and hydrogen sulfide.

  12. BOA: Framework for automated builds

    SciTech Connect

    N. Ratnikova et al.

    2003-09-30

    Managing large-scale software products is a complex software engineering task. The automation of the software development, release and distribution process is most beneficial in the large collaborations, where the big number of developers, multiple platforms and distributed environment are typical factors. This paper describes Build and Output Analyzer framework and its components that have been developed in CMS to facilitate software maintenance and improve software quality. The system allows to generate, control and analyze various types of automated software builds and tests, such as regular rebuilds of the development code, software integration for releases and installation of the existing versions.

  13. Advanced automation for space missions

    SciTech Connect

    Freitas, R.A., Jr.; Healy, T.J.; Long, J.E.

    1982-01-01

    A NASA/ASEE summer study conducted at the University of Santa Clara in 1980 examined the feasibility of using advanced artificial intelligence and automation technologies in future NASA space missions. Four candidate applications missions were considered: an intelligent earth-sensing information system; an autonomous space exploration system; an automated space manufacturing facility; and a self-replicating, growing lunar factory. The study assessed the various artificial intelligence and machine technologies which must be developed if such sophisticated missions are to become feasible by the century's end. 18 references.

  14. Automating Shallow Seismic Imaging

    SciTech Connect

    Steeples, Don W.

    2004-12-09

    This seven-year, shallow-seismic reflection research project had the aim of improving geophysical imaging of possible contaminant flow paths. Thousands of chemically contaminated sites exist in the United States, including at least 3,700 at Department of Energy (DOE) facilities. Imaging technologies such as shallow seismic reflection (SSR) and ground-penetrating radar (GPR) sometimes are capable of identifying geologic conditions that might indicate preferential contaminant-flow paths. Historically, SSR has been used very little at depths shallower than 30 m, and even more rarely at depths of 10 m or less. Conversely, GPR is rarely useful at depths greater than 10 m, especially in areas where clay or other electrically conductive materials are present near the surface. Efforts to image the cone of depression around a pumping well using seismic methods were only partially successful (for complete references of all research results, see the full Final Technical Report, DOE/ER/14826-F), but peripheral results included development of SSR methods for depths shallower than one meter, a depth range that had not been achieved before. Imaging at such shallow depths, however, requires geophone intervals of the order of 10 cm or less, which makes such surveys very expensive in terms of human time and effort. We also showed that SSR and GPR could be used in a complementary fashion to image the same volume of earth at very shallow depths. The primary research focus of the second three-year period of funding was to develop and demonstrate an automated method of conducting two-dimensional (2D) shallow-seismic surveys with the goal of saving time, effort, and money. Tests involving the second generation of the hydraulic geophone-planting device dubbed the ''Autojuggie'' showed that large numbers of geophones can be placed quickly and automatically and can acquire high-quality data, although not under rough topographic conditions. In some easy-access environments, this device could

  15. Automated Tools for Subject Matter Expert Evaluation of Automated Scoring

    ERIC Educational Resources Information Center

    Williamson, David M.; Bejar, Isaac I.; Sax, Anne

    2004-01-01

    As automated scoring of complex constructed-response examinations reaches operational status, the process of evaluating the quality of resultant scores, particularly in contrast to scores of expert human graders, becomes as complex as the data itself. Using a vignette from the Architectural Registration Examination (ARE), this article explores the…

  16. Automation U.S.A.: Overcoming Barriers to Automation.

    ERIC Educational Resources Information Center

    Brody, Herb

    1985-01-01

    Although labor unions and inadequate technology play minor roles, the principal barrier to factory automation is "fear of change." Related problems include long-term benefits, nontechnical executives, and uncertainty of factory cost accounting. Industry support for university programs is helping to educate engineers to design, implement, and…

  17. AUTOMATION AND UNEMPLOYMENT.

    ERIC Educational Resources Information Center

    SCHMIDT, EMERSON P.; STEWART, CHARLES T.

    HIGH UNEMPLOYMENT RESULTS IN ECONOMIC LOSSES TO THE ECONOMY AND IMPOSES SUFFERING ON MILLIONS OF INDIVIDUALS AND FAMILIES. OF THE MANY TYPES, LONG-TERM STRUCTURAL UNEMPLOYMENT AFFECTS MORE THAN ONE MILLION WORKERS AND IS MOST INTRACTABLE TO TREATMENT AND DISTURBING IN TERMS OF HUMAN HARDSHIP. MOST OF THE WORKERS CLASSIFIED AS STRUCTURALLY…

  18. Automation effects in a multiloop manual control system

    NASA Technical Reports Server (NTRS)

    Hess, R. A.; Mcnally, B. D.

    1986-01-01

    An experimental and analytical study was undertaken to investigate human interaction with a simple multiloop manual control system in which the human's activity was systematically varied by changing the level of automation. The system simulated was the longitudinal dynamics of a hovering helicopter. The automation-systems-stabilized vehicle responses from attitude to velocity to position and also provided for display automation in the form of a flight director. The control-loop structure resulting from the task definition can be considered a simple stereotype of a hierarchical control system. The experimental study was complemented by an analytical modeling effort which utilized simple crossover models of the human operator. It was shown that such models can be extended to the description of multiloop tasks involving preview and precognitive human operator behavior. The existence of time optimal manual control behavior was established for these tasks and the role which internal models may play in establishing human-machine performance was discussed.

  19. An automated test system for terahertz receiver characterization

    NASA Astrophysics Data System (ADS)

    Kuenzi, Linda C.; Groppi, Christopher E.; Wheeler, Caleb H.; Mani, Hamdi

    2014-07-01

    An automated test system was developed to characterize detectors for the Kilopixel Array Pathfinder Project (KAPPa), a 16-pixel 2D integrated heterodyne focal plane array. Although primarily designed for KAPPa, the system can be used with other instruments to automate tests that might be tedious and time-consuming by hand. Mechanical components include an adjustable structure of aluminum t-slot framing that supports a rotating chopper. Driven by a stepper motor, the wheel alternates between blackbodies at room temperature and 77 K. The cold load consists of absorbing material submerged in liquid nitrogen in an open Styrofoam cooler. Python scripts control the mechanical system, interface with receiver components, and process data. Test system operation was verified by sweeping the local oscillator frequency with a Virginia Diodes room temperature receiver. The system was then integrated with the KAPPa receiver to allow complete and automated testing of all array pixels with minimal user intervention.

  20. A centralized global automation group in a decentralized organization.

    PubMed

    Ormand, J; Bruner, J; Birkemo, L; Hinderliter-Smith, J; Veitch, J

    2000-01-01

    In the latter part of the 1990s, many companies have worked to foster a 'matrix' style culture through several changes in organizational structure. This type of culture facilitates communication and development of new technology across organizational and global boundaries. At Glaxo Wellcome, this matrix culture is reflected in an automation strategy that relies on both centralized and decentralized resources. The Group Development Operations Information Systems Robotics Team is a centralized resource providing development, support, integration, and training in laboratory automation across businesses in the Development organization. The matrix culture still presents challenges with respect to communication and managing the development of technology. A current challenge for our team is to go beyond our recognized role as a technology resource and actually to influence automation strategies across the global Development organization. We shall provide an overview of our role as a centralized resource, our team strategy, examples of current and past successes and failures, and future directions.

  1. Acoustic hemostasis device for automated treatment of bleeding in limbs

    NASA Astrophysics Data System (ADS)

    Sekins, K. Michael; Zeng, Xiaozheng; Barnes, Stephen; Hopple, Jerry; Kook, John; Moreau-Gobard, Romain; Hsu, Stephen; Ahiekpor-Dravi, Alexis; Lee, Chi-Yin; Ramachandran, Suresh; Maleke, Caroline; Eaton, John; Wong, Keith; Keneman, Scott

    2012-10-01

    A research prototype automated image-guided acoustic hemostasis system for treatment of deep bleeding was developed and tested in limb phantoms. The system incorporated a flexible, conformal acoustic applicator cuff. Electronically steered and focused therapeutic arrays (Tx) populated the cuff to enable dosing from multiple Tx's simultaneously. Similarly, multiple imaging arrays (Ix) were deployed on the cuff to enable 3D compounded images for targeting and treatment monitoring. To affect a lightweight cuff, highly integrated Tx electrical circuitry was implemented, fabric and lightweight structural materials were used, and components were minimized. Novel cuff and Ix and Tx mechanical registration approaches were used to insure targeting accuracy. Two-step automation was implemented: 1) targeting (3D image volume acquisition and stitching, Power and Pulsed Wave Doppler automated bleeder detection, identification of bone, followed by closed-loop iterative Tx beam targeting), and 2) automated dosing (auto-selection of arrays and Tx dosing parameters, power initiation and then monitoring by acoustic thermometry for power shut-off). In final testing the device automatically detected 65% of all bleeders (with various bleeder flow rates). Accurate targeting was achieved in HIFU phantoms with end-dose (30 sec) temperature rise reaching the desired 33-58°C. Automated closed-loop targeting and treatment was demonstrated in separate phantoms.

  2. Plenoptic Imager for Automated Surface Navigation

    NASA Technical Reports Server (NTRS)

    Zollar, Byron; Milder, Andrew; Milder, Andrew; Mayo, Michael

    2010-01-01

    An electro-optical imaging device is capable of autonomously determining the range to objects in a scene without the use of active emitters or multiple apertures. The novel, automated, low-power imaging system is based on a plenoptic camera design that was constructed as a breadboard system. Nanohmics proved feasibility of the concept by designing an optical system for a prototype plenoptic camera, developing simulated plenoptic images and range-calculation algorithms, constructing a breadboard prototype plenoptic camera, and processing images (including range calculations) from the prototype system. The breadboard demonstration included an optical subsystem comprised of a main aperture lens, a mechanical structure that holds an array of micro lenses at the focal distance from the main lens, and a structure that mates a CMOS imaging sensor the correct distance from the micro lenses. The demonstrator also featured embedded electronics for camera readout, and a post-processor executing image-processing algorithms to provide ranging information.

  3. Modular workcells: modern methods for laboratory automation.

    PubMed

    Felder, R A

    1998-12-01

    Laboratory automation is beginning to become an indispensable survival tool for laboratories facing difficult market competition. However, estimates suggest that only 8% of laboratories will be able to afford total laboratory automation systems. Therefore, automation vendors have developed alternative hardware configurations called 'modular automation', to fit the smaller laboratory. Modular automation consists of consolidated analyzers, integrated analyzers, modular workcells, and pre- and post-analytical automation. These terms will be defined in this paper. Using a modular automation model, the automated core laboratory will become a site where laboratory data is evaluated by trained professionals to provide diagnostic information to practising physicians. Modem software information management and process control tools will complement modular hardware. Proper standardization that will allow vendor-independent modular configurations will assure success of this revolutionary new technology.

  4. What Is an Automated External Defibrillator?

    MedlinePlus

    ANSWERS by heart Treatments + Tests What Is an Automated External Defibrillator? An automated external defibrillator (AED) is a lightweight, portable device ... AED? Non-medical personnel such as police, fire service personnel, flight attendants, security guards and other lay ...

  5. Understanding human management of automation errors

    PubMed Central

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  6. Automated beam builder

    NASA Technical Reports Server (NTRS)

    Muench, W. K.

    1980-01-01

    Requirements for the space fabrication of large space structures are considered with emphasis on the design, development, manufacture, and testing of a machine which automatically produces a basic building block aluminum beam. Particular problems discussed include those associated with beam cap forming; brace storage, dispensing, and transporting; beam component fastening; and beam cut-off. Various critical process tests conducted to develop technology for a machine to produce composite beams are also discussed.

  7. Ask the experts: automation: part I.

    PubMed

    Allinson, John L; Blick, Kenneth E; Cohen, Lucinda; Higton, David; Li, Ming

    2013-08-01

    Bioanalysis invited a selection of leading researchers to express their views on automation in the bioanalytical laboratory. The topics discussed include the challenges that the modern bioanalyst faces when integrating automation into existing drug-development processes, the impact of automation and how they envision the modern bioanalytical laboratory changing in the near future. Their enlightening responses provide a valuable insight into the impact of automation and the future of the constantly evolving bioanalytical laboratory.

  8. Progress in Fully Automated Abdominal CT Interpretation

    PubMed Central

    Summers, Ronald M.

    2016-01-01

    OBJECTIVE Automated analysis of abdominal CT has advanced markedly over just the last few years. Fully automated assessment of organs, lymph nodes, adipose tissue, muscle, bowel, spine, and tumors are some examples where tremendous progress has been made. Computer-aided detection of lesions has also improved dramatically. CONCLUSION This article reviews the progress and provides insights into what is in store in the near future for automated analysis for abdominal CT, ultimately leading to fully automated interpretation. PMID:27101207

  9. Automated Cataloging. SPEC Kit 47.

    ERIC Educational Resources Information Center

    Association of Research Libraries, Washington, DC. Office of Management Studies.

    Results of a 1978 Association of Research Libraries (ARL) survey indicated that 68 (89%) of responding libraries utilized an automated cataloging system. Of those 68, 53 participated in the Ohio College Library Center (OCLC), five in BALLOTS, and the rest in other networks or local systems. At the beginning of this collection, a concise summary…

  10. Automated species identification: why not?

    PubMed Central

    Gaston, Kevin J; O'Neill, Mark A

    2004-01-01

    Where possible, automation has been a common response of humankind to many activities that have to be repeated numerous times. The routine identification of specimens of previously described species has many of the characteristics of other activities that have been automated, and poses a major constraint on studies in many areas of both pure and applied biology. In this paper, we consider some of the reasons why automated species identification has not become widely employed, and whether it is a realistic option, addressing the notions that it is too difficult, too threatening, too different or too costly. Although recognizing that there are some very real technical obstacles yet to be overcome, we argue that progress in the development of automated species identification is extremely encouraging that such an approach has the potential to make a valuable contribution to reducing the burden of routine identifications. Vision and enterprise are perhaps more limiting at present than practical constraints on what might possibly be achieved. PMID:15253351

  11. Fully automated solid weighing workstation.

    PubMed

    Wong, Stephen K-F; Lu, YiFeng; Heineman, William; Palmer, Janice; Courtney, Carter

    2005-08-01

    A fully automated, solid-to-solid weighing workstation (patent pending) is described in this article. The core of this automated process is the use of an electrostatically charged pipette tip to attract solid particles on its outside surface. The particles were then dislodged into a 1.2-mL destination vial in a microbalance by spinning the pipette tip. Textures of solid that could be weighed included powder, crystalline, liquid, and semi-solid substances. The workstation can pick up submilligram quantities of sample (=0.3mg) from source vials containing as little as 1mg. The destination vials containing the samples were stored in a 96-well rack to enable subsequent automated liquid handling. Using bovine serum albumin as test solid, the coefficient of variation of the protein concentration for 48 samples is less than 6%. The workstation was used successfully to weigh out 48 different synthetic compounds. Time required for automated weighing was similar to manual weighing. The use of this workstation reduced 90% hands-on time and thus exposure to potentially toxic compounds. In addition, it minimized sample waste and reduced artifacts due to the poor solubility of compound in solvents. Moreover, it enabled compounds synthesized in milligram quantities to be weighed out and tested in biological assays.

  12. Automated Filtering of Internet Postings.

    ERIC Educational Resources Information Center

    Rosenfeld, Louis B.; Holland, Maurita P.

    1994-01-01

    Discussion of the use of dynamic data resources, such as Internet LISTSERVs or Usenet newsgroups, focuses on an experiment using an automated filtering system with Usenet newsgroups. Highlights include user satisfaction, based on retrieval size, data sources, and user interface and the need for some human mediation. (Contains two references.) (LRW)

  13. Automated analysis of oxidative metabolites

    NASA Technical Reports Server (NTRS)

    Furner, R. L. (Inventor)

    1974-01-01

    An automated system for the study of drug metabolism is described. The system monitors the oxidative metabolites of aromatic amines and of compounds which produce formaldehyde on oxidative dealkylation. It includes color developing compositions suitable for detecting hyroxylated aromatic amines and formaldehyde.

  14. Teacherbot: Interventions in Automated Teaching

    ERIC Educational Resources Information Center

    Bayne, Sian

    2015-01-01

    Promises of "teacher-light" tuition and of enhanced "efficiency" via the automation of teaching have been with us since the early days of digital education, sometimes embraced by academics and institutions, and sometimes resisted as a set of moves which are damaging to teacher professionalism and to the humanistic values of…

  15. Automating the conflict resolution process

    NASA Technical Reports Server (NTRS)

    Wike, Jeffrey S.

    1991-01-01

    The purpose is to initiate a discussion of how the conflict resolution process at the Network Control Center can be made more efficient. Described here are how resource conflicts are currently resolved as well as the impacts of automating conflict resolution in the ATDRSS era. A variety of conflict resolution strategies are presented.

  16. Automating a High School Restroom.

    ERIC Educational Resources Information Center

    Ritner-Heir, Robbin

    1999-01-01

    Discusses how one high school transformed its restrooms into cleaner and more vandal-resistant environments by automating them. Solutions discussed include installing perforated stainless steel panel ceilings, using epoxy-based paint for walls, selecting china commode fixtures instead of stainless steel, installing electronic faucets and sensors,…

  17. Automation; The New Industrial Revolution.

    ERIC Educational Resources Information Center

    Arnstein, George E.

    Automation is a word that describes the workings of computers and the innovations of automatic transfer machines in the factory. As the hallmark of the new industrial revolution, computers displace workers and create a need for new skills and retraining programs. With improved communication between industry and the educational community to…

  18. Safety in the Automated Office.

    ERIC Educational Resources Information Center

    Graves, Pat R.; Greathouse, Lillian R.

    1990-01-01

    Office automation has introduced new hazards to the workplace: electrical hazards related to computer wiring, musculoskeletal problems resulting from use of computer terminals and design of work stations, and environmental concerns related to ventilation, noise levels, and office machine chemicals. (SK)

  19. Library Automation: Guidelines to Costing.

    ERIC Educational Resources Information Center

    Ford, Geoffrey

    As with all new programs, the costs associated with library automation must be carefully considered before implementation. This document suggests guidelines to be followed and areas to be considered in the costing of library procedures. An existing system model has been suggested as a standard (Appendix A) and a classification of library tasks…

  20. Delaware: Library Automation and Networking.

    ERIC Educational Resources Information Center

    Sloan, Tom

    1996-01-01

    Describes automation and networking activities among Delaware libraries, including integrated library systems for public libraries, the Delaware Technical and Community College telecommunications network, Delaware Public Library Internet access planning, digital resources, a computer/technology training center, and the Delaware Center for…

  1. Automation of Space Inventory Management

    NASA Technical Reports Server (NTRS)

    Fink, Patrick W.; Ngo, Phong; Wagner, Raymond; Barton, Richard; Gifford, Kevin

    2009-01-01

    This viewgraph presentation describes the utilization of automated space-based inventory management through handheld RFID readers and BioNet Middleware. The contents include: 1) Space-Based INventory Management; 2) Real-Time RFID Location and Tracking; 3) Surface Acoustic Wave (SAW) RFID; and 4) BioNet Middleware.

  2. Automation on the Laboratory Bench.

    ERIC Educational Resources Information Center

    Legrand, M.; Foucard, A.

    1978-01-01

    A kit is described for use in automation of routine chemical research procedures. The kit uses sensors to evaluate the state of the system, actuators which modify the adjustable parameters, and an organ of decision which uses the information from the sensors. (BB)

  3. Illinois: Library Automation and Connectivity Initiatives.

    ERIC Educational Resources Information Center

    Lamont, Bridget L.; Bloomberg, Kathleen L.

    1996-01-01

    Discussion of library automation in Illinois focuses on ILLINET, the Illinois Library and Information Network. Topics include automated resource sharing; ILLINET's online catalog; regional library system automation; community networking and public library technology development; telecommunications initiatives; electronic access to state government…

  4. You're a What? Automation Technician

    ERIC Educational Resources Information Center

    Mullins, John

    2010-01-01

    Many people think of automation as laborsaving technology, but it sure keeps Jim Duffell busy. Defined simply, automation is a technique for making a device run or a process occur with minimal direct human intervention. But the functions and technologies involved in automated manufacturing are complex. Nearly all functions, from orders coming in…

  5. Does Automated Feedback Improve Writing Quality?

    ERIC Educational Resources Information Center

    Wilson, Joshua; Olinghouse, Natalie G.; Andrada, Gilbert N.

    2014-01-01

    The current study examines data from students in grades 4-8 who participated in a statewide computer-based benchmark writing assessment that featured automated essay scoring and automated feedback. We examined whether the use of automated feedback was associated with gains in writing quality across revisions to an essay, and with transfer effects…

  6. Flight-deck automation: Promises and problems

    NASA Technical Reports Server (NTRS)

    Wiener, E. L.; Curry, R. E.

    1980-01-01

    The state of the art in human factors in flight-deck automation is presented. A number of critical problem areas are identified and broad design guidelines are offered. Automation-related aircraft accidents and incidents are discussed as examples of human factors problems in automated flight.

  7. Automated System Marketplace 1987: Maturity and Competition.

    ERIC Educational Resources Information Center

    Walton, Robert A.; Bridge, Frank R.

    1988-01-01

    This annual review of the library automation marketplace presents profiles of 15 major library automation firms and looks at emerging trends. Seventeen charts and tables provide data on market shares, number and size of installations, hardware availability, operating systems, and interfaces. A directory of 49 automation sources is included. (MES)

  8. Archives and Automation: Issues and Trends.

    ERIC Educational Resources Information Center

    Weiner, Rob

    This paper focuses on archives and automation, and reviews recent literature on various topics concerning archives and automation. Topics include: resistance to technology and the need to educate about automation; the change in archival theory due to the information age; problems with technology use; the history of organizing archival records…

  9. Generative Representations for Automated Design of Robots

    NASA Technical Reports Server (NTRS)

    Homby, Gregory S.; Lipson, Hod; Pollack, Jordan B.

    2007-01-01

    according to a fitness criterion to yield a figure of merit that is fed back into the evolutionary subprocess of the next iteration. In comparison with prior approaches to automated evolutionary design of robots, the use of generative representations offers two advantages: First, a generative representation enables the reuse of components in regular and hierarchical ways and thereby serves a systematic means of creating more complex modules out of simpler ones. Second, the evolved generative representation may capture intrinsic properties of the design problem, so that variations in the representations move through the design space more effectively than do equivalent variations in a nongenerative representation. This method has been demonstrated by using it to design some robots that move, variously, by walking, rolling, or sliding. Some of the robots were built (see figure). Although these robots are very simple, in comparison with robots designed by humans, their structures are more regular, modular, hierarchical, and complex than are those of evolved designs of comparable functionality synthesized by use of nongenerative representations.

  10. Automated Core Design

    SciTech Connect

    Kobayashi, Yoko; Aiyoshi, Eitaro

    2005-07-15

    Multistate searching methods are a subfield of distributed artificial intelligence that aims to provide both principles for construction of complex systems involving multiple states and mechanisms for coordination of independent agents' actions. This paper proposes a multistate searching algorithm with reinforcement learning for the automatic core design of a boiling water reactor. The characteristics of this algorithm are that the coupling structure and the coupling operation suitable for the assigned problem are assumed and an optimal solution is obtained by mutual interference in multistate transitions using multiagents. Calculations in an actual plant confirmed that the proposed algorithm increased the convergence ability of the optimization process.

  11. Automated analytical microarrays: a critical review.

    PubMed

    Seidel, Michael; Niessner, Reinhard

    2008-07-01

    Microarrays provide a powerful analytical tool for the simultaneous detection of multiple analytes in a single experiment. The specific affinity reaction of nucleic acids (hybridization) and antibodies towards antigens is the most common bioanalytical method for generating multiplexed quantitative results. Nucleic acid-based analysis is restricted to the detection of cells and viruses. Antibodies are more universal biomolecular receptors that selectively bind small molecules such as pesticides, small toxins, and pharmaceuticals and to biopolymers (e.g. toxins, allergens) and complex biological structures like bacterial cells and viruses. By producing an appropriate antibody, the corresponding antigenic analyte can be detected on a multiplexed immunoanalytical microarray. Food and water analysis along with clinical diagnostics constitute potential application fields for multiplexed analysis. Diverse fluorescence, chemiluminescence, electrochemical, and label-free microarray readout systems have been developed in the last decade. Some of them are constructed as flow-through microarrays by combination with a fluidic system. Microarrays have the potential to become widely accepted as a system for analytical applications, provided that robust and validated results on fully automated platforms are successfully generated. This review gives an overview of the current research on microarrays with the focus on automated systems and quantitative multiplexed applications.

  12. Specimen coordinate automated measuring machine/fiducial automated measuring machine

    DOEpatents

    Hedglen, Robert E.; Jacket, Howard S.; Schwartz, Allan I.

    1991-01-01

    The Specimen coordinate Automated Measuring Machine (SCAMM) and the Fiducial Automated Measuring Machine (FAMM) is a computer controlled metrology system capable of measuring length, width, and thickness, and of locating fiducial marks. SCAMM and FAMM have many similarities in their designs, and they can be converted from one to the other without taking them out of the hot cell. Both have means for: supporting a plurality of samples and a standard; controlling the movement of the samples in the +/- X and Y directions; determining the coordinates of the sample; compensating for temperature effects; and verifying the accuracy of the measurements and repeating as necessary. SCAMM and FAMM are designed to be used in hot cells.

  13. Automation: how much is too much?

    PubMed

    Hancock, P A

    2014-01-01

    The headlong rush to automate continues apace. The dominant question still remains whether we can automate, not whether we should automate. However, it is this latter question that is featured and considered explicitly here. The suggestion offered is that unlimited automation of all technical functions will eventually prove anathema to the fundamental quality of human life. Examples of tasks, pursuits and past-times that should potentially be excused from the automation imperative are discussed. This deliberation leads us back to the question of balance in the cooperation, coordination and potential conflict between humans and the machines they create.

  14. Automated trabecular bone histomorphometry

    NASA Technical Reports Server (NTRS)

    Polig, E.; Jee, W. S. S.

    1985-01-01

    The toxicity of alpha-emitting bone-seeking radionuclides and the relationship between bone tumor incidence and the local dosimetry of radionuclides in bone are investigated. The microdistributions of alpha-emitting radionuclides in the trabecular bone from the proximal humerus, distal humerus, proximal ulna, proximal femur, and distal femur of six young adult beagles injected with Am-241 (three with 2.8 micro-Ci/kg and three with 0.9 micro-Ci/kg) are estimated using a computer-controlled microscope photometer system; the components of the University of Utah Optical Track Scanner are described. The morphometric parameters for the beagles are calculated and analyzed. It is observed that the beagles injected with 0.9 micro-Ci of Am-241/kg showed an increase in the percentage of bone and trabecular bone thickness, and a reduction in the width of the bone marrow space and surface/volume ratio. The data reveal that radiation damage causes abnormal bone structure.

  15. Automating clinical dietetics documentation.

    PubMed

    Grace-Farfaglia, P; Rosow, P

    1995-06-01

    A review of commonly used charting formats discussed in the dietetics literature revealed that the subjective, objective assessment and planning (SOAP) approach is most frequently used by dietitians. Formats reported in the nursing literature were charting by exception (CBE); problem, intervention, evaluation (PIE); and focus/data, action, response (Focus/DAR). The strengths and weaknesses of the charting styles as they apply to the needs of clinical dietetic specialists were reviewed. We then decided to test in house the Focus/DAR format by assessing chart entries for adherence to style, brevity, and physician response. Dietitians pilot tested all the methods, but found them time consuming to use. The consensus was that SOAP could be adapted to the documentation needs of the individual situation and required little additional staff training. Often because of time limitations, a narrative summary was most appropriate. Chart entry length was reduced as much as 200% when staff were given brief clinical communication as a goal, and a further reduction when line limits were imposed. The physician response was positive, with recommendations followed in 50% of charts, compared with 34% in a previous audit. A nutrition documentation system was developed by the researchers by reviewing medical chart structure, documentation standards, methods of risk identification, and terminology for clinical documentation style. The resulting system affected the decision making of physicians, who could now scan notes more quickly and implement nutrition recommendations in a more timely fashion.

  16. Automated Estimating System

    1996-04-15

    AES6.1 is a PC software package developed to aid in the preparation and reporting of cost estimates. AES6.1 provides an easy means for entering and updating the detailed cost, schedule information, project work breakdown structure, and escalation information contained in a typical project cost estimate through the use of menus and formatted input screens. AES6.1 combines this information to calculate both unescalated and escalated cost for a project which can be reported at varying levelsmore » of detail. Following are the major modifications to AES6.0f: Contingency update was modified to provide greater flexibility for user updates, Schedule Update was modified to provide user ability to schedule Bills of Material at the WBS/Participant/Cost Code level, Schedule Plot was modified to graphically show schedule by WBS/Participant/Cost Code, All Fiscal Year reporting has been modified to use the new schedule format, The Schedule 1-B-7, Cost Schedule, and the WBS/Participant reprorts were modified to determine Phase of Work from the B/M Cost Code, Utility program was modified to allow selection by cost code and update cost code in the Global Schedule update, Generic summary and line item download were added to the utility program, and an option was added to all reports which allows the user to indicate where overhead is to be reported (bottom line or in body of report)« less

  17. Automated security response robot

    NASA Astrophysics Data System (ADS)

    Ciccimaro, Dominic A.; Everett, Hobart R.; Gilbreath, Gary A.; Tran, Tien T.

    1999-01-01

    ROBART III is intended as an advance demonstration platform for non-lethal response measures, extending the concepts of reflexive teleoperation into the realm of coordinated weapons control in law enforcement and urban warfare scenarios. A rich mix of ultrasonic and optical proximity and range sensors facilitates remote operation in unstructured and unexplored buildings with minimal operator supervision. Autonomous navigation and mapping of interior spaces is significantly enhanced by an innovative algorithm which exploits the fact that the majority of man-made structures are characterized by parallel and orthogonal walls. Extremely robust intruder detection and assessment capabilities are achieved through intelligent fusion of a multitude of inputs form various onboard motion sensors. Intruder detection is addressed by a 360-degree staring array of passive-IR motion detectors, augmented by a number of positionable head-mounted sensors. Automatic camera tracking of a moving target is accomplished using a video line digitizer. Non-lethal response systems include a six- barrelled pneumatically-powered Gatling gun, high-powered strobe lights, and three ear-piercing 103-decibel sirens.

  18. Development of a framework of human-centered automation for the nuclear industry

    SciTech Connect

    Nelson, W.R.; Haney, L.N.

    1993-01-01

    Introduction of automated systems into control rooms for advanced reactor designs is often justified on the basis of increased efficiency and reliability, without a detailed assessment of how the new technologies will influence the role of the operator. Such a technology-centered'' approach carries with it the risk that entirely new mechanisms for human error will be introduced, resulting in some unpleasant surprises when the plant goes into operation. The aviation industry has experienced some of these surprises since the introduction of automated systems into the cockpits of advanced technology aircraft. Pilot errors have actually been induced by automated systems, especially when the pilot doesn't fully understand what the automated systems are doing during all modes of operation. In order to structure the research program for investigating these problems, the National Aeronautics and Space Administration (NASA) has developed a framework for human-centered automation. This framework is described in the NASA document Human-Centered Aircraft Automation Philosophy by Charles Billings. It is the thesis of this paper that a corresponding framework of human-centered automation should be developed for the nuclear industry. Such a framework would serve to guide the design and regulation of automated systems for advanced reactor designs, and would help prevent some of the problems that have arisen in other applications that have followed a technology-centered'' approach.

  19. Development of a framework of human-centered automation for the nuclear industry

    SciTech Connect

    Nelson, W.R.; Haney, L.N.

    1993-04-01

    Introduction of automated systems into control rooms for advanced reactor designs is often justified on the basis of increased efficiency and reliability, without a detailed assessment of how the new technologies will influence the role of the operator. Such a ``technology-centered`` approach carries with it the risk that entirely new mechanisms for human error will be introduced, resulting in some unpleasant surprises when the plant goes into operation. The aviation industry has experienced some of these surprises since the introduction of automated systems into the cockpits of advanced technology aircraft. Pilot errors have actually been induced by automated systems, especially when the pilot doesn`t fully understand what the automated systems are doing during all modes of operation. In order to structure the research program for investigating these problems, the National Aeronautics and Space Administration (NASA) has developed a framework for human-centered automation. This framework is described in the NASA document Human-Centered Aircraft Automation Philosophy by Charles Billings. It is the thesis of this paper that a corresponding framework of human-centered automation should be developed for the nuclear industry. Such a framework would serve to guide the design and regulation of automated systems for advanced reactor designs, and would help prevent some of the problems that have arisen in other applications that have followed a ``technology-centered`` approach.

  20. Automated nutrient analyses in seawater

    SciTech Connect

    Whitledge, T.E.; Malloy, S.C.; Patton, C.J.; Wirick, C.D.

    1981-02-01

    This manual was assembled for use as a guide for analyzing the nutrient content of seawater samples collected in the marine coastal zone of the Northeast United States and the Bering Sea. Some modifications (changes in dilution or sample pump tube sizes) may be necessary to achieve optimum measurements in very pronounced oligotrophic, eutrophic or brackish areas. Information is presented under the following section headings: theory and mechanics of automated analysis; continuous flow system description; operation of autoanalyzer system; cookbook of current nutrient methods; automated analyzer and data analysis software; computer interfacing and hardware modifications; and trouble shooting. The three appendixes are entitled: references and additional reading; manifold components and chemicals; and software listings. (JGB)

  1. Automated Demand Response and Commissioning

    SciTech Connect

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Bourassa, Norman

    2005-04-01

    This paper describes the results from the second season of research to develop and evaluate the performance of new Automated Demand Response (Auto-DR) hardware and software technology in large facilities. Demand Response (DR) is a set of activities to reduce or shift electricity use to improve the electric grid reliability and manage electricity costs. Fully-Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. We refer to this as Auto-DR. The evaluation of the control and communications must be properly configured and pass through a set of test stages: Readiness, Approval, Price Client/Price Server Communication, Internet Gateway/Internet Relay Communication, Control of Equipment, and DR Shed Effectiveness. New commissioning tests are needed for such systems to improve connecting demand responsive building systems to the electric grid demand response systems.

  2. Automating occupational protection records systems

    SciTech Connect

    Lyon, M.; Martin, J.B.

    1991-10-01

    Occupational protection records have traditionally been generated by field and laboratory personnel, assembled into files in the safety office, and eventually stored in a warehouse or other facility. Until recently, these records have been primarily paper copies, often handwritten. Sometimes, the paper is microfilmed for storage. However, electronic records are beginning to replace these traditional methods. The purpose of this paper is to provide guidance for making the transition to automated record keeping and retrieval using modern computer equipment. This paper describes the types of records most readily converted to electronic record keeping and a methodology for implementing an automated record system. The process of conversion is based on a requirements analysis to assess program needs and a high level of user involvement during the development. The importance of indexing the hard copy records for easy retrieval is also discussed. The concept of linkage between related records and its importance relative to reporting, research, and litigation will be addressed. 2 figs.

  3. Automated illustration of patients instructions.

    PubMed

    Bui, Duy; Nakamura, Carlos; Bray, Bruce E; Zeng-Treitler, Qing

    2012-01-01

    A picture can be a powerful communication tool. However, creating pictures to illustrate patient instructions can be a costly and time-consuming task. Building on our prior research in this area, we developed a computer application that automatically converts text to pictures using natural language processing and computer graphics techniques. After iterative testing, the automated illustration system was evaluated using 49 previously unseen cardiology discharge instructions. The completeness of the system-generated illustrations was assessed by three raters using a three-level scale. The average inter-rater agreement for text correctly represented in the pictograph was about 66 percent. Since illustration in this context is intended to enhance rather than replace text, these results support the feasibility of conducting automated illustration.

  4. Automation design and crew coordination

    NASA Technical Reports Server (NTRS)

    Segal, Leon D.

    1993-01-01

    Advances in technology have greatly impacted the appearance of the modern aircraft cockpit. Where once one would see rows upon rows. The introduction of automation has greatly altered the demands on the pilots and the dynamics of aircrew task performance. While engineers and designers continue to implement the latest technological innovations in the cockpit - claiming higher reliability and decreased workload - a large percentage of aircraft accidents are still attributed to human error. Rather than being the main instigators of accidents, operators tend to be the inheritors of system defects created by poor design, incorrect installation, faulty maintenance and bad management decisions. This paper looks at some of the variables that need to be considered if we are to eliminate at least one of these inheritances - poor design. Specifically, this paper describes the first part of a comprehensive study aimed at identifying the effects of automation on crew coordination.

  5. Automated labeling in document images

    NASA Astrophysics Data System (ADS)

    Kim, Jongwoo; Le, Daniel X.; Thoma, George R.

    2000-12-01

    The National Library of Medicine (NLM) is developing an automated system to produce bibliographic records for its MEDLINER database. This system, named Medical Article Record System (MARS), employs document image analysis and understanding techniques and optical character recognition (OCR). This paper describes a key module in MARS called the Automated Labeling (AL) module, which labels all zones of interest (title, author, affiliation, and abstract) automatically. The AL algorithm is based on 120 rules that are derived from an analysis of journal page layouts and features extracted from OCR output. Experiments carried out on more than 11,000 articles in over 1,000 biomedical journals show the accuracy of this rule-based algorithm to exceed 96%.

  6. Fatigue crack growth automated testing method

    SciTech Connect

    Hatch, P.W.; VanDenAvyle, J.A.; Laing, J.

    1989-06-01

    A computer controlled servo-hydraulic mechanical test system has been configured to conduct automated fatigue crack growth testing. This provides two major benefits: it allows continuous cycling of specimens without operator attention over evenings and weekends; and complex load histories, including random loading and spectrum loading, can be applied to the specimens to simulate cyclic loading of engineering structures. The software is written in MTS Multi-User Basic to control test machine output and acquire data at predetermined intervals. Compact tension specimens are cycled according to ASTM specification E647-86. Fatigue crack growth is measured via specimen compliance during the test using a compliance/crack length calibration determined earlier by visual crack length measurements. This setup was used to measure crack growth rates in 6063 aluminum alloy for a variety of cyclic loadings, including spectrum loads. Data collected compared well with tests run manually. 13 figs.

  7. Algorithms Could Automate Cancer Diagnosis

    NASA Technical Reports Server (NTRS)

    Baky, A. A.; Winkler, D. G.

    1982-01-01

    Five new algorithms are a complete statistical procedure for quantifying cell abnormalities from digitized images. Procedure could be basis for automated detection and diagnosis of cancer. Objective of procedure is to assign each cell an atypia status index (ASI), which quantifies level of abnormality. It is possible that ASI values will be accurate and economical enough to allow diagnoses to be made quickly and accurately by computer processing of laboratory specimens extracted from patients.

  8. Home automation in the workplace.

    PubMed

    McCormack, J E; Tello, S F

    1994-01-01

    Environmental control units and home automation devices contribute to the independence and potential of individuals with disabilities, both at work and at home. Devices currently exist that can assist people with physical, cognitive, and sensory disabilities to control lighting, appliances, temperature, security, and telephone communications. This article highlights several possible applications for these technologies and discusses emerging technologies that will increase the benefits these devices offer people with disabilities.

  9. Automated Scheduling Via Artificial Intelligence

    NASA Technical Reports Server (NTRS)

    Biefeld, Eric W.; Cooper, Lynne P.

    1991-01-01

    Artificial-intelligence software that automates scheduling developed in Operations Mission Planner (OMP) research project. Software used in both generation of new schedules and modification of existing schedules in view of changes in tasks and/or available resources. Approach based on iterative refinement. Although project focused upon scheduling of operations of scientific instruments and other equipment aboard spacecraft, also applicable to such terrestrial problems as scheduling production in factory.

  10. Convection automated logic oven control

    SciTech Connect

    Boyer, M.A.; Eke, K.I.

    1998-03-01

    For the past few years, there has been a greater push to bring more automation to the cooling process. There have been attempts at automated cooking using a wide range of sensors and procedures, but with limited success. The authors have the answer to the automated cooking process; this patented technology is called Convection AutoLogic (CAL). The beauty of the technology is that it requires no extra hardware for the existing oven system. They use the existing temperature probe, whether it is an RTD, thermocouple, or thermistor. This means that the manufacturer does not have to be burdened with extra costs associated with automated cooking in comparison to standard ovens. The only change to the oven is the program in the central processing unit (CPU) on the board. As for its operation, when the user places the food into the oven, he or she is required to select a category (e.g., beef, poultry, or casseroles) and then simply press the start button. The CAL program then begins its cooking program. It first looks at the ambient oven temperature to see if it is a cold, warm, or hot start. CAL stores this data and then begins to look at the food`s thermal footprint. After CAL has properly detected this thermal footprint, it can calculate the time and temperature at which the food needs to be cooked. CAL then sets up these factors for the cooking stage of the program and, when the food has finished cooking, the oven is turned off automatically. The total time for this entire process is the same as the standard cooking time the user would normally set. The CAL program can also compensate for varying line voltages and detect when the oven door is opened. With all of these varying factors being monitored, CAL can produce a perfectly cooked item with minimal user input.

  11. Automated Platform Management System Scheduling

    NASA Technical Reports Server (NTRS)

    Hull, Larry G.

    1990-01-01

    The Platform Management System was established to coordinate the operation of platform systems and instruments. The management functions are split between ground and space components. Since platforms are to be out of contact with the ground more than the manned base, the on-board functions are required to be more autonomous than those of the manned base. Under this concept, automated replanning and rescheduling, including on-board real-time schedule maintenance and schedule repair, are required to effectively and efficiently meet Space Station Freedom mission goals. In a FY88 study, we developed several promising alternatives for automated platform planning and scheduling. We recommended both a specific alternative and a phased approach to automated platform resource scheduling. Our recommended alternative was based upon use of exactly the same scheduling engine in both ground and space components of the platform management system. Our phased approach recommendation was based upon evolutionary development of the platform. In the past year, we developed platform scheduler requirements and implemented a rapid prototype of a baseline platform scheduler. Presently we are rehosting this platform scheduler rapid prototype and integrating the scheduler prototype into two Goddard Space Flight Center testbeds, as the ground scheduler in the Scheduling Concepts, Architectures, and Networks Testbed and as the on-board scheduler in the Platform Management System Testbed. Using these testbeds, we will investigate rescheduling issues, evaluate operational performance and enhance the platform scheduler prototype to demonstrate our evolutionary approach to automated platform scheduling. The work described in this paper was performed prior to Space Station Freedom rephasing, transfer of platform responsibility to Code E, and other recently discussed changes. We neither speculate on these changes nor attempt to predict the impact of the final decisions. As a consequence some of our

  12. Small Business Innovations (Automated Information)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Bruce G. Jackson & Associates Document Director is an automated tool that combines word processing and database management technologies to offer the flexibility and convenience of text processing with the linking capability of database management. Originally developed for NASA, it provides a means to collect and manage information associated with requirements development. The software system was used by NASA in the design of the Assured Crew Return Vehicle, as well as by other government and commercial organizations including the Southwest Research Institute.

  13. Fully automated urban traffic system

    NASA Technical Reports Server (NTRS)

    Dobrotin, B. M.; Hansen, G. R.; Peng, T. K. C.; Rennels, D. A.

    1977-01-01

    The replacement of the driver with an automatic system which could perform the functions of guiding and routing a vehicle with a human's capability of responding to changing traffic demands was discussed. The problem was divided into four technological areas; guidance, routing, computing, and communications. It was determined that the latter three areas being developed independent of any need for fully automated urban traffic. A guidance system that would meet system requirements was not being developed but was technically feasible.

  14. Home automation in the workplace.

    PubMed

    McCormack, J E; Tello, S F

    1994-01-01

    Environmental control units and home automation devices contribute to the independence and potential of individuals with disabilities, both at work and at home. Devices currently exist that can assist people with physical, cognitive, and sensory disabilities to control lighting, appliances, temperature, security, and telephone communications. This article highlights several possible applications for these technologies and discusses emerging technologies that will increase the benefits these devices offer people with disabilities. PMID:24440955

  15. Using microwave Doppler radar in automated manufacturing applications

    NASA Astrophysics Data System (ADS)

    Smith, Gregory C.

    Since the beginning of the Industrial Revolution, manufacturers worldwide have used automation to improve productivity, gain market share, and meet growing or changing consumer demand for manufactured products. To stimulate further industrial productivity, manufacturers need more advanced automation technologies: "smart" part handling systems, automated assembly machines, CNC machine tools, and industrial robots that use new sensor technologies, advanced control systems, and intelligent decision-making algorithms to "see," "hear," "feel," and "think" at the levels needed to handle complex manufacturing tasks without human intervention. The investigator's dissertation offers three methods that could help make "smart" CNC machine tools and industrial robots possible: (1) A method for detecting acoustic emission using a microwave Doppler radar detector, (2) A method for detecting tool wear on a CNC lathe using a Doppler radar detector, and (3) An online non-contact method for detecting industrial robot position errors using a microwave Doppler radar motion detector. The dissertation studies indicate that microwave Doppler radar could be quite useful in automated manufacturing applications. In particular, the methods developed may help solve two difficult problems that hinder further progress in automating manufacturing processes: (1) Automating metal-cutting operations on CNC machine tools by providing a reliable non-contact method for detecting tool wear, and (2) Fully automating robotic manufacturing tasks by providing a reliable low-cost non-contact method for detecting on-line position errors. In addition, the studies offer a general non-contact method for detecting acoustic emission that may be useful in many other manufacturing and non-manufacturing areas, as well (e.g., monitoring and nondestructively testing structures, materials, manufacturing processes, and devices). By advancing the state of the art in manufacturing automation, the studies may help

  16. Automated analysis and annotation of basketball video

    NASA Astrophysics Data System (ADS)

    Saur, Drew D.; Tan, Yap-Peng; Kulkarni, Sanjeev R.; Ramadge, Peter J.

    1997-01-01

    Automated analysis and annotation of video sequences are important for digital video libraries, content-based video browsing and data mining projects. A successful video annotation system should provide users with useful video content summary in a reasonable processing time. Given the wide variety of video genres available today, automatically extracting meaningful video content for annotation still remains hard by using current available techniques. However, a wide range video has inherent structure such that some prior knowledge about the video content can be exploited to improve our understanding of the high-level video semantic content. In this paper, we develop tools and techniques for analyzing structured video by using the low-level information available directly from MPEG compressed video. Being able to work directly in the video compressed domain can greatly reduce the processing time and enhance storage efficiency. As a testbed, we have developed a basketball annotation system which combines the low-level information extracted from MPEG stream with the prior knowledge of basketball video structure to provide high level content analysis, annotation and browsing for events such as wide- angle and close-up views, fast breaks, steals, potential shots, number of possessions and possession times. We expect our approach can also be extended to structured video in other domains.

  17. Trust in automation: designing for appropriate reliance.

    PubMed

    Lee, John D; See, Katrina A

    2004-01-01

    Automation is often problematic because people fail to rely upon it appropriately. Because people respond to technology socially, trust influences reliance on automation. In particular, trust guides reliance when complexity and unanticipated situations make a complete understanding of the automation impractical. This review considers trust from the organizational, sociological, interpersonal, psychological, and neurological perspectives. It considers how the context, automation characteristics, and cognitive processes affect the appropriateness of trust. The context in which the automation is used influences automation performance and provides a goal-oriented perspective to assess automation characteristics along a dimension of attributional abstraction. These characteristics can influence trust through analytic, analogical, and affective processes. The challenges of extrapolating the concept of trust in people to trust in automation are discussed. A conceptual model integrates research regarding trust in automation and describes the dynamics of trust, the role of context, and the influence of display characteristics. Actual or potential applications of this research include improved designs of systems that require people to manage imperfect automation.

  18. Process development for automated solar cell and module production. Task 4: Automated array assembly

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A process sequence which can be used in conjunction with automated equipment for the mass production of solar cell modules for terrestrial use was developed. The process sequence was then critically analyzed from a technical and economic standpoint to determine the technological readiness of certain process steps for implementation. The steps receiving analysis were: back contact metallization, automated cell array layup/interconnect, and module edge sealing. For automated layup/interconnect, both hard automation and programmable automation (using an industrial robot) were studied. The programmable automation system was then selected for actual hardware development.

  19. Automated AI-based designer of electrical distribution systems

    NASA Astrophysics Data System (ADS)

    Sumic, Zarko

    1992-03-01

    Designing the electrical supply system for new residential developments (plat design) is an everyday task for electric utility engineers. Presently this task is carried out manually resulting in an overdesigned, costly, and nonstandardized solution. As an ill-structured and open-ended problem, plat design is difficult to automate with conventional approaches such as operational research or CAD. Additional complexity in automating plat design is imposed by the need to process spatial data such as circuits' maps, records, and construction plans. The intelligent decision support system for automated electrical plate design (IDSS for AEPD) is an engineering tool aimed at automating plate design. IDSS for AEPD combines the functionality of geographic information systems (GIS) a geographically referenced database, with the sophistication of artificial intelligence (AI) to deal with the complexity inherent in design problems. Blackboard problem solving architecture, concentrated around INGRES relational database and NEXPERT object expert system shell have been chosen to accommodate the diverse knowledge sources and data models. The GIS's principal task it to create, structure, and formalize the real world representation required by the rule based reasoning portion of the AEPD. IDSS's capability to support and enhance the engineer's design, rather than only automate the design process through a prescribed computation, makes it a preferred choice among the possible techniques for AEPD. This paper presents the results of knowledge acquisition and the knowledge engineering process with AEPD tool conceptual design issues. To verify the proposed concept, the comparison of results obtained by the AEPD tool with the design obtained by an experienced human designer is given.

  20. The contaminant analysis automation robot implementation for the automated laboratory

    SciTech Connect

    Younkin, J.R.; Igou, R.E.; Urenda, T.D.

    1995-12-31

    The Contaminant Analysis Automation (CAA) project defines the automated laboratory as a series of standard laboratory modules (SLM) serviced by a robotic standard support module (SSM). These SLMs are designed to allow plug-and-play integration into automated systems that perform standard analysis methods (SAM). While the SLMs are autonomous in the execution of their particular chemical processing task, the SAM concept relies on a high-level task sequence controller (TSC) to coordinate the robotic delivery of materials requisite for SLM operations, initiate an SLM operation with the chemical method dependent operating parameters, and coordinate the robotic removal of materials from the SLM when its commands and events has been established to allow ready them for transport operations as well as performing the Supervisor and Subsystems (GENISAS) software governs events from the SLMs and robot. The Intelligent System Operating Environment (ISOE) enables the inter-process communications used by GENISAS. CAA selected the Hewlett-Packard Optimized Robot for Chemical Analysis (ORCA) and its associated Windows based Methods Development Software (MDS) as the robot SSM. The MDS software is used to teach the robot each SLM position and required material port motions. To allow the TSC to command these SLM motions, a hardware and software implementation was required that allowed message passing between different operating systems. This implementation involved the use of a Virtual Memory Extended (VME) rack with a Force CPU-30 computer running VxWorks; a real-time multitasking operating system, and a Radiuses PC compatible VME computer running MDS. A GENISAS server on The Force computer accepts a transport command from the TSC, a GENISAS supervisor, over Ethernet and notifies software on the RadiSys PC of the pending command through VMEbus shared memory. The command is then delivered to the MDS robot control software using a Windows Dynamic Data Exchange conversation.

  1. Automation and robotics technology for intelligent mining systems

    NASA Technical Reports Server (NTRS)

    Welsh, Jeffrey H.

    1989-01-01

    The U.S. Bureau of Mines is approaching the problems of accidents and efficiency in the mining industry through the application of automation and robotics to mining systems. This technology can increase safety by removing workers from hazardous areas of the mines or from performing hazardous tasks. The short-term goal of the Automation and Robotics program is to develop technology that can be implemented in the form of an autonomous mining machine using current continuous mining machine equipment. In the longer term, the goal is to conduct research that will lead to new intelligent mining systems that capitalize on the capabilities of robotics. The Bureau of Mines Automation and Robotics program has been structured to produce the technology required for the short- and long-term goals. The short-term goal of application of automation and robotics to an existing mining machine, resulting in autonomous operation, is expected to be accomplished within five years. Key technology elements required for an autonomous continuous mining machine are well underway and include machine navigation systems, coal-rock interface detectors, machine condition monitoring, and intelligent computer systems. The Bureau of Mines program is described, including status of key technology elements for an autonomous continuous mining machine, the program schedule, and future work. Although the program is directed toward underground mining, much of the technology being developed may have applications for space systems or mining on the Moon or other planets.

  2. Cognitive engineering in aerospace application: Pilot interaction with cockpit automation

    NASA Technical Reports Server (NTRS)

    Sarter, Nadine R.; Woods, David D.

    1993-01-01

    Because of recent incidents involving glass-cockpit aircraft, there is growing concern with cockpit automation and its potential effects on pilot performance. However, little is known about the nature and causes of problems that arise in pilot-automation interaction. The results of two studies that provide converging, complementary data on pilots' difficulties with understanding and operating one of the core systems of cockpit automation, the Flight Management System (FMS) is reported. A survey asking pilots to describe specific incidents with the FMS and observations of pilots undergoing transition training to a glass cockpit aircraft served as vehicles to gather a corpus on the nature and variety of FMS-related problems. The results of both studies indicate that pilots become proficient in standard FMS operations through ground training and subsequent line experience. But even with considerable line experience, they still have difficulties tracking FMS status and behavior in certain flight contexts, and they show gaps in their understanding of the functional structure of the system. The results suggest that design-related factors such as opaque interfaces contribute to these difficulties which can affect pilots' situation awareness. The results of this research are relevant for both the design of cockpit automation and the development of training curricula specifically tailored to the needs of glass cockpits.

  3. Towards cooperative guidance and control of highly automated vehicles: H-Mode and Conduct-by-Wire.

    PubMed

    Flemisch, Frank Ole; Bengler, Klaus; Bubb, Heiner; Winner, Hermann; Bruder, Ralph

    2014-01-01

    This article provides a general ergonomic framework of cooperative guidance and control for vehicles with an emphasis on the cooperation between a human and a highly automated vehicle. In the twenty-first century, mobility and automation technologies are increasingly fused. In the sky, highly automated aircraft are flying with a high safety record. On the ground, a variety of driver assistance systems are being developed, and highly automated vehicles with increasingly autonomous capabilities are becoming possible. Human-centred automation has paved the way for a better cooperation between automation and humans. How can these highly automated systems be structured so that they can be easily understood, how will they cooperate with the human? The presented research was conducted using the methods of iterative build-up and refinement of framework by triangulation, i.e. by instantiating and testing the framework with at least two derived concepts and prototypes. This article sketches a general, conceptual ergonomic framework of cooperative guidance and control of highly automated vehicles, two concepts derived from the framework, prototypes and pilot data. Cooperation is exemplified in a list of aspects and related to levels of the driving task. With the concept 'Conduct-by-Wire', cooperation happens mainly on the guidance level, where the driver can delegate manoeuvres to the automation with a specialised manoeuvre interface. With H-Mode, a haptic-multimodal interaction with highly automated vehicles based on the H(orse)-Metaphor, cooperation is mainly done on guidance and control with a haptically active interface. Cooperativeness should be a key aspect for future human-automation systems. Especially for highly automated vehicles, cooperative guidance and control is a research direction with already promising concepts and prototypes that should be further explored. The application of the presented approach is every human-machine system that moves and includes high

  4. Towards cooperative guidance and control of highly automated vehicles: H-Mode and Conduct-by-Wire.

    PubMed

    Flemisch, Frank Ole; Bengler, Klaus; Bubb, Heiner; Winner, Hermann; Bruder, Ralph

    2014-01-01

    This article provides a general ergonomic framework of cooperative guidance and control for vehicles with an emphasis on the cooperation between a human and a highly automated vehicle. In the twenty-first century, mobility and automation technologies are increasingly fused. In the sky, highly automated aircraft are flying with a high safety record. On the ground, a variety of driver assistance systems are being developed, and highly automated vehicles with increasingly autonomous capabilities are becoming possible. Human-centred automation has paved the way for a better cooperation between automation and humans. How can these highly automated systems be structured so that they can be easily understood, how will they cooperate with the human? The presented research was conducted using the methods of iterative build-up and refinement of framework by triangulation, i.e. by instantiating and testing the framework with at least two derived concepts and prototypes. This article sketches a general, conceptual ergonomic framework of cooperative guidance and control of highly automated vehicles, two concepts derived from the framework, prototypes and pilot data. Cooperation is exemplified in a list of aspects and related to levels of the driving task. With the concept 'Conduct-by-Wire', cooperation happens mainly on the guidance level, where the driver can delegate manoeuvres to the automation with a specialised manoeuvre interface. With H-Mode, a haptic-multimodal interaction with highly automated vehicles based on the H(orse)-Metaphor, cooperation is mainly done on guidance and control with a haptically active interface. Cooperativeness should be a key aspect for future human-automation systems. Especially for highly automated vehicles, cooperative guidance and control is a research direction with already promising concepts and prototypes that should be further explored. The application of the presented approach is every human-machine system that moves and includes high

  5. Parmodel: a web server for automated comparative modeling of proteins.

    PubMed

    Uchôa, Hugo Brandão; Jorge, Guilherme Eberhart; Freitas Da Silveira, Nelson José; Camera, João Carlos; Canduri, Fernanda; De Azevedo, Walter Filgueira

    2004-12-24

    Parmodel is a web server for automated comparative modeling and evaluation of protein structures. The aim of this tool is to help inexperienced users to perform modeling, assessment, visualization, and optimization of protein models as well as crystallographers to evaluate structures solved experimentally. It is subdivided in four modules: Parmodel Modeling, Parmodel Assessment, Parmodel Visualization, and Parmodel Optimization. The main module is the Parmodel Modeling that allows the building of several models for a same protein in a reduced time, through the distribution of modeling processes on a Beowulf cluster. Parmodel automates and integrates the main softwares used in comparative modeling as MODELLER, Whatcheck, Procheck, Raster3D, Molscript, and Gromacs. This web server is freely accessible at .

  6. Automated 3D vascular segmentation in CT hepatic venography

    NASA Astrophysics Data System (ADS)

    Fetita, Catalin; Lucidarme, Olivier; Preteux, Francoise

    2005-08-01

    In the framework of preoperative evaluation of the hepatic venous anatomy in living-donor liver transplantation or oncologic rejections, this paper proposes an automated approach for the 3D segmentation of the liver vascular structure from 3D CT hepatic venography data. The developed segmentation approach takes into account the specificities of anatomical structures in terms of spatial location, connectivity and morphometric properties. It implements basic and advanced morphological operators (closing, geodesic dilation, gray-level reconstruction, sup-constrained connection cost) in mono- and multi-resolution filtering schemes in order to achieve an automated 3D reconstruction of the opacified hepatic vessels. A thorough investigation of the venous anatomy including morphometric parameter estimation is then possible via computer-vision 3D rendering, interaction and navigation capabilities.

  7. Advances in in situ inspection of automated fiber placement systems

    NASA Astrophysics Data System (ADS)

    Juarez, Peter D.; Cramer, K. Elliott; Seebo, Jeffrey P.

    2016-05-01

    Automated Fiber Placement (AFP) systems have been developed to help take advantage of the tailorability of composite structures in aerospace applications. AFP systems allow the repeatable placement of uncured, spool fed, preimpregnated carbon fiber tape (tows) onto substrates in desired thicknesses and orientations. This automated process can incur defects, such as overlapping tow lines, which can severely undermine the structural integrity of the part. Current defect detection and abatement methods are very labor intensive, and still mostly rely on human manual inspection. Proposed is a thermographic in situ inspection technique which monitors tow placement with an on board thermal camera using the preheated substrate as a through transmission heat source. An investigation of the concept is conducted, and preliminary laboratory results are presented. Also included will be a brief overview of other emerging technologies that tackle the same issue.

  8. A catalog of automated analysis methods for enterprise models.

    PubMed

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool.

  9. Expert Robots For Automated Packaging And Processing

    NASA Astrophysics Data System (ADS)

    Slutzky, G. D.; Hall, E. L.; Shell, R. L.

    1989-02-01

    A variety of problems in automated packaging and processing seem ready for expert robotic solutions. Such problems as automated palletizing, bin-picking, automated stoilw and retrieval, automated kitting of parts for assembly, and automated warehousing are currently being considered. The use of expert robots which consist of specialized computer programs, manipulators and integrated sensors has been demonstrated with robot Chedkers, peg games, etc. Actual solutions for automated palletizing, pit-carb basket loading, etc. have also been developed for industrial applications at our Center. The generic concepts arising from this research will be described, unsolved problems discussed, and some important tools demonstrated. The significance of this work lies in its broad application to a host of generic industrial problems which can improve quality, reduce waste, are eliminate human injuries.

  10. AUTOMATED DEFECT CLASSIFICATION USING AN ARTIFICIAL NEURAL NETWORK

    SciTech Connect

    Chady, T.; Caryk, M.; Piekarczyk, B.

    2009-03-03

    The automated defect classification algorithm based on artificial neural network with multilayer backpropagation structure was utilized. The selected features of flaws were used as input data. In order to train the neural network it is necessary to prepare learning data which is representative database of defects. Database preparation requires the following steps: image acquisition and pre-processing, image enhancement, defect detection and feature extraction. The real digital radiographs of welded parts of a ship were used for this purpose.

  11. Automated systems for identification of microorganisms.

    PubMed Central

    Stager, C E; Davis, J R

    1992-01-01

    Automated instruments for the identification of microorganisms were introduced into clinical microbiology laboratories in the 1970s. During the past two decades, the capabilities and performance characteristics of automated identification systems have steadily progressed and improved. This article explores the development of the various automated identification systems available in the United States and reviews their performance for identification of microorganisms. Observations regarding deficiencies and suggested improvements for these systems are provided. PMID:1498768

  12. Powder handling for automated fuel processing

    SciTech Connect

    Frederickson, J.R.; Eschenbaum, R.C.; Goldmann, L.H.

    1989-04-09

    Installation of the Secure Automated Fabrication (SAF) line has been completed. It is located in the Fuel Cycle Plant (FCP) at the Department of Energy's (DOE) Hanford site near Richland, Washington. The SAF line was designed to fabricate advanced reactor fuel pellets and assemble fuel pins by automated, remote operation. This paper describes powder handling equipment and techniques utilized for automated powder processing and powder conditioning systems in this line. 9 figs.

  13. Temperature automation for a propellant mixer

    NASA Technical Reports Server (NTRS)

    Vincent, T. L.; Wilson, R. G.

    1990-01-01

    The analysis and installation of an automatic temperature controller on a propellant mixer is presented. Ultimately, the entire mixing process will come under automation, but since precise adherence to the temperature profile is very difficult to sustain manually, this was the first component to be automated. Automation is not only important for producing a uniform product, but it is necessary for envisioned space-based propellant production.

  14. New luster for space robots and automation

    NASA Technical Reports Server (NTRS)

    Heer, E.

    1978-01-01

    Consideration is given to the potential role of robotics and automation in space transportation systems. Automation development requirements are defined for projects in space exploration, global services, space utilization, and space transport. In each category the potential automation of ground operations, on-board spacecraft operations, and in-space handling is noted. The major developments of space robot technology are noted for the 1967-1978 period. Economic aspects of ground-operation, ground command, and mission operations are noted.

  15. Human-centered aircraft automation: A concept and guidelines

    NASA Technical Reports Server (NTRS)

    Billings, Charles E.

    1991-01-01

    Aircraft automation is examined and its effects on flight crews. Generic guidelines are proposed for the design and use of automation in transport aircraft, in the hope of stimulating increased and more effective dialogue among designers of automated cockpits, purchasers of automated aircraft, and the pilots who must fly those aircraft in line operations. The goal is to explore the means whereby automation may be a maximally effective tool or resource for pilots without compromising human authority and with an increase in system safety. After definition of the domain of the aircraft pilot and brief discussion of the history of aircraft automation, a concept of human centered automation is presented and discussed. Automated devices are categorized as a control automation, information automation, and management automation. The environment and context of aircraft automation are then considered, followed by thoughts on the likely future of automation of that category.

  16. Automated decentralized pharmacy dispensing systems.

    PubMed

    1996-12-01

    Automated decentralized pharmacy dispensing systems (ADPDSs) are medication management systems that allow hospitals to store and dispense drugs near the point of use. These systems, which can be compared with the automated teller machines used by banks, provide nurses with ready access to medications while maintaining tight control of drug distribution. In this study, we evaluated three ADPDSs from two suppliers, focusing on whether these systems can store and dispense drugs in a safe, secure, and effective manner. When rating the systems, we considered their applicability to two different implementation schemes: The use of a system with a pharmacy profile interface. This feature broadens the capabilities of the system by allowing more information to be provided at the dispensing cabinet and by providing better integration of the information from this cabinet with the pharmacy's information system. Two of the evaluated systems have this feature and were rated Acceptable. The use of a system without a pharmacy profile interface. We rated all three of the evaluated systems Acceptable for such implementations. To decide which scheme is most appropriate for a particular hospital, the facility will need to determine both how it intends to use the ADPDS and what it hopes to achieve by implementing the system. By performing this type of analysis, the facility can then determine which ADPDS features and capabilities are needed to accomplish its goals. To help facilities make these decisions, we have provided an Equipment Management Guide, "Improving the Drug Distribution Process-Do You Need an Automated Decentralized Pharmacy Dispensing System?," which precedes this Evaluation. In addition, readers unfamiliar with the roles of both the pharmacy and the pharmacist within the hospital can refer to the Primer, "Functions of a Hospital Pharmacy," also published in this issue. PMID:8968721

  17. Automated dry powder dispenser for explosive components

    SciTech Connect

    Garcia, P.; Salmonson, J.C.

    1992-09-01

    Sandia and Mound are developing a workcell that will automate the assembly of explosive components. Sandia is responsible for the automated powder dispenser subsystem. Automated dispensing of explosive powders in the past resulted in separation or segregation of powder constituents. The Automated Dry Powder Dispenser designed by Sandia achieves weight tolerances of {plus_minus}0.1 mg while keeping powderoxidizer separation to a minimum. A software control algorithm compensates fore changes in powder flow due to lot variations, temperature, humidity, and the amount of powder left in the system.

  18. Automated dry powder dispenser for explosive components

    SciTech Connect

    Garcia, P. ); Salmonson, J.C. )

    1992-01-01

    Sandia and Mound are developing a workcell that will automate the assembly of explosive components. Sandia is responsible for the automated powder dispenser subsystem. Automated dispensing of explosive powders in the past resulted in separation or segregation of powder constituents. The Automated Dry Powder Dispenser designed by Sandia achieves weight tolerances of {plus minus}0.1 mg while keeping powderoxidizer separation to a minimum. A software control algorithm compensates fore changes in powder flow due to lot variations, temperature, humidity, and the amount of powder left in the system.

  19. Automation and quality in analytical laboratories

    SciTech Connect

    Valcarcel, M.; Rios, A.

    1994-05-01

    After a brief introduction to the generic aspects of automation in analytical laboratories, the different approaches to quality in analytical chemistry are presented and discussed to establish the following different facets emerging from the combination of quality and automation: automated analytical control of quality of products and systems; quality control of automated chemical analysis; and improvement of capital (accuracy and representativeness), basic (sensitivity, precision, and selectivity), and complementary (rapidity, cost, and personnel factors) analytical features. Several examples are presented to demonstrate the importance of this marriage of convenience in present and future analytical chemistry. 7 refs., 4 figs.

  20. Automation: Turning mixed cullet into cash

    SciTech Connect

    Woods, R.

    1994-01-01

    Ask any glass processor or recycler about the principal reason prices for their cullet can falter and the second most popular response probably will be material quality.'' No matter how well-educated the customers, no matter how well-trained the line pickers, there are always some unnoticed contaminants that fool the eye and get into the recycling bins, slip past on the conveyor belts, and ruin a load. In addition, glass has the tendency to break -- especially in the growing number of high-compaction co-collection vehicles -- leaving unusable, mixed cullet behind that is difficult and dangerous to sort by had. Most work on automated separation of whole glass containers in this country remains in the research and development stage. So far, this work has had few enthusiastic supporters, and has ground nearly to a halt. Right now, it just doesn't make economic sense.'' With most processors sticking to manual sorting of whole bottles, MSS and several other companies are focusing, instead, of beneficiating nearly marketless mixed broken cullet. From that stream new contaminant-detection technology can pick out bits of unwanted window glass, bottle caps, plastics, labels, ceramics, and porcelain, which have different melting points and can cause impurities and structural weaknesses in recycled glass. Other units can detect colored cullet from clear and automatically eject it. To date, applications of these machines have been limited, but news from field tests in the US and commercial operations in Europe -- considered by many to be the birthplace of automated sorting technology -- is encouraging.

  1. Automated mass spectrometer grows up

    SciTech Connect

    McInteer, B.B.; Montoya, J.G.; Stark, E.E.

    1984-01-01

    In 1980 we reported the development of an automated mass spectrometer for large scale batches of samples enriched in nitrogen-15 as ammonium salts. Since that time significant technical progress has been made in the instrument. Perhaps more significantly, administrative and institutional changes have permitted the entire effort to be transferred to the private sector from its original base at the Los Alamos National Laboratory. This has ensured the continuance of a needed service to the international scientific community as revealed by a development project at a national laboratory, and is an excellent example of beneficial technology transfer to private industry.

  2. Automated fuel pin loading system

    DOEpatents

    Christiansen, David W.; Brown, William F.; Steffen, Jim M.

    1985-01-01

    An automated loading system for nuclear reactor fuel elements utilizes a gravity feed conveyor which permits individual fuel pins to roll along a constrained path perpendicular to their respective lengths. The individual lengths of fuel cladding are directed onto movable transports, where they are aligned coaxially with the axes of associated handling equipment at appropriate production stations. Each fuel pin can be reciprocated axially and/or rotated about its axis as required during handling steps. The fuel pins are inserted as a batch prior to welding of end caps by one of two disclosed welding systems.

  3. Automated solar panel assembly line

    NASA Technical Reports Server (NTRS)

    Somberg, H.

    1981-01-01

    The initial stage of the automated solar panel assembly line program was devoted to concept development and proof of approach through simple experimental verification. In this phase, laboratory bench models were built to demonstrate and verify concepts. Following this phase was machine design and integration of the various machine elements. The third phase was machine assembly and debugging. In this phase, the various elements were operated as a unit and modifications were made as required. The final stage of development was the demonstration of the equipment in a pilot production operation.

  4. Automated planar patch-clamp.

    PubMed

    Milligan, Carol J; Möller, Clemens

    2013-01-01

    Ion channels are integral membrane proteins that regulate the flow of ions across the plasma membrane and the membranes of intracellular organelles of both excitable and non-excitable cells. Ion channels are vital to a wide variety of biological processes and are prominent components of the nervous system and cardiovascular system, as well as controlling many metabolic functions. Furthermore, ion channels are known to be involved in many disease states and as such have become popular therapeutic targets. For many years now manual patch-clamping has been regarded as one of the best approaches for assaying ion channel function, through direct measurement of ion flow across these membrane proteins. Over the last decade there have been many remarkable breakthroughs in the development of technologies enabling the study of ion channels. One of these breakthroughs is the development of automated planar patch-clamp technology. Automated platforms have demonstrated the ability to generate high-quality data with high throughput capabilities, at great efficiency and reliability. Additional features such as simultaneous intracellular and extracellular perfusion of the cell membrane, current clamp operation, fast compound application, an increasing rate of parallelization, and more recently temperature control have been introduced. Furthermore, in addition to the well-established studies of over-expressed ion channel proteins in cell lines, new generations of planar patch-clamp systems have enabled successful studies of native and primary mammalian cells. This technology is becoming increasingly popular and extensively used both within areas of drug discovery as well as academic research. Many platforms have been developed including NPC-16 Patchliner(®) and SyncroPatch(®) 96 (Nanion Technologies GmbH, Munich), CytoPatch™ (Cytocentrics AG, Rostock), PatchXpress(®) 7000A, IonWorks(®) Quattro and IonWorks Barracuda™, (Molecular Devices, LLC); Dynaflow(®) HT (Cellectricon

  5. Automated flight test management system

    NASA Technical Reports Server (NTRS)

    Hewett, M. D.; Tartt, D. M.; Agarwal, A.

    1991-01-01

    The Phase 1 development of an automated flight test management system (ATMS) as a component of a rapid prototyping flight research facility for artificial intelligence (AI) based flight concepts is discussed. The ATMS provides a flight engineer with a set of tools that assist in flight test planning, monitoring, and simulation. The system is also capable of controlling an aircraft during flight test by performing closed loop guidance functions, range management, and maneuver-quality monitoring. The ATMS is being used as a prototypical system to develop a flight research facility for AI based flight systems concepts at NASA Ames Dryden.

  6. Automated Coal-Mining System

    NASA Technical Reports Server (NTRS)

    Gangal, M. D.; Isenberg, L.; Lewis, E. V.

    1985-01-01

    Proposed system offers safety and large return on investment. System, operating by year 2000, employs machines and processes based on proven principles. According to concept, line of parallel machines, connected in groups of four to service modules, attacks face of coal seam. High-pressure water jets and central auger on each machine break face. Jaws scoop up coal chunks, and auger grinds them and forces fragments into slurry-transport system. Slurry pumped through pipeline to point of use. Concept for highly automated coal-mining system increases productivity, makes mining safer, and protects health of mine workers.

  7. Programmable, automated transistor test system

    NASA Technical Reports Server (NTRS)

    Truong, L. V.; Sundburg, G. R.

    1986-01-01

    A programmable, automated transistor test system was built to supply experimental data on new and advanced power semiconductors. The data will be used for analytical models and by engineers in designing space and aircraft electric power systems. A pulsed power technique was used at low duty cycles in a nondestructive test to examine the dynamic switching characteristic curves of power transistors in the 500 to 1000 V, 10 to 100 A range. Data collection, manipulation, storage, and output are operator interactive but are guided and controlled by the system software.

  8. Automated fuel pin loading system

    DOEpatents

    Christiansen, D.W.; Brown, W.F.; Steffen, J.M.

    An automated loading system for nuclear reactor fuel elements utilizes a gravity feed conveyor which permits individual fuel pins to roll along a constrained path perpendicular to their respective lengths. The individual lengths of fuel cladding are directed onto movable transports, where they are aligned coaxially with the axes of associated handling equipment at appropriate production stations. Each fuel pin can be be reciprocated axially and/or rotated about its axis as required during handling steps. The fuel pins are inerted as a batch prior to welding of end caps by one of two disclosed welding systems.

  9. Automating a residual gas analyzer

    NASA Technical Reports Server (NTRS)

    Petrie, W. F.; Westfall, A. H.

    1982-01-01

    A residual gas analyzer (RGA), a device for measuring the amounts and species of various gases present in a vacuum system is discussed. In a recent update of the RGA, it was shown that the use of microprocessors could revolutionize data acquisition and data reduction. This revolution is exemplified by the Inficon 1Q200 RGA which was selected to meet the needs of this update. The Inficon RGA and the Zilog microcomputer were interfaced in order the receive and format the digital data from the RGA. This automated approach is discussed in detail.

  10. AUTOMATION.

    ERIC Educational Resources Information Center

    Manpower Research Council, Milwaukee, WI.

    THE MANPOWER RESEARCH COUNCIL, A NONPROFIT SERVICE ORGANIZATION, HAS AS ITS OBJECTIVE THE DEVELOPMENT OF AN INTERCHANGE AMONG THE MANUFACTURING AND SERVICE INDUSTRIES OF THE UNITED STATES OF INFORMATION ON EMPLOYMENT, INDUSTRIAL RELATIONS TRENDS AND ACTIVITIES, AND MANAGEMENT PROBLEMS. A SURVEY OF 200 MEMBER CORPORATIONS, EMPLOYING A TOTAL OF…

  11. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  12. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  13. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  14. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  15. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  16. Automated Microwave Low Power Testing Techniques for NLC

    SciTech Connect

    Carter, H.; Finley, D.; Gonin, I.; Khabibullin, T.; Romanov, G.; Sun, D.; Adolphsen, C.; Wang, J.; /SLAC

    2005-07-08

    As part of the Next Linear Collider (NLC) collaboration, the NLC structures group at Fermilab has started an R&D program to fabricate NLC accelerator structures in cooperation with commercial companies in order to prepare for mass production of RF structures. To build the Next Linear Collider, thousands accelerator structures containing a million cells are needed. Our primary goal is to explore the feasibility of making these structures in an industrial environment. On the other hand the structure mass production requires ''industrialized''microwave quality control techniques to characterize these structures at different stages of production as efficiently as possible. We developed several automated set-ups based on different RF techniques that are mutually complementary address this problem.

  17. Automation of Oklahoma School Library Media Centers: Automation at the Local Level.

    ERIC Educational Resources Information Center

    Oklahoma State Dept. of Education, Oklahoma City. Library and Learning Resources Section.

    This document outlines a workshop for media specialists--"School Library Automation: Solving the Puzzle"--that is designed to reduce automation anxiety and give a broad overview of the concerns confronting school library media centers planning for or involved in automation. Issues are addressed under the following headings: (1) Levels of School…

  18. Massachusetts Library Automation Survey: A Directory of Automated Operations in Massachusetts Libraries.

    ERIC Educational Resources Information Center

    Stephens, Eileen; Nijenberg, Caroline

    This directory is designed to provide information on automated systems and/or equipment used in libraries to provide a tool for planning future automation in the context of interlibrary cooperation considerations, and to inform the library and information community of the state of the art of automation in Massachusetts libraries. The main body is…

  19. Automated Car Park Management System

    NASA Astrophysics Data System (ADS)

    Fabros, J. P.; Tabañag, D.; Espra, A.; Gerasta, O. J.

    2015-06-01

    This study aims to develop a prototype for an Automated Car Park Management System that will increase the quality of service of parking lots through the integration of a smart system that assists motorist in finding vacant parking lot. The research was based on implementing an operating system and a monitoring system for parking system without the use of manpower. This will include Parking Guidance and Information System concept which will efficiently assist motorists and ensures the safety of the vehicles and the valuables inside the vehicle. For monitoring, Optical Character Recognition was employed to monitor and put into list all the cars entering the parking area. All parking events in this system are visible via MATLAB GUI which contain time-in, time-out, time consumed information and also the lot number where the car parks. To put into reality, this system has a payment method, and it comes via a coin slot operation to control the exit gate. The Automated Car Park Management System was successfully built by utilizing microcontrollers specifically one PIC18f4550 and two PIC16F84s and one PIC16F628A.

  20. Automated experimentation in ecological networks

    PubMed Central

    2011-01-01

    Background In ecological networks, natural communities are studied from a complex systems perspective by representing interactions among species within them in the form of a graph, which is in turn analysed using mathematical tools. Topological features encountered in complex networks have been proved to provide the systems they represent with interesting attributes such as robustness and stability, which in ecological systems translates into the ability of communities to resist perturbations of different kinds. A focus of research in community ecology is on understanding the mechanisms by which these complex networks of interactions among species in a community arise. We employ an agent-based approach to model ecological processes operating at the species' interaction level for the study of the emergence of organisation in ecological networks. Results We have designed protocols of interaction among agents in a multi-agent system based on ecological processes occurring at the interaction level between species in plant-animal mutualistic communities. Interaction models for agents coordination thus engineered facilitate the emergence of network features such as those found in ecological networks of interacting species, in our artificial societies of agents. Conclusions Agent based models developed in this way facilitate the automation of the design an execution of simulation experiments that allow for the exploration of diverse behavioural mechanisms believed to be responsible for community organisation in ecological communities. This automated way of conducting experiments empowers the study of ecological networks by exploiting the expressive power of interaction models specification in agent systems. PMID:21554669

  1. Automation of surface observations program

    NASA Technical Reports Server (NTRS)

    Short, Steve E.

    1988-01-01

    At present, surface weather observing methods are still largely manual and labor intensive. Through the nationwide implementation of Automated Surface Observing Systems (ASOS), this situation can be improved. Two ASOS capability levels are planned. The first is a basic-level system which will automatically observe the weather parameters essential for aviation operations and will operate either with or without supplemental contributions by an observer. The second is a more fully automated, stand-alone system which will observe and report the full range of weather parameters and will operate primarily in the unattended mode. Approximately 250 systems are planned by the end of the decade. When deployed, these systems will generate the standard hourly and special long-line transmitted weather observations, as well as provide continuous weather information direct to airport users. Specific ASOS configurations will vary depending upon whether the operation is unattended, minimally attended, or fully attended. The major functions of ASOS are data collection, data processing, product distribution, and system control. The program phases of development, demonstration, production system acquisition, and operational implementation are described.

  2. Cassini Tour Atlas Automated Generation

    NASA Technical Reports Server (NTRS)

    Grazier, Kevin R.; Roumeliotis, Chris; Lange, Robert D.

    2011-01-01

    During the Cassini spacecraft s cruise phase and nominal mission, the Cassini Science Planning Team developed and maintained an online database of geometric and timing information called the Cassini Tour Atlas. The Tour Atlas consisted of several hundreds of megabytes of EVENTS mission planning software outputs, tables, plots, and images used by mission scientists for observation planning. Each time the nominal mission trajectory was altered or tweaked, a new Tour Atlas had to be regenerated manually. In the early phases of Cassini s Equinox Mission planning, an a priori estimate suggested that mission tour designers would develop approximately 30 candidate tours within a short period of time. So that Cassini scientists could properly analyze the science opportunities in each candidate tour quickly and thoroughly so that the optimal series of orbits for science return could be selected, a separate Tour Atlas was required for each trajectory. The task of manually generating the number of trajectory analyses in the allotted time would have been impossible, so the entire task was automated using code written in five different programming languages. This software automates the generation of the Cassini Tour Atlas database. It performs with one UNIX command what previously took a day or two of human labor.

  3. Automated cleaning of electronic components

    SciTech Connect

    Drotning, W.

    1994-03-01

    Environmental and operator safety concerns are leading to the elimination of trichloroethylene (TCE) and chlorofluorocarbon (CFC) solvents in electronic component cleaning processes that remove rosin flux, organic and inorganic contamination, and particulates. Present processes depend heavily on these solvents for manual spray cleaning of small components and subassemblies. Use of alternative solvent systems can lead to longer processing times and reduced quality. Automated spray cleaning can improve the quality of the cleaning process, thus enabling the productive use of environmentally conscious materials, while minimizing personnel exposure to hazardous materials. In addition, the use of robotic and automated systems can reduce the manual handling of parts that necessitates additional cleaning. We describe the development of a prototype robotic system for cleaning electronic components in a spray cleaning workcell. An important feature of the prototype system is the capability to generate the robot paths and motions automatically from the CAD models of the part to be cleaned, and to embed cleaning process knowledge into the automatically programmed operations.

  4. Automation and robotics human performance

    NASA Technical Reports Server (NTRS)

    Mah, Robert W.

    1990-01-01

    The scope of this report is limited to the following: (1) assessing the feasibility of the assumptions for crew productivity during the intra-vehicular activities and extra-vehicular activities; (2) estimating the appropriate level of automation and robotics to accomplish balanced man-machine, cost-effective operations in space; (3) identifying areas where conceptually different approaches to the use of people and machines can leverage the benefits of the scenarios; and (4) recommending modifications to scenarios or developing new scenarios that will improve the expected benefits. The FY89 special assessments are grouped into the five categories shown in the report. The high level system analyses for Automation & Robotics (A&R) and Human Performance (HP) were performed under the Case Studies Technology Assessment category, whereas the detailed analyses for the critical systems and high leverage development areas were performed under the appropriate operations categories (In-Space Vehicle Operations or Planetary Surface Operations). The analysis activities planned for the Science Operations technology areas were deferred to FY90 studies. The remaining activities such as analytic tool development, graphics/video demonstrations and intelligent communicating systems software architecture were performed under the Simulation & Validations category.

  5. Automated glass-fragmentation analysis

    NASA Astrophysics Data System (ADS)

    Gordon, Gaile G.

    1996-02-01

    This paper describes a novel automated inspection process for tempered safety glass. The system is geared toward the European Community (EC) import regulations which are based on fragment count and dimensions in a fractured glass sample. The automation of this test presents two key challenges: image acquisition, and robust particle segmentation. The image acquisition must perform well both for clear and opaque glass. Opaque regions of glass are common in the American auto industry due to painted styling or adhesives (e.g. defroster cables). The system presented uses a multiple light source, reflected light imaging technique, rather than transmitted light imaging which is often used in manual versions of this inspection test. Segmentation of the glass fragments in the resulting images must produce clean and completely connected crack lines in order to compute the correct particle count. Processing must therefore be robust with respect to noise in the imaging process such as dust and glint on the glass. The system presented takes advantage of mathematical morphology algorithms, in particular the watershed algorithm, to perform robust preprocessing and segmentation. Example images and image segmentation results are shown for tempered safety glass which has been painted on the outside edges for styling purposes.

  6. Fully Mechanically Controlled Automated Electron Microscopic Tomography

    PubMed Central

    Liu, Jinxin; Li, Hongchang; Zhang, Lei; Rames, Matthew; Zhang, Meng; Yu, Yadong; Peng, Bo; Celis, César Díaz; Xu, April; Zou, Qin; Yang, Xu; Chen, Xuefeng; Ren, Gang

    2016-01-01

    Knowledge of three-dimensional (3D) structures of each individual particles of asymmetric and flexible proteins is essential in understanding those proteins’ functions; but their structures are difficult to determine. Electron tomography (ET) provides a tool for imaging a single and unique biological object from a series of tilted angles, but it is challenging to image a single protein for three-dimensional (3D) reconstruction due to the imperfect mechanical control capability of the specimen goniometer under both a medium to high magnification (approximately 50,000–160,000×) and an optimized beam coherence condition. Here, we report a fully mechanical control method for automating ET data acquisition without using beam tilt/shift processes. This method could reduce the accumulation of beam tilt/shift that used to compensate the error from the mechanical control, but downgraded the beam coherence. Our method was developed by minimizing the error of the target object center during the tilting process through a closed-loop proportional-integral (PI) control algorithm. The validations by both negative staining (NS) and cryo-electron microscopy (cryo-EM) suggest that this method has a comparable capability to other ET methods in tracking target proteins while maintaining optimized beam coherence conditions for imaging. PMID:27403922

  7. Fully Mechanically Controlled Automated Electron Microscopic Tomography

    NASA Astrophysics Data System (ADS)

    Liu, Jinxin; Li, Hongchang; Zhang, Lei; Rames, Matthew; Zhang, Meng; Yu, Yadong; Peng, Bo; Celis, César Díaz; Xu, April; Zou, Qin; Yang, Xu; Chen, Xuefeng; Ren, Gang

    2016-07-01

    Knowledge of three-dimensional (3D) structures of each individual particles of asymmetric and flexible proteins is essential in understanding those proteins’ functions; but their structures are difficult to determine. Electron tomography (ET) provides a tool for imaging a single and unique biological object from a series of tilted angles, but it is challenging to image a single protein for three-dimensional (3D) reconstruction due to the imperfect mechanical control capability of the specimen goniometer under both a medium to high magnification (approximately 50,000–160,000×) and an optimized beam coherence condition. Here, we report a fully mechanical control method for automating ET data acquisition without using beam tilt/shift processes. This method could reduce the accumulation of beam tilt/shift that used to compensate the error from the mechanical control, but downgraded the beam coherence. Our method was developed by minimizing the error of the target object center during the tilting process through a closed-loop proportional-integral (PI) control algorithm. The validations by both negative staining (NS) and cryo-electron microscopy (cryo-EM) suggest that this method has a comparable capability to other ET methods in tracking target proteins while maintaining optimized beam coherence conditions for imaging.

  8. Fully Mechanically Controlled Automated Electron Microscopic Tomography.

    PubMed

    Liu, Jinxin; Li, Hongchang; Zhang, Lei; Rames, Matthew; Zhang, Meng; Yu, Yadong; Peng, Bo; Celis, César Díaz; Xu, April; Zou, Qin; Yang, Xu; Chen, Xuefeng; Ren, Gang

    2016-01-01

    Knowledge of three-dimensional (3D) structures of each individual particles of asymmetric and flexible proteins is essential in understanding those proteins' functions; but their structures are difficult to determine. Electron tomography (ET) provides a tool for imaging a single and unique biological object from a series of tilted angles, but it is challenging to image a single protein for three-dimensional (3D) reconstruction due to the imperfect mechanical control capability of the specimen goniometer under both a medium to high magnification (approximately 50,000-160,000×) and an optimized beam coherence condition. Here, we report a fully mechanical control method for automating ET data acquisition without using beam tilt/shift processes. This method could reduce the accumulation of beam tilt/shift that used to compensate the error from the mechanical control, but downgraded the beam coherence. Our method was developed by minimizing the error of the target object center during the tilting process through a closed-loop proportional-integral (PI) control algorithm. The validations by both negative staining (NS) and cryo-electron microscopy (cryo-EM) suggest that this method has a comparable capability to other ET methods in tracking target proteins while maintaining optimized beam coherence conditions for imaging. PMID:27403922

  9. Automated D/3 to Visio Analog Diagrams

    2000-08-10

    ADVAD1 reads an ASCII file containing the D/3 DCS MDL input for analog points for a D/3 continuous database. It uses the information in the files to create a series of Visio files representing the structure of each analog chain, one drawing per Visio file. The actual drawing function is performed by Visio (requires Visio version 4.5+). The user can configure the program to select which fields in the database are shown on the diagrammore » and how the information is to be presented. This gives a visual representation of the structure of the analog chains, showing selected fields in a consistent manner. Updating documentation can be done easily and the automated approach eliminates human error in the cadding process. The program can also create the drawings far faster than a human operator is capable, able to create approximately 270 typical diagrams in about 8 minutes on a Pentium II 400 MHz PC. The program allows for multiple option sets to be saved to provide different settings (i.e., different fields, different field presentations, and /or different diagram layouts) for various scenarios or facilities on one workstation. Option sets may be exported from the Windows registry to allow duplication of settings on another workstation.« less

  10. The Automated Planet Finder's automation & first two years of science

    NASA Astrophysics Data System (ADS)

    Burt, Jennifer; Laughlin, Greg; Vogt, Steven S.; Holden, Bradford

    2016-01-01

    The Automated Planet Finder (APF) is the newest facility at Lick Observatory, comprised of a 2.4m telescope coupled with the high-resolution Levy echelle spectrograph. Purpose built for exoplanet detection and characterization, 80% of the telescope's observing time is dedicated to these science goals. The APF has demonstrated 1 m/s radial velocity precision on bright, RV standard stars and performs with the same speed-on-sky as Keck/HIRES when observing M-dwarfs.The telesope is fully automated for RV operations, using a dynamic scheduler that makes informed decisions on which targets to observe based on scientific interest, desired cadence, required precision levels and current observing conditions, all on a minute-to-minute basis. This ensures that time is not wasted chasing non-optimal targets on nights with poor conditions and enables rapid changes to the overall science observing strategy.The APF has contributed to the detection of four planetary systems in its first two years of scientific operations. Our most recent detection is that of a 6-planet system around the bright (V=5.5), nearby (d=6.5pc), K3V star HD 219134. The planets in this system have masses ranging from 3.5 to108 MEarth, with orbital periods from 3 to 2247 days. An independent detection of the inner 4 planets in this system by the HARPS-N team has shown that the 3d planet transits the star, making this system ideal for follow-up observations.I will discuss the design and implementation of the APF's dynamic scheduler, the telescope's planet detections to date, overall performance results of the telescope and our future observing strategy.

  11. Working toward Transparency in Library Automation

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2007-01-01

    In this article, the author argues the need for transparency with regard to the automation systems used in libraries. As librarians make decisions regarding automation software and services, they should have convenient access to information about the organizations it will potentially acquire technology from and about the collective experiences of…

  12. Library Automation: A Measure of Attitude.

    ERIC Educational Resources Information Center

    Molholt, Pat A.

    Findings of the study described in this report indicate that the attitudes of library school students toward library automation were not changed significantly by a semester of relevant coursework. It has been hypothesized that these students would have a somewhat negative attitude toward automation, but that through relevant course instruction…

  13. At the intersection of automation and culture

    NASA Technical Reports Server (NTRS)

    Sherman, P. J.; Wiener, E. L.

    1995-01-01

    The crash of an automated passenger jet at Nagoya, Japan, in 1995, is used as an example of crew error in using automatic systems. Automation provides pilots with the ability to perform tasks in various ways. National culture is cited as a factor that affects how a pilot and crew interact with each other and equipment.

  14. Workflow Automation: A Collective Case Study

    ERIC Educational Resources Information Center

    Harlan, Jennifer

    2013-01-01

    Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…

  15. Investing in the Future: Automation Marketplace 2009

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2009-01-01

    In a year where the general economy presented enormous challenges, libraries continued to make investments in automation, especially in products that help improve what and how they deliver to their end users. Access to electronic content remains a key driver. In response to anticipated needs for new approaches to library automation, many companies…

  16. An Automated Library Circulation System: A Justification.

    ERIC Educational Resources Information Center

    Harrell, Charles B.

    This report for an automated circulation control system to replace the currently used automated off-line batch system discusses the general requirements for the requested system, the equipment needed, the planned uses and design of the proposed system, its utilization, its expected benefits, its estimated costs, the alternatives considered, and…

  17. Partial Automated Alignment and Integration System

    NASA Technical Reports Server (NTRS)

    Kelley, Gary Wayne (Inventor)

    2014-01-01

    The present invention is a Partial Automated Alignment and Integration System (PAAIS) used to automate the alignment and integration of space vehicle components. A PAAIS includes ground support apparatuses, a track assembly with a plurality of energy-emitting components and an energy-receiving component containing a plurality of energy-receiving surfaces. Communication components and processors allow communication and feedback through PAAIS.

  18. Validation of Automated Scoring of Science Assessments

    ERIC Educational Resources Information Center

    Liu, Ou Lydia; Rios, Joseph A.; Heilman, Michael; Gerard, Libby; Linn, Marcia C.

    2016-01-01

    Constructed response items can both measure the coherence of student ideas and serve as reflective experiences to strengthen instruction. We report on new automated scoring technologies that can reduce the cost and complexity of scoring constructed-response items. This study explored the accuracy of c-rater-ML, an automated scoring engine…

  19. What's New in the Library Automation Arena?

    ERIC Educational Resources Information Center

    Breeding, Marshall

    1998-01-01

    Reviews trends in library automation based on vendors at the 1998 American Library Association Annual Conference. Discusses the major industry trend, a move from host-based computer systems to the new generation of client/server, object-oriented, open systems-based automation. Includes a summary of developments for 26 vendors. (LRW)

  20. The Automated Logistics Element Planning System (ALEPS)

    NASA Technical Reports Server (NTRS)

    Schwaab, Douglas G.

    1991-01-01

    The design and functions of ALEPS (Automated Logistics Element Planning System) is a computer system that will automate planning and decision support for Space Station Freedom Logistical Elements (LEs) resupply and return operations. ALEPS provides data management, planning, analysis, monitoring, interfacing, and flight certification for support of LE flight load planning activities. The prototype ALEPS algorithm development is described.