Sample records for automated structure solution

  1. Automated MAD and MIR structure solution

    PubMed Central

    Terwilliger, Thomas C.; Berendzen, Joel

    1999-01-01

    Obtaining an electron-density map from X-ray diffraction data can be difficult and time-consuming even after the data have been collected, largely because MIR and MAD structure determinations currently require many subjective evaluations of the qualities of trial heavy-atom partial structures before a correct heavy-atom solution is obtained. A set of criteria for evaluating the quality of heavy-atom partial solutions in macromolecular crystallography have been developed. These have allowed the conversion of the crystal structure-solution process into an optimization problem and have allowed its automation. The SOLVE software has been used to solve MAD data sets with as many as 52 selenium sites in the asymmetric unit. The automated structure-solution process developed is a major step towards the fully automated structure-determination, model-building and refinement procedure which is needed for genomic scale structure determinations. PMID:10089316

  2. Automated structure solution, density modification and model building.

    PubMed

    Terwilliger, Thomas C

    2002-11-01

    The approaches that form the basis of automated structure solution in SOLVE and RESOLVE are described. The use of a scoring scheme to convert decision making in macromolecular structure solution to an optimization problem has proven very useful and in many cases a single clear heavy-atom solution can be obtained and used for phasing. Statistical density modification is well suited to an automated approach to structure solution because the method is relatively insensitive to choices of numbers of cycles and solvent content. The detection of non-crystallographic symmetry (NCS) in heavy-atom sites and checking of potential NCS operations against the electron-density map has proven to be a reliable method for identification of NCS in most cases. Automated model building beginning with an FFT-based search for helices and sheets has been successful in automated model building for maps with resolutions as low as 3 A. The entire process can be carried out in a fully automatic fashion in many cases.

  3. Automated Generation of Finite-Element Meshes for Aircraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Li, Wu; Robinson, Jay

    2016-01-01

    This paper presents a novel approach for automated generation of fully connected finite-element meshes for all internal structural components and skins of a given wing-body geometry model, controlled by a few conceptual-level structural layout parameters. Internal structural components include spars, ribs, frames, and bulkheads. Structural layout parameters include spar/rib locations in wing chordwise/spanwise direction and frame/bulkhead locations in longitudinal direction. A simple shell thickness optimization problem with two load conditions is used to verify versatility and robustness of the automated meshing process. The automation process is implemented in ModelCenter starting from an OpenVSP geometry and ending with a NASTRAN 200 solution. One subsonic configuration and one supersonic configuration are used for numerical verification. Two different structural layouts are constructed for each configuration and five finite-element meshes of different sizes are generated for each layout. The paper includes various comparisons of solutions of 20 thickness optimization problems, as well as discussions on how the optimal solutions are affected by the stress constraint bound and the initial guess of design variables.

  4. The 3D Euler solutions using automated Cartesian grid generation

    NASA Technical Reports Server (NTRS)

    Melton, John E.; Enomoto, Francis Y.; Berger, Marsha J.

    1993-01-01

    Viewgraphs on 3-dimensional Euler solutions using automated Cartesian grid generation are presented. Topics covered include: computational fluid dynamics (CFD) and the design cycle; Cartesian grid strategy; structured body fit; grid generation; prolate spheroid; and ONERA M6 wing.

  5. Robust, high-throughput solution structural analyses by small angle X-ray scattering (SAXS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hura, Greg L.; Menon, Angeli L.; Hammel, Michal

    2009-07-20

    We present an efficient pipeline enabling high-throughput analysis of protein structure in solution with small angle X-ray scattering (SAXS). Our SAXS pipeline combines automated sample handling of microliter volumes, temperature and anaerobic control, rapid data collection and data analysis, and couples structural analysis with automated archiving. We subjected 50 representative proteins, mostly from Pyrococcus furiosus, to this pipeline and found that 30 were multimeric structures in solution. SAXS analysis allowed us to distinguish aggregated and unfolded proteins, define global structural parameters and oligomeric states for most samples, identify shapes and similar structures for 25 unknown structures, and determine envelopes formore » 41 proteins. We believe that high-throughput SAXS is an enabling technology that may change the way that structural genomics research is done.« less

  6. Auto-rickshaw: an automated crystal structure determination platform as an efficient tool for the validation of an X-ray diffraction experiment.

    PubMed

    Panjikar, Santosh; Parthasarathy, Venkataraman; Lamzin, Victor S; Weiss, Manfred S; Tucker, Paul A

    2005-04-01

    The EMBL-Hamburg Automated Crystal Structure Determination Platform is a system that combines a number of existing macromolecular crystallographic computer programs and several decision-makers into a software pipeline for automated and efficient crystal structure determination. The pipeline can be invoked as soon as X-ray data from derivatized protein crystals have been collected and processed. It is controlled by a web-based graphical user interface for data and parameter input, and for monitoring the progress of structure determination. A large number of possible structure-solution paths are encoded in the system and the optimal path is selected by the decision-makers as the structure solution evolves. The processes have been optimized for speed so that the pipeline can be used effectively for validating the X-ray experiment at a synchrotron beamline.

  7. Automated structure determination of proteins with the SAIL-FLYA NMR method.

    PubMed

    Takeda, Mitsuhiro; Ikeya, Teppei; Güntert, Peter; Kainosho, Masatsune

    2007-01-01

    The labeling of proteins with stable isotopes enhances the NMR method for the determination of 3D protein structures in solution. Stereo-array isotope labeling (SAIL) provides an optimal stereospecific and regiospecific pattern of stable isotopes that yields sharpened lines, spectral simplification without loss of information, and the ability to collect rapidly and evaluate fully automatically the structural restraints required to solve a high-quality solution structure for proteins up to twice as large as those that can be analyzed using conventional methods. Here, we describe a protocol for the preparation of SAIL proteins by cell-free methods, including the preparation of S30 extract and their automated structure analysis using the FLYA algorithm and the program CYANA. Once efficient cell-free expression of the unlabeled or uniformly labeled target protein has been achieved, the NMR sample preparation of a SAIL protein can be accomplished in 3 d. A fully automated FLYA structure calculation can be completed in 1 d on a powerful computer system.

  8. Total synthesis of TMG-chitotriomycin based on an automated electrochemical assembly of a disaccharide building block.

    PubMed

    Isoda, Yuta; Sasaki, Norihiko; Kitamura, Kei; Takahashi, Shuji; Manmode, Sujit; Takeda-Okuda, Naoko; Tamura, Jun-Ichi; Nokami, Toshiki; Itoh, Toshiyuki

    2017-01-01

    The total synthesis of TMG-chitotriomycin using an automated electrochemical synthesizer for the assembly of carbohydrate building blocks is demonstrated. We have successfully prepared a precursor of TMG-chitotriomycin, which is a structurally-pure tetrasaccharide with typical protecting groups, through the methodology of automated electrochemical solution-phase synthesis developed by us. The synthesis of structurally well-defined TMG-chitotriomycin has been accomplished in 10-steps from a disaccharide building block.

  9. Total synthesis of TMG-chitotriomycin based on an automated electrochemical assembly of a disaccharide building block

    PubMed Central

    Isoda, Yuta; Sasaki, Norihiko; Kitamura, Kei; Takahashi, Shuji; Manmode, Sujit; Takeda-Okuda, Naoko; Tamura, Jun-ichi

    2017-01-01

    The total synthesis of TMG-chitotriomycin using an automated electrochemical synthesizer for the assembly of carbohydrate building blocks is demonstrated. We have successfully prepared a precursor of TMG-chitotriomycin, which is a structurally-pure tetrasaccharide with typical protecting groups, through the methodology of automated electrochemical solution-phase synthesis developed by us. The synthesis of structurally well-defined TMG-chitotriomycin has been accomplished in 10-steps from a disaccharide building block. PMID:28684973

  10. Potential of Laboratory Execution Systems (LESs) to Simplify the Application of Business Process Management Systems (BPMSs) in Laboratory Automation.

    PubMed

    Neubert, Sebastian; Göde, Bernd; Gu, Xiangyu; Stoll, Norbert; Thurow, Kerstin

    2017-04-01

    Modern business process management (BPM) is increasingly interesting for laboratory automation. End-to-end workflow automation and improved top-level systems integration for information technology (IT) and automation systems are especially prominent objectives. With the ISO Standard Business Process Model and Notation (BPMN) 2.X, a system-independent and interdisciplinary accepted graphical process control notation is provided, allowing process analysis, while also being executable. The transfer of BPM solutions to structured laboratory automation places novel demands, for example, concerning the real-time-critical process and systems integration. The article discusses the potential of laboratory execution systems (LESs) for an easier implementation of the business process management system (BPMS) in hierarchical laboratory automation. In particular, complex application scenarios, including long process chains based on, for example, several distributed automation islands and mobile laboratory robots for a material transport, are difficult to handle in BPMSs. The presented approach deals with the displacement of workflow control tasks into life science specialized LESs, the reduction of numerous different interfaces between BPMSs and subsystems, and the simplification of complex process modelings. Thus, the integration effort for complex laboratory workflows can be significantly reduced for strictly structured automation solutions. An example application, consisting of a mixture of manual and automated subprocesses, is demonstrated by the presented BPMS-LES approach.

  11. Automated polarization control for the precise alignment of laser-induced self-organized nanostructures

    NASA Astrophysics Data System (ADS)

    Hermens, Ulrike; Pothen, Mario; Winands, Kai; Arntz, Kristian; Klocke, Fritz

    2018-02-01

    Laser-induced periodic surface structures (LIPSS) found in particular applications in the fields of surface functionalization have been investigated since many years. The direction of these ripple structures with a periodicity in the nanoscale can be manipulated by changing the laser polarization. For industrial use, it is useful to manipulate the direction of these structures automatically and to obtain smooth changes of their orientation without any visible inhomogeneity. However, currently no system solution exists that is able to control the polarization direction completely automated in one software solution so far. In this paper, a system solution is presented that includes a liquid crystal polarizer to control the polarization direction. It is synchronized with a scanner, a dynamic beam expander and a five axis-system. It provides fast switching times and small step sizes. First results of fabricated structures are also presented. In a systematic study, the conjunction of LIPSS with different orientation in two parallel line scans has been investigated.

  12. Towards Detection of Learner Misconceptions in a Medical Learning Environment: A Subgroup Discovery Approach

    ERIC Educational Resources Information Center

    Poitras, Eric G.; Doleck, Tenzin; Lajoie, Susanne P.

    2018-01-01

    Ill-structured problems, by definition, have multiple paths to a solution and are multifaceted making automated assessment and feedback a difficult challenge. Diagnostic reasoning about medical cases meet the criteria of ill-structured problem solving since there are multiple solution paths. The goal of this study was to develop an adaptive…

  13. Non-Uniform Sampling and J-UNIO Automation for Efficient Protein NMR Structure Determination.

    PubMed

    Didenko, Tatiana; Proudfoot, Andrew; Dutta, Samit Kumar; Serrano, Pedro; Wüthrich, Kurt

    2015-08-24

    High-resolution structure determination of small proteins in solution is one of the big assets of NMR spectroscopy in structural biology. Improvements in the efficiency of NMR structure determination by advances in NMR experiments and automation of data handling therefore attracts continued interest. Here, non-uniform sampling (NUS) of 3D heteronuclear-resolved [(1)H,(1)H]-NOESY data yielded two- to three-fold savings of instrument time for structure determinations of soluble proteins. With the 152-residue protein NP_372339.1 from Staphylococcus aureus and the 71-residue protein NP_346341.1 from Streptococcus pneumonia we show that high-quality structures can be obtained with NUS NMR data, which are equally well amenable to robust automated analysis as the corresponding uniformly sampled data. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. 76 FR 81986 - Honeywell International, Inc., Automation and Control Solutions Division, Including On-Site...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-29

    ..., Inc., Automation and Control Solutions Division, Including On-Site Leased Workers From Manpower... International, Inc., Automation and Control Solutions Division, Rock Island, Illinois. The notice was published...., Automation and Control Solutions Division. The Department has determined that these workers were sufficiently...

  15. The Phenix Software for Automated Determination of Macromolecular Structures

    PubMed Central

    Adams, Paul D.; Afonine, Pavel V.; Bunkóczi, Gábor; Chen, Vincent B.; Echols, Nathaniel; Headd, Jeffrey J.; Hung, Li-Wei; Jain, Swati; Kapral, Gary J.; Grosse Kunstleve, Ralf W.; McCoy, Airlie J.; Moriarty, Nigel W.; Oeffner, Robert D.; Read, Randy J.; Richardson, David C.; Richardson, Jane S.; Terwilliger, Thomas C.; Zwart, Peter H.

    2011-01-01

    X-ray crystallography is a critical tool in the study of biological systems. It is able to provide information that has been a prerequisite to understanding the fundamentals of life. It is also a method that is central to the development of new therapeutics for human disease. Significant time and effort are required to determine and optimize many macromolecular structures because of the need for manual interpretation of complex numerical data, often using many different software packages, and the repeated use of interactive three-dimensional graphics. The Phenix software package has been developed to provide a comprehensive system for macromolecular crystallographic structure solution with an emphasis on automation. This has required the development of new algorithms that minimize or eliminate subjective input in favour of built-in expert-systems knowledge, the automation of procedures that are traditionally performed by hand, and the development of a computational framework that allows a tight integration between the algorithms. The application of automated methods is particularly appropriate in the field of structural proteomics, where high throughput is desired. Features in Phenix for the automation of experimental phasing with subsequent model building, molecular replacement, structure refinement and validation are described and examples given of running Phenix from both the command line and graphical user interface. PMID:21821126

  16. Three-dimensional electron diffraction as a complementary technique to powder X-ray diffraction for phase identification and structure solution of powders.

    PubMed

    Yun, Yifeng; Zou, Xiaodong; Hovmöller, Sven; Wan, Wei

    2015-03-01

    Phase identification and structure determination are important and widely used techniques in chemistry, physics and materials science. Recently, two methods for automated three-dimensional electron diffraction (ED) data collection, namely automated diffraction tomography (ADT) and rotation electron diffraction (RED), have been developed. Compared with X-ray diffraction (XRD) and two-dimensional zonal ED, three-dimensional ED methods have many advantages in identifying phases and determining unknown structures. Almost complete three-dimensional ED data can be collected using the ADT and RED methods. Since each ED pattern is usually measured off the zone axes by three-dimensional ED methods, dynamic effects are much reduced compared with zonal ED patterns. Data collection is easy and fast, and can start at any arbitrary orientation of the crystal, which facilitates automation. Three-dimensional ED is a powerful technique for structure identification and structure solution from individual nano- or micron-sized particles, while powder X-ray diffraction (PXRD) provides information from all phases present in a sample. ED suffers from dynamic scattering, while PXRD data are kinematic. Three-dimensional ED methods and PXRD are complementary and their combinations are promising for studying multiphase samples and complicated crystal structures. Here, two three-dimensional ED methods, ADT and RED, are described. Examples are given of combinations of three-dimensional ED methods and PXRD for phase identification and structure determination over a large number of different materials, from Ni-Se-O-Cl crystals, zeolites, germanates, metal-organic frameworks and organic compounds to intermetallics with modulated structures. It is shown that three-dimensional ED is now as feasible as X-ray diffraction for phase identification and structure solution, but still needs further development in order to be as accurate as X-ray diffraction. It is expected that three-dimensional ED methods will become crucially important in the near future.

  17. Automation of the micro-arc oxidation process

    NASA Astrophysics Data System (ADS)

    Golubkov, P. E.; Pecherskaya, E. A.; Karpanin, O. V.; Shepeleva, Y. V.; Zinchenko, T. O.; Artamonov, D. V.

    2017-11-01

    At present the significantly increased interest in micro-arc oxidation (MAO) encourages scientists to look for the solution of the problem of this technological process controllability. To solve this problem an automated technological installation MAO was developed, its structure and control principles are presented in this article. This device will allow to provide the controlled synthesis of MAO coatings and to identify MAO process patterns which contributes to commercialization of this technology.

  18. 75 FR 77664 - Honeywell International, Inc., Automation and Control Solutions Division, Including On-Site...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-13

    ..., Inc., Automation and Control Solutions Division, Including On-Site Leased Workers From Manpower...., Automation and Control Solutions Division, Rock Island, Illinois. The notice was published in the Federal...-site at the Rock Island, Illinois location of Honeywell International, Inc., Automation and Control...

  19. A Numerical Study of Automated Dynamic Relaxation for Nonlinear Static Tensioned Structures.

    DTIC Science & Technology

    1987-10-01

    sytem f dscree fnit element equations, i.e., an algebraic system. The form of these equa- tions is the same for all nonlinear kinematic structures that...the first phase the solu- tion to the static, prestress configuration is sought. This phase is also referred to as form finding, shape finding, or the...does facilitate stability of the numerical solution. The system of equations, which is the focus of the solution methods presented, is formed by a

  20. Automated smoother for the numerical decoupling of dynamics models.

    PubMed

    Vilela, Marco; Borges, Carlos C H; Vinga, Susana; Vasconcelos, Ana Tereza R; Santos, Helena; Voit, Eberhard O; Almeida, Jonas S

    2007-08-21

    Structure identification of dynamic models for complex biological systems is the cornerstone of their reverse engineering. Biochemical Systems Theory (BST) offers a particularly convenient solution because its parameters are kinetic-order coefficients which directly identify the topology of the underlying network of processes. We have previously proposed a numerical decoupling procedure that allows the identification of multivariate dynamic models of complex biological processes. While described here within the context of BST, this procedure has a general applicability to signal extraction. Our original implementation relied on artificial neural networks (ANN), which caused slight, undesirable bias during the smoothing of the time courses. As an alternative, we propose here an adaptation of the Whittaker's smoother and demonstrate its role within a robust, fully automated structure identification procedure. In this report we propose a robust, fully automated solution for signal extraction from time series, which is the prerequisite for the efficient reverse engineering of biological systems models. The Whittaker's smoother is reformulated within the context of information theory and extended by the development of adaptive signal segmentation to account for heterogeneous noise structures. The resulting procedure can be used on arbitrary time series with a nonstationary noise process; it is illustrated here with metabolic profiles obtained from in-vivo NMR experiments. The smoothed solution that is free of parametric bias permits differentiation, which is crucial for the numerical decoupling of systems of differential equations. The method is applicable in signal extraction from time series with nonstationary noise structure and can be applied in the numerical decoupling of system of differential equations into algebraic equations, and thus constitutes a rather general tool for the reverse engineering of mechanistic model descriptions from multivariate experimental time series.

  1. Automated Assignment of MS/MS Cleavable Cross-Links in Protein 3D-Structure Analysis

    NASA Astrophysics Data System (ADS)

    Götze, Michael; Pettelkau, Jens; Fritzsche, Romy; Ihling, Christian H.; Schäfer, Mathias; Sinz, Andrea

    2015-01-01

    CID-MS/MS cleavable cross-linkers hold an enormous potential for an automated analysis of cross-linked products, which is essential for conducting structural proteomics studies. The created characteristic fragment ion patterns can easily be used for an automated assignment and discrimination of cross-linked products. To date, there are only a few software solutions available that make use of these properties, but none allows for an automated analysis of cleavable cross-linked products. The MeroX software fills this gap and presents a powerful tool for protein 3D-structure analysis in combination with MS/MS cleavable cross-linkers. We show that MeroX allows an automatic screening of characteristic fragment ions, considering static and variable peptide modifications, and effectively scores different types of cross-links. No manual input is required for a correct assignment of cross-links and false discovery rates are calculated. The self-explanatory graphical user interface of MeroX provides easy access for an automated cross-link search platform that is compatible with commonly used data file formats, enabling analysis of data originating from different instruments. The combination of an MS/MS cleavable cross-linker with a dedicated software tool for data analysis provides an automated workflow for 3D-structure analysis of proteins. MeroX is available at www.StavroX.com .

  2. Automated Assessment in Massive Open Online Courses

    ERIC Educational Resources Information Center

    Ivaniushin, Dmitrii A.; Shtennikov, Dmitrii G.; Efimchick, Eugene A.; Lyamin, Andrey V.

    2016-01-01

    This paper describes an approach to use automated assessments in online courses. Open edX platform is used as the online courses platform. The new assessment type uses Scilab as learning and solution validation tool. This approach allows to use automated individual variant generation and automated solution checks without involving the course…

  3. Automated identification of cone photoreceptors in adaptive optics retinal images.

    PubMed

    Li, Kaccie Y; Roorda, Austin

    2007-05-01

    In making noninvasive measurements of the human cone mosaic, the task of labeling each individual cone is unavoidable. Manual labeling is a time-consuming process, setting the motivation for the development of an automated method. An automated algorithm for labeling cones in adaptive optics (AO) retinal images is implemented and tested on real data. The optical fiber properties of cones aided the design of the algorithm. Out of 2153 manually labeled cones from six different images, the automated method correctly identified 94.1% of them. The agreement between the automated and the manual labeling methods varied from 92.7% to 96.2% across the six images. Results between the two methods disagreed for 1.2% to 9.1% of the cones. Voronoi analysis of large montages of AO retinal images confirmed the general hexagonal-packing structure of retinal cones as well as the general cone density variability across portions of the retina. The consistency of our measurements demonstrates the reliability and practicality of having an automated solution to this problem.

  4. Automated crack detection in conductive smart-concrete structures using a resistor mesh model

    NASA Astrophysics Data System (ADS)

    Downey, Austin; D'Alessandro, Antonella; Ubertini, Filippo; Laflamme, Simon

    2018-03-01

    Various nondestructive evaluation techniques are currently used to automatically detect and monitor cracks in concrete infrastructure. However, these methods often lack the scalability and cost-effectiveness over large geometries. A solution is the use of self-sensing carbon-doped cementitious materials. These self-sensing materials are capable of providing a measurable change in electrical output that can be related to their damage state. Previous work by the authors showed that a resistor mesh model could be used to track damage in structural components fabricated from electrically conductive concrete, where damage was located through the identification of high resistance value resistors in a resistor mesh model. In this work, an automated damage detection strategy that works through placing high value resistors into the previously developed resistor mesh model using a sequential Monte Carlo method is introduced. Here, high value resistors are used to mimic the internal condition of damaged cementitious specimens. The proposed automated damage detection method is experimentally validated using a 500 × 500 × 50 mm3 reinforced cement paste plate doped with multi-walled carbon nanotubes exposed to 100 identical impact tests. Results demonstrate that the proposed Monte Carlo method is capable of detecting and localizing the most prominent damage in a structure, demonstrating that automated damage detection in smart-concrete structures is a promising strategy for real-time structural health monitoring of civil infrastructure.

  5. A hybrid computational-experimental approach for automated crystal structure solution

    NASA Astrophysics Data System (ADS)

    Meredig, Bryce; Wolverton, C.

    2013-02-01

    Crystal structure solution from diffraction experiments is one of the most fundamental tasks in materials science, chemistry, physics and geology. Unfortunately, numerous factors render this process labour intensive and error prone. Experimental conditions, such as high pressure or structural metastability, often complicate characterization. Furthermore, many materials of great modern interest, such as batteries and hydrogen storage media, contain light elements such as Li and H that only weakly scatter X-rays. Finally, structural refinements generally require significant human input and intuition, as they rely on good initial guesses for the target structure. To address these many challenges, we demonstrate a new hybrid approach, first-principles-assisted structure solution (FPASS), which combines experimental diffraction data, statistical symmetry information and first-principles-based algorithmic optimization to automatically solve crystal structures. We demonstrate the broad utility of FPASS to clarify four important crystal structure debates: the hydrogen storage candidates MgNH and NH3BH3; Li2O2, relevant to Li-air batteries; and high-pressure silane, SiH4.

  6. ICEG2D (v2.0) - An Integrated Software Package for Automated Prediction of Flow Fields for Single-Element Airfoils With Ice Accretion

    NASA Technical Reports Server (NTRS)

    Thompson David S.; Soni, Bharat K.

    2001-01-01

    An integrated geometry/grid/simulation software package, ICEG2D, is being developed to automate computational fluid dynamics (CFD) simulations for single- and multi-element airfoils with ice accretions. The current version, ICEG213 (v2.0), was designed to automatically perform four primary functions: (1) generate a grid-ready surface definition based on the geometrical characteristics of the iced airfoil surface, (2) generate high-quality structured and generalized grids starting from a defined surface definition, (3) generate the input and restart files needed to run the structured grid CFD solver NPARC or the generalized grid CFD solver HYBFL2D, and (4) using the flow solutions, generate solution-adaptive grids. ICEG2D (v2.0) can be operated in either a batch mode using a script file or in an interactive mode by entering directives from a command line within a Unix shell. This report summarizes activities completed in the first two years of a three-year research and development program to address automation issues related to CFD simulations for airfoils with ice accretions. As well as describing the technology employed in the software, this document serves as a users manual providing installation and operating instructions. An evaluation of the software is also presented.

  7. Low Cost Structures, but How Much are we Paying for Them?

    NASA Astrophysics Data System (ADS)

    Gomez Molinero, Vincent

    2014-06-01

    Based on more than 37 years developing spacecraft structures - both for launchers starting with Ariane-1 up to the most modern ones and for satellites of any type - a critical review of the current trends, aiming specially to low cost solutions, will be presented.Airbus Defence and Space (CASA Espacio previously) has been developing structures for launchers and satellites during more than 4 decades. All types of spacecraft structures - primary and secondary ones, high stability ones and special critical cases like antenna reflectors, high stiffness structures and load carrying ones - have been developed using different types of materials and structural constructions. Although our main expertise is concentrated on composite structures, we have also developed many types of metallic ones, when the best solution was that one, not necessarily only based on pure technical reasons.From that perspective and experience, this paper tries to review the current trend of imposing the low cost as the main requirement for the development of satellites and launchers and its intrinsic characteristic of being a non- ending process: the spacecraft structures are never sufficiently cheaper.The main ways used today to justify low cost spacecraft structures will be reviewed trying to understand their rationale and some prejudices always present when the trade-off studies are performed. Some of the reviewed cost-killing factors will be (non-exhaustive list) Material type (i.e.: metallic vs composite). Low cost materials in general. Manufacturing process (i.e.: autoclave curing vs out-of-autoclave one). Automation in manufacturing. Automation in assembly. Automation in inspection and verification. Lean manufacturing techniques. Standardization. Some insight about how to solve this problem without losing our distinctive nature (we are developing high performance systems many of them unique prototypes and thought to work in environments not perfectly known and highly unknown in some cases) will be provided from the author's point of view.

  8. Marking Student Programs Using Graph Similarity

    ERIC Educational Resources Information Center

    Naude, Kevin A.; Greyling, Jean H.; Vogts, Dieter

    2010-01-01

    We present a novel approach to the automated marking of student programming assignments. Our technique quantifies the structural similarity between unmarked student submissions and marked solutions, and is the basis by which we assign marks. This is accomplished through an efficient novel graph similarity measure ("AssignSim"). Our experiments…

  9. Do centrally pre-prepared solutions achieve more reliable drug concentrations than solutions prepared on the ward?

    PubMed

    Dehmel, Carola; Braune, Stephan A; Kreymann, Georg; Baehr, Michael; Langebrake, Claudia; Hilgarth, Heike; Nierhaus, Axel; Dartsch, Dorothee C; Kluge, Stefan

    2011-08-01

    To compare the concentration conformity of infusion solutions manually prepared on intensive care units (ICU) with solutions from pharmacy-based, automated production. A prospective observational study conducted in a university hospital in Germany. Drug concentrations of 100 standardised infusion solutions manually prepared in the ICU and 100 matching solutions from automated production containing amiodarone, noradrenaline or hydrocortisone were measured by high-performance liquid chromatography analysis. Deviations from stated concentrations were calculated, and the quality of achieved concentration conformity of the two production methods was compared. Actual concentrations of 53% of the manually prepared and 16% of the machine-made solutions deviated by >5% above or below the stated concentration. A deviation of >10% was measured in 22% of the manually prepared samples and in 5% of samples from automated production. Of the manually prepared solutions, 15% deviated by >15% above or below the intended concentration. The mean concentration of the manually prepared solutions was 97.2% (SD 12.7%, range 45-129%) and of the machine-made solutions was 101.1% (SD 4.3%, range 90-114%) of the target concentration (p < 0.01). In this preliminary study, ward-based, manually prepared infusion solutions showed clinically relevant deviations in concentration conformity significantly more often than pharmacy-prepared, machine-made solutions. Centralised, automated preparation of standardised infusion solutions may be an effective means to reduce this type of medication error. Further confirmatory studies in larger settings and under conditions of routine automated production are required.

  10. A complete solution of cartographic displacement based on elastic beams model and Delaunay triangulation

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Guo, Q.; Sun, Y.

    2014-04-01

    In map production and generalization, it is inevitable to arise some spatial conflicts, but the detection and resolution of these spatial conflicts still requires manual operation. It is become a bottleneck hindering the development of automated cartographic generalization. Displacement is the most useful contextual operator that is often used for resolving the conflicts arising between two or more map objects. Automated generalization researches have reported many approaches of displacement including sequential approaches and optimization approaches. As an excellent optimization approach on the basis of energy minimization principles, elastic beams model has been used in resolving displacement problem of roads and buildings for several times. However, to realize a complete displacement solution, techniques of conflict detection and spatial context analysis should be also take into consideration. So we proposed a complete solution of displacement based on the combined use of elastic beams model and constrained Delaunay triangulation (CDT) in this paper. The solution designed as a cyclic and iterative process containing two phases: detection phase and displacement phase. In detection phase, CDT of map is use to detect proximity conflicts, identify spatial relationships and structures, and construct auxiliary structure, so as to support the displacement phase on the basis of elastic beams. In addition, for the improvements of displacement algorithm, a method for adaptive parameters setting and a new iterative strategy are put forward. Finally, we implemented our solution on a testing map generalization platform, and successfully tested it against 2 hand-generated test datasets of roads and buildings respectively.

  11. Automated campaign system

    NASA Astrophysics Data System (ADS)

    Vondran, Gary; Chao, Hui; Lin, Xiaofan; Beyer, Dirk; Joshi, Parag; Atkins, Brian; Obrador, Pere

    2006-02-01

    To run a targeted campaign involves coordination and management across numerous organizations and complex process flows. Everything from market analytics on customer databases, acquiring content and images, composing the materials, meeting the sponsoring enterprise brand standards, driving through production and fulfillment, and evaluating results; all processes are currently performed by experienced highly trained staff. Presented is a developed solution that not only brings together technologies that automate each process, but also automates the entire flow so that a novice user could easily run a successful campaign from their desktop. This paper presents the technologies, structure, and process flows used to bring this system together. Highlighted will be how the complexity of running a targeted campaign is hidden from the user through technologies, all while providing the benefits of a professionally managed campaign.

  12. Evaluation of stereo-array isotope labeling (SAIL) patterns for automated structural analysis of proteins with CYANA.

    PubMed

    Ikeya, Teppei; Terauchi, Tsutomu; Güntert, Peter; Kainosho, Masatsune

    2006-07-01

    Recently we have developed the stereo-array isotope labeling (SAIL) technique to overcome the conventional molecular size limitation in NMR protein structure determination by employing complete stereo- and regiospecific patterns of stable isotopes. SAIL sharpens signals and simplifies spectra without the loss of requisite structural information, thus making large classes of proteins newly accessible to detailed solution structure determination. The automated structure calculation program CYANA can efficiently analyze SAIL-NOESY spectra and calculate structures without manual analysis. Nevertheless, the original SAIL method might not be capable of determining the structures of proteins larger than 50 kDa or membrane proteins, for which the spectra are characterized by many broadened and overlapped peaks. Here we have carried out simulations of new SAIL patterns optimized for minimal relaxation and overlap, to evaluate the combined use of SAIL and CYANA for solving the structures of larger proteins and membrane proteins. The modified approach reduces the number of peaks to nearly half of that observed with uniform labeling, while still yielding well-defined structures and is expected to enable NMR structure determinations of these challenging systems.

  13. Comparison of Automated Scoring Methods for a Computerized Performance Assessment of Clinical Judgment

    ERIC Educational Resources Information Center

    Harik, Polina; Baldwin, Peter; Clauser, Brian

    2013-01-01

    Growing reliance on complex constructed response items has generated considerable interest in automated scoring solutions. Many of these solutions are described in the literature; however, relatively few studies have been published that "compare" automated scoring strategies. Here, comparisons are made among five strategies for…

  14. Automation of Vapor-Diffusion Growth of Protein Crystals

    NASA Technical Reports Server (NTRS)

    Hamrick, David T.; Bray, Terry L.

    2005-01-01

    Some improvements have been made in a system of laboratory equipment developed previously for studying the crystallization of proteins from solution by use of dynamically controlled flows of dry gas. The improvements involve mainly (1) automation of dispensing of liquids for starting experiments, (2) automatic control of drying of protein solutions during the experiments, and (3) provision for automated acquisition of video images for monitoring experiments in progress and for post-experiment analysis. The automation of dispensing of liquids was effected by adding an automated liquid-handling robot that can aspirate source solutions and dispense them in either a hanging-drop or a sitting-drop configuration, whichever is specified, in each of 48 experiment chambers. A video camera of approximately the size and shape of a lipstick dispenser was added to a mobile stage that is part of the robot, in order to enable automated acquisition of images in each experiment chamber. The experiment chambers were redesigned to enable the use of sitting drops, enable backlighting of each specimen, and facilitate automation.

  15. Thermal depth profiling of vascular lesions: automated regularization of reconstruction algorithms

    NASA Astrophysics Data System (ADS)

    Verkruysse, Wim; Choi, Bernard; Zhang, Jenny R.; Kim, Jeehyun; Nelson, J. Stuart

    2008-03-01

    Pulsed photo-thermal radiometry (PPTR) is a non-invasive, non-contact diagnostic technique used to locate cutaneous chromophores such as melanin (epidermis) and hemoglobin (vascular structures). Clinical utility of PPTR is limited because it typically requires trained user intervention to regularize the inversion solution. Herein, the feasibility of automated regularization was studied. A second objective of this study was to depart from modeling port wine stain PWS, a vascular skin lesion frequently studied with PPTR, as strictly layered structures since this may influence conclusions regarding PPTR reconstruction quality. Average blood vessel depths, diameters and densities derived from histology of 30 PWS patients were used to generate 15 randomized lesion geometries for which we simulated PPTR signals. Reconstruction accuracy for subjective regularization was compared with that for automated regularization methods. The objective regularization approach performed better. However, the average difference was much smaller than the variation between the 15 simulated profiles. Reconstruction quality depended more on the actual profile to be reconstructed than on the reconstruction algorithm or regularization method. Similar, or better, accuracy reconstructions can be achieved with an automated regularization procedure which enhances prospects for user friendly implementation of PPTR to optimize laser therapy on an individual patient basis.

  16. An automated LS(β)- NaI(Tl)(γ) coincidence system as absolute standard for radioactivity measurements.

    PubMed

    Joseph, Leena; Das, A P; Ravindra, Anuradha; Kulkarni, D B; Kulkarni, M S

    2018-07-01

    4πβ-γ coincidence method is a powerful and widely used method to determine the absolute activity concentration of radioactive solutions. A new automated liquid scintillator based coincidence system has been designed, developed, tested and established as absolute standard for radioactivity measurements. The automation is achieved using PLC (programmable logic controller) and SCADA (supervisory control and data acquisition). Radioactive solution of 60 Co was standardized to compare the performance of the automated system with proportional counter based absolute standard maintained in the laboratory. The activity concentrations determined using these two systems were in very good agreement; the new automated system can be used for absolute measurement of activity concentration of radioactive solutions. Copyright © 2018. Published by Elsevier Ltd.

  17. Photonomics: automation approaches yield economic aikido for photonics device manufacture

    NASA Astrophysics Data System (ADS)

    Jordan, Scott

    2002-09-01

    In the glory days of photonics, with exponentiating demand for photonics devices came exponentiating competition, with new ventures commencing deliveries seemingly weekly. Suddenly the industry was faced with a commodity marketplace well before a commodity cost structure was in place. Economic issues like cost, scalability, yield-call it all "Photonomics" -now drive the industry. Automation and throughput-optimization are obvious answers, but until now, suitable modular tools had not been introduced. Available solutions were barely compatible with typical transverse alignment tolerances and could not automate angular alignments of collimated devices and arrays. And settling physics served as the insoluble bottleneck to throughput and resolution advancement in packaging, characterization and fabrication processes. The industry has addressed these needs in several ways, ranging from special configurations of catalog motion devices to integrated microrobots based on a novel mini-hexapod configuration. This intriguing approach allows tip/tilt alignments to be automated about any point in space, such as a beam waist, a focal point, the cleaved face of a fiber, or the optical axis of a waveguide- ideal for MEMS packaging automation and array alignment. Meanwhile, patented new low-cost settling-enhancement technology has been applied in applications ranging from air-bearing long-travel stages to subnanometer-resolution piezo positioners to advance resolution and process cycle-times in sensitive applications such as optical coupling characterization and fiber Bragg grating generation. Background, examples and metrics are discussed, providing an up-to-date industry overview of available solutions.

  18. Building Flexible User Interfaces for Solving PDEs

    NASA Astrophysics Data System (ADS)

    Logg, Anders; Wells, Garth N.

    2010-09-01

    FEniCS is a collection of software tools for the automated solution of differential equations by finite element methods. In this note, we describe how FEniCS can be used to solve a simple nonlinear model problem with varying levels of automation. At one extreme, FEniCS provides tools for the fully automated and adaptive solution of nonlinear partial differential equations. At the other extreme, FEniCS provides a range of tools that allow the computational scientist to experiment with novel solution algorithms.

  19. Automated NMR structure determination of stereo-array isotope labeled ubiquitin from minimal sets of spectra using the SAIL-FLYA system.

    PubMed

    Ikeya, Teppei; Takeda, Mitsuhiro; Yoshida, Hitoshi; Terauchi, Tsutomu; Jee, Jun-Goo; Kainosho, Masatsune; Güntert, Peter

    2009-08-01

    Stereo-array isotope labeling (SAIL) has been combined with the fully automated NMR structure determination algorithm FLYA to determine the three-dimensional structure of the protein ubiquitin from different sets of input NMR spectra. SAIL provides a complete stereo- and regio-specific pattern of stable isotopes that results in sharper resonance lines and reduced signal overlap, without information loss. Here we show that as a result of the superior quality of the SAIL NMR spectra, reliable, fully automated analyses of the NMR spectra and structure calculations are possible using fewer input spectra than with conventional uniformly 13C/15N-labeled proteins. FLYA calculations with SAIL ubiquitin, using a single three-dimensional "through-bond" spectrum (and 2D HSQC spectra) in addition to the 13C-edited and 15N-edited NOESY spectra for conformational restraints, yielded structures with an accuracy of 0.83-1.15 A for the backbone RMSD to the conventionally determined solution structure of SAIL ubiquitin. NMR structures can thus be determined almost exclusively from the NOESY spectra that yield the conformational restraints, without the need to record many spectra only for determining intermediate, auxiliary data of the chemical shift assignments. The FLYA calculations for this report resulted in 252 ubiquitin structure bundles, obtained with different input data but identical structure calculation and refinement methods. These structures cover the entire range from highly accurate structures to seriously, but not trivially, wrong structures, and thus constitute a valuable database for the substantiation of structure validation methods.

  20. 75 FR 22846 - Norgren Automation Solutions, Including Workers Whose Unemployment Insurance (UI) Wages Are Paid...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-30

    ... Solutions, Including Workers Whose Unemployment Insurance (UI) Wages Are Paid Through Syron Engineering Erie... from employment at the subject firm had their wages reported under a separated unemployment insurance... of Norgren Automation Solutions, including workers whose unemployment insurance (UI) wages are paid...

  1. Spatial Phase Imaging

    NASA Technical Reports Server (NTRS)

    2006-01-01

    Frequently, scientists grow crystals by dissolving a protein in a specific liquid solution, and then allowing that solution to evaporate. The methods used next have been, variously, invasive (adding a dye that is absorbed by the protein), destructive (crushing protein/salt-crystal mixtures and observing differences between the crushing of salt and protein), or costly and time-consuming (X-ray crystallography). In contrast to these methods, a new technology for monitoring protein growth, developed in part through NASA Small Business Innovation Research (SBIR) funding from Marshall Space Flight Center, is noninvasive, nondestructive, rapid, and more cost effective than X-ray analysis. The partner for this SBIR, Photon-X, Inc., of Huntsville, Alabama, developed spatial phase imaging technology that can monitor crystal growth in real time and in an automated mode. Spatial phase imaging scans for flaws quickly and produces a 3-D structured image of a crystal, showing volumetric growth analysis for future automated growth.

  2. Application of heuristic satellite plan synthesis algorithms to requirements of the WARC-88 allotment plan

    NASA Technical Reports Server (NTRS)

    Heyward, Ann O.; Reilly, Charles H.; Walton, Eric K.; Mata, Fernando; Olen, Carl

    1990-01-01

    Creation of an Allotment Plan for the Fixed Satellite Service at the 1988 Space World Administrative Radio Conference (WARC) represented a complex satellite plan synthesis problem, involving a large number of planned and existing systems. Solutions to this problem at WARC-88 required the use of both automated and manual procedures to develop an acceptable set of system positions. Development of an Allotment Plan may also be attempted through solution of an optimization problem, known as the Satellite Location Problem (SLP). Three automated heuristic procedures, developed specifically to solve SLP, are presented. The heuristics are then applied to two specific WARC-88 scenarios. Solutions resulting from the fully automated heuristics are then compared with solutions obtained at WARC-88 through a combination of both automated and manual planning efforts.

  3. A Phenomenographic Study of the Ways of Understanding Conditional and Repetition Structures in Computer Programming Languages

    ERIC Educational Resources Information Center

    Bucks, Gregory Warren

    2010-01-01

    Computers have become an integral part of how engineers complete their work, allowing them to collect and analyze data, model potential solutions and aiding in production through automation and robotics. In addition, computers are essential elements of the products themselves, from tennis shoes to construction materials. An understanding of how…

  4. ICECAP: an integrated, general-purpose, automation-assisted IC50/EC50 assay platform.

    PubMed

    Li, Ming; Chou, Judy; King, Kristopher W; Jing, Jing; Wei, Dong; Yang, Liyu

    2015-02-01

    IC50 and EC50 values are commonly used to evaluate drug potency. Mass spectrometry (MS)-centric bioanalytical and biomarker labs are now conducting IC50/EC50 assays, which, if done manually, are tedious and error-prone. Existing bioanalytical sample preparation automation systems cannot meet IC50/EC50 assay throughput demand. A general-purpose, automation-assisted IC50/EC50 assay platform was developed to automate the calculations of spiking solutions and the matrix solutions preparation scheme, the actual spiking and matrix solutions preparations, as well as the flexible sample extraction procedures after incubation. In addition, the platform also automates the data extraction, nonlinear regression curve fitting, computation of IC50/EC50 values, graphing, and reporting. The automation-assisted IC50/EC50 assay platform can process the whole class of assays of varying assay conditions. In each run, the system can handle up to 32 compounds and up to 10 concentration levels per compound, and it greatly improves IC50/EC50 assay experimental productivity and data processing efficiency. © 2014 Society for Laboratory Automation and Screening.

  5. Titanium(IV) isopropoxide mediated solution phase reductive amination on an automated platform: application in the generation of urea and amide libraries.

    PubMed

    Bhattacharyya, S; Fan, L; Vo, L; Labadie, J

    2000-04-01

    Amine libraries and their derivatives are important targets for high throughput synthesis because of their versatility as medicinal agents and agrochemicals. As a part of our efforts towards automated chemical library synthesis, a titanium(IV) isopropoxide mediated solution phase reductive amination protocol was successfully translated to automation on the Trident(TM) library synthesizer of Argonaut Technologies. An array of 24 secondary amines was prepared in high yield and purity from 4 primary amines and 6 carbonyl compounds. These secondary amines were further utilized in a split synthesis to generate libraries of ureas, amides and sulfonamides in solution phase on the Trident(TM). The automated runs included 192 reactions to synthesize 96 ureas in duplicate and 96 reactions to synthesize 48 amides and 48 sulfonamides. A number of polymer-assisted solution phase protocols were employed for parallel work-up and purification of the products in each step.

  6. Parallel solution-phase synthesis of a 2-aminothiazole library including fully automated work-up.

    PubMed

    Buchstaller, Hans-Peter; Anlauf, Uwe

    2011-02-01

    A straightforward and effective procedure for the solution phase preparation of a 2-aminothiazole combinatorial library is described. Reaction, work-up and isolation of the title compounds as free bases was accomplished in a fully automated fashion using the Chemspeed ASW 2000 automated synthesizer. The compounds were obtained in good yields and excellent purities without any further purification procedure.

  7. Automatic pelvis segmentation from x-ray images of a mouse model

    NASA Astrophysics Data System (ADS)

    Al Okashi, Omar M.; Du, Hongbo; Al-Assam, Hisham

    2017-05-01

    The automatic detection and quantification of skeletal structures has a variety of different applications for biological research. Accurate segmentation of the pelvis from X-ray images of mice in a high-throughput project such as the Mouse Genomes Project not only saves time and cost but also helps achieving an unbiased quantitative analysis within the phenotyping pipeline. This paper proposes an automatic solution for pelvis segmentation based on structural and orientation properties of the pelvis in X-ray images. The solution consists of three stages including pre-processing image to extract pelvis area, initial pelvis mask preparation and final pelvis segmentation. Experimental results on a set of 100 X-ray images showed consistent performance of the algorithm. The automated solution overcomes the weaknesses of a manual annotation procedure where intra- and inter-observer variations cannot be avoided.

  8. Modelling and simulating the forming of new dry automated lay-up reinforcements for primary structures

    NASA Astrophysics Data System (ADS)

    Bouquerel, Laure; Moulin, Nicolas; Drapier, Sylvain; Boisse, Philippe; Beraud, Jean-Marc

    2017-10-01

    While weight has been so far the main driver for the development of prepreg based-composites solutions for aeronautics, a new weight-cost trade-off tends to drive choices for next-generation aircrafts. As a response, Hexcel has designed a new dry reinforcement type for aircraft primary structures, which combines the benefits of automation, out-of-autoclave process cost-effectiveness, and mechanical performances competitive to prepreg solutions: HiTape® is a unidirectional (UD) dry carbon reinforcement with thermoplastic veil on each side designed for aircraft primary structures [1-3]. One privileged process route for HiTape® in high volume automated processes consists in forming initially flat dry reinforcement stacks, before resin infusion [4] or injection. Simulation of the forming step aims at predicting the geometry and mechanical properties of the formed stack (so-called preform) for process optimisation. Extensive work has been carried out on prepreg and dry woven fabrics forming behaviour and simulation, but the interest for dry non-woven reinforcements has emerged more recently. Some work has been achieved on non crimp fabrics but studies on the forming behaviour of UDs are seldom and deal with UD prepregs only. Tension and bending in the fibre direction, along with inter-ply friction have been identified as the main mechanisms controlling the HiTape® response during forming. Bending has been characterised using a modified Peirce's flexometer [5] and inter-ply friction study is under development. Anisotropic hyperelastic constitutive models have been selected to represent the assumed decoupled deformation mechanisms. Model parameters are then identified from associated experimental results. For forming simulation, a continuous approach at the macroscopic scale has been selected first, and simulation is carried out in the Zset framework [6] using proper shell finite elements.

  9. KAMO: towards automated data processing for microcrystals.

    PubMed

    Yamashita, Keitaro; Hirata, Kunio; Yamamoto, Masaki

    2018-05-01

    In protein microcrystallography, radiation damage often hampers complete and high-resolution data collection from a single crystal, even under cryogenic conditions. One promising solution is to collect small wedges of data (5-10°) separately from multiple crystals. The data from these crystals can then be merged into a complete reflection-intensity set. However, data processing of multiple small-wedge data sets is challenging. Here, a new open-source data-processing pipeline, KAMO, which utilizes existing programs, including the XDS and CCP4 packages, has been developed to automate whole data-processing tasks in the case of multiple small-wedge data sets. Firstly, KAMO processes individual data sets and collates those indexed with equivalent unit-cell parameters. The space group is then chosen and any indexing ambiguity is resolved. Finally, clustering is performed, followed by merging with outlier rejections, and a report is subsequently created. Using synthetic and several real-world data sets collected from hundreds of crystals, it was demonstrated that merged structure-factor amplitudes can be obtained in a largely automated manner using KAMO, which greatly facilitated the structure analyses of challenging targets that only produced microcrystals. open access.

  10. Development and application of structural dynamics analysis capabilities

    NASA Technical Reports Server (NTRS)

    Heinemann, Klaus W.; Hozaki, Shig

    1994-01-01

    Extensive research activities were performed in the area of multidisciplinary modeling and simulation of aerospace vehicles that are relevant to NASA Dryden Flight Research Facility. The efforts involved theoretical development, computer coding, and debugging of the STARS code. New solution procedures were developed in such areas as structures, CFD, and graphics, among others. Furthermore, systems-oriented codes were developed for rendering the code truly multidisciplinary and rather automated in nature. Also, work was performed in pre- and post-processing of engineering analysis data.

  11. A Sensor Data Fusion System Based on k-Nearest Neighbor Pattern Classification for Structural Health Monitoring Applications

    PubMed Central

    Vitola, Jaime; Pozo, Francesc; Tibaduiza, Diego A.; Anaya, Maribel

    2017-01-01

    Civil and military structures are susceptible and vulnerable to damage due to the environmental and operational conditions. Therefore, the implementation of technology to provide robust solutions in damage identification (by using signals acquired directly from the structure) is a requirement to reduce operational and maintenance costs. In this sense, the use of sensors permanently attached to the structures has demonstrated a great versatility and benefit since the inspection system can be automated. This automation is carried out with signal processing tasks with the aim of a pattern recognition analysis. This work presents the detailed description of a structural health monitoring (SHM) system based on the use of a piezoelectric (PZT) active system. The SHM system includes: (i) the use of a piezoelectric sensor network to excite the structure and collect the measured dynamic response, in several actuation phases; (ii) data organization; (iii) advanced signal processing techniques to define the feature vectors; and finally; (iv) the nearest neighbor algorithm as a machine learning approach to classify different kinds of damage. A description of the experimental setup, the experimental validation and a discussion of the results from two different structures are included and analyzed. PMID:28230796

  12. A Sensor Data Fusion System Based on k-Nearest Neighbor Pattern Classification for Structural Health Monitoring Applications.

    PubMed

    Vitola, Jaime; Pozo, Francesc; Tibaduiza, Diego A; Anaya, Maribel

    2017-02-21

    Civil and military structures are susceptible and vulnerable to damage due to the environmental and operational conditions. Therefore, the implementation of technology to provide robust solutions in damage identification (by using signals acquired directly from the structure) is a requirement to reduce operational and maintenance costs. In this sense, the use of sensors permanently attached to the structures has demonstrated a great versatility and benefit since the inspection system can be automated. This automation is carried out with signal processing tasks with the aim of a pattern recognition analysis. This work presents the detailed description of a structural health monitoring (SHM) system based on the use of a piezoelectric (PZT) active system. The SHM system includes: (i) the use of a piezoelectric sensor network to excite the structure and collect the measured dynamic response, in several actuation phases; (ii) data organization; (iii) advanced signal processing techniques to define the feature vectors; and finally; (iv) the nearest neighbor algorithm as a machine learning approach to classify different kinds of damage. A description of the experimental setup, the experimental validation and a discussion of the results from two different structures are included and analyzed.

  13. Clustering procedures for the optimal selection of data sets from multiple crystals in macromolecular crystallography.

    PubMed

    Foadi, James; Aller, Pierre; Alguel, Yilmaz; Cameron, Alex; Axford, Danny; Owen, Robin L; Armour, Wes; Waterman, David G; Iwata, So; Evans, Gwyndaf

    2013-08-01

    The availability of intense microbeam macromolecular crystallography beamlines at third-generation synchrotron sources has enabled data collection and structure solution from microcrystals of <10 µm in size. The increased likelihood of severe radiation damage where microcrystals or particularly sensitive crystals are used forces crystallographers to acquire large numbers of data sets from many crystals of the same protein structure. The associated analysis and merging of multi-crystal data is currently a manual and time-consuming step. Here, a computer program, BLEND, that has been written to assist with and automate many of the steps in this process is described. It is demonstrated how BLEND has successfully been used in the solution of a novel membrane protein.

  14. Clustering procedures for the optimal selection of data sets from multiple crystals in macromolecular crystallography

    PubMed Central

    Foadi, James; Aller, Pierre; Alguel, Yilmaz; Cameron, Alex; Axford, Danny; Owen, Robin L.; Armour, Wes; Waterman, David G.; Iwata, So; Evans, Gwyndaf

    2013-01-01

    The availability of intense microbeam macromolecular crystallography beamlines at third-generation synchrotron sources has enabled data collection and structure solution from microcrystals of <10 µm in size. The increased likelihood of severe radiation damage where microcrystals or particularly sensitive crystals are used forces crystallographers to acquire large numbers of data sets from many crystals of the same protein structure. The associated analysis and merging of multi-crystal data is currently a manual and time-consuming step. Here, a computer program, BLEND, that has been written to assist with and automate many of the steps in this process is described. It is demonstrated how BLEND has successfully been used in the solution of a novel membrane protein. PMID:23897484

  15. Solvation Structure and Thermodynamic Mapping (SSTMap): An Open-Source, Flexible Package for the Analysis of Water in Molecular Dynamics Trajectories.

    PubMed

    Haider, Kamran; Cruz, Anthony; Ramsey, Steven; Gilson, Michael K; Kurtzman, Tom

    2018-01-09

    We have developed SSTMap, a software package for mapping structural and thermodynamic water properties in molecular dynamics trajectories. The package introduces automated analysis and mapping of local measures of frustration and enhancement of water structure. The thermodynamic calculations are based on Inhomogeneous Fluid Solvation Theory (IST), which is implemented using both site-based and grid-based approaches. The package also extends the applicability of solvation analysis calculations to multiple molecular dynamics (MD) simulation programs by using existing cross-platform tools for parsing MD parameter and trajectory files. SSTMap is implemented in Python and contains both command-line tools and a Python module to facilitate flexibility in setting up calculations and for automated generation of large data sets involving analysis of multiple solutes. Output is generated in formats compatible with popular Python data science packages. This tool will be used by the molecular modeling community for computational analysis of water in problems of biophysical interest such as ligand binding and protein function.

  16. Fluorescent Approaches to High Throughput Crystallography

    NASA Technical Reports Server (NTRS)

    Pusey, Marc L.; Forsythe, Elizabeth

    2005-01-01

    X-ray crystallography remains the primary method for determining the structure of macromolecules. The first requirement is to have crystals, and obtaining them is often the rate-limiting step. The numbers of crystallization trials that are set up for any one protein for structural genomics, and the rate at which they are being set up, now overwhelm the ability for strictly human analysis of the results. Automated analysis methods are now being implemented with varying degrees of success, but these typically cannot reliably extract intermediate results. By covalently modifying a subpopulation, 51%, of a macromolecule solution with a fluorescent probe, the labeled material will add to a growing crystal as a microheterogeneous growth unit. Labeling procedures can be readily incorporated into the final stages of purification. The covalently attached probe will concentrate in the crystal relative to the solution, and under fluorescent illumination the crystals show up as bright objects against a dark background. As crystalline packing is more dense than amorphous precipitate, the fluorescence intensity can be used as a guide in distinguishing different types of precipitated phases, even in the absence of obvious crystalline features, widening the available potential lead conditions in the absence of clear hits. Non-protein structures, such as salt crystals, will not incorporate the probe and will not show up under fluorescent illumination. Also, brightly fluorescent crystals are readily found against less fluorescent precipitated phases, which under white light illumination may serve to obscure the crystals. Automated image analysis to find crystals should be greatly facilitated, without having to first define crystallization drop boundaries and by having the protein or protein structures all that show up. The trace fluorescently labeled crystals will also emit with sufficient intensity to aid in the automation of crystal alignment using relatively low cost optics, further increasing throughput at synchrotrons. This presentation will focus on the methodology for fluorescent labeling, the crystallization results, and the effects of the trace labeling on the crystal quality.

  17. Fluorescent Approaches to High Throughput Crystallography

    NASA Technical Reports Server (NTRS)

    Minamitani, Elizabeth Forsythe; Pusey, Marc L.

    2004-01-01

    X-ray crystallography remains the primary method for determining the structure of macromolecules. The first requirement is to have crystals, and obtaining them is often the rate-limiting step. The numbers of crystallization trials that are set up for any one protein for structural genomics, and the rate at which they are being set up, now overwhelm the ability for strictly human analysis of the results. Automated analysis methods are now being implemented with varying degrees of success, but these typically cannot reliably extract intermediate results. By covalently modifying a subpopulation, less than or = 1%, of a macromolecule solution with a fluorescent probe, the labeled material will add to a growing crystal as a microheterogeneous growth unit. Labeling procedures can be readily incorporated into the final stages of a macromolecules purification. The covalently attached probe will concentrate in the crystal relative to the solution, and under fluorescent illumination the crystals will show up as bright objects against a dark background. As crystalline packing is more dense than amorphous precipitate, the fluorescence intensity can be used as a guide in distinguishing different types of precipitated phases, even in the absence of obvious crystalline features, widening the available potential lead conditions in the absence of clear "bits." Non-protein structures, such as salt crystals, will not incorporate the probe and will not show up under fluorescent illumination. Also, brightly fluorescent crystals are readily found against less fluorescent precipitated phases, which under white light illumination may serve to obscure the crystals. Automated image analysis to find crystals should be greatly facilitated, without having to first define crystallization drop boundaries and by having the protein or protein structures all that show up. The trace fluorescently labeled crystals will also emit with sufficient intensity to aid in the automation of crystal alignment using relatively low cost optics, further increasing throughput at synchrotrons. This presentation will focus on the methodology for fluorescent labeling, the crystallization results, and the effects of the trace labeling on the crystal quality.

  18. Fluorescent Approaches to High Throughput Crystallography

    NASA Technical Reports Server (NTRS)

    Pusey, Marc L.; Forsythe, Elizabeth; Achari, Amiruddha

    2005-01-01

    X-ray crystallography remains the primary method for determining the structure of macromolecules. The first requirement is to have crystals, and obtaining them is often the rate-limiting step. The numbers of crystallization trials that are set up for any one protein for structural genomics, and the rate at which they are being set up, now overwhelm the ability for strictly human analysis of the results. Automated analysis methods are now being implemented with varying degrees of success, but these typically cannot reliably extract intermediate results. By covalently modifying a subpopulation, less than or = 1 %, of a macromolecule solution with a fluorescent probe, the labeled material will add to a growing crystal as a microheterogeneous growth unit. Labeling procedures can be readily incorporated into the final stages of purification. The covalently attached probe will concentrate in the crystal relative to the solution, and under fluorescent illumination the crystals show up as bright objects against a dark background. As crystalline packing is more dense than amorphous precipitate, the fluorescence intensity can be used as a guide in distinguishing different types of precipitated phases, even in the absence of obvious crystalline features, widening the available potential lead conditions in the absence of clear "hits." Non-protein structures, such as salt crystals, will not incorporate the probe and will not show up under fluorescent illumination. Also, brightly fluorescent crystals are readily found against less fluorescent precipitated phases, which under white light illumination may serve to obscure the crystals. Automated image analysis to find crystals should be greatly facilitated, without having to first define crystallization drop boundaries and by having the protein or protein structures all that show up. The trace fluorescently labeled crystals will also emit with sufficient intensity to aid in the automation of crystal alignment using relatively low cost optics, further increasing throughput at synchrotrons. Preliminary experiments show that the presence of the fluorescent probe does not affect the nucleation process or the quality of the X-ray data obtained.

  19. Fluorescent Approaches to High Throughput Crystallography

    NASA Technical Reports Server (NTRS)

    Pusey, Marc L.; Forsythe, Elizabeth

    2004-01-01

    X-ray crystallography remains the primary method for determining the structure of macromolecules. The first requirement is to have crystals, and obtaining them is often the rate-limiting step. The numbers of crystallization trials that are set up for any one protein for structural genomics, and the rate at which they are being set up, now overwhelm the ability for strictly human analysis of the results. Automated analysis methods are now being implemented with varying degrees of success, but these typically can not reliably extract intermediate results. By covalently modifying a subpopulation, less than or = 1%, of a macromolecule solution with a fluorescent probe, the labeled material will add to a growing crystal as a microheterogeneous growth unit. Labeling procedures can be readily incorporated into the final stages of purification. The covalently attached probe will concentrate in the crystal relative to the solution, and under fluorescent illumination the crystals show up as bright objects against a dark background. As crystalline packing is more dense than amorphous precipitate, the fluorescence intensity can be used as a guide in distinguishing different types of precipitated phases, even in the absence of obvious crystalline features, widening the available potential lead conditions in the absence of clear "hits." Non-protein structures, such as salt crystals, will not incorporate the probe and will not show up under fluorescent illumination. Also, brightly fluorescent crystals are readily found against less fluorescent precipitated phases, which under white light illumination may serve to obscure the crystals. Automated image analysis to find crystals should be greatly facilitated, without having to first define crystallization drop boundaries and by having the protein or protein structures all that show up. The trace fluorescently labeled crystals will also emit with sufficient intensity to aid in the automation of crystal alignment using relatively low cost optics, further increasing throughput at synchrotrons. This presentation will focus on the methodology for fluorescent labeling, the crystallization results, and the effects of the trace labeling on the crystal quality.

  20. Automated standardization technique for an inductively-coupled plasma emission spectrometer

    USGS Publications Warehouse

    Garbarino, John R.; Taylor, Howard E.

    1982-01-01

    The manifold assembly subsystem described permits real-time computer-controlled standardization and quality control of a commercial inductively-coupled plasma atomic emission spectrometer. The manifold assembly consists of a branch-structured glass manifold, a series of microcomputer-controlled solenoid valves, and a reservoir for each standard. Automated standardization involves selective actuation of each solenoid valve that permits a specific mixed standard solution to be pumped to the nebulizer of the spectrometer. Quality control is based on the evaluation of results obtained for a mixed standard containing 17 analytes, that is measured periodically with unknown samples. An inaccurate standard evaluation triggers restandardization of the instrument according to a predetermined protocol. Interaction of the computer-controlled manifold assembly hardware with the spectrometer system is outlined. Evaluation of the automated standardization system with respect to reliability, simplicity, flexibility, and efficiency is compared to the manual procedure. ?? 1982.

  1. Optimization of Composite Structures with Curved Fiber Trajectories

    NASA Astrophysics Data System (ADS)

    Lemaire, Etienne; Zein, Samih; Bruyneel, Michael

    2014-06-01

    This paper studies the problem of optimizing composites shells manufactured using Automated Tape Layup (ATL) or Automated Fiber Placement (AFP) processes. The optimization procedure relies on a new approach to generate equidistant fiber trajectories based on Fast Marching Method. Starting with a (possibly curved) reference fiber direction defined on a (possibly curved) meshed surface, the new method allows determining fibers orientation resulting from a uniform thickness layup. The design variables are the parameters defining the position and the shape of the reference curve which results in very few design variables. Thanks to this efficient parameterization, maximum stiffness optimization numerical applications are proposed. The shape of the design space is discussed, regarding local and global optimal solutions.

  2. Automation of Hessian-Based Tubularity Measure Response Function in 3D Biomedical Images.

    PubMed

    Dzyubak, Oleksandr P; Ritman, Erik L

    2011-01-01

    The blood vessels and nerve trees consist of tubular objects interconnected into a complex tree- or web-like structure that has a range of structural scale 5 μm diameter capillaries to 3 cm aorta. This large-scale range presents two major problems; one is just making the measurements, and the other is the exponential increase of component numbers with decreasing scale. With the remarkable increase in the volume imaged by, and resolution of, modern day 3D imagers, it is almost impossible to make manual tracking of the complex multiscale parameters from those large image data sets. In addition, the manual tracking is quite subjective and unreliable. We propose a solution for automation of an adaptive nonsupervised system for tracking tubular objects based on multiscale framework and use of Hessian-based object shape detector incorporating National Library of Medicine Insight Segmentation and Registration Toolkit (ITK) image processing libraries.

  3. Automated multi-slice extracellular and patch-clamp experiments using the WinLTP data acquisition system with automated perfusion control

    PubMed Central

    Anderson, William W.; Fitzjohn, Stephen M.; Collingridge, Graham L.

    2012-01-01

    WinLTP is a data acquisition program for studying long-term potentiation (LTP) and other aspects of synaptic function. Earlier versions of WinLTP (J. Neurosci. Methods, 162:346–356, 2007) provided automated electrical stimulation and data acquisition capable of running nearly an entire synaptic plasticity experiment, with the primary exception that perfusion solutions had to be changed manually. This automated stimulation and acquisition was done by using ‘Sweep’, ‘Loop’ and ‘Delay’ events to build scripts using the ‘Protocol Builder’. However, this did not allow automatic changing of many solutions while running multiple slice experiments, or solution changing when this had to be performed rapidly and with accurate timing during patch-clamp experiments. We report here the addition of automated perfusion control to WinLTP. First, perfusion change between sweeps is enabled by adding the ‘Perfuse’ event to Protocol Builder scripting and is used in slice experiments. Second, fast perfusion changes during as well as between sweeps is enabled by using the Perfuse event in the protocol scripts to control changes between sweeps, and also by changing digital or analog output during a sweep and is used for single cell single-line perfusion patch-clamp experiments. The addition of stepper control of tube placement allows dual- or triple-line perfusion patch-clamp experiments for up to 48 solutions. The ability to automate perfusion changes and fully integrate them with the already automated stimulation and data acquisition goes a long way toward complete automation of multi-slice extracellularly recorded and single cell patch-clamp experiments. PMID:22524994

  4. Automated lettuce nutrient solution management using an array of ion-selective electrodes

    USDA-ARS?s Scientific Manuscript database

    Automated sensing and control of macronutrients in hydroponic solutions would allow more efficient management of nutrients for crop growth in closed systems. This paper describes the development and evaluation of a computer-controlled nutrient management system with an array of ion-selective electro...

  5. Integrated performance and reliability specification for digital avionics systems

    NASA Technical Reports Server (NTRS)

    Brehm, Eric W.; Goettge, Robert T.

    1995-01-01

    This paper describes an automated tool for performance and reliability assessment of digital avionics systems, called the Automated Design Tool Set (ADTS). ADTS is based on an integrated approach to design assessment that unifies traditional performance and reliability views of system designs, and that addresses interdependencies between performance and reliability behavior via exchange of parameters and result between mathematical models of each type. A multi-layer tool set architecture has been developed for ADTS that separates the concerns of system specification, model generation, and model solution. Performance and reliability models are generated automatically as a function of candidate system designs, and model results are expressed within the system specification. The layered approach helps deal with the inherent complexity of the design assessment process, and preserves long-term flexibility to accommodate a wide range of models and solution techniques within the tool set structure. ADTS research and development to date has focused on development of a language for specification of system designs as a basis for performance and reliability evaluation. A model generation and solution framework has also been developed for ADTS, that will ultimately encompass an integrated set of analytic and simulated based techniques for performance, reliability, and combined design assessment.

  6. Automated iodine monitor system. [for aqueous solutions

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The feasibility of a direct spectrophotometric measurement of iodine in water was established. An iodine colorimeter, was built to demonstrate the practicality of this technique. The specificity of this method was verified when applied to an on-line system where a reference solution cannot be used, and a preliminary design is presented for an automated iodine measuring and controlling system meeting the desired specifications. An Automated iodine monitor/controller system based on this preliminary design was built, tested, and delivered to the Johnson Space Center.

  7. Automated Processing of Imaging Data through Multi-tiered Classification of Biological Structures Illustrated Using Caenorhabditis elegans.

    PubMed

    Zhan, Mei; Crane, Matthew M; Entchev, Eugeni V; Caballero, Antonio; Fernandes de Abreu, Diana Andrea; Ch'ng, QueeLim; Lu, Hang

    2015-04-01

    Quantitative imaging has become a vital technique in biological discovery and clinical diagnostics; a plethora of tools have recently been developed to enable new and accelerated forms of biological investigation. Increasingly, the capacity for high-throughput experimentation provided by new imaging modalities, contrast techniques, microscopy tools, microfluidics and computer controlled systems shifts the experimental bottleneck from the level of physical manipulation and raw data collection to automated recognition and data processing. Yet, despite their broad importance, image analysis solutions to address these needs have been narrowly tailored. Here, we present a generalizable formulation for autonomous identification of specific biological structures that is applicable for many problems. The process flow architecture we present here utilizes standard image processing techniques and the multi-tiered application of classification models such as support vector machines (SVM). These low-level functions are readily available in a large array of image processing software packages and programming languages. Our framework is thus both easy to implement at the modular level and provides specific high-level architecture to guide the solution of more complicated image-processing problems. We demonstrate the utility of the classification routine by developing two specific classifiers as a toolset for automation and cell identification in the model organism Caenorhabditis elegans. To serve a common need for automated high-resolution imaging and behavior applications in the C. elegans research community, we contribute a ready-to-use classifier for the identification of the head of the animal under bright field imaging. Furthermore, we extend our framework to address the pervasive problem of cell-specific identification under fluorescent imaging, which is critical for biological investigation in multicellular organisms or tissues. Using these examples as a guide, we envision the broad utility of the framework for diverse problems across different length scales and imaging methods.

  8. Quantification of Dynamic Morphological Drug Responses in 3D Organotypic Cell Cultures by Automated Image Analysis

    PubMed Central

    Härmä, Ville; Schukov, Hannu-Pekka; Happonen, Antti; Ahonen, Ilmari; Virtanen, Johannes; Siitari, Harri; Åkerfelt, Malin; Lötjönen, Jyrki; Nees, Matthias

    2014-01-01

    Glandular epithelial cells differentiate into complex multicellular or acinar structures, when embedded in three-dimensional (3D) extracellular matrix. The spectrum of different multicellular morphologies formed in 3D is a sensitive indicator for the differentiation potential of normal, non-transformed cells compared to different stages of malignant progression. In addition, single cells or cell aggregates may actively invade the matrix, utilizing epithelial, mesenchymal or mixed modes of motility. Dynamic phenotypic changes involved in 3D tumor cell invasion are sensitive to specific small-molecule inhibitors that target the actin cytoskeleton. We have used a panel of inhibitors to demonstrate the power of automated image analysis as a phenotypic or morphometric readout in cell-based assays. We introduce a streamlined stand-alone software solution that supports large-scale high-content screens, based on complex and organotypic cultures. AMIDA (Automated Morphometric Image Data Analysis) allows quantitative measurements of large numbers of images and structures, with a multitude of different spheroid shapes, sizes, and textures. AMIDA supports an automated workflow, and can be combined with quality control and statistical tools for data interpretation and visualization. We have used a representative panel of 12 prostate and breast cancer lines that display a broad spectrum of different spheroid morphologies and modes of invasion, challenged by a library of 19 direct or indirect modulators of the actin cytoskeleton which induce systematic changes in spheroid morphology and differentiation versus invasion. These results were independently validated by 2D proliferation, apoptosis and cell motility assays. We identified three drugs that primarily attenuated the invasion and formation of invasive processes in 3D, without affecting proliferation or apoptosis. Two of these compounds block Rac signalling, one affects cellular cAMP/cGMP accumulation. Our approach supports the growing needs for user-friendly, straightforward solutions that facilitate large-scale, cell-based 3D assays in basic research, drug discovery, and target validation. PMID:24810913

  9. Automation of large scale transient protein expression in mammalian cells

    PubMed Central

    Zhao, Yuguang; Bishop, Benjamin; Clay, Jordan E.; Lu, Weixian; Jones, Margaret; Daenke, Susan; Siebold, Christian; Stuart, David I.; Yvonne Jones, E.; Radu Aricescu, A.

    2011-01-01

    Traditional mammalian expression systems rely on the time-consuming generation of stable cell lines; this is difficult to accommodate within a modern structural biology pipeline. Transient transfections are a fast, cost-effective solution, but require skilled cell culture scientists, making man-power a limiting factor in a setting where numerous samples are processed in parallel. Here we report a strategy employing a customised CompacT SelecT cell culture robot allowing the large-scale expression of multiple protein constructs in a transient format. Successful protocols have been designed for automated transient transfection of human embryonic kidney (HEK) 293T and 293S GnTI− cells in various flask formats. Protein yields obtained by this method were similar to those produced manually, with the added benefit of reproducibility, regardless of user. Automation of cell maintenance and transient transfection allows the expression of high quality recombinant protein in a completely sterile environment with limited support from a cell culture scientist. The reduction in human input has the added benefit of enabling continuous cell maintenance and protein production, features of particular importance to structural biology laboratories, which typically use large quantities of pure recombinant proteins, and often require rapid characterisation of a series of modified constructs. This automated method for large scale transient transfection is now offered as a Europe-wide service via the P-cube initiative. PMID:21571074

  10. Development of an automated fuzing station for the future armored resupply vehicle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chesser, J.B.; Jansen, J.F.; Lloyd, P.D.

    1995-03-01

    The US Army is developing the Advanced Field Artillery System (SGSD), a next generation armored howitzer. The Future Armored Resupply Vehicle (FARV) will be its companion ammunition resupply vehicle. The FARV with automate the supply of ammunition and fuel to the AFAS which will increase capabilities over the current system. One of the functions being considered for automation is ammunition processing. Oak Ridge National Laboratory is developing equipment to demonstrate automated ammunition processing. One of the key operations to be automated is fuzing. The projectiles are initially unfuzed, and a fuze must be inserted and threaded into the projectile asmore » part of the processing. A constraint on the design solution is that the ammunition cannot be modified to simplify automation. The problem was analyzed to determine the alignment requirements. Using the results of the analysis, ORNL designed, built, and tested a test stand to verify the selected design solution.« less

  11. Medical imaging informatics based solutions for human performance analytics

    NASA Astrophysics Data System (ADS)

    Verma, Sneha; McNitt-Gray, Jill; Liu, Brent J.

    2018-03-01

    For human performance analysis, extensive experimental trials are often conducted to identify the underlying cause or long-term consequences of certain pathologies and to improve motor functions by examining the movement patterns of affected individuals. Data collected for human performance analysis includes high-speed video, surveys, spreadsheets, force data recordings from instrumented surfaces etc. These datasets are recorded from various standalone sources and therefore captured in different folder structures as well as in varying formats depending on the hardware configurations. Therefore, data integration and synchronization present a huge challenge while handling these multimedia datasets specifically for large datasets. Another challenge faced by researchers is querying large quantity of unstructured data and to design feedbacks/reporting tools for users who need to use datasets at various levels. In the past, database server storage solutions have been introduced to securely store these datasets. However, to automate the process of uploading raw files, various file manipulation steps are required. In the current workflow, this file manipulation and structuring is done manually and is not feasible for large amounts of data. However, by attaching metadata files and data dictionaries with these raw datasets, they can provide information and structure needed for automated server upload. We introduce one such system for metadata creation for unstructured multimedia data based on the DICOM data model design. We will discuss design and implementation of this system and evaluate this system with data set collected for movement analysis study. The broader aim of this paper is to present a solutions space achievable based on medical imaging informatics design and methods for improvement in workflow for human performance analysis in a biomechanics research lab.

  12. The use of methods of structural optimization at the stage of designing high-rise buildings with steel construction

    NASA Astrophysics Data System (ADS)

    Vasilkin, Andrey

    2018-03-01

    The more designing solutions at the search stage for design for high-rise buildings can be synthesized by the engineer, the more likely that the final adopted version will be the most efficient and economical. However, in modern market conditions, taking into account the complexity and responsibility of high-rise buildings the designer does not have the necessary time to develop, analyze and compare any significant number of options. To solve this problem, it is expedient to use the high potential of computer-aided designing. To implement automated search for design solutions, it is proposed to develop the computing facilities, the application of which will significantly increase the productivity of the designer and reduce the complexity of designing. Methods of structural and parametric optimization have been adopted as the basis of the computing facilities. Their efficiency in the synthesis of design solutions is shown, also the schemes, that illustrate and explain the introduction of structural optimization in the traditional design of steel frames, are constructed. To solve the problem of synthesis and comparison of design solutions for steel frames, it is proposed to develop the computing facilities that significantly reduces the complexity of search designing and based on the use of methods of structural and parametric optimization.

  13. Automated Historical and Real-Time Cyclone Discovery With Multimodal Remote Satellite Measurements

    NASA Astrophysics Data System (ADS)

    Ho, S.; Talukder, A.; Liu, T.; Tang, W.; Bingham, A.

    2008-12-01

    Existing cyclone detection and tracking solutions involve extensive manual analysis of modeled-data and field campaign data by teams of experts. We have developed a novel automated global cyclone detection and tracking system by assimilating and sharing information from multiple remote satellites. This unprecedented solution of combining multiple remote satellite measurements in an autonomous manner allows leveraging off the strengths of each individual satellite. Use of multiple satellite data sources also results in significantly improved temporal tracking accuracy for cyclones. Our solution involves an automated feature extraction and machine learning technique based on an ensemble classifier and Kalman filter for cyclone detection and tracking from multiple heterogeneous satellite data sources. Our feature-based methodology that focuses on automated cyclone discovery is fundamentally different from, and actually complements, the well-known Dvorak technique for cyclone intensity estimation (that often relies on manual detection of cyclonic regions) from field and remote data. Our solution currently employs the QuikSCAT wind measurement and the merged level 3 TRMM precipitation data for automated cyclone discovery. Assimilation of other types of remote measurements is ongoing and planned in the near future. Experimental results of our automated solution on historical cyclone datasets demonstrate the superior performance of our automated approach compared to previous work. Performance of our detection solution compares favorably against the list of cyclones occurring in North Atlantic Ocean for the 2005 calendar year reported by the National Hurricane Center (NHC) in our initial analysis. We have also demonstrated the robustness of our cyclone tracking methodology in other regions over the world by using multiple heterogeneous satellite data for detection and tracking of three arbitrary historical cyclones in other regions. Our cyclone detection and tracking methodology can be applied to (i) historical data to support Earth scientists in climate modeling, cyclonic-climate interactions, and obtain a better understanding of the cause and effects of cyclone (e.g. cyclo-genesis), and (ii) automatic cyclone discovery in near real-time using streaming satellite to support and improve the planning of global cyclone field campaigns. Additional satellite data from GOES and other orbiting satellites can be easily assimilated and integrated into our automated cyclone detection and tracking module to improve the temporal tracking accuracy of cyclones down to ½ hr and reduce the incidence of false alarms.

  14. Flexible Automation System for Determination of Elemental Composition of Incrustations in Clogged Biliary Endoprostheses Using ICP-MS.

    PubMed

    Fleischer, Heidi; Ramani, Kinjal; Blitti, Koffi; Roddelkopf, Thomas; Warkentin, Mareike; Behrend, Detlef; Thurow, Kerstin

    2018-02-01

    Automation systems are well established in industries and life science laboratories, especially in bioscreening and high-throughput applications. An increasing demand of automation solutions can be seen in the field of analytical measurement in chemical synthesis, quality control, and medical and pharmaceutical fields, as well as research and development. In this study, an automation solution was developed and optimized for the investigation of new biliary endoprostheses (stents), which should reduce clogging after implantation in the human body. The material inside the stents (incrustations) has to be controlled regularly and under identical conditions. The elemental composition is one criterion to be monitored in stent development. The manual procedure was transferred to an automated process including sample preparation, elemental analysis using inductively coupled plasma mass spectrometry (ICP-MS), and data evaluation. Due to safety issues, microwave-assisted acid digestion was executed outside of the automation system. The performance of the automated process was determined and validated. The measurement results and the processing times were compared for both the manual and the automated procedure. Finally, real samples of stent incrustations and pig bile were analyzed using the automation system.

  15. Defining the drivers for accepting decision making automation in air traffic management.

    PubMed

    Bekier, Marek; Molesworth, Brett R C; Williamson, Ann

    2011-04-01

    Air Traffic Management (ATM) operators are under increasing pressure to improve the efficiency of their operation to cater for forecasted increases in air traffic movements. One solution involves increasing the utilisation of automation within the ATM system. The success of this approach is contingent on Air Traffic Control Operators' (ATCOs) willingness to accept increased levels of automation. The main aim of the present research was to examine the drivers underpinning ATCOs' willingness to accept increased utilisation of automation within their role. Two fictitious scenarios involving the application of two new automated decision-making tools were created. The results of an online survey revealed traditional predictors of automation acceptance such as age, trust and job satisfaction explain between 4 and 7% of the variance. Furthermore, these predictors varied depending on the purpose in which the automation was to be employed. These results are discussed from an applied and theoretical perspective. STATEMENT OF RELEVANCE: Efficiency improvements in ATM are required to cater for forecasted increases in air traffic movements. One solution is to increase the utilisation of automation within Air Traffic Control. The present research examines the drivers underpinning air traffic controllers' willingness to accept increased levels of automation in their role.

  16. Automatic protein structure solution from weak X-ray data

    NASA Astrophysics Data System (ADS)

    Skubák, Pavol; Pannu, Navraj S.

    2013-11-01

    Determining new protein structures from X-ray diffraction data at low resolution or with a weak anomalous signal is a difficult and often an impossible task. Here we propose a multivariate algorithm that simultaneously combines the structure determination steps. In tests on over 140 real data sets from the protein data bank, we show that this combined approach can automatically build models where current algorithms fail, including an anisotropically diffracting 3.88 Å RNA polymerase II data set. The method seamlessly automates the process, is ideal for non-specialists and provides a mathematical framework for successfully combining various sources of information in image processing.

  17. Automated DNA extraction platforms offer solutions to challenges of assessing microbial biofouling in oil production facilities.

    PubMed

    Oldham, Athenia L; Drilling, Heather S; Stamps, Blake W; Stevenson, Bradley S; Duncan, Kathleen E

    2012-11-20

    The analysis of microbial assemblages in industrial, marine, and medical systems can inform decisions regarding quality control or mitigation. Modern molecular approaches to detect, characterize, and quantify microorganisms provide rapid and thorough measures unbiased by the need for cultivation. The requirement of timely extraction of high quality nucleic acids for molecular analysis is faced with specific challenges when used to study the influence of microorganisms on oil production. Production facilities are often ill equipped for nucleic acid extraction techniques, making the preservation and transportation of samples off-site a priority. As a potential solution, the possibility of extracting nucleic acids on-site using automated platforms was tested. The performance of two such platforms, the Fujifilm QuickGene-Mini80™ and Promega Maxwell®16 was compared to a widely used manual extraction kit, MOBIO PowerBiofilm™ DNA Isolation Kit, in terms of ease of operation, DNA quality, and microbial community composition. Three pipeline biofilm samples were chosen for these comparisons; two contained crude oil and corrosion products and the third transported seawater. Overall, the two more automated extraction platforms produced higher DNA yields than the manual approach. DNA quality was evaluated for amplification by quantitative PCR (qPCR) and end-point PCR to generate 454 pyrosequencing libraries for 16S rRNA microbial community analysis. Microbial community structure, as assessed by DGGE analysis and pyrosequencing, was comparable among the three extraction methods. Therefore, the use of automated extraction platforms should enhance the feasibility of rapidly evaluating microbial biofouling at remote locations or those with limited resources.

  18. Automated DNA extraction platforms offer solutions to challenges of assessing microbial biofouling in oil production facilities

    PubMed Central

    2012-01-01

    The analysis of microbial assemblages in industrial, marine, and medical systems can inform decisions regarding quality control or mitigation. Modern molecular approaches to detect, characterize, and quantify microorganisms provide rapid and thorough measures unbiased by the need for cultivation. The requirement of timely extraction of high quality nucleic acids for molecular analysis is faced with specific challenges when used to study the influence of microorganisms on oil production. Production facilities are often ill equipped for nucleic acid extraction techniques, making the preservation and transportation of samples off-site a priority. As a potential solution, the possibility of extracting nucleic acids on-site using automated platforms was tested. The performance of two such platforms, the Fujifilm QuickGene-Mini80™ and Promega Maxwell®16 was compared to a widely used manual extraction kit, MOBIO PowerBiofilm™ DNA Isolation Kit, in terms of ease of operation, DNA quality, and microbial community composition. Three pipeline biofilm samples were chosen for these comparisons; two contained crude oil and corrosion products and the third transported seawater. Overall, the two more automated extraction platforms produced higher DNA yields than the manual approach. DNA quality was evaluated for amplification by quantitative PCR (qPCR) and end-point PCR to generate 454 pyrosequencing libraries for 16S rRNA microbial community analysis. Microbial community structure, as assessed by DGGE analysis and pyrosequencing, was comparable among the three extraction methods. Therefore, the use of automated extraction platforms should enhance the feasibility of rapidly evaluating microbial biofouling at remote locations or those with limited resources. PMID:23168231

  19. The cobas® 6800/8800 System: a new era of automation in molecular diagnostics.

    PubMed

    Cobb, Bryan; Simon, Christian O; Stramer, Susan L; Body, Barbara; Mitchell, P Shawn; Reisch, Natasa; Stevens, Wendy; Carmona, Sergio; Katz, Louis; Will, Stephen; Liesenfeld, Oliver

    2017-02-01

    Molecular diagnostics is a key component of laboratory medicine. Here, the authors review key triggers of ever-increasing automation in nucleic acid amplification testing (NAAT) with a focus on specific automated Polymerase Chain Reaction (PCR) testing and platforms such as the recently launched cobas® 6800 and cobas® 8800 Systems. The benefits of such automation for different stakeholders including patients, clinicians, laboratory personnel, hospital administrators, payers, and manufacturers are described. Areas Covered: The authors describe how molecular diagnostics has achieved total laboratory automation over time, rivaling clinical chemistry to significantly improve testing efficiency. Finally, the authors discuss how advances in automation decrease the development time for new tests enabling clinicians to more readily provide test results. Expert Commentary: The advancements described enable complete diagnostic solutions whereby specific test results can be combined with relevant patient data sets to allow healthcare providers to deliver comprehensive clinical recommendations in multiple fields ranging from infectious disease to outbreak management and blood safety solutions.

  20. Nomenclature and basic concepts in automation in the clinical laboratory setting: a practical glossary.

    PubMed

    Evangelopoulos, Angelos A; Dalamaga, Maria; Panoutsopoulos, Konstantinos; Dima, Kleanthi

    2013-01-01

    In the early 80s, the word automation was used in the clinical laboratory setting referring only to analyzers. But in late 80s and afterwards, automation found its way into all aspects of the diagnostic process, embracing not only the analytical but also the pre- and post-analytical phase. While laboratories in the eastern world, mainly Japan, paved the way for laboratory automation, US and European laboratories soon realized the benefits and were quick to follow. Clearly, automation and robotics will be a key survival tool in a very competitive and cost-concious healthcare market. What sets automation technology apart from so many other efficiency solutions are the dramatic savings that it brings to the clinical laboratory. Further standardization will assure the success of this revolutionary new technology. One of the main difficulties laboratory managers and personnel must deal with when studying solutions to reengineer a laboratory is familiarizing themselves with the multidisciplinary and technical terminology of this new and exciting field. The present review/glossary aims at giving an overview of the most frequently used terms within the scope of laboratory automation and to put laboratory automation on a sounder linguistic basis.

  1. Gravity-Assist Trajectories to the Ice Giants: An Automated Method to Catalog Mass- Or Time-Optimal Solutions

    NASA Technical Reports Server (NTRS)

    Hughes, Kyle M.; Knittel, Jeremy M.; Englander, Jacob A.

    2017-01-01

    This work presents an automated method of calculating mass (or time) optimal gravity-assist trajectories without a priori knowledge of the flyby-body combination. Since gravity assists are particularly crucial for reaching the outer Solar System, we use the Ice Giants, Uranus and Neptune, as example destinations for this work. Catalogs are also provided that list the most attractive trajectories found over launch dates ranging from 2024 to 2038. The tool developed to implement this method, called the Python EMTG Automated Trade Study Application (PEATSA), iteratively runs the Evolutionary Mission Trajectory Generator (EMTG), a NASA Goddard Space Flight Center in-house trajectory optimization tool. EMTG finds gravity-assist trajectories with impulsive maneuvers using a multiple-shooting structure along with stochastic methods (such as monotonic basin hopping) and may be run with or without an initial guess provided. PEATSA runs instances of EMTG in parallel over a grid of launch dates. After each set of runs completes, the best results within a neighborhood of launch dates are used to seed all other cases in that neighborhood-allowing the solutions across the range of launch dates to improve over each iteration. The results here are compared against trajectories found using a grid-search technique, and PEATSA is found to outperform the grid-search results for most launch years considered.

  2. Gravity-Assist Trajectories to the Ice Giants: An Automated Method to Catalog Mass-or Time-Optimal Solutions

    NASA Technical Reports Server (NTRS)

    Hughes, Kyle M.; Knittel, Jeremy M.; Englander, Jacob A.

    2017-01-01

    This work presents an automated method of calculating mass (or time) optimal gravity-assist trajectories without a priori knowledge of the flyby-body combination. Since gravity assists are particularly crucial for reaching the outer Solar System, we use the Ice Giants, Uranus and Neptune, as example destinations for this work. Catalogs are also provided that list the most attractive trajectories found over launch dates ranging from 2024 to 2038. The tool developed to implement this method, called the Python EMTG Automated Trade Study Application (PEATSA), iteratively runs the Evolutionary Mission Trajectory Generator (EMTG), a NASA Goddard Space Flight Center in-house trajectory optimization tool. EMTG finds gravity-assist trajectories with impulsive maneuvers using a multiple-shooting structure along with stochastic methods (such as monotonic basin hopping) and may be run with or without an initial guess provided. PEATSA runs instances of EMTG in parallel over a grid of launch dates. After each set of runs completes, the best results within a neighborhood of launch dates are used to seed all other cases in that neighborhood---allowing the solutions across the range of launch dates to improve over each iteration. The results here are compared against trajectories found using a grid-search technique, and PEATSA is found to outperform the grid-search results for most launch years considered.

  3. Toward the automated analysis of plasma physics problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mynick, H.E.

    1989-04-01

    A program (CALC) is described, which carries out nontrivial plasma physics calculations, in a manner intended to emulate the approach of a human theorist. This includes the initial process of gathering the relevant equations from a plasma knowledge base, and then determining how to solve them. Solution of the sets of equations governing physics problems, which in general have a nonuniform,irregular structure, not amenable to solution by standardized algorithmic procedures, is facilitated by an analysis of the structure of the equations and the relations among them. This often permits decompositions of the full problem into subproblems, and other simplifications inmore » form, which renders the resultant subsystems soluble by more standardized tools. CALC's operation is illustrated by a detailed description of its treatment of a sample plasma calculation. 5 refs., 3 figs.« less

  4. IFLA General Conference, 1986. Pre-Conference Seminar on Automated Systems for Access to Multilingual and Multiscript Library materials: Problems and Solutions. Papers.

    ERIC Educational Resources Information Center

    International Federation of Library Associations and Institutions, The Hague (Netherlands).

    A seminar which considered problems and solutions regarding automated systems for access to multilingual and multiscript library materials was held as a pre-session before the IFLA general conference in 1986. Papers presented include: (1) "Romanized and Transliterated Databases of Asian Language Materials--History, Problems, and…

  5. Automation of a high-speed imaging setup for differential viscosity measurements

    NASA Astrophysics Data System (ADS)

    Hurth, C.; Duane, B.; Whitfield, D.; Smith, S.; Nordquist, A.; Zenhausern, F.

    2013-12-01

    We present the automation of a setup previously used to assess the viscosity of pleural effusion samples and discriminate between transudates and exudates, an important first step in clinical diagnostics. The presented automation includes the design, testing, and characterization of a vacuum-actuated loading station that handles the 2 mm glass spheres used as sensors, as well as the engineering of electronic Printed Circuit Board (PCB) incorporating a microcontroller and their synchronization with a commercial high-speed camera operating at 10 000 fps. The hereby work therefore focuses on the instrumentation-related automation efforts as the general method and clinical application have been reported earlier [Hurth et al., J. Appl. Phys. 110, 034701 (2011)]. In addition, we validate the performance of the automated setup with the calibration for viscosity measurements using water/glycerol standard solutions and the determination of the viscosity of an "unknown" solution of hydroxyethyl cellulose.

  6. Automation of a high-speed imaging setup for differential viscosity measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurth, C.; Duane, B.; Whitfield, D.

    We present the automation of a setup previously used to assess the viscosity of pleural effusion samples and discriminate between transudates and exudates, an important first step in clinical diagnostics. The presented automation includes the design, testing, and characterization of a vacuum-actuated loading station that handles the 2 mm glass spheres used as sensors, as well as the engineering of electronic Printed Circuit Board (PCB) incorporating a microcontroller and their synchronization with a commercial high-speed camera operating at 10 000 fps. The hereby work therefore focuses on the instrumentation-related automation efforts as the general method and clinical application have beenmore » reported earlier [Hurth et al., J. Appl. Phys. 110, 034701 (2011)]. In addition, we validate the performance of the automated setup with the calibration for viscosity measurements using water/glycerol standard solutions and the determination of the viscosity of an “unknown” solution of hydroxyethyl cellulose.« less

  7. Adaptive Finite Element Methods for Continuum Damage Modeling

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Tworzydlo, W. W.; Xiques, K. E.

    1995-01-01

    The paper presents an application of adaptive finite element methods to the modeling of low-cycle continuum damage and life prediction of high-temperature components. The major objective is to provide automated and accurate modeling of damaged zones through adaptive mesh refinement and adaptive time-stepping methods. The damage modeling methodology is implemented in an usual way by embedding damage evolution in the transient nonlinear solution of elasto-viscoplastic deformation problems. This nonlinear boundary-value problem is discretized by adaptive finite element methods. The automated h-adaptive mesh refinements are driven by error indicators, based on selected principal variables in the problem (stresses, non-elastic strains, damage, etc.). In the time domain, adaptive time-stepping is used, combined with a predictor-corrector time marching algorithm. The time selection is controlled by required time accuracy. In order to take into account strong temperature dependency of material parameters, the nonlinear structural solution a coupled with thermal analyses (one-way coupling). Several test examples illustrate the importance and benefits of adaptive mesh refinements in accurate prediction of damage levels and failure time.

  8. Design of automatic leveling and centering system of theodolite

    NASA Astrophysics Data System (ADS)

    Liu, Chun-tong; He, Zhen-Xin; Huang, Xian-xiang; Zhan, Ying

    2012-09-01

    To realize the theodolite automation and improve the azimuth Angle measurement instrument, the theodolite automatic leveling and centering system with the function of leveling error compensation is designed, which includes the system solution, key components selection, the mechanical structure of leveling and centering, and system software solution. The redesigned leveling feet are driven by the DC servo motor; and the electronic control center device is installed. Using high precision of tilt sensors as horizontal skew detection sensors ensures the effectiveness of the leveling error compensation. Aiming round mark center is located using digital image processing through surface array CCD; and leveling measurement precision can reach the pixel level, which makes the theodolite accurate centering possible. Finally, experiments are conducted using the automatic leveling and centering system of the theodolite. The results show the leveling and centering system can realize automatic operation with high centering accuracy of 0.04mm.The measurement precision of the orientation angle after leveling error compensation is improved, compared with that of in the traditional method. Automatic leveling and centering system of theodolite can satisfy the requirements of the measuring precision and its automation.

  9. Development of an automated large-scale protein-crystallization and monitoring system for high-throughput protein-structure analyses.

    PubMed

    Hiraki, Masahiko; Kato, Ryuichi; Nagai, Minoru; Satoh, Tadashi; Hirano, Satoshi; Ihara, Kentaro; Kudo, Norio; Nagae, Masamichi; Kobayashi, Masanori; Inoue, Michio; Uejima, Tamami; Oda, Shunichiro; Chavas, Leonard M G; Akutsu, Masato; Yamada, Yusuke; Kawasaki, Masato; Matsugaki, Naohiro; Igarashi, Noriyuki; Suzuki, Mamoru; Wakatsuki, Soichi

    2006-09-01

    Protein crystallization remains one of the bottlenecks in crystallographic analysis of macromolecules. An automated large-scale protein-crystallization system named PXS has been developed consisting of the following subsystems, which proceed in parallel under unified control software: dispensing precipitants and protein solutions, sealing crystallization plates, carrying robot, incubators, observation system and image-storage server. A sitting-drop crystallization plate specialized for PXS has also been designed and developed. PXS can set up 7680 drops for vapour diffusion per hour, which includes time for replenishing supplies such as disposable tips and crystallization plates. Images of the crystallization drops are automatically recorded according to a preprogrammed schedule and can be viewed by users remotely using web-based browser software. A number of protein crystals were successfully produced and several protein structures could be determined directly from crystals grown by PXS. In other cases, X-ray quality crystals were obtained by further optimization by manual screening based on the conditions found by PXS.

  10. Exploring the Use of a Test Automation Framework

    NASA Technical Reports Server (NTRS)

    Cervantes, Alex

    2009-01-01

    It is known that software testers, more often than not, lack the time needed to fully test the delivered software product within the time period allotted to them. When problems in the implementation phase of a development project occur, it normally causes the software delivery date to slide. As a result, testers either need to work longer hours, or supplementary resources need to be added to the test team in order to meet aggressive test deadlines. One solution to this problem is to provide testers with a test automation framework to facilitate the development of automated test solutions.

  11. Automated processing of endoscopic surgical instruments.

    PubMed

    Roth, K; Sieber, J P; Schrimm, H; Heeg, P; Buess, G

    1994-10-01

    This paper deals with the requirements for automated processing of endoscopic surgical instruments. After a brief analysis of the current problems, solutions are discussed. Test-procedures have been developed to validate the automated processing, so that the cleaning results are guaranteed and reproducable. Also a device for testing and cleaning was designed together with Netzsch Newamatic and PCI, called TC-MIC, to automate processing and reduce manual work.

  12. Approaches to automated protein crystal harvesting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deller, Marc C., E-mail: mdeller@scripps.edu; Rupp, Bernhard, E-mail: mdeller@scripps.edu

    Approaches to automated and robot-assisted harvesting of protein crystals are critically reviewed. While no true turn-key solutions for automation of protein crystal harvesting are currently available, systems incorporating advanced robotics and micro-electromechanical systems represent exciting developments with the potential to revolutionize the way in which protein crystals are harvested.

  13. Translation: Aids, Robots, and Automation.

    ERIC Educational Resources Information Center

    Andreyewsky, Alexander

    1981-01-01

    Examines electronic aids to translation both as ways to automate it and as an approach to solve problems resulting from shortage of qualified translators. Describes the limitations of robotic MT (Machine Translation) systems, viewing MAT (Machine-Aided Translation) as the only practical solution and the best vehicle for further automation. (MES)

  14. Automated System of Diagnostic Monitoring at Bureya HPP Hydraulic Engineering Installations: a New Level of Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Musyurka, A. V., E-mail: musyurkaav@burges.rushydro.ru

    This article presents the design, hardware, and software solutions developed and placed in service for the automated system of diagnostic monitoring (ASDM) for hydraulic engineering installations at the Bureya HPP, and assuring a reliable process for monitoring hydraulic engineering installations. Project implementation represents a timely solution of problems addressed by the hydraulic engineering installation diagnostics section.

  15. Direct-method SAD phasing with partial-structure iteration: towards automation.

    PubMed

    Wang, J W; Chen, J R; Gu, Y X; Zheng, C D; Fan, H F

    2004-11-01

    The probability formula of direct-method SAD (single-wavelength anomalous diffraction) phasing proposed by Fan & Gu (1985, Acta Cryst. A41, 280-284) contains partial-structure information in the form of a Sim-weighting term. Previously, only the substructure of anomalous scatterers has been included in this term. In the case that the subsequent density modification and model building yields only structure fragments, which do not straightforwardly lead to the complete solution, the partial structure can be fed back into the Sim-weighting term of the probability formula in order to strengthen its phasing power and to benefit the subsequent automatic model building. The procedure has been tested with experimental SAD data from two known proteins with copper and sulfur as the anomalous scatterers.

  16. An ontology-driven tool for structured data acquisition using Web forms.

    PubMed

    Gonçalves, Rafael S; Tu, Samson W; Nyulas, Csongor I; Tierney, Michael J; Musen, Mark A

    2017-08-01

    Structured data acquisition is a common task that is widely performed in biomedicine. However, current solutions for this task are far from providing a means to structure data in such a way that it can be automatically employed in decision making (e.g., in our example application domain of clinical functional assessment, for determining eligibility for disability benefits) based on conclusions derived from acquired data (e.g., assessment of impaired motor function). To use data in these settings, we need it structured in a way that can be exploited by automated reasoning systems, for instance, in the Web Ontology Language (OWL); the de facto ontology language for the Web. We tackle the problem of generating Web-based assessment forms from OWL ontologies, and aggregating input gathered through these forms as an ontology of "semantically-enriched" form data that can be queried using an RDF query language, such as SPARQL. We developed an ontology-based structured data acquisition system, which we present through its specific application to the clinical functional assessment domain. We found that data gathered through our system is highly amenable to automatic analysis using queries. We demonstrated how ontologies can be used to help structuring Web-based forms and to semantically enrich the data elements of the acquired structured data. The ontologies associated with the enriched data elements enable automated inferences and provide a rich vocabulary for performing queries.

  17. Provider automation. Focusing on the big picture.

    PubMed

    Watson, S

    1995-06-01

    St. Vincent's Hospital in Birmingham, Ala., is preparing for a new world of health care by creating an enterprisewide information systems strategy rather than developing automation solutions for departmental "islands."

  18. Catch and Patch: A Pipette-Based Approach for Automating Patch Clamp That Enables Cell Selection and Fast Compound Application.

    PubMed

    Danker, Timm; Braun, Franziska; Silbernagl, Nikole; Guenther, Elke

    2016-03-01

    Manual patch clamp, the gold standard of electrophysiology, represents a powerful and versatile toolbox to stimulate, modulate, and record ion channel activity from membrane fragments and whole cells. The electrophysiological readout can be combined with fluorescent or optogenetic methods and allows for ultrafast solution exchanges using specialized microfluidic tools. A hallmark of manual patch clamp is the intentional selection of individual cells for recording, often an essential prerequisite to generate meaningful data. So far, available automation solutions rely on random cell usage in the closed environment of a chip and thus sacrifice much of this versatility by design. To parallelize and automate the traditional patch clamp technique while perpetuating the full versatility of the method, we developed an approach to automation, which is based on active cell handling and targeted electrode placement rather than on random processes. This is achieved through an automated pipette positioning system, which guides the tips of recording pipettes with micrometer precision to a microfluidic cell handling device. Using a patch pipette array mounted on a conventional micromanipulator, our automated patch clamp process mimics the original manual patch clamp as closely as possible, yet achieving a configuration where recordings are obtained from many patch electrodes in parallel. In addition, our implementation is extensible by design to allow the easy integration of specialized equipment such as ultrafast compound application tools. The resulting system offers fully automated patch clamp on purposely selected cells and combines high-quality gigaseal recordings with solution switching in the millisecond timescale.

  19. Automation of NMR structure determination of proteins.

    PubMed

    Altieri, Amanda S; Byrd, R Andrew

    2004-10-01

    The automation of protein structure determination using NMR is coming of age. The tedious processes of resonance assignment, followed by assignment of NOE (nuclear Overhauser enhancement) interactions (now intertwined with structure calculation), assembly of input files for structure calculation, intermediate analyses of incorrect assignments and bad input data, and finally structure validation are all being automated with sophisticated software tools. The robustness of the different approaches continues to deal with problems of completeness and uniqueness; nevertheless, the future is very bright for automation of NMR structure generation to approach the levels found in X-ray crystallography. Currently, near completely automated structure determination is possible for small proteins, and the prospect for medium-sized and large proteins is good. Copyright 2004 Elsevier Ltd.

  20. The development of structural x-ray crystallography

    NASA Astrophysics Data System (ADS)

    Woolfson, M. M.

    2018-03-01

    From its birth in 1912, when only the simplest structures could be solved, x-ray structural crystallography is now able to solve macromolecular structures containing many thousands of independent non-hydrogen atoms. This progress has depended on, and been driven by, great technical advances in the development of powerful synchrotron x-ray sources, advanced automated equipment for the collection and storage of large data sets and powerful computers to deal with everything from data processing to running programmes employing complex algorithms for the automatic solution of structures. The sheer number of developments in the subject over the past century makes it impossible for this review to be exhaustive, but it will describe some major developments that will enable the reader to understand how the subject has grown from its humble beginnings to what it is today.

  1. An industrial engineering approach to laboratory automation for high throughput screening

    PubMed Central

    Menke, Karl C.

    2000-01-01

    Across the pharmaceutical industry, there are a variety of approaches to laboratory automation for high throughput screening. At Sphinx Pharmaceuticals, the principles of industrial engineering have been applied to systematically identify and develop those automated solutions that provide the greatest value to the scientists engaged in lead generation. PMID:18924701

  2. Automation Challenges of the 80's: What to Do until Your Integrated Library System Arrives.

    ERIC Educational Resources Information Center

    Allan, Ferne C.; Shields, Joyce M.

    1986-01-01

    A medium-sized aerospace library has developed interim solutions to automation needs by using software and equipment that were available in-house in preparation for an expected integrated library system. Automated processes include authors' file of items authored by employees, journal routing (including routing slips), statistics, journal…

  3. Adapting for Scalability: Automating the Video Assessment of Instructional Learning

    ERIC Educational Resources Information Center

    Roberts , Amy M.; LoCasale-Crouch, Jennifer; Hamre, Bridget K.; Buckrop, Jordan M.

    2017-01-01

    Although scalable programs, such as online courses, have the potential to reach broad audiences, they may pose challenges to evaluating learners' knowledge and skills. Automated scoring offers a possible solution. In the current paper, we describe the process of creating and testing an automated means of scoring a validated measure of teachers'…

  4. Automated Source-Code-Based Testing of Object-Oriented Software

    NASA Astrophysics Data System (ADS)

    Gerlich, Ralf; Gerlich, Rainer; Dietrich, Carsten

    2014-08-01

    With the advent of languages such as C++ and Java in mission- and safety-critical space on-board software, new challenges for testing and specifically automated testing arise. In this paper we discuss some of these challenges, consequences and solutions based on an experiment in automated source- code-based testing for C++.

  5. Small Libraries Online: Automating Circulation and Public Access Catalogs. Revised and Updated.

    ERIC Educational Resources Information Center

    Peterson, Christine

    This manual provides information to help libraries in Texas considering an automation project, with special emphasis on smaller libraries. The solutions discussed are microcomputer-based. The manual begins with a discussion of how to prepare for the automation of a library, including planning, approval, collection decisions, policy, and staffing.…

  6. Computational tool for the early screening of monoclonal antibodies for their viscosities

    PubMed Central

    Agrawal, Neeraj J; Helk, Bernhard; Kumar, Sandeep; Mody, Neil; Sathish, Hasige A.; Samra, Hardeep S.; Buck, Patrick M; Li, Li; Trout, Bernhardt L

    2016-01-01

    Highly concentrated antibody solutions often exhibit high viscosities, which present a number of challenges for antibody-drug development, manufacturing and administration. The antibody sequence is a key determinant for high viscosity of highly concentrated solutions; therefore, a sequence- or structure-based tool that can identify highly viscous antibodies from their sequence would be effective in ensuring that only antibodies with low viscosity progress to the development phase. Here, we present a spatial charge map (SCM) tool that can accurately identify highly viscous antibodies from their sequence alone (using homology modeling to determine the 3-dimensional structures). The SCM tool has been extensively validated at 3 different organizations, and has proved successful in correctly identifying highly viscous antibodies. As a quantitative tool, SCM is amenable to high-throughput automated analysis, and can be effectively implemented during the antibody screening or engineering phase for the selection of low-viscosity antibodies. PMID:26399600

  7. Distributed computing for macromolecular crystallography

    PubMed Central

    Krissinel, Evgeny; Uski, Ville; Lebedev, Andrey; Ballard, Charles

    2018-01-01

    Modern crystallographic computing is characterized by the growing role of automated structure-solution pipelines, which represent complex expert systems utilizing a number of program components, decision makers and databases. They also require considerable computational resources and regular database maintenance, which is increasingly more difficult to provide at the level of individual desktop-based CCP4 setups. On the other hand, there is a significant growth in data processed in the field, which brings up the issue of centralized facilities for keeping both the data collected and structure-solution projects. The paradigm of distributed computing and data management offers a convenient approach to tackling these problems, which has become more attractive in recent years owing to the popularity of mobile devices such as tablets and ultra-portable laptops. In this article, an overview is given of developments by CCP4 aimed at bringing distributed crystallographic computations to a wide crystallographic community. PMID:29533240

  8. Distributed computing for macromolecular crystallography.

    PubMed

    Krissinel, Evgeny; Uski, Ville; Lebedev, Andrey; Winn, Martyn; Ballard, Charles

    2018-02-01

    Modern crystallographic computing is characterized by the growing role of automated structure-solution pipelines, which represent complex expert systems utilizing a number of program components, decision makers and databases. They also require considerable computational resources and regular database maintenance, which is increasingly more difficult to provide at the level of individual desktop-based CCP4 setups. On the other hand, there is a significant growth in data processed in the field, which brings up the issue of centralized facilities for keeping both the data collected and structure-solution projects. The paradigm of distributed computing and data management offers a convenient approach to tackling these problems, which has become more attractive in recent years owing to the popularity of mobile devices such as tablets and ultra-portable laptops. In this article, an overview is given of developments by CCP4 aimed at bringing distributed crystallographic computations to a wide crystallographic community.

  9. Automation in Clinical Microbiology

    PubMed Central

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  10. CASMI 2013: Identification of Small Molecules by Tandem Mass Spectrometry Combined with Database and Literature Mining

    PubMed Central

    Newsome, Andrew G.; Nikolic, Dejan

    2014-01-01

    The Critical Assessment of Small Molecule Identification (CASMI) contest was initiated in 2012 to evaluate manual and automated strategies for the identification of small molecules from raw mass spectrometric data. The authors participated in both category 1 (molecular formula determination) and category 2 (molecular structure determination) of the second annual CASMI contest (CASMI 2013) using slow but effective manual methods. The provided high resolution mass spectrometric data were interpreted manually using a combination of molecular formula calculators, fragment and neutral loss analysis, literature consultation, manual database searches, deductive logic, and experience. The authors submitted correct formulas as lead candidates for 16 of 16 challenges and submitted correct structure solutions as lead candidates for 14 of 16 challenges. One structure submission (Challenge 3) was very close but not exact (N2-acetylglutaminylisoleucinamide instead of the correct N2-acetylglutaminylleucinamide). A solution for one (Challenge 13) was not submitted due to an inability to reconcile the provided fragmentation pattern with any known structures with the provided molecular composition. PMID:26819877

  11. Automated procedures for sizing aerospace vehicle structures /SAVES/

    NASA Technical Reports Server (NTRS)

    Giles, G. L.; Blackburn, C. L.; Dixon, S. C.

    1972-01-01

    Results from a continuing effort to develop automated methods for structural design are described. A system of computer programs presently under development called SAVES is intended to automate the preliminary structural design of a complete aerospace vehicle. Each step in the automated design process of the SAVES system of programs is discussed, with emphasis placed on use of automated routines for generation of finite-element models. The versatility of these routines is demonstrated by structural models generated for a space shuttle orbiter, an advanced technology transport,n hydrogen fueled Mach 3 transport. Illustrative numerical results are presented for the Mach 3 transport wing.

  12. Automated tilt series alignment and tomographic reconstruction in IMOD.

    PubMed

    Mastronarde, David N; Held, Susannah R

    2017-02-01

    Automated tomographic reconstruction is now possible in the IMOD software package, including the merging of tomograms taken around two orthogonal axes. Several developments enable the production of high-quality tomograms. When using fiducial markers for alignment, the markers to be tracked through the series are chosen automatically; if there is an excess of markers available, a well-distributed subset is selected that is most likely to track well. Marker positions are refined by applying an edge-enhancing Sobel filter, which results in a 20% improvement in alignment error for plastic-embedded samples and 10% for frozen-hydrated samples. Robust fitting, in which outlying points are given less or no weight in computing the fitting error, is used to obtain an alignment solution, so that aberrant points from the automated tracking can have little effect on the alignment. When merging two dual-axis tomograms, the alignment between them is refined from correlations between local patches; a measure of structure was developed so that patches with insufficient structure to give accurate correlations can now be excluded automatically. We have also developed a script for running all steps in the reconstruction process with a flexible mechanism for setting parameters, and we have added a user interface for batch processing of tilt series to the Etomo program in IMOD. Batch processing is fully compatible with interactive processing and can increase efficiency even when the automation is not fully successful, because users can focus their effort on the steps that require manual intervention. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Automated protein NMR structure determination using wavelet de-noised NOESY spectra.

    PubMed

    Dancea, Felician; Günther, Ulrich

    2005-11-01

    A major time-consuming step of protein NMR structure determination is the generation of reliable NOESY cross peak lists which usually requires a significant amount of manual interaction. Here we present a new algorithm for automated peak picking involving wavelet de-noised NOESY spectra in a process where the identification of peaks is coupled to automated structure determination. The core of this method is the generation of incremental peak lists by applying different wavelet de-noising procedures which yield peak lists of a different noise content. In combination with additional filters which probe the consistency of the peak lists, good convergence of the NOESY-based automated structure determination could be achieved. These algorithms were implemented in the context of the ARIA software for automated NOE assignment and structure determination and were validated for a polysulfide-sulfur transferase protein of known structure. The procedures presented here should be commonly applicable for efficient protein NMR structure determination and automated NMR peak picking.

  14. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  15. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  16. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  17. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  18. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  19. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    PubMed

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  20. Mass spectrometry-based monitoring of millisecond protein–ligand binding dynamics using an automated microfluidic platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cong, Yongzheng; Katipamula, Shanta; Trader, Cameron D.

    2016-01-01

    Characterizing protein-ligand binding dynamics is crucial for understanding protein function and developing new therapeutic agents. We have developed a novel microfluidic platform that features rapid mixing of protein and ligand solutions, variable incubation times, and on-chip electrospray ionization to perform label-free, solution-based monitoring of protein-ligand binding dynamics. This platform offers many advantages including automated processing, rapid mixing, and low sample consumption.

  1. Towards Evolving Electronic Circuits for Autonomous Space Applications

    NASA Technical Reports Server (NTRS)

    Lohn, Jason D.; Haith, Gary L.; Colombano, Silvano P.; Stassinopoulos, Dimitris

    2000-01-01

    The relatively new field of Evolvable Hardware studies how simulated evolution can reconfigure, adapt, and design hardware structures in an automated manner. Space applications, especially those requiring autonomy, are potential beneficiaries of evolvable hardware. For example, robotic drilling from a mobile platform requires high-bandwidth controller circuits that are difficult to design. In this paper, we present automated design techniques based on evolutionary search that could potentially be used in such applications. First, we present a method of automatically generating analog circuit designs using evolutionary search and a circuit construction language. Our system allows circuit size (number of devices), circuit topology, and device values to be evolved. Using a parallel genetic algorithm, we present experimental results for five design tasks. Second, we investigate the use of coevolution in automated circuit design. We examine fitness evaluation by comparing the effectiveness of four fitness schedules. The results indicate that solution quality is highest with static and co-evolving fitness schedules as compared to the other two dynamic schedules. We discuss these results and offer two possible explanations for the observed behavior: retention of useful information, and alignment of problem difficulty with circuit proficiency.

  2. Automated fiber placement composite manufacturing: The mission at MSFC's Productivity Enhancement Complex

    NASA Technical Reports Server (NTRS)

    Vickers, John H.; Pelham, Larry I.

    1993-01-01

    Automated fiber placement is a manufacturing process used for producing complex composite structures. It is a notable leap to the state-of-the-art in technology for automated composite manufacturing. The fiber placement capability was established at the Marshall Space Flight Center's (MSFC) Productivity Enhancement Complex in 1992 in collaboration with Thiokol Corporation to provide materials and processes research and development, and to fabricate components for many of the Center's Programs. The Fiber Placement System (FPX) was developed as a distinct solution to problems inherent to other automated composite manufacturing systems. This equipment provides unique capabilities to build composite parts in complex 3-D shapes with concave and other asymmetrical configurations. Components with complex geometries and localized reinforcements usually require labor intensive efforts resulting in expensive, less reproducible components; the fiber placement system has the features necessary to overcome these conditions. The mechanical systems of the equipment have the motion characteristics of a filament winder and the fiber lay-up attributes of a tape laying machine, with the additional capabilities of differential tow payout speeds, compaction and cut-restart to selectively place the correct number of fibers where the design dictates. This capability will produce a repeatable process resulting in lower cost and improved quality and reliability.

  3. Automated electrochemical synthesis and photoelectrochemical characterization of Zn1-xCo(x)O thin films for solar hydrogen production.

    PubMed

    Jaramillo, Thomas F; Baeck, Sung-Hyeon; Kleiman-Shwarsctein, Alan; Choi, Kyoung-Shin; Stucky, Galen D; McFarland, Eric W

    2005-01-01

    High-throughput electrochemical methods have been developed for the investigation of Zn1-xCo(x)O films for photoelectrochemical hydrogen production from water. A library of 120 samples containing 27 different compositions (0

  4. Computational Methods for Structural Mechanics and Dynamics, part 1

    NASA Technical Reports Server (NTRS)

    Stroud, W. Jefferson (Editor); Housner, Jerrold M. (Editor); Tanner, John A. (Editor); Hayduk, Robert J. (Editor)

    1989-01-01

    The structural analysis methods research has several goals. One goal is to develop analysis methods that are general. This goal of generality leads naturally to finite-element methods, but the research will also include other structural analysis methods. Another goal is that the methods be amenable to error analysis; that is, given a physical problem and a mathematical model of that problem, an analyst would like to know the probable error in predicting a given response quantity. The ultimate objective is to specify the error tolerances and to use automated logic to adjust the mathematical model or solution strategy to obtain that accuracy. A third goal is to develop structural analysis methods that can exploit parallel processing computers. The structural analysis methods research will focus initially on three types of problems: local/global nonlinear stress analysis, nonlinear transient dynamics, and tire modeling.

  5. Human-Autonomy Teaming: Supporting Dynamically Adjustable Collaboration

    NASA Technical Reports Server (NTRS)

    Shively, Jay

    2017-01-01

    This presentation is a technical update for the NATO-STO HFM-247 working group. Our progress on four goals will be discussed. For Goal 1, a conceptual model of HAT is presented. HAT looks to make automation act as more of a teammate, by having it communicate with human operators in a more human, goal-directed, manner which provides transparency into the reasoning behind automated recommendations and actions. This, in turn, permits more trust in the automation when it is appropriate, and less when it is not, allowing a more targeted supervision of automated functions. For Goal 2, we wanted to test these concepts and principles. We present findings from a recent simulation and describe two in progress. Goal 3 was to develop pattern(s) of HAT solution(s). These were originally presented at HCII 2016 and are reviewed. Goal 4 is to develop a re-usable HAT software agent. This is an ongoing effort to be delivered October 2017.

  6. Simultaneous 3D-vibration measurement using a single laser beam device

    NASA Astrophysics Data System (ADS)

    Brecher, Christian; Guralnik, Alexander; Baümler, Stephan

    2012-06-01

    Today's commercial solutions for vibration measurement and modal analysis are 3D-scanning laser doppler vibrometers, mainly used for open surfaces in the automotive and aerospace industries and the classic three-axial accelerometers in civil engineering, for most industrial applications in manufacturing environments, and particularly for partially closed structures. This paper presents a novel measurement approach using a single laser beam device and optical reflectors to simultaneously perform 3D-dynamic measurement as well as geometry measurement of the investigated object. We show the application of this so called laser tracker for modal testing of structures on a mechanical manufacturing shop floor. A holistic measurement method is developed containing manual reflector placement, semi-automated geometric modeling of investigated objects and fully automated vibration measurement up to 1000 Hz and down to few microns amplitude. Additionally the fast set up dynamic measurement of moving objects using a tracking technique is presented that only uses the device's own functionalities and does neither require a predefined moving path of the target nor an electronic synchronization to the moving object.

  7. Parametric Study of a YAV-8B Harrier in Ground Effect using Time-Dependent Navier-Stokes Computations

    NASA Technical Reports Server (NTRS)

    Pandya, Shishir; Chaderjian, Neal; Ahmad, Jasim; Kwak, Dochan (Technical Monitor)

    2002-01-01

    A process is described which enables the generation of 35 time-dependent viscous solutions for a YAV-8B Harrier in ground effect in one week. Overset grids are used to model the complex geometry of the Harrier aircraft and the interaction of its jets with the ground plane and low-speed ambient flow. The time required to complete this parametric study is drastically reduced through the use of process automation, modern computational platforms, and parallel computing. Moreover, a dual-time-stepping algorithm is described which improves solution robustness. Unsteady flow visualization and a frequency domain analysis are also used to identify and correlated key flow structures with the time variation of lift.

  8. Summaries of press automation conference presented

    NASA Astrophysics Data System (ADS)

    Makhlin, A. Y.; Pokrovskaya, G. M.

    1985-01-01

    The automation and mechanization of cold and hot stamping were discussed. Problems in the comprehensive mechanization and automatio of stamping in machine building development were examined. Automation becomes effective when it is implemented in progressive manufacturing processes and a comprehensive approach to the solution of all problems, beginning with the delivery of initial materials and ending with the transportation of finished products to the warehouse. Production intensification and improvments of effectiveness of produced output through the comprehensive mechanization and automation of stamping operations are reported.

  9. Automated sample-changing robot for solution scattering experiments at the EMBL Hamburg SAXS station X33

    PubMed Central

    Round, A. R.; Franke, D.; Moritz, S.; Huchler, R.; Fritsche, M.; Malthan, D.; Klaering, R.; Svergun, D. I.; Roessle, M.

    2008-01-01

    There is a rapidly increasing interest in the use of synchrotron small-angle X-ray scattering (SAXS) for large-scale studies of biological macromolecules in solution, and this requires an adequate means of automating the experiment. A prototype has been developed of an automated sample changer for solution SAXS, where the solutions are kept in thermostatically controlled well plates allowing for operation with up to 192 samples. The measuring protocol involves controlled loading of protein solutions and matching buffers, followed by cleaning and drying of the cell between measurements. The system was installed and tested at the X33 beamline of the EMBL, at the storage ring DORIS-III (DESY, Hamburg), where it was used by over 50 external groups during 2007. At X33, a throughput of approximately 12 samples per hour, with a failure rate of sample loading of less than 0.5%, was observed. The feedback from users indicates that the ease of use and reliability of the user operation at the beamline were greatly improved compared with the manual filling mode. The changer is controlled by a client–server-based network protocol, locally and remotely. During the testing phase, the changer was operated in an attended mode to assess its reliability and convenience. Full integration with the beamline control software, allowing for automated data collection of all samples loaded into the machine with remote control from the user, is presently being implemented. The approach reported is not limited to synchrotron-based SAXS but can also be used on laboratory and neutron sources. PMID:25484841

  10. Visions of Automation and Realities of Certification

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.; Holloway, Michael C.

    2005-01-01

    Quite a lot of people envision automation as the solution to many of the problems in aviation and air transportation today, across all sectors: commercial, private, and military. This paper explains why some recent experiences with complex, highly-integrated, automated systems suggest that this vision will not be realized unless significant progress is made over the current state-of-the-practice in software system development and certification.

  11. Does bacteriology laboratory automation reduce time to results and increase quality management?

    PubMed

    Dauwalder, O; Landrieve, L; Laurent, F; de Montclos, M; Vandenesch, F; Lina, G

    2016-03-01

    Due to reductions in financial and human resources, many microbiological laboratories have merged to build very large clinical microbiology laboratories, which allow the use of fully automated laboratory instruments. For clinical chemistry and haematology, automation has reduced the time to results and improved the management of laboratory quality. The aim of this review was to examine whether fully automated laboratory instruments for microbiology can reduce time to results and impact quality management. This study focused on solutions that are currently available, including the BD Kiestra™ Work Cell Automation and Total Lab Automation and the Copan WASPLab(®). Copyright © 2015 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  12. Gravity-driven pH adjustment for site-specific protein pKa measurement by solution-state NMR

    NASA Astrophysics Data System (ADS)

    Li, Wei

    2017-12-01

    To automate pH adjustment in site-specific protein pKa measurement by solution-state NMR, I present a funnel with two caps for the standard 5 mm NMR tube. The novelty of this simple-to-build and inexpensive apparatus is that it allows automatic gravity-driven pH adjustment within the magnet, and consequently results in a fully automated NMR-monitored pH titration without any hardware modification on the NMR spectrometer.

  13. NMR-based automated protein structure determination.

    PubMed

    Würz, Julia M; Kazemi, Sina; Schmidt, Elena; Bagaria, Anurag; Güntert, Peter

    2017-08-15

    NMR spectra analysis for protein structure determination can now in many cases be performed by automated computational methods. This overview of the computational methods for NMR protein structure analysis presents recent automated methods for signal identification in multidimensional NMR spectra, sequence-specific resonance assignment, collection of conformational restraints, and structure calculation, as implemented in the CYANA software package. These algorithms are sufficiently reliable and integrated into one software package to enable the fully automated structure determination of proteins starting from NMR spectra without manual interventions or corrections at intermediate steps, with an accuracy of 1-2 Å backbone RMSD in comparison with manually solved reference structures. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Building Facade Reconstruction by Fusing Terrestrial Laser Points and Images

    PubMed Central

    Pu, Shi; Vosselman, George

    2009-01-01

    Laser data and optical data have a complementary nature for three dimensional feature extraction. Efficient integration of the two data sources will lead to a more reliable and automated extraction of three dimensional features. This paper presents a semiautomatic building facade reconstruction approach, which efficiently combines information from terrestrial laser point clouds and close range images. A building facade's general structure is discovered and established using the planar features from laser data. Then strong lines in images are extracted using Canny extractor and Hough transformation, and compared with current model edges for necessary improvement. Finally, textures with optimal visibility are selected and applied according to accurate image orientations. Solutions to several challenge problems throughout the collaborated reconstruction, such as referencing between laser points and multiple images and automated texturing, are described. The limitations and remaining works of this approach are also discussed. PMID:22408539

  15. FORESEE™ User-Centric Energy Automation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    FORESEE™ is a home energy management system (HEMS) that provides a user centric energy automation solution for residential building occupants. Built upon advanced control and machine learning algorithms, FORESEE intelligently manages the home appliances and distributed energy resources (DERs) such as photovoltaics and battery storage in a home. Unlike existing HEMS in the market, FORESEE provides a tailored home automation solution for individual occupants by learning and adapting to their preferences on cost, comfort, convenience and carbon. FORESEE improves not only the energy efficiency of the home but also its capability to provide grid services such as demand response. Highlymore » reliable demand response services are likely to be incentivized by utility companies, making FORESEE economically viable for most homes.« less

  16. Altering user' acceptance of automation through prior automation exposure.

    PubMed

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  17. Automated Euler and Navier-Stokes Database Generation for a Glide-Back Booster

    NASA Technical Reports Server (NTRS)

    Chaderjian, Neal M.; Rogers, Stuart E.; Aftosmis, Mike J.; Pandya, Shishir A.; Ahmad, Jasim U.; Tejnil, Edward

    2004-01-01

    The past two decades have seen a sustained increase in the use of high fidelity Computational Fluid Dynamics (CFD) in basic research, aircraft design, and the analysis of post-design issues. As the fidelity of a CFD method increases, the number of cases that can be readily and affordably computed greatly diminishes. However, computer speeds now exceed 2 GHz, hundreds of processors are currently available and more affordable, and advances in parallel CFD algorithms scale more readily with large numbers of processors. All of these factors make it feasible to compute thousands of high fidelity cases. However, there still remains the overwhelming task of monitoring the solution process. This paper presents an approach to automate the CFD solution process. A new software tool, AeroDB, is used to compute thousands of Euler and Navier-Stokes solutions for a 2nd generation glide-back booster in one week. The solution process exploits a common job-submission grid environment, the NASA Information Power Grid (IPG), using 13 computers located at 4 different geographical sites. Process automation and web-based access to a MySql database greatly reduces the user workload, removing much of the tedium and tendency for user input errors. The AeroDB framework is shown. The user submits/deletes jobs, monitors AeroDB's progress, and retrieves data and plots via a web portal. Once a job is in the database, a job launcher uses an IPG resource broker to decide which computers are best suited to run the job. Job/code requirements, the number of CPUs free on a remote system, and queue lengths are some of the parameters the broker takes into account. The Globus software provides secure services for user authentication, remote shell execution, and secure file transfers over an open network. AeroDB automatically decides when a job is completed. Currently, the Cart3D unstructured flow solver is used for the Euler equations, and the Overflow structured overset flow solver is used for the Navier-Stokes equations. Other codes can be readily included into the AeroDB framework.

  18. Laboratory automation: trajectory, technology, and tactics.

    PubMed

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a modular approach, from a hardware-driven system to process control, from a one-of-a-kind novelty toward a standardized product, and from an in vitro diagnostics novelty to a marketing tool. Multiple vendors are present in the marketplace, many of whom are in vitro diagnostics manufacturers providing an automation solution coupled with their instruments, whereas others are focused automation companies. Automation technology continues to advance, acceptance continues to climb, and payback and cost justification methods are developing.

  19. Automation of data processing and calculation of retention parameters and thermodynamic data for gas chromatography

    NASA Astrophysics Data System (ADS)

    Makarycheva, A. I.; Faerman, V. A.

    2017-02-01

    The analyses of automation patterns is performed and the programming solution for the automation of data processing of the chromatographic data and their further information storage with a help of a software package, Mathcad and MS Excel spreadsheets, is developed. The offered approach concedes the ability of data processing algorithm modification and does not require any programming experts participation. The approach provides making a measurement of the given time and retention volumes, specific retention volumes, a measurement of differential molar free adsorption energy, and a measurement of partial molar solution enthalpies and isosteric heats of adsorption. The developed solution is focused on the appliance in a small research group and is tested on the series of some new gas chromatography sorbents. More than 20 analytes were submitted to calculation of retention parameters and thermodynamic sorption quantities. The received data are provided in the form accessible to comparative analysis, and they are able to find sorbing agents with the most profitable properties to solve some concrete analytic issues.

  20. Evaluating the solution from MrBUMP and BALBES

    PubMed Central

    Keegan, Ronan M.; Long, Fei; Fazio, Vincent J.; Winn, Martyn D.; Murshudov, Garib N.; Vagin, Alexei A.

    2011-01-01

    Molecular replacement is one of the key methods used to solve the problem of determining the phases of structure factors in protein structure solution from X-ray image diffraction data. Its success rate has been steadily improving with the development of improved software methods and the increasing number of structures available in the PDB for use as search models. Despite this, in cases where there is low sequence identity between the target-structure sequence and that of its set of possible homologues it can be a difficult and time-consuming chore to isolate and prepare the best search model for molecular replacement. MrBUMP and BALBES are two recent developments from CCP4 that have been designed to automate and speed up the process of determining and preparing the best search models and putting them through molecular replacement. Their intention is to provide the user with a broad set of results using many search models and to highlight the best of these for further processing. An overview of both programs is presented along with a description of how best to use them, citing case studies and the results of large-scale testing of the software. PMID:21460449

  1. Iterative dataset optimization in automated planning: Implementation for breast and rectal cancer radiotherapy.

    PubMed

    Fan, Jiawei; Wang, Jiazhou; Zhang, Zhen; Hu, Weigang

    2017-06-01

    To develop a new automated treatment planning solution for breast and rectal cancer radiotherapy. The automated treatment planning solution developed in this study includes selection of the iterative optimized training dataset, dose volume histogram (DVH) prediction for the organs at risk (OARs), and automatic generation of clinically acceptable treatment plans. The iterative optimized training dataset is selected by an iterative optimization from 40 treatment plans for left-breast and rectal cancer patients who received radiation therapy. A two-dimensional kernel density estimation algorithm (noted as two parameters KDE) which incorporated two predictive features was implemented to produce the predicted DVHs. Finally, 10 additional new left-breast treatment plans are re-planned using the Pinnacle 3 Auto-Planning (AP) module (version 9.10, Philips Medical Systems) with the objective functions derived from the predicted DVH curves. Automatically generated re-optimized treatment plans are compared with the original manually optimized plans. By combining the iterative optimized training dataset methodology and two parameters KDE prediction algorithm, our proposed automated planning strategy improves the accuracy of the DVH prediction. The automatically generated treatment plans using the dose derived from the predicted DVHs can achieve better dose sparing for some OARs without compromising other metrics of plan quality. The proposed new automated treatment planning solution can be used to efficiently evaluate and improve the quality and consistency of the treatment plans for intensity-modulated breast and rectal cancer radiation therapy. © 2017 American Association of Physicists in Medicine.

  2. Automated Detection and Analysis of Interplanetary Shocks with Real-Time Application

    NASA Astrophysics Data System (ADS)

    Vorotnikov, V.; Smith, C. W.; Hu, Q.; Szabo, A.; Skoug, R. M.; Cohen, C. M.

    2006-12-01

    The ACE real-time data stream provides web-based now-casting capabilities for solar wind conditions upstream of Earth. Our goal is to provide an automated code that finds and analyzes interplanetary shocks as they occur for possible real-time application to space weather nowcasting. Shock analysis algorithms based on the Rankine-Hugoniot jump conditions exist and are in wide-spread use today for the interactive analysis of interplanetary shocks yielding parameters such as shock speed and propagation direction and shock strength in the form of compression ratios. Although these codes can be automated in a reasonable manner to yield solutions not far from those obtained by user-directed interactive analysis, event detection presents an added obstacle and the first step in a fully automated analysis. We present a fully automated Rankine-Hugoniot analysis code that can scan the ACE science data, find shock candidates, analyze the events, obtain solutions in good agreement with those derived from interactive applications, and dismiss false positive shock candidates on the basis of the conservation equations. The intent is to make this code available to NOAA for use in real-time space weather applications. The code has the added advantage of being able to scan spacecraft data sets to provide shock solutions for use outside real-time applications and can easily be applied to science-quality data sets from other missions. Use of the code for this purpose will also be explored.

  3. Algorithme intelligent d'optimisation d'un design structurel de grande envergure

    NASA Astrophysics Data System (ADS)

    Dominique, Stephane

    The implementation of an automated decision support system in the field of design and structural optimisation can give a significant advantage to any industry working on mechanical designs. Indeed, by providing solution ideas to a designer or by upgrading existing design solutions while the designer is not at work, the system may reduce the project cycle time, or allow more time to produce a better design. This thesis presents a new approach to automate a design process based on Case-Based Reasoning (CBR), in combination with a new genetic algorithm named Genetic Algorithm with Territorial core Evolution (GATE). This approach was developed in order to reduce the operating cost of the process. However, as the system implementation cost is quite expensive, the approach is better suited for large scale design problem, and particularly for design problems that the designer plans to solve for many different specification sets. First, the CBR process uses a databank filled with every known solution to similar design problems. Then, the closest solutions to the current problem in term of specifications are selected. After this, during the adaptation phase, an artificial neural network (ANN) interpolates amongst known solutions to produce an additional solution to the current problem using the current specifications as inputs. Each solution produced and selected by the CBR is then used to initialize the population of an island of the genetic algorithm. The algorithm will optimise the solution further during the refinement phase. Using progressive refinement, the algorithm starts using only the most important variables for the problem. Then, as the optimisation progress, the remaining variables are gradually introduced, layer by layer. The genetic algorithm that is used is a new algorithm specifically created during this thesis to solve optimisation problems from the field of mechanical device structural design. The algorithm is named GATE, and is essentially a real number genetic algorithm that prevents new individuals to be born too close to previously evaluated solutions. The restricted area becomes smaller or larger during the optimisation to allow global or local search when necessary. Also, a new search operator named Substitution Operator is incorporated in GATE. This operator allows an ANN surrogate model to guide the algorithm toward the most promising areas of the design space. The suggested CBR approach and GATE were tested on several simple test problems, as well as on the industrial problem of designing a gas turbine engine rotor's disc. These results are compared to other results obtained for the same problems by many other popular optimisation algorithms, such as (depending of the problem) gradient algorithms, binary genetic algorithm, real number genetic algorithm, genetic algorithm using multiple parents crossovers, differential evolution genetic algorithm, Hookes & Jeeves generalized pattern search method and POINTER from the software I-SIGHT 3.5. Results show that GATE is quite competitive, giving the best results for 5 of the 6 constrained optimisation problem. GATE also provided the best results of all on problem produced by a Maximum Set Gaussian landscape generator. Finally, GATE provided a disc 4.3% lighter than the best other tested algorithm (POINTER) for the gas turbine engine rotor's disc problem. One drawback of GATE is a lesser efficiency for highly multimodal unconstrained problems, for which he gave quite poor results with respect to its implementation cost. To conclude, according to the preliminary results obtained during this thesis, the suggested CBR process, combined with GATE, seems to be a very good candidate to automate and accelerate the structural design of mechanical devices, potentially reducing significantly the cost of industrial preliminary design processes.

  4. Clumpak: a program for identifying clustering modes and packaging population structure inferences across K.

    PubMed

    Kopelman, Naama M; Mayzel, Jonathan; Jakobsson, Mattias; Rosenberg, Noah A; Mayrose, Itay

    2015-09-01

    The identification of the genetic structure of populations from multilocus genotype data has become a central component of modern population-genetic data analysis. Application of model-based clustering programs often entails a number of steps, in which the user considers different modelling assumptions, compares results across different predetermined values of the number of assumed clusters (a parameter typically denoted K), examines multiple independent runs for each fixed value of K, and distinguishes among runs belonging to substantially distinct clustering solutions. Here, we present Clumpak (Cluster Markov Packager Across K), a method that automates the postprocessing of results of model-based population structure analyses. For analysing multiple independent runs at a single K value, Clumpak identifies sets of highly similar runs, separating distinct groups of runs that represent distinct modes in the space of possible solutions. This procedure, which generates a consensus solution for each distinct mode, is performed by the use of a Markov clustering algorithm that relies on a similarity matrix between replicate runs, as computed by the software Clumpp. Next, Clumpak identifies an optimal alignment of inferred clusters across different values of K, extending a similar approach implemented for a fixed K in Clumpp and simplifying the comparison of clustering results across different K values. Clumpak incorporates additional features, such as implementations of methods for choosing K and comparing solutions obtained by different programs, models, or data subsets. Clumpak, available at http://clumpak.tau.ac.il, simplifies the use of model-based analyses of population structure in population genetics and molecular ecology. © 2015 John Wiley & Sons Ltd.

  5. 12 CFR Appendix D to Part 360 - Sweep/Automated Credit Account File Structure

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    .../Automated Credit Account File Structure This is the structure of the data file to provide information to the... remainder of the data fields defined below should be populated. For data provided in the Sweep/Automated... number. The Account Identifier may be composed of more than one physical data element. If multiple fields...

  6. 12 CFR Appendix D to Part 360 - Sweep/Automated Credit Account File Structure

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    .../Automated Credit Account File Structure This is the structure of the data file to provide information to the... remainder of the data fields defined below should be populated. For data provided in the Sweep/Automated... number. The Account Identifier may be composed of more than one physical data element. If multiple fields...

  7. 12 CFR Appendix D to Part 360 - Sweep/Automated Credit Account File Structure

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    .../Automated Credit Account File Structure This is the structure of the data file to provide information to the... remainder of the data fields defined below should be populated. For data provided in the Sweep/Automated... number. The Account Identifier may be composed of more than one physical data element. If multiple fields...

  8. 12 CFR Appendix D to Part 360 - Sweep/Automated Credit Account File Structure

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    .../Automated Credit Account File Structure This is the structure of the data file to provide information to the... remainder of the data fields defined below should be populated. For data provided in the Sweep/Automated... number. The Account Identifier may be composed of more than one physical data element. If multiple fields...

  9. Microfabricated Patch Clamp Electrodes for Improved Ion Channel Protein Measurements

    NASA Astrophysics Data System (ADS)

    Klemic, James; Klemic, Kathryn; Reed, Mark; Sigworth, Frederick

    2002-03-01

    Ion channels are trans-membrane proteins that underlie many cell functions including hormone and neurotransmitter release, muscle contraction and cell signaling cascades. Ion channel proteins are commonly characterized via the patch clamp method in which an extruded glass tube containing ionic solution, manipulated by an expert technician, is brought into contact with a living cell to record ionic current through the cell membrane. Microfabricated planar patch electrodes, micromolded in the silicone elastomer poly-dimethylsiloxane (PDMS) from microlithographically patterned structures, have been developed that improve on this method. Microfabrication techniques allow arrays of patch electrodes to be fabricated, increasing the throughput of the measurement technique. Planar patch electrodes readily allow the automation of cell sealing, further increasing throughput. Microfabricated electrode arrays may be readily integrated with microfluidic structures to allow fast, in situ solution exchange. Miniaturization of the electrode geometry should increase both the signal to noise and the bandwidth of the measurement. Microfabricated patch electrode arrays have been fabricated and measurements have been taken.

  10. Force-Free Magnetic Fields Calculated from Automated Tracing of Coronal Loops with AIA/SDO

    NASA Astrophysics Data System (ADS)

    Aschwanden, M. J.

    2013-12-01

    One of the most realistic magnetic field models of the solar corona is a nonlinear force-free field (NLFFF) solution. There exist about a dozen numeric codes that compute NLFFF solutions based on extrapolations of photospheric vector magnetograph data. However, since the photosphere and lower chromosphere is not force-free, a suitable correction has to be applied to the lower boundary condition. Despite of such "pre-processing" corrections, the resulting theoretical magnetic field lines deviate substantially from observed coronal loop geometries. - Here we developed an alternative method that fits an analytical NLFFF approximation to the observed geometry of coronal loops. The 2D coordinates of the geometry of coronal loop structures observed with AIA/SDO are traced with the "Oriented Coronal CUrved Loop Tracing" (OCCULT-2) code, an automated pattern recognition algorithm that has demonstrated the fidelity in loop tracing matching visual perception. A potential magnetic field solution is then derived from a line-of-sight magnetogram observed with HMI/SDO, and an analytical NLFFF approximation is then forward-fitted to the twisted geometry of coronal loops. We demonstrate the performance of this magnetic field modeling method for a number of solar active regions, before and after major flares observed with SDO. The difference of the NLFFF and the potential field energies allows us then to compute the free magnetic energy, which is an upper limit of the energy that is released during a solar flare.

  11. Automated Tape Laying Machine for Composite Structures.

    DTIC Science & Technology

    The invention comprises an automated tape laying machine, for laying tape on a composite structure. The tape laying machine has a tape laying head...neatly cut. The automated tape laying device utilizes narrow width tape to increase machine flexibility and reduce wastage.

  12. An automated baseline correction protocol for infrared spectra of atmospheric aerosols collected on polytetrafluoroethylene (Teflon) filters

    NASA Astrophysics Data System (ADS)

    Kuzmiakova, Adele; Dillner, Ann M.; Takahama, Satoshi

    2016-06-01

    A growing body of research on statistical applications for characterization of atmospheric aerosol Fourier transform infrared (FT-IR) samples collected on polytetrafluoroethylene (PTFE) filters (e.g., Russell et al., 2011; Ruthenburg et al., 2014) and a rising interest in analyzing FT-IR samples collected by air quality monitoring networks call for an automated PTFE baseline correction solution. The existing polynomial technique (Takahama et al., 2013) is not scalable to a project with a large number of aerosol samples because it contains many parameters and requires expert intervention. Therefore, the question of how to develop an automated method for baseline correcting hundreds to thousands of ambient aerosol spectra given the variability in both environmental mixture composition and PTFE baselines remains. This study approaches the question by detailing the statistical protocol, which allows for the precise definition of analyte and background subregions, applies nonparametric smoothing splines to reproduce sample-specific PTFE variations, and integrates performance metrics from atmospheric aerosol and blank samples alike in the smoothing parameter selection. Referencing 794 atmospheric aerosol samples from seven Interagency Monitoring of PROtected Visual Environment (IMPROVE) sites collected during 2011, we start by identifying key FT-IR signal characteristics, such as non-negative absorbance or analyte segment transformation, to capture sample-specific transitions between background and analyte. While referring to qualitative properties of PTFE background, the goal of smoothing splines interpolation is to learn the baseline structure in the background region to predict the baseline structure in the analyte region. We then validate the model by comparing smoothing splines baseline-corrected spectra with uncorrected and polynomial baseline (PB)-corrected equivalents via three statistical applications: (1) clustering analysis, (2) functional group quantification, and (3) thermal optical reflectance (TOR) organic carbon (OC) and elemental carbon (EC) predictions. The discrepancy rate for a four-cluster solution is 10 %. For all functional groups but carboxylic COH the discrepancy is ≤ 10 %. Performance metrics obtained from TOR OC and EC predictions (R2 ≥ 0.94 %, bias ≤ 0.01 µg m-3, and error ≤ 0.04 µg m-3) are on a par with those obtained from uncorrected and PB-corrected spectra. The proposed protocol leads to visually and analytically similar estimates as those generated by the polynomial method. More importantly, the automated solution allows us and future users to evaluate its analytical reproducibility while minimizing reducible user bias. We anticipate the protocol will enable FT-IR researchers and data analysts to quickly and reliably analyze a large amount of data and connect them to a variety of available statistical learning methods to be applied to analyte absorbances isolated in atmospheric aerosol samples.

  13. SCOUSE: Semi-automated multi-COmponent Universal Spectral-line fitting Engine

    NASA Astrophysics Data System (ADS)

    Henshaw, J. D.; Longmore, S. N.; Kruijssen, J. M. D.; Davies, B.; Bally, J.; Barnes, A.; Battersby, C.; Burton, M.; Cunningham, M. R.; Dale, J. E.; Ginsburg, A.; Immer, K.; Jones, P. A.; Kendrew, S.; Mills, E. A. C.; Molinari, S.; Moore, T. J. T.; Ott, J.; Pillai, T.; Rathborne, J.; Schilke, P.; Schmiedeke, A.; Testi, L.; Walker, D.; Walsh, A.; Zhang, Q.

    2016-01-01

    The Semi-automated multi-COmponent Universal Spectral-line fitting Engine (SCOUSE) is a spectral line fitting algorithm that fits Gaussian files to spectral line emission. It identifies the spatial area over which to fit the data and generates a grid of spectral averaging areas (SAAs). The spatially averaged spectra are fitted according to user-provided tolerance levels, and the best fit is selected using the Akaike Information Criterion, which weights the chisq of a best-fitting solution according to the number of free-parameters. A more detailed inspection of the spectra can be performed to improve the fit through an iterative process, after which SCOUSE integrates the new solutions into the solution file.

  14. ASTROS: A multidisciplinary automated structural design tool

    NASA Technical Reports Server (NTRS)

    Neill, D. J.

    1989-01-01

    ASTROS (Automated Structural Optimization System) is a finite-element-based multidisciplinary structural optimization procedure developed under Air Force sponsorship to perform automated preliminary structural design. The design task is the determination of the structural sizes that provide an optimal structure while satisfying numerous constraints from many disciplines. In addition to its automated design features, ASTROS provides a general transient and frequency response capability, as well as a special feature to perform a transient analysis of a vehicle subjected to a nuclear blast. The motivation for the development of a single multidisciplinary design tool is that such a tool can provide improved structural designs in less time than is currently needed. The role of such a tool is even more apparent as modern materials come into widespread use. Balancing conflicting requirements for the structure's strength and stiffness while exploiting the benefits of material anisotropy is perhaps an impossible task without assistance from an automated design tool. Finally, the use of a single tool can bring the design task into better focus among design team members, thereby improving their insight into the overall task.

  15. A case for automated tape in clinical imaging.

    PubMed

    Bookman, G; Baune, D

    1998-08-01

    Electronic archiving of radiology images over many years will require many terabytes of storage with a need for rapid retrieval of these images. As more large PACS installations are installed and implemented, a data crisis occurs. The ability to store this large amount of data using the traditional method of optical jukeboxes or online disk alone becomes an unworkable solution. The amount of floor space number of optical jukeboxes, and off-line shelf storage required to store the images becomes unmanageable. With the recent advances in tape and tape drives, the use of tape for long term storage of PACS data has become the preferred alternative. A PACS system consisting of a centrally managed system of RAID disk, software and at the heart of the system, tape, presents a solution that for the first time solves the problems of multi-modality high end PACS, non-DICOM image, electronic medical record and ADT data storage. This paper will examine the installation of the University of Utah, Department of Radiology PACS system and the integration of automated tape archive. The tape archive is also capable of storing data other than traditional PACS data. The implementation of an automated data archive to serve the many other needs of a large hospital will also be discussed. This will include the integration of a filmless cardiology department and the backup/archival needs of a traditional MIS department. The need for high bandwidth to tape with a large RAID cache will be examined and how with an interface to a RIS pre-fetch engine, tape can be a superior solution to optical platters or other archival solutions. The data management software will be discussed in detail. The performance and cost of RAID disk cache and automated tape compared to a solution that includes optical will be examined.

  16. Automating a High School Restroom.

    ERIC Educational Resources Information Center

    Ritner-Heir, Robbin

    1999-01-01

    Discusses how one high school transformed its restrooms into cleaner and more vandal-resistant environments by automating them. Solutions discussed include installing perforated stainless steel panel ceilings, using epoxy-based paint for walls, selecting china commode fixtures instead of stainless steel, installing electronic faucets and sensors,…

  17. Technology Solutions for School Food Service.

    ERIC Educational Resources Information Center

    Begalle, Mary

    2002-01-01

    Considers ways to include schools' food service departments in technology planning. Discusses school food service software applications, considerations and challenges of automating food service operations, and business-to-business Internet solutions. (EV)

  18. Generic and Automated Data Evaluation in Analytical Measurement.

    PubMed

    Adam, Martin; Fleischer, Heidi; Thurow, Kerstin

    2017-04-01

    In the past year, automation has become more and more important in the field of elemental and structural chemical analysis to reduce the high degree of manual operation and processing time as well as human errors. Thus, a high number of data points are generated, which requires fast and automated data evaluation. To handle the preprocessed export data from different analytical devices with software from various vendors offering a standardized solution without any programming knowledge should be preferred. In modern laboratories, multiple users will use this software on multiple personal computers with different operating systems (e.g., Windows, Macintosh, Linux). Also, mobile devices such as smartphones and tablets have gained growing importance. The developed software, Project Analytical Data Evaluation (ADE), is implemented as a web application. To transmit the preevaluated data from the device software to the Project ADE, the exported XML report files are detected and the included data are imported into the entities database using the Data Upload software. Different calculation types of a sample within one measurement series (e.g., method validation) are identified using information tags inside the sample name. The results are presented in tables and diagrams on different information levels (general, detailed for one analyte or sample).

  19. Simple setup for gas-phase H/D exchange mass spectrometry coupled to electron transfer dissociation and ion mobility for analysis of polypeptide structure on a liquid chromatographic time scale.

    PubMed

    Mistarz, Ulrik H; Brown, Jeffery M; Haselmann, Kim F; Rand, Kasper D

    2014-12-02

    Gas-phase hydrogen/deuterium exchange (HDX) is a fast and sensitive, yet unharnessed analytical approach for providing information on the structural properties of biomolecules, in a complementary manner to mass analysis. Here, we describe a simple setup for ND3-mediated millisecond gas-phase HDX inside a mass spectrometer immediately after ESI (gas-phase HDX-MS) and show utility for studying the primary and higher-order structure of peptides and proteins. HDX was achieved by passing N2-gas through a container filled with aqueous deuterated ammonia reagent (ND3/D2O) and admitting the saturated gas immediately upstream or downstream of the primary skimmer cone. The approach was implemented on three commercially available mass spectrometers and required no or minor fully reversible reconfiguration of gas-inlets of the ion source. Results from gas-phase HDX-MS of peptides using the aqueous ND3/D2O as HDX reagent indicate that labeling is facilitated exclusively through gaseous ND3, yielding similar results to the infusion of purified ND3-gas, while circumventing the complications associated with the use of hazardous purified gases. Comparison of the solution-phase- and gas-phase deuterium uptake of Leu-Enkephalin and Glu-Fibrinopeptide B, confirmed that this gas-phase HDX-MS approach allows for labeling of sites (heteroatom-bound non-amide hydrogens located on side-chains, N-terminus and C-terminus) not accessed by classical solution-phase HDX-MS. The simple setup is compatible with liquid chromatography and a chip-based automated nanoESI interface, allowing for online gas-phase HDX-MS analysis of peptides and proteins separated on a liquid chromatographic time scale at increased throughput. Furthermore, online gas-phase HDX-MS could be performed in tandem with ion mobility separation or electron transfer dissociation, thus enabling multiple orthogonal analyses of the structural properties of peptides and proteins in a single automated LC-MS workflow.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Haishuang; Krysiak, Yaşar; Hoffmann, Kristin

    The crystal structure and disorder phenomena of Al{sub 4}B{sub 2}O{sub 9}, an aluminum borate from the mullite-type family, were studied using automated diffraction tomography (ADT), a recently established method for collection and analysis of electron diffraction data. Al{sub 4}B{sub 2}O{sub 9}, prepared by sol-gel approach, crystallizes in the monoclinic space group C2/m. The ab initio structure determination based on three-dimensional electron diffraction data from single ordered crystals reveals that edge-connected AlO{sub 6} octahedra expanding along the b axis constitute the backbone. The ordered structure (A) was confirmed by TEM and HAADF-STEM images. Furthermore, disordered crystals with diffuse scattering along themore » b axis are observed. Analysis of the modulation pattern implies a mean superstructure (AAB) with a threefold b axis, where B corresponds to an A layer shifted by ½a and ½c. Diffraction patterns simulated for the AAB sequence including additional stacking disorder are in good agreement with experimental electron diffraction patterns. - Graphical abstract: Crystal structure and disorder phenomena of B-rich Al{sub 4}B{sub 2}O{sub 9} studied by automated electron diffraction tomography (ADT) and described by diffraction simulation using DISCUS. - Highlights: • Ab-initio structure solution by electron diffraction from single nanocrystals. • Detected modulation corresponding mainly to three-fold superstructure. • Diffuse diffraction streaks caused by stacking faults in disordered crystals. • Observed streaks explained by simulated electron diffraction patterns.« less

  1. Design of Inhouse Automated Library Systems.

    ERIC Educational Resources Information Center

    Cortez, Edwin M.

    1984-01-01

    Examines six steps inherent to development of in-house automated library system: (1) problem definition, (2) requirement specifications, (3) analysis of alternatives and solutions, (4, 5) design and implementation of hardware and software, and (6) evaluation. Practical method for comparing and weighting options is illustrated and explained. A…

  2. @neurIST complex information processing toolchain for the integrated management of cerebral aneurysms

    PubMed Central

    Villa-Uriol, M. C.; Berti, G.; Hose, D. R.; Marzo, A.; Chiarini, A.; Penrose, J.; Pozo, J.; Schmidt, J. G.; Singh, P.; Lycett, R.; Larrabide, I.; Frangi, A. F.

    2011-01-01

    Cerebral aneurysms are a multi-factorial disease with severe consequences. A core part of the European project @neurIST was the physical characterization of aneurysms to find candidate risk factors associated with aneurysm rupture. The project investigated measures based on morphological, haemodynamic and aneurysm wall structure analyses for more than 300 cases of ruptured and unruptured aneurysms, extracting descriptors suitable for statistical studies. This paper deals with the unique challenges associated with this task, and the implemented solutions. The consistency of results required by the subsequent statistical analyses, given the heterogeneous image data sources and multiple human operators, was met by a highly automated toolchain combined with training. A testimonial of the successful automation is the positive evaluation of the toolchain by over 260 clinicians during various hands-on workshops. The specification of the analyses required thorough investigations of modelling and processing choices, discussed in a detailed analysis protocol. Finally, an abstract data model governing the management of the simulation-related data provides a framework for data provenance and supports future use of data and toolchain. This is achieved by enabling the easy modification of the modelling approaches and solution details through abstract problem descriptions, removing the need of repetition of manual processing work. PMID:22670202

  3. Automated pH Control of Nutrient Solution in a Hydroponic Plant Growth System

    NASA Technical Reports Server (NTRS)

    Smith, B.; Dogan, N.; Aglan, H.; Mortley, D.; Loretan, P.

    1998-01-01

    Over, the years, NASA has played an important role in providing to and the development of automated nutrient delivery and monitoring, systems for growing crops hydroponically for long term space missions. One example are the systems used in the Biomass Production Chamber (BPC) at Kennedy Space Center (KSC). The current KSC monitoring system is based on an engineering workstation using standard analog/digital input/output hardware and custom written software. The monitoring system uses completely separate sensors to provide a check of control sensor accuracy and has the ability to graphically display and store data form past experiment so that they are available for data analysis [Fortson, 1992]. In many cases, growing systems have not been fitted with the kind of automated control systems as used at KSC. The Center for Food and Environmental Systems for Human Exploration of Space (CFESH) located on the campus of Tuskegee University, has effectively grown sweetpotatoes and peanuts hydroponically for the past five years. However they have adjusted the pH electrical conductivity and volume of the hydroponic nutrient solution only manually at times when the solution was to be replenished or changed out according to its protocol (e.g. one-week, two-week, or two-day cycle). But the pH of the nutrient solution flowing through the channel is neither known nor controlled between the update, change out, or replenishment period. Thus, the pH of the nutrient solution is not held at an optimum level over the span of the plant's growth cycle. To solve this dilemma, an automated system for the control and data logging of pH data relative to sweetpotato production using the nutrient film technique (NFT) has been developed, This paper discusses a microprocessor-based system, which was designed to monitor, control, and record the pH of a nutrient solution used for growing sweetpotatoes using NFT.

  4. Automated Fabrication Technologies for High Performance Polymer Composites

    NASA Technical Reports Server (NTRS)

    Shuart , M. J.; Johnston, N. J.; Dexter, H. B.; Marchello, J. M.; Grenoble, R. W.

    1998-01-01

    New fabrication technologies are being exploited for building high graphite-fiber-reinforced composite structure. Stitched fiber preforms and resin film infusion have been successfully demonstrated for large, composite wing structures. Other automatic processes being developed include automated placement of tacky, drapable epoxy towpreg, automated heated head placement of consolidated ribbon/tape, and vacuum-assisted resin transfer molding. These methods have the potential to yield low cost high performance structures by fabricating composite structures to net shape out-of-autoclave.

  5. Doing more with less in the lab.

    PubMed

    Craig, Elinore

    2003-12-01

    Automation offers laboratories the ability to improve patient care, enhance client and employee satisfaction, and increase workload capacity while maintaining a cost-effective department. "The overall objective of any organization's automation project is simple--to do more with less, better," states Davis. "We know our future is on the information systems side," Clarke states. "Vendors' investment in the development and creativity of automation is what is going to drive the future of the laboratory." "Implementing the automation solution was absolutely the right thing to do for Sacred Heart Health," says Wright. "With the transition complete, we are pleased with the results. All we want now is more automation that will enable us to do even more with what we have."

  6. Extension Master Gardener Intranet: Automating Administration, Motivating Volunteers, Increasing Efficiency, and Facilitating Impact Reporting

    ERIC Educational Resources Information Center

    Bradley, Lucy K.; Cook, Jonneen; Cook, Chris

    2011-01-01

    North Carolina State University has incorporated many aspects of volunteer program administration and reporting into an on-line solution that integrates impact reporting into daily program management. The Extension Master Gardener Intranet automates many of the administrative tasks associated with volunteer management, increasing efficiency, and…

  7. Multiple Robots Localization Via Data Sharing

    DTIC Science & Technology

    2015-09-01

    multiple humans, each with specialized skills complementing each other, work to create the solution. Hence, there is a motivation to think in terms of...pygame.Color(255,255,255) COLORBLACK = pygame.Color(0,0,0) F. AUTOMATE.PY The automate.py file is a helper file to assist in running multiple simulation

  8. Applied and implied semantics in crystallographic publishing

    PubMed Central

    2012-01-01

    Background Crystallography is a data-rich, software-intensive scientific discipline with a community that has undertaken direct responsibility for publishing its own scientific journals. That community has worked actively to develop information exchange standards allowing readers of structure reports to access directly, and interact with, the scientific content of the articles. Results Structure reports submitted to some journals of the International Union of Crystallography (IUCr) can be automatically validated and published through an efficient and cost-effective workflow. Readers can view and interact with the structures in three-dimensional visualization applications, and can access the experimental data should they wish to perform their own independent structure solution and refinement. The journals also layer on top of this facility a number of automated annotations and interpretations to add further scientific value. Conclusions The benefits of semantically rich information exchange standards have revolutionised the scholarly publishing process for crystallography, and establish a model relevant to many other physical science disciplines. PMID:22932420

  9. PC-based automation system streamlines operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowman, J.

    1995-10-01

    The continued emergence of PC-based automation systems in the modern compressor station is driving the need for personnel who have the special skills need to support them. However, the dilemma is that operating budget restraints limit the overall number of people available to operate and maintain compressor stations. An ideal solution is to deploy automation systems which can be easily understood and supported by existing compressor station personnel. This paper reviews such a system developed by Waukesha-Pearce Industries, Inc.

  10. Automated eukaryotic gene structure annotation using EVidenceModeler and the Program to Assemble Spliced Alignments

    PubMed Central

    Haas, Brian J; Salzberg, Steven L; Zhu, Wei; Pertea, Mihaela; Allen, Jonathan E; Orvis, Joshua; White, Owen; Buell, C Robin; Wortman, Jennifer R

    2008-01-01

    EVidenceModeler (EVM) is presented as an automated eukaryotic gene structure annotation tool that reports eukaryotic gene structures as a weighted consensus of all available evidence. EVM, when combined with the Program to Assemble Spliced Alignments (PASA), yields a comprehensive, configurable annotation system that predicts protein-coding genes and alternatively spliced isoforms. Our experiments on both rice and human genome sequences demonstrate that EVM produces automated gene structure annotation approaching the quality of manual curation. PMID:18190707

  11. Software support in automation of medicinal product evaluations.

    PubMed

    Juric, Radmila; Shojanoori, Reza; Slevin, Lindi; Williams, Stephen

    2005-01-01

    Medicinal product evaluation is one of the most important tasks undertaken by government health departments and their regulatory authorities, in every country in the world. The automation and adequate software support are critical tasks that can improve the efficiency and interoperation of regulatory systems across the world. In this paper we propose a software solution that supports the automation of the (i) submission of licensing applications, and (ii) evaluations of submitted licensing applications, according to regulatory authorities' procedures. The novelty of our solution is in allowing licensing applications to be submitted in any country in the world and evaluated according to any evaluation procedure (which can be chosen by either regulatory authorities or pharmaceutical companies). Consequently, submission and evaluation procedures become interoperable and the associated data repositories/databases can be shared between various countries and regulatory authorities.

  12. Computer-aided liver volumetry: performance of a fully-automated, prototype post-processing solution for whole-organ and lobar segmentation based on MDCT imaging.

    PubMed

    Fananapazir, Ghaneh; Bashir, Mustafa R; Marin, Daniele; Boll, Daniel T

    2015-06-01

    To evaluate the performance of a prototype, fully-automated post-processing solution for whole-liver and lobar segmentation based on MDCT datasets. A polymer liver phantom was used to assess accuracy of post-processing applications comparing phantom volumes determined via Archimedes' principle with MDCT segmented datasets. For the IRB-approved, HIPAA-compliant study, 25 patients were enrolled. Volumetry performance compared the manual approach with the automated prototype, assessing intraobserver variability, and interclass correlation for whole-organ and lobar segmentation using ANOVA comparison. Fidelity of segmentation was evaluated qualitatively. Phantom volume was 1581.0 ± 44.7 mL, manually segmented datasets estimated 1628.0 ± 47.8 mL, representing a mean overestimation of 3.0%, automatically segmented datasets estimated 1601.9 ± 0 mL, representing a mean overestimation of 1.3%. Whole-liver and segmental volumetry demonstrated no significant intraobserver variability for neither manual nor automated measurements. For whole-liver volumetry, automated measurement repetitions resulted in identical values; reproducible whole-organ volumetry was also achieved with manual segmentation, p(ANOVA) 0.98. For lobar volumetry, automated segmentation improved reproducibility over manual approach, without significant measurement differences for either methodology, p(ANOVA) 0.95-0.99. Whole-organ and lobar segmentation results from manual and automated segmentation showed no significant differences, p(ANOVA) 0.96-1.00. Assessment of segmentation fidelity found that segments I-IV/VI showed greater segmentation inaccuracies compared to the remaining right hepatic lobe segments. Automated whole-liver segmentation showed non-inferiority of fully-automated whole-liver segmentation compared to manual approaches with improved reproducibility and post-processing duration; automated dual-seed lobar segmentation showed slight tendencies for underestimating the right hepatic lobe volume and greater variability in edge detection for the left hepatic lobe compared to manual segmentation.

  13. Application of Hybrid Real-Time Power System Simulator for Designing and Researching of Relay Protection and Automation

    NASA Astrophysics Data System (ADS)

    Borovikov, Yu S.; Sulaymanov, A. O.; Andreev, M. V.

    2015-10-01

    Development, research and operation of smart grids (SG) with active-adaptive networks (AAS) are actual tasks for today. Planned integration of high-speed FACTS devices greatly complicates complex dynamic properties of power systems. As a result the operating conditions of equipment of power systems are significantly changing. Such situation creates the new actual problem of development and research of relay protection and automation (RPA) which will be able to adequately operate in the SGs and adapt to its regimes. Effectiveness of solution of the problem depends on using tools - different simulators of electric power systems. Analysis of the most famous and widely exploited simulators led to the conclusion about the impossibility of using them for solution of the mentioned problem. In Tomsk Polytechnic University developed the prototype of hybrid multiprocessor software and hardware system - Hybrid Real-Time Power System Simulator (HRTSim). Because of its unique features this simulator can be used for solution of mentioned tasks. This article introduces the concept of development and research of relay protection and automation with usage of HRTSim.

  14. Automation--down to the nuts and bolts.

    PubMed

    Fix, R J; Rowe, J M; McConnell, B C

    2000-01-01

    Laboratories that once viewed automation as an expensive luxury are now looking to automation as a solution to increase sample throughput, to help ensure data integrity and to improve laboratory safety. The question is no longer, 'Should we automate?', but 'How should we approach automation?' A laboratory may choose from three approaches when deciding to automate: (1) contract with a third party vendor to produce a turnkey system, (2) develop and fabricate the system in-house or (3) some combination of approaches (1) and (2). The best approach for a given laboratory depends upon its available resources. The first lesson to be learned in automation is that no matter how straightforward an idea appears in the beginning, the solution will not be realized until many complex problems have been resolved. Issues dealing with sample vessel manipulation, liquid handling and system control must be addressed before a final design can be developed. This requires expertise in engineering, electronics, programming and chemistry. Therefore, the team concept of automation should be employed to help ensure success. This presentation discusses the advantages and disadvantages of the three approaches to automation. The development of an automated sample handling and control system for the STAR System focused microwave will be used to illustrate the complexities encountered in a seemingly simple project, and to highlight the importance of the team concept to automation no matter which approach is taken. The STAR System focused microwave from CEM Corporation is an open vessel digestion system with six microwave cells. This system is used to prepare samples for trace metal determination. The automated sample handling was developed around a XYZ motorized gantry system. Grippers were specially designed to perform several different functions and to provide feedback to the control software. Software was written in Visual Basic 5.0 to control the movement of the samples and the operation and monitoring of the STAR microwave. This software also provides a continuous update of the system's status to the computer screen. The system provides unattended preparation of up to 59 samples per run.

  15. ARAS: an automated radioactivity aliquoting system for dispensing solutions containing positron-emitting radioisotopes

    DOE PAGES

    Dooraghi, Alex A.; Carroll, Lewis; Collins, Jeffrey; ...

    2016-03-09

    Automated protocols for measuring and dispensing solutions containing radioisotopes are essential not only for providing a safe environment for radiation workers but also to ensure accuracy of dispensed radioactivity and an efficient workflow. For this purpose, we have designed ARAS, an automated radioactivity aliquoting system for dispensing solutions containing positron-emitting radioisotopes with particular focus on fluorine-18 (18F). The key to the system is the combination of a radiation detector measuring radioactivity concentration, in line with a peristaltic pump dispensing known volumes. Results show the combined system demonstrates volume variation to be within 5 % for dispensing volumes of 20 μLmore » or greater. When considering volumes of 20 μL or greater, the delivered radioactivity is in agreement with the requested amount as measured independently with a dose calibrator to within 2 % on average. In conclusion, the integration of the detector and pump in an in-line system leads to a flexible and compact approach that can accurately dispense solutions containing radioactivity concentrations ranging from the high values typical of [18F]fluoride directly produced from a cyclotron (~0.1-1 mCi μL -1) to the low values typical of batches of [18F]fluoride-labeled radiotracers intended for preclinical mouse scans (~1-10 μCi μL -1).« less

  16. The historical development and basis of human factors guidelines for automated systems in aeronautical operations

    NASA Technical Reports Server (NTRS)

    Ciciora, J. A.; Leonard, S. D.; Johnson, N.; Amell, J.

    1984-01-01

    In order to derive general design guidelines for automated systems a study was conducted on the utilization and acceptance of existing automated systems as currently employed in several commercial fields. Four principal study area were investigated by means of structured interviews, and in some cases questionnaires. The study areas were aviation, a both scheduled airline and general commercial aviation; process control and factory applications; office automation; and automation in the power industry. The results of over eighty structured interviews were analyzed and responses categoried as various human factors issues for use by both designers and users of automated equipment. These guidelines address such items as general physical features of automated equipment; personnel orientation, acceptance, and training; and both personnel and system reliability.

  17. The terminal area automated path generation problem

    NASA Technical Reports Server (NTRS)

    Hsin, C.-C.

    1977-01-01

    The automated terminal area path generation problem in the advanced Air Traffic Control System (ATC), has been studied. Definitions, input, output and the interrelationships with other ATC functions have been discussed. Alternatives in modeling the problem have been identified. Problem formulations and solution techniques are presented. In particular, the solution of a minimum effort path stretching problem (path generation on a given schedule) has been carried out using the Newton-Raphson trajectory optimization method. Discussions are presented on the effect of different delivery time, aircraft entry position, initial guess on the boundary conditions, etc. Recommendations are made on real-world implementations.

  18. Macromolecular Crystallization in Microfluidics for the International Space Station

    NASA Technical Reports Server (NTRS)

    Monaco, Lisa A.; Spearing, Scott

    2003-01-01

    At NASA's Marshall Space Flight Center, the Iterative Biological Crystallization (IBC) project has begun development on scientific hardware for macromolecular crystallization on the International Space Station (ISS). Currently ISS crystallization research is limited to solution recipes that were prepared on the ground prior to launch. The proposed hardware will conduct solution mixing and dispensing on board the ISS, be fully automated, and have imaging functions via remote commanding from the ground. Utilizing microfluidic technology, IBC will allow for on orbit iterations. The microfluidics LabChip(R) devices that have been developed, along with Caliper Technologies, will greatly benefit researchers by allowing for precise fluid handling of nano/pico liter sized volumes. IBC will maximize the amount of science return by utilizing the microfluidic approach and be a valuable tool to structural biologists investigating medically relevant projects.

  19. Main Pipelines Corrosion Monitoring Device

    NASA Astrophysics Data System (ADS)

    Anatoliy, Bazhenov; Galina, Bondareva; Natalia, Grivennaya; Sergey, Malygin; Mikhail, Goryainov

    2017-01-01

    The aim of the article is to substantiate the technical solution for the problem of monitoring corrosion changes in oil and gas pipelines with use (using) of an electromagnetic NDT method. Pipeline wall thinning under operating conditions can lead to perforations and leakage of the product to be transported outside the pipeline. In most cases there is danger for human life and environment. Monitoring of corrosion changes in pipeline inner wall under operating conditions is complicated because pipelines are mainly made of structural steels with conductive and magnetic properties that complicate test signal passage through the entire thickness of the object under study. The technical solution of this problem lies in monitoring of the internal corrosion changes in pipes under operating conditions in order to increase safety of pipelines by automated prediction of achieving the threshold pre-crash values due to corrosion.

  20. The design of the automated control system for warehouse equipment under radio-electronic manufacturing

    NASA Astrophysics Data System (ADS)

    Kapulin, D. V.; Chemidov, I. V.; Kazantsev, M. A.

    2017-01-01

    In the paper, the aspects of design, development and implementation of the automated control system for warehousing under the manufacturing process of the radio-electronic enterprise JSC «Radiosvyaz» are discussed. The architecture of the automated control system for warehousing proposed in the paper consists of a server which is connected to the physically separated information networks: the network with a database server, which stores information about the orders for picking, and the network with the automated storage and retrieval system. This principle allows implementing the requirements for differentiation of access, ensuring the information safety and security requirements. Also, the efficiency of the developed automated solutions in terms of optimizing the warehouse’s logistic characteristics is researched.

  1. Automated electrochemical assembly of the protected potential TMG-chitotriomycin precursor based on rational optimization of the carbohydrate building block.

    PubMed

    Nokami, Toshiki; Isoda, Yuta; Sasaki, Norihiko; Takaiso, Aki; Hayase, Shuichi; Itoh, Toshiyuki; Hayashi, Ryutaro; Shimizu, Akihiro; Yoshida, Jun-ichi

    2015-03-20

    The anomeric arylthio group and the hydroxyl-protecting groups of thioglycosides were optimized to construct carbohydrate building blocks for automated electrochemical solution-phase synthesis of oligoglucosamines having 1,4-β-glycosidic linkages. The optimization study included density functional theory calculations, measurements of the oxidation potentials, and the trial synthesis of the chitotriose trisaccharide. The automated synthesis of the protected potential N,N,N-trimethyl-d-glucosaminylchitotriomycin precursor was accomplished by using the optimized building block.

  2. Automated batch fiducial-less tilt-series alignment in Appion using Protomo

    PubMed Central

    Noble, Alex J.; Stagg, Scott M.

    2015-01-01

    The field of electron tomography has benefited greatly from manual and semi-automated approaches to marker-based tilt-series alignment that have allowed for the structural determination of multitudes of in situ cellular structures as well as macromolecular structures of individual protein complexes. The emergence of complementary metal-oxide semiconductor detectors capable of detecting individual electrons has enabled the collection of low dose, high contrast images, opening the door for reliable correlation-based tilt-series alignment. Here we present a set of automated, correlation-based tilt-series alignment, contrast transfer function (CTF) correction, and reconstruction workflows for use in conjunction with the Appion/Leginon package that are primarily targeted at automating structure determination with cryogenic electron microscopy. PMID:26455557

  3. Exploring the Lived Experiences of Program Managers Regarding an Automated Logistics Environment

    ERIC Educational Resources Information Center

    Allen, Ronald Timothy

    2014-01-01

    Automated Logistics Environment (ALE) is a new term used by Navy and aerospace industry executives to describe the aggregate of logistics-related information systems that support modern aircraft weapon systems. The development of logistics information systems is not always well coordinated among programs, often resulting in solutions that cannot…

  4. Method 365.5 Determination of Orthophosphate in Estuarine and Coastal Waters by Automated Colorimetric Analysis

    EPA Science Inventory

    This method provides a procedure for the determination of low-level orthophosphate concentrations normally found in estuarine and/or coastal waters. It is based upon the method of Murphy and Riley1 adapted for automated segmented flow analysis2 in which the two reagent solutions ...

  5. Taking Advantage of Automated Assessment of Student-Constructed Graphs in Science

    ERIC Educational Resources Information Center

    Vitale, Jonathan M.; Lai, Kevin; Linn, Marcia C.

    2015-01-01

    We present a new system for automated scoring of graph construction items that address complex science concepts, feature qualitative prompts, and support a range of possible solutions. This system utilizes analysis of spatial features (e.g., slope of a line) to evaluate potential student ideas represented within graphs. Student ideas are then…

  6. Automated Formative Assessment as a Tool to Scaffold Student Documentary Writing

    ERIC Educational Resources Information Center

    Ferster, Bill; Hammond, Thomas C.; Alexander, R. Curby; Lyman, Hunt

    2012-01-01

    The hurried pace of the modern classroom does not permit formative feedback on writing assignments at the frequency or quality recommended by the research literature. One solution for increasing individual feedback to students is to incorporate some form of computer-generated assessment. This study explores the use of automated assessment of…

  7. Consolidating a Distributed Compound Management Capability into a Single Installation: The Application of Overall Equipment Effectiveness to Determine Capacity Utilization.

    PubMed

    Green, Clive; Taylor, Daniel

    2016-12-01

    Compound management (CM) is a critical discipline enabling hit discovery through the production of assay-ready compound plates for screening. CM in pharma requires significant investments in manpower, capital equipment, repairs and maintenance, and information technology. These investments are at risk from external factors, for example, new technology rendering existing equipment obsolete and strategic site closures. At AstraZeneca, we faced the challenge of evaluating the number of CM sites required to support hit discovery in response to site closures and pressure on our operating budget. We reasoned that overall equipment effectiveness, a tool used extensively in the manufacturing sector, could determine the equipment capacity and appropriate number of sites. We identified automation downtime as the critical component governing capacity, and a connection between automation downtime and the availability of skilled staff. We demonstrated that sufficient production capacity existed in two sites to meet hit discovery demand without the requirement for an additional investment of $7 million in new facilities. In addition, we developed an automated capacity model that incorporated an extended working-day pattern as a solution for reducing automation downtime. The application of this solution enabled the transition to a single site, with an annual cost saving of $2.1 million. © 2015 Society for Laboratory Automation and Screening.

  8. A completely automated flow, heat-capacity, calorimeter for use at high temperatures and pressures

    NASA Astrophysics Data System (ADS)

    Rogers, P. S. Z.; Sandarusi, Jamal

    1990-11-01

    An automated, flow calorimeter has been constructed to measure the isobaric heat capacities of concentrated, aqueous electrolyte solutions using a differential calorimetry technique. The calorimeter is capable of operation to 700 K and 40 MPa with a measurement accuracy of 0.03% relative to the heat capacity of the pure reference fluid (water). A novel design encloses the calorimeter within a double set of separately controlled, copper, adiabatic shields that minimize calorimeter heat losses and precisely control the temperature of the inlet fluids. A multistage preheat train, used to efficiently heat the flowing fluid, includes a counter-current heat exchanger for the inlet and outlet fluid streams in tandem with two calorimeter preheaters. Complete system automation is accomplished with a distributed control scheme using multiple processors, allowing the major control tasks of calorimeter operation and control, data logging and display, and pump control to be performed simultaneously. A sophisticated pumping strategy for the two separate syringe pumps allows continuous fluid delivery. This automation system enables the calorimeter to operate unattended except for the reloading of sample fluids. In addition, automation has allowed the development and implementation of an improved heat loss calibration method that provides calorimeter calibration with absolute accuracy comparable to the overall measurement precision, even for very concentrated solutions.

  9. Using Dissimilarity Metrics to Identify Interesting Designs

    NASA Technical Reports Server (NTRS)

    Feather, Martin; Kiper, James

    2006-01-01

    A computer program helps to blend the power of automated-search software, which is able to generate large numbers of design solutions, with the insight of expert designers, who are able to identify preferred designs but do not have time to examine all the solutions. From among the many automated solutions to a given design problem, the program selects a smaller number of solutions that are worthy of scrutiny by the experts in the sense that they are sufficiently dissimilar from each other. The program makes the selection in an interactive process that involves a sequence of data-mining steps interspersed with visual displays of results of these steps to the experts. At crucial points between steps, the experts provide directives to guide the process. The program uses heuristic search techniques to identify nearly optimal design solutions and uses dissimilarity metrics defined by the experts to characterize the degree to which solutions are interestingly different. The search, data-mining, and visualization features of the program were derived from previously developed risk-management software used to support a risk-centric design methodology

  10. Automatic mobile device synchronization and remote control system for high-performance medical applications.

    PubMed

    Constantinescu, L; Kim, J; Chan, C; Feng, D

    2007-01-01

    The field of telemedicine is in need of generic solutions that harness the power of small, easily carried computing devices to increase efficiency and decrease the likelihood of medical errors. Our study resolved to build a framework to bridge the gap between handheld and desktop solutions by developing an automated network protocol that wirelessly propagates application data and images prepared by a powerful workstation to handheld clients for storage, display and collaborative manipulation. To this end, we present the Mobile Active Medical Protocol (MAMP), a framework capable of nigh-effortlessly linking medical workstation solutions to corresponding control interfaces on handheld devices for remote storage, control and display. The ease-of-use, encapsulation and applicability of this automated solution is designed to provide significant benefits to the rapid development of telemedical solutions. Our results demonstrate that the design of this system allows an acceptable data transfer rate, a usable framerate for diagnostic solutions and enough flexibility to enable its use in a wide variety of cases. To this end, we also present a large-scale multi-modality image viewer as an example application based on the MAMP.

  11. Progress on Platforms, Sensors and Applications with Unmanned Aerial Vehicles in soil science and geomorphology

    NASA Astrophysics Data System (ADS)

    Anders, Niels; Suomalainen, Juha; Seeger, Manuel; Keesstra, Saskia; Bartholomeus, Harm; Paron, Paolo

    2014-05-01

    The recent increase of performance and endurance of electronically controlled flying platforms, such as multi-copters and fixed-wing airplanes, and decreasing size and weight of different sensors and batteries leads to increasing popularity of Unmanned Aerial Systems (UAS) for scientific purposes. Modern workflows that implement UAS include guided flight plan generation, 3D GPS navigation for fully automated piloting, and automated processing with new techniques such as "Structure from Motion" photogrammetry. UAS are often equipped with normal RGB cameras, multi- and hyperspectral sensors, radar, or other sensors, and provide a cheap and flexible solution for creating multi-temporal data sets. UAS revolutionized multi-temporal research allowing new applications related to change analysis and process monitoring. The EGU General Assembly 2014 is hosting a session on platforms, sensors and applications with UAS in soil science and geomorphology. This presentation briefly summarizes the outcome of this session, addressing the current state and future challenges of small-platform data acquisition in soil science and geomorphology.

  12. Automation life-cycle cost model

    NASA Technical Reports Server (NTRS)

    Gathmann, Thomas P.; Reeves, Arlinda J.; Cline, Rick; Henrion, Max; Ruokangas, Corinne

    1992-01-01

    The problem domain being addressed by this contractual effort can be summarized by the following list: Automation and Robotics (A&R) technologies appear to be viable alternatives to current, manual operations; Life-cycle cost models are typically judged with suspicion due to implicit assumptions and little associated documentation; and Uncertainty is a reality for increasingly complex problems and few models explicitly account for its affect on the solution space. The objectives for this effort range from the near-term (1-2 years) to far-term (3-5 years). In the near-term, the envisioned capabilities of the modeling tool are annotated. In addition, a framework is defined and developed in the Decision Modelling System (DEMOS) environment. Our approach is summarized as follows: Assess desirable capabilities (structure into near- and far-term); Identify useful existing models/data; Identify parameters for utility analysis; Define tool framework; Encode scenario thread for model validation; and Provide transition path for tool development. This report contains all relevant, technical progress made on this contractual effort.

  13. Verifying the Modal Logic Cube Is an Easy Task (For Higher-Order Automated Reasoners)

    NASA Astrophysics Data System (ADS)

    Benzmüller, Christoph

    Prominent logics, including quantified multimodal logics, can be elegantly embedded in simple type theory (classical higher-order logic). Furthermore, off-the-shelf reasoning systems for simple type type theory exist that can be uniformly employed for reasoning within and about embedded logics. In this paper we focus on reasoning about modal logics and exploit our framework for the automated verification of inclusion and equivalence relations between them. Related work has applied first-order automated theorem provers for the task. Our solution achieves significant improvements, most notably, with respect to elegance and simplicity of the problem encodings as well as with respect to automation performance.

  14. Elimination of biofilm and microbial contamination reservoirs in hospital washbasin U-bends by automated cleaning and disinfection with electrochemically activated solutions.

    PubMed

    Swan, J S; Deasy, E C; Boyle, M A; Russell, R J; O'Donnell, M J; Coleman, D C

    2016-10-01

    Washbasin U-bends are reservoirs of microbial contamination in healthcare environments. U-Bends are constantly full of water and harbour microbial biofilm. To develop an effective automated cleaning and disinfection system for U-bends using two solutions generated by electrochemical activation of brine including the disinfectant anolyte (predominantly hypochlorous acid) and catholyte (predominantly sodium hydroxide) with detergent properties. Initially three washbasin U-bends were manually filled with catholyte followed by anolyte for 5min each once weekly for five weeks. A programmable system was then developed with one washbasin that automated this process. This U-bend had three cycles of 5min catholyte followed by 5min anolyte treatment per week for three months. Quantitative bacterial counts from treated and control U-bends were determined on blood agar (CBA), R2A, PAS, and PA agars following automated treatment and on CBA and R2A following manual treatment. The average bacterial density from untreated U-bends throughout the study was >1×10(5) cfu/swab on all media with Pseudomonas aeruginosa accounting for ∼50% of counts. Manual U-bend electrochemically activated (ECA) solution treatment reduced counts significantly (<100cfu/swab) (P<0.01 for CBA; P<0.005 for R2A). Similarly, counts from the automated ECA-treatment U-bend were significantly reduced with average counts for 35 cycles on CBA, R2A, PAS, and PA of 2.1±4.5 (P<0.0001), 13.1±30.1 (P<0.05), 0.7±2.8 (P<0.001), and 0 (P<0.05) cfu/swab, respectively. P. aeruginosa was eliminated from all treated U-bends. Automated ECA treatment of washbasin U-bends consistently minimizes microbial contamination. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Challenges of sulfur SAD phasing as a routine method in macromolecular crystallography.

    PubMed

    Doutch, James; Hough, Michael A; Hasnain, S Samar; Strange, Richard W

    2012-01-01

    The sulfur SAD phasing method allows the determination of protein structures de novo without reference to derivatives such as Se-methionine. The feasibility for routine automated sulfur SAD phasing using a number of current protein crystallography beamlines at several synchrotrons was examined using crystals of trimeric Achromobacter cycloclastes nitrite reductase (AcNiR), which contains a near average proportion of sulfur-containing residues and two Cu atoms per subunit. Experiments using X-ray wavelengths in the range 1.9-2.4 Å show that we are not yet at the level where sulfur SAD is routinely successful for automated structure solution and model building using existing beamlines and current software tools. On the other hand, experiments using the shortest X-ray wavelengths available on existing beamlines could be routinely exploited to solve and produce unbiased structural models using the similarly weak anomalous scattering signals from the intrinsic metal atoms in proteins. The comparison of long-wavelength phasing (the Bijvoet ratio for nine S atoms and two Cu atoms is ~1.25% at ~2 Å) and copper phasing (the Bijvoet ratio for two Cu atoms is 0.81% at ~0.75 Å) for AcNiR suggests that lower data multiplicity than is currently required for success should in general be possible for sulfur phasing if appropriate improvements to beamlines and data collection strategies can be implemented.

  16. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  17. Automation in clinical bacteriology: what system to choose?

    PubMed

    Greub, G; Prod'hom, G

    2011-05-01

    With increased activity and reduced financial and human resources, there is a need for automation in clinical bacteriology. Initial processing of clinical samples includes repetitive and fastidious steps. These tasks are suitable for automation, and several instruments are now available on the market, including the WASP (Copan), Previ-Isola (BioMerieux), Innova (Becton-Dickinson) and Inoqula (KIESTRA) systems. These new instruments allow efficient and accurate inoculation of samples, including four main steps: (i) selecting the appropriate Petri dish; (ii) inoculating the sample; (iii) spreading the inoculum on agar plates to obtain, upon incubation, well-separated bacterial colonies; and (iv) accurate labelling and sorting of each inoculated media. The challenge for clinical bacteriologists is to determine what is the ideal automated system for their own laboratory. Indeed, different solutions will be preferred, according to the number and variety of samples, and to the types of sample that will be processed with the automated system. The final choice is troublesome, because audits proposed by industrials risk being biased towards the solution proposed by their company, and because these automated systems may not be easily tested on site prior to the final decision, owing to the complexity of computer connections between the laboratory information system and the instrument. This article thus summarizes the main parameters that need to be taken into account for choosing the optimal system, and provides some clues to help clinical bacteriologists to make their choice. © 2011 The Authors. Clinical Microbiology and Infection © 2011 European Society of Clinical Microbiology and Infectious Diseases.

  18. A report on SHARP (Spacecraft Health Automated Reasoning Prototype) and the Voyager Neptune encounter

    NASA Technical Reports Server (NTRS)

    Martin, R. G. (Editor); Atkinson, D. J.; James, M. L.; Lawson, D. L.; Porta, H. J.

    1990-01-01

    The development and application of the Spacecraft Health Automated Reasoning Prototype (SHARP) for the operations of the telecommunications systems and link analysis functions in Voyager mission operations are presented. An overview is provided of the design and functional description of the SHARP system as it was applied to Voyager. Some of the current problems and motivations for automation in real-time mission operations are discussed, as are the specific solutions that SHARP provides. The application of SHARP to Voyager telecommunications had the goal of being a proof-of-capability demonstration of artificial intelligence as applied to the problem of real-time monitoring functions in planetary mission operations. AS part of achieving this central goal, the SHARP application effort was also required to address the issue of the design of an appropriate software system architecture for a ground-based, highly automated spacecraft monitoring system for mission operations, including methods for: (1) embedding a knowledge-based expert system for fault detection, isolation, and recovery within this architecture; (2) acquiring, managing, and fusing the multiple sources of information used by operations personnel; and (3) providing information-rich displays to human operators who need to exercise the capabilities of the automated system. In this regard, SHARP has provided an excellent example of how advanced artificial intelligence techniques can be smoothly integrated with a variety of conventionally programmed software modules, as well as guidance and solutions for many questions about automation in mission operations.

  19. Glycan Reader: Automated Sugar Identification and Simulation Preparation for Carbohydrates and Glycoproteins

    PubMed Central

    Jo, Sunhwan; Song, Kevin C.; Desaire, Heather; MacKerell, Alexander D.; Im, Wonpil

    2011-01-01

    Understanding how glycosylation affects protein structure, dynamics, and function is an emerging and challenging problem in biology. As a first step toward glycan modeling in the context of structural glycobiology, we have developed Glycan Reader and integrated it into the CHARMM-GUI, http://www.charmm-gui.org/input/glycan. Glycan Reader greatly simplifies the reading of PDB structure files containing glycans through (i) detection of carbohydrate molecules, (ii) automatic annotation of carbohydrates based on their three-dimensional structures, (iii) recognition of glycosidic linkages between carbohydrates as well as N-/O-glycosidic linkages to proteins, and (iv) generation of inputs for the biomolecular simulation program CHARMM with the proper glycosidic linkage setup. In addition, Glycan Reader is linked to other functional modules in CHARMM-GUI, allowing users to easily generate carbohydrate or glycoprotein molecular simulation systems in solution or membrane environments and visualize the electrostatic potential on glycoprotein surfaces. These tools are useful for studying the impact of glycosylation on protein structure and dynamics. PMID:21815173

  20. Automated batch fiducial-less tilt-series alignment in Appion using Protomo.

    PubMed

    Noble, Alex J; Stagg, Scott M

    2015-11-01

    The field of electron tomography has benefited greatly from manual and semi-automated approaches to marker-based tilt-series alignment that have allowed for the structural determination of multitudes of in situ cellular structures as well as macromolecular structures of individual protein complexes. The emergence of complementary metal-oxide semiconductor detectors capable of detecting individual electrons has enabled the collection of low dose, high contrast images, opening the door for reliable correlation-based tilt-series alignment. Here we present a set of automated, correlation-based tilt-series alignment, contrast transfer function (CTF) correction, and reconstruction workflows for use in conjunction with the Appion/Leginon package that are primarily targeted at automating structure determination with cryogenic electron microscopy. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. JPLEX: Java Simplex Implementation with Branch-and-Bound Search for Automated Test Assembly

    ERIC Educational Resources Information Center

    Park, Ryoungsun; Kim, Jiseon; Dodd, Barbara G.; Chung, Hyewon

    2011-01-01

    JPLEX, short for Java simPLEX, is an automated test assembly (ATA) program. It is a mixed integer linear programming (MILP) solver written in Java. It reads in a configuration file, solves the minimization problem, and produces an output file for postprocessing. It implements the simplex algorithm to create a fully relaxed solution and…

  2. Selecting automation for the clinical chemistry laboratory.

    PubMed

    Melanson, Stacy E F; Lindeman, Neal I; Jarolim, Petr

    2007-07-01

    Laboratory automation proposes to improve the quality and efficiency of laboratory operations, and may provide a solution to the quality demands and staff shortages faced by today's clinical laboratories. Several vendors offer automation systems in the United States, with both subtle and obvious differences. Arriving at a decision to automate, and the ensuing evaluation of available products, can be time-consuming and challenging. Although considerable discussion concerning the decision to automate has been published, relatively little attention has been paid to the process of evaluating and selecting automation systems. To outline a process for evaluating and selecting automation systems as a reference for laboratories contemplating laboratory automation. Our Clinical Chemistry Laboratory staff recently evaluated all major laboratory automation systems in the United States, with their respective chemistry and immunochemistry analyzers. Our experience is described and organized according to the selection process, the important considerations in clinical chemistry automation, decisions and implementation, and we give conclusions pertaining to this experience. Including the formation of a committee, workflow analysis, submitting a request for proposal, site visits, and making a final decision, the process of selecting chemistry automation took approximately 14 months. We outline important considerations in automation design, preanalytical processing, analyzer selection, postanalytical storage, and data management. Selecting clinical chemistry laboratory automation is a complex, time-consuming process. Laboratories considering laboratory automation may benefit from the concise overview and narrative and tabular suggestions provided.

  3. SPLICE: A program to assemble partial query solutions from three-dimensional database searches into novel ligands

    NASA Astrophysics Data System (ADS)

    Ho, Chris M. W.; Marshall, Garland R.

    1993-12-01

    SPLICE is a program that processes partial query solutions retrieved from 3D, structural databases to generate novel, aggregate ligands. It is designed to interface with the database searching program FOUNDATION, which retrieves fragments containing any combination of a user-specified minimum number of matching query elements. SPLICE eliminates aspects of structures that are physically incapable of binding within the active site. Then, a systematic rule-based procedure is performed upon the remaining fragments to ensure receptor complementarity. All modifications are automated and remain transparent to the user. Ligands are then assembled by linking components into composite structures through overlapping bonds. As a control experiment, FOUNDATION and SPLICE were used to reconstruct a know HIV-1 protease inhibitor after it had been fragmented, reoriented, and added to a sham database of fifty different small molecules. To illustrate the capabilities of this program, a 3D search query containing the pharmacophoric elements of an aspartic proteinase-inhibitor crystal complex was searched using FOUNDATION against a subset of the Cambridge Structural Database. One hundred thirty-one compounds were retrieved, each containing any combination of at least four query elements. Compounds were automatically screened and edited for receptor complementarity. Numerous combinations of fragments were discovered that could be linked to form novel structures, containing a greater number of pharmacophoric elements than any single retrieved fragment.

  4. Automated global structure extraction for effective local building block processing in XCS.

    PubMed

    Butz, Martin V; Pelikan, Martin; Llorà, Xavier; Goldberg, David E

    2006-01-01

    Learning Classifier Systems (LCSs), such as the accuracy-based XCS, evolve distributed problem solutions represented by a population of rules. During evolution, features are specialized, propagated, and recombined to provide increasingly accurate subsolutions. Recently, it was shown that, as in conventional genetic algorithms (GAs), some problems require efficient processing of subsets of features to find problem solutions efficiently. In such problems, standard variation operators of genetic and evolutionary algorithms used in LCSs suffer from potential disruption of groups of interacting features, resulting in poor performance. This paper introduces efficient crossover operators to XCS by incorporating techniques derived from competent GAs: the extended compact GA (ECGA) and the Bayesian optimization algorithm (BOA). Instead of simple crossover operators such as uniform crossover or one-point crossover, ECGA or BOA-derived mechanisms are used to build a probabilistic model of the global population and to generate offspring classifiers locally using the model. Several offspring generation variations are introduced and evaluated. The results show that it is possible to achieve performance similar to runs with an informed crossover operator that is specifically designed to yield ideal problem-dependent exploration, exploiting provided problem structure information. Thus, we create the first competent LCSs, XCS/ECGA and XCS/BOA, that detect dependency structures online and propagate corresponding lower-level dependency structures effectively without any information about these structures given in advance.

  5. Semi-automatic mapping of geological Structures using UAV-based photogrammetric data: An image analysis approach

    NASA Astrophysics Data System (ADS)

    Vasuki, Yathunanthan; Holden, Eun-Jung; Kovesi, Peter; Micklethwaite, Steven

    2014-08-01

    Recent advances in data acquisition technologies, such as Unmanned Aerial Vehicles (UAVs), have led to a growing interest in capturing high-resolution rock surface images. However, due to the large volumes of data that can be captured in a short flight, efficient analysis of this data brings new challenges, especially the time it takes to digitise maps and extract orientation data. We outline a semi-automated method that allows efficient mapping of geological faults using photogrammetric data of rock surfaces, which was generated from aerial photographs collected by a UAV. Our method harnesses advanced automated image analysis techniques and human data interaction to rapidly map structures and then calculate their dip and dip directions. Geological structures (faults, joints and fractures) are first detected from the primary photographic dataset and the equivalent three dimensional (3D) structures are then identified within a 3D surface model generated by structure from motion (SfM). From this information the location, dip and dip direction of the geological structures are calculated. A structure map generated by our semi-automated method obtained a recall rate of 79.8% when compared against a fault map produced using expert manual digitising and interpretation methods. The semi-automated structure map was produced in 10 min whereas the manual method took approximately 7 h. In addition, the dip and dip direction calculation, using our automated method, shows a mean±standard error of 1.9°±2.2° and 4.4°±2.6° respectively with field measurements. This shows the potential of using our semi-automated method for accurate and efficient mapping of geological structures, particularly from remote, inaccessible or hazardous sites.

  6. InPRO: Automated Indoor Construction Progress Monitoring Using Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Hamledari, Hesam

    In this research, an envisioned automated intelligent robotic solution for automated indoor data collection and inspection that employs a series of unmanned aerial vehicles (UAV), entitled "InPRO", is presented. InPRO consists of four stages, namely: 1) automated path planning; 2) autonomous UAV-based indoor inspection; 3) automated computer vision-based assessment of progress; and, 4) automated updating of 4D building information models (BIM). The works presented in this thesis address the third stage of InPRO. A series of computer vision-based methods that automate the assessment of construction progress using images captured at indoor sites are introduced. The proposed methods employ computer vision and machine learning techniques to detect the components of under-construction indoor partitions. In particular, framing (studs), insulation, electrical outlets, and different states of drywall sheets (installing, plastering, and painting) are automatically detected using digital images. High accuracy rates, real-time performance, and operation without a priori information are indicators of the methods' promising performance.

  7. A&R challenges for in-space operations. [Automation and Robotic technologies

    NASA Technical Reports Server (NTRS)

    Underwood, James

    1990-01-01

    Automation and robotics (A&R) challenges for in-space operations are examined, with emphasis on the interaction between developing requirements, developing solutions, design concepts, and the nature of the applicability of automation in robotic technologies. Attention is first given to the use of A&R in establishing outposts on the moon and Mars. Then emphasis is placed on the requirements for the assembly of transportation systems in low earth orbit. Concepts of the Space Station which show how the assembly, processing, and checkout of systems in LEO might be accommodated are examined.

  8. End-effector microprocessor

    NASA Technical Reports Server (NTRS)

    Doggett, William R.

    1992-01-01

    The topics are presented in viewgraph form and include: automated structures assembly facility current control hierarchy; automated structures assembly facility purposed control hierarchy; end-effector software state transition diagram; block diagram for ideal install composite; and conclusions.

  9. Multiparametric Flow System for the Automated Determination of Sodium, Potassium, Calcium, and Magnesium in Large-Volume Parenteral Solutions and Concentrated Hemodialysis Solutions

    PubMed Central

    Pistón, Mariela; Dol, Isabel

    2006-01-01

    A multiparametric flow system based on multicommutation and binary sampling has been designed for the automated determination of sodium, potassium, calcium, and magnesium in large-volume parenteral solutions and hemodialysis concentrated solutions. The goal was to obtain a computer-controlled system capable of determining the four metals without extensive modifications. The system involved the use of five solenoid valves under software control, allowing the establishment of the appropriate flow conditions for each analyte, that is, sample size, dilution, reagent addition, and so forth. Detection was carried out by either flame atomic emission spectrometry (sodium, potassium) or flame atomic absorption spectrometry (calcium, magnesium). The influence of several operating parameters was studied. Validation was carried out by analyzing artificial samples. Figures of merit obtained include linearity, accuracy, precision, and sampling frequency. Linearity was satisfactory: sodium, r 2 >0.999 ( 0.5 – 3.5 g/L), potassium, r 2 >0.996 (50–150 mg/L), calcium, r 2 >0.999 (30–120 mg/L), and magnesium, r 2 >0.999 (20–40 mg/L). Precision ( s r , %, n=5 ) was better than 2.1 %, and accuracy (evaluated through recovery assays) was in the range of 99.8 %– 101.0 % (sodium), 100.8 – 102.5 % (potassium), 97.3 %– 101.3 % (calcium), and 97.1 %– 99.8 % (magnesium). Sampling frequencies ( h −1 ) were 70 (sodium), 75 (potassium), 70 (calcium), and 58 (magnesium). According to the results obtained, the use of an automated multiparametric system based on multicommutation offers several advantages for the quality control of large-volume parenteral solutions and hemodialysis concentrated solutions. PMID:17671619

  10. Phaser.MRage: automated molecular replacement

    PubMed Central

    Bunkóczi, Gábor; Echols, Nathaniel; McCoy, Airlie J.; Oeffner, Robert D.; Adams, Paul D.; Read, Randy J.

    2013-01-01

    Phaser.MRage is a molecular-replacement automation framework that implements a full model-generation workflow and provides several layers of model exploration to the user. It is designed to handle a large number of models and can distribute calculations efficiently onto parallel hardware. In addition, phaser.MRage can identify correct solutions and use this information to accelerate the search. Firstly, it can quickly score all alternative models of a component once a correct solution has been found. Secondly, it can perform extensive analysis of identified solutions to find protein assemblies and can employ assembled models for subsequent searches. Thirdly, it is able to use a priori assembly information (derived from, for example, homologues) to speculatively place and score molecules, thereby customizing the search procedure to a certain class of protein molecule (for example, antibodies) and incorporating additional biological information into molecular replacement. PMID:24189240

  11. Phaser.MRage: automated molecular replacement.

    PubMed

    Bunkóczi, Gábor; Echols, Nathaniel; McCoy, Airlie J; Oeffner, Robert D; Adams, Paul D; Read, Randy J

    2013-11-01

    Phaser.MRage is a molecular-replacement automation framework that implements a full model-generation workflow and provides several layers of model exploration to the user. It is designed to handle a large number of models and can distribute calculations efficiently onto parallel hardware. In addition, phaser.MRage can identify correct solutions and use this information to accelerate the search. Firstly, it can quickly score all alternative models of a component once a correct solution has been found. Secondly, it can perform extensive analysis of identified solutions to find protein assemblies and can employ assembled models for subsequent searches. Thirdly, it is able to use a priori assembly information (derived from, for example, homologues) to speculatively place and score molecules, thereby customizing the search procedure to a certain class of protein molecule (for example, antibodies) and incorporating additional biological information into molecular replacement.

  12. Automated CFD Database Generation for a 2nd Generation Glide-Back-Booster

    NASA Technical Reports Server (NTRS)

    Chaderjian, Neal M.; Rogers, Stuart E.; Aftosmis, Michael J.; Pandya, Shishir A.; Ahmad, Jasim U.; Tejmil, Edward

    2003-01-01

    A new software tool, AeroDB, is used to compute thousands of Euler and Navier-Stokes solutions for a 2nd generation glide-back booster in one week. The solution process exploits a common job-submission grid environment using 13 computers located at 4 different geographical sites. Process automation and web-based access to the database greatly reduces the user workload, removing much of the tedium and tendency for user input errors. The database consists of forces, moments, and solution files obtained by varying the Mach number, angle of attack, and sideslip angle. The forces and moments compare well with experimental data. Stability derivatives are also computed using a monotone cubic spline procedure. Flow visualization and three-dimensional surface plots are used to interpret and characterize the nature of computed flow fields.

  13. A Hybrid Human-Computer Approach to the Extraction of Scientific Facts from the Literature.

    PubMed

    Tchoua, Roselyne B; Chard, Kyle; Audus, Debra; Qin, Jian; de Pablo, Juan; Foster, Ian

    2016-01-01

    A wealth of valuable data is locked within the millions of research articles published each year. Reading and extracting pertinent information from those articles has become an unmanageable task for scientists. This problem hinders scientific progress by making it hard to build on results buried in literature. Moreover, these data are loosely structured, encoded in manuscripts of various formats, embedded in different content types, and are, in general, not machine accessible. We present a hybrid human-computer solution for semi-automatically extracting scientific facts from literature. This solution combines an automated discovery, download, and extraction phase with a semi-expert crowd assembled from students to extract specific scientific facts. To evaluate our approach we apply it to a challenging molecular engineering scenario, extraction of a polymer property: the Flory-Huggins interaction parameter. We demonstrate useful contributions to a comprehensive database of polymer properties.

  14. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application.

    PubMed

    Hanwell, Marcus D; de Jong, Wibe A; Harris, Christopher J

    2017-10-30

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction-connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platform with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web-going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.

  15. The comparison of the use of holonic and agent-based methods in modelling of manufacturing systems

    NASA Astrophysics Data System (ADS)

    Foit, K.; Banaś, W.; Gwiazda, A.; Hryniewicz, P.

    2017-08-01

    The rapid evolution in the field of industrial automation and manufacturing is often called the 4th Industry Revolution. Worldwide availability of the internet access contributes to the competition between manufacturers, gives the opportunity for buying materials, parts and for creating the partnership networks, like cloud manufacturing, grid manufacturing (MGrid), virtual enterprises etc. The effect of the industry evolution is the need to search for new solutions in the field of manufacturing systems modelling and simulation. During the last decade researchers have developed the agent-based approach of modelling. This methodology have been taken from the computer science, but was adapted to the philosophy of industrial automation and robotization. The operation of the agent-based system depends on the simultaneous acting of different agents that may have different roles. On the other hand, there is the holon-based approach that uses the structures created by holons. It differs from the agent-based structure in some aspects, while the other ones are quite similar in both methodologies. The aim of this paper is to present the both methodologies and discuss the similarities and the differences. This may could help to select the optimal method of modelling, according to the considered problem and software resources.

  16. Design Through Manufacturing: The Solid Model - Finite Element Analysis Interface

    NASA Technical Reports Server (NTRS)

    Rubin, Carol

    2003-01-01

    State-of-the-art computer aided design (CAD) presently affords engineers the opportunity to create solid models of machine parts which reflect every detail of the finished product. Ideally, these models should fulfill two very important functions: (1) they must provide numerical control information for automated manufacturing of precision parts, and (2) they must enable analysts to easily evaluate the stress levels (using finite element analysis - FEA) for all structurally significant parts used in space missions. Today's state-of-the-art CAD programs perform function (1) very well, providing an excellent model for precision manufacturing. But they do not provide a straightforward and simple means of automating the translation from CAD to FEA models, especially for aircraft-type structures. The research performed during the fellowship period investigated the transition process from the solid CAD model to the FEA stress analysis model with the final goal of creating an automatic interface between the two. During the period of the fellowship a detailed multi-year program for the development of such an interface was created. The ultimate goal of this program will be the development of a fully parameterized automatic ProE/FEA translator for parts and assemblies, with the incorporation of data base management into the solution, and ultimately including computational fluid dynamics and thermal modeling in the interface.

  17. EPICS controlled sample mounting robots at the GM/CA CAT.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makarov, O. A.; Benn, R.; Corcoran, S.

    2007-11-11

    GM/CA CAT at Sector 23 of the advanced photon source (APS) is an NIH funded facility for crystallographic structure determination of biological macromolecules by X-ray diffraction [R.F. Fischetti, et al., GM/CA canted undulator beamlines for protein crystallography, Acta Crystallogr. A 61 (2005) C139]. The facility consists of three beamlines; two based on canted undulators and one on a bending magnet. The scientific and technical goals of the CAT emphasize streamlined, efficient throughput for a variety of sample types, sizes and qualities, representing the cutting edge of structural biology research. For this purpose all three beamlines are equipped with the ALS-stylemore » robots [C.W.Cork, et al. Status of the BCSB automated sample mounting and alignment system for macromolecular crystallography at the Advanced Light Source, SRI-2003, San-Francisco, CA, USA, August 25-29, 2003] for an automated mounting of cryo-protected macromolecular crystals. This report summarizes software and technical solutions implemented with the first of the three operational robots at beamline 23-ID-B. The automounter's Dewar can hold up to 72 or 96 samples residing in six Rigaku ACTOR magazines or ALS-style pucks, respectively. Mounting of a crystal takes approximately 2 s, during which time the temperature of the crystal is maintained near that of liquid nitrogen.« less

  18. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application

    DOE PAGES

    Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.

    2017-10-30

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less

  19. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less

  20. Preliminary Full-Scale Tests of the Center for Automated Processing of Hardwoods' Auto-Image

    Treesearch

    Philip A. Araman; Janice K. Wiedenbeck

    1995-01-01

    Automated lumber grading and yield optimization using computer controlled saws will be plausible for hardwoods if and when lumber scanning systems can reliably identify all defects by type. Existing computer programs could then be used to grade the lumber, identify the best cut-up solution, and control the sawing machines. The potential value of a scanning grading...

  1. Human factors in the presentation of computer-generated information - Aspects of design and application in automated flight traffic

    NASA Technical Reports Server (NTRS)

    Roske-Hofstrand, Renate J.

    1990-01-01

    The man-machine interface and its influence on the characteristics of computer displays in automated air traffic is discussed. The graphical presentation of spatial relationships and the problems it poses for air traffic control, and the solution of such problems are addressed. Psychological factors involved in the man-machine interface are stressed.

  2. Ensembles generated from crystal structures of single distant homologues solve challenging molecular-replacement cases in AMPLE.

    PubMed

    Rigden, Daniel J; Thomas, Jens M H; Simkovic, Felix; Simpkin, Adam; Winn, Martyn D; Mayans, Olga; Keegan, Ronan M

    2018-03-01

    Molecular replacement (MR) is the predominant route to solution of the phase problem in macromolecular crystallography. Although routine in many cases, it becomes more effortful and often impossible when the available experimental structures typically used as search models are only distantly homologous to the target. Nevertheless, with current powerful MR software, relatively small core structures shared between the target and known structure, of 20-40% of the overall structure for example, can succeed as search models where they can be isolated. Manual sculpting of such small structural cores is rarely attempted and is dependent on the crystallographer's expertise and understanding of the protein family in question. Automated search-model editing has previously been performed on the basis of sequence alignment, in order to eliminate, for example, side chains or loops that are not present in the target, or on the basis of structural features (e.g. solvent accessibility) or crystallographic parameters (e.g. B factors). Here, based on recent work demonstrating a correlation between evolutionary conservation and protein rigidity/packing, novel automated ways to derive edited search models from a given distant homologue over a range of sizes are presented. A variety of structure-based metrics, many readily obtained from online webservers, can be fed to the MR pipeline AMPLE to produce search models that succeed with a set of test cases where expertly manually edited comparators, further processed in diverse ways with MrBUMP, fail. Further significant performance gains result when the structure-based distance geometry method CONCOORD is used to generate ensembles from the distant homologue. To our knowledge, this is the first such approach whereby a single structure is meaningfully transformed into an ensemble for the purposes of MR. Additional cases further demonstrate the advantages of the approach. CONCOORD is freely available and computationally inexpensive, so these novel methods offer readily available new routes to solve difficult MR cases.

  3. Ensembles generated from crystal structures of single distant homologues solve challenging molecular-replacement cases in AMPLE

    PubMed Central

    Simpkin, Adam; Mayans, Olga; Keegan, Ronan M.

    2018-01-01

    Molecular replacement (MR) is the predominant route to solution of the phase problem in macromolecular crystallography. Although routine in many cases, it becomes more effortful and often impossible when the available experimental structures typically used as search models are only distantly homologous to the target. Nevertheless, with current powerful MR software, relatively small core structures shared between the target and known structure, of 20–40% of the overall structure for example, can succeed as search models where they can be isolated. Manual sculpting of such small structural cores is rarely attempted and is dependent on the crystallographer’s expertise and understanding of the protein family in question. Automated search-model editing has previously been performed on the basis of sequence alignment, in order to eliminate, for example, side chains or loops that are not present in the target, or on the basis of structural features (e.g. solvent accessibility) or crystallographic parameters (e.g. B factors). Here, based on recent work demonstrating a correlation between evolutionary conservation and protein rigidity/packing, novel automated ways to derive edited search models from a given distant homologue over a range of sizes are presented. A variety of structure-based metrics, many readily obtained from online webservers, can be fed to the MR pipeline AMPLE to produce search models that succeed with a set of test cases where expertly manually edited comparators, further processed in diverse ways with MrBUMP, fail. Further significant performance gains result when the structure-based distance geometry method CONCOORD is used to generate ensembles from the distant homologue. To our knowledge, this is the first such approach whereby a single structure is meaningfully transformed into an ensemble for the purposes of MR. Additional cases further demonstrate the advantages of the approach. CONCOORD is freely available and computationally inexpensive, so these novel methods offer readily available new routes to solve difficult MR cases. PMID:29533226

  4. Physiological Self-Regulation and Adaptive Automation

    NASA Technical Reports Server (NTRS)

    Prinzell, Lawrence J.; Pope, Alan T.; Freeman, Frederick G.

    2007-01-01

    Adaptive automation has been proposed as a solution to current problems of human-automation interaction. Past research has shown the potential of this advanced form of automation to enhance pilot engagement and lower cognitive workload. However, there have been concerns voiced regarding issues, such as automation surprises, associated with the use of adaptive automation. This study examined the use of psychophysiological self-regulation training with adaptive automation that may help pilots deal with these problems through the enhancement of cognitive resource management skills. Eighteen participants were assigned to 3 groups (self-regulation training, false feedback, and control) and performed resource management, monitoring, and tracking tasks from the Multiple Attribute Task Battery. The tracking task was cycled between 3 levels of task difficulty (automatic, adaptive aiding, manual) on the basis of the electroencephalogram-derived engagement index. The other two tasks remained in automatic mode that had a single automation failure. Those participants who had received self-regulation training performed significantly better and reported lower National Aeronautics and Space Administration Task Load Index scores than participants in the false feedback and control groups. The theoretical and practical implications of these results for adaptive automation are discussed.

  5. An overview of measurement solutions for digital systems

    NASA Astrophysics Data System (ADS)

    Lemke, D.

    An overview of digital measurement solutions is presented. A summary of the digital instrumentation that is currently available on the commercial market is given. The technology trends that are driving commercial instrumentation suppliers to provide newer and more advanced features and better measurement solutions for the future is reviewed. The implications of developments in design automation for electrical engineers is discussed.

  6. CFD Extraction Tool for TecPlot From DPLR Solutions

    NASA Technical Reports Server (NTRS)

    Norman, David

    2013-01-01

    This invention is a TecPlot macro of a computer program in the TecPlot programming language that processes data from DPLR solutions in TecPlot format. DPLR (Data-Parallel Line Relaxation) is a NASA computational fluid dynamics (CFD) code, and TecPlot is a commercial CFD post-processing tool. The Tec- Plot data is in SI units (same as DPLR output). The invention converts the SI units into British units. The macro modifies the TecPlot data with unit conversions, and adds some extra calculations. After unit conversions, the macro cuts a slice, and adds vectors on the current plot for output format. The macro can also process surface solutions. Existing solutions use manual conversion and superposition. The conversion is complicated because it must be applied to a range of inter-related scalars and vectors to describe a 2D or 3D flow field. It processes the CFD solution to create superposition/comparison of scalars and vectors. The existing manual solution is cumbersome, open to errors, slow, and cannot be inserted into an automated process. This invention is quick and easy to use, and can be inserted into an automated data-processing algorithm.

  7. Fatigue and voluntary utilization of automation in simulated driving.

    PubMed

    Neubauer, Catherine; Matthews, Gerald; Langheim, Lisa; Saxby, Dyani

    2012-10-01

    A driving simulator was used to assess the impact on fatigue, stress, and workload of full vehicle automation that was initiated by the driver. Previous studies have shown that mandatory use of full automation induces a state of "passive fatigue" associated with loss of alertness. By contrast, voluntary use of automation may enhance the driver's perceptions of control and ability to manage fatigue. Participants were assigned to one of two experimental conditions, automation optional (AO) and nonautomation (NA), and then performed a 35 min, monotonous simulated drive. In the last 5 min, automation was unavailable and drivers were required to respond to an emergency event. Subjective state and workload were evaluated before and after the drive. Making automation available to the driver failed to alleviate fatigue and stress states induced by driving in monotonous conditions. Drivers who were fatigued prior to the drive were more likely to choose to use automation, but automation use increased distress, especially in fatigue-prone drivers. Drivers in the AO condition were slower to initiate steering responses to the emergency event, suggesting optional automation may be distracting. Optional, driver-controlled automation appears to pose the same dangers to task engagement and alertness as externally initiated automation. Drivers of automated vehicles may be vulnerable to fatigue that persists when normal vehicle control is restored. It is important to evaluate automated systems' impact on driver fatigue, to seek design solutions to the issue of maintaining driver engagement, and to address the vulnerabilities of fatigue-prone drivers.

  8. Automated reporting of pharmacokinetic study results: gaining efficiency downstream from the laboratory.

    PubMed

    Schaefer, Peter

    2011-07-01

    The purpose of bioanalysis in the pharmaceutical industry is to provide 'raw' data about the concentration of a drug candidate and its metabolites as input for studies of drug properties such as pharmacokinetic (PK), toxicokinetic, bioavailability/bioequivalence and other studies. Building a seamless workflow from the laboratory to final reports is an ongoing challenge for IT groups and users alike. In such a workflow, PK automation can provide companies with the means to vastly increase the productivity of their scientific staff while improving the quality and consistency of their reports on PK analyses. This report presents the concept and benefits of PK automation and discuss which features of an automated reporting workflow should be translated into software requirements that pharmaceutical companies can use to select or build an efficient and effective PK automation solution that best meets their needs.

  9. A Highly Flexible, Automated System Providing Reliable Sample Preparation in Element- and Structure-Specific Measurements.

    PubMed

    Vorberg, Ellen; Fleischer, Heidi; Junginger, Steffen; Liu, Hui; Stoll, Norbert; Thurow, Kerstin

    2016-10-01

    Life science areas require specific sample pretreatment to increase the concentration of the analytes and/or to convert the analytes into an appropriate form for the detection and separation systems. Various workstations are commercially available, allowing for automated biological sample pretreatment. Nevertheless, due to the required temperature, pressure, and volume conditions in typical element and structure-specific measurements, automated platforms are not suitable for analytical processes. Thus, the purpose of the presented investigation was the design, realization, and evaluation of an automated system ensuring high-precision sample preparation for a variety of analytical measurements. The developed system has to enable system adaption and high performance flexibility. Furthermore, the system has to be capable of dealing with the wide range of required vessels simultaneously, allowing for less cost and time-consuming process steps. However, the system's functionality has been confirmed in various validation sequences. Using element-specific measurements, the automated system was up to 25% more precise compared to the manual procedure and as precise as the manual procedure using structure-specific measurements. © 2015 Society for Laboratory Automation and Screening.

  10. Strong stabilization servo controller with optimization of performance criteria.

    PubMed

    Sarjaš, Andrej; Svečko, Rajko; Chowdhury, Amor

    2011-07-01

    Synthesis of a simple robust controller with a pole placement technique and a H(∞) metrics is the method used for control of a servo mechanism with BLDC and BDC electric motors. The method includes solving a polynomial equation on the basis of the chosen characteristic polynomial using the Manabe standard polynomial form and parametric solutions. Parametric solutions are introduced directly into the structure of the servo controller. On the basis of the chosen parametric solutions the robustness of a closed-loop system is assessed through uncertainty models and assessment of the norm ‖•‖(∞). The design procedure and the optimization are performed with a genetic algorithm differential evolution - DE. The DE optimization method determines a suboptimal solution throughout the optimization on the basis of a spectrally square polynomial and Šiljak's absolute stability test. The stability of the designed controller during the optimization is being checked with Lipatov's stability condition. Both utilized approaches: Šiljak's test and Lipatov's condition, check the robustness and stability characteristics on the basis of the polynomial's coefficients, and are very convenient for automated design of closed-loop control and for application in optimization algorithms such as DE. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Geophysical imaging of karst features in Missouri

    NASA Astrophysics Data System (ADS)

    Obi, Jeremiah Chukwunonso

    Automated electrical resistivity tomography (ERT) supported with multichannel analysis of surface waves (MASW) and boring data were used to map karst related features in Missouri in order to understand karst processes better in Missouri. Previous works on karst in Missouri were mostly surficial mapping of bedrock outcrops and joints, which are not enough to define the internal structure of karst system, since most critical processes in karst occur underground. To understand these processes better, the density, placement and pattern of karst related features like solution-widened joints and voids, as well as top of bedrock were mapped. In the course of the study, six study sites were visited in Missouri. The sites were in Nixa, Gasconade River Bridge in Lebanon, Battlefield, Aurora, Protem and Richland. The case studies reflect to a large extent some of the problems inherent in karst terrain, ranging from environmental problems to structural problems especially sinkhole collapses. The result of the study showed that karst in Missouri is mostly formed as a result of piping of sediments through solution-widened joints, with a pattern showing that the joints/fractures are mostly filled with moist clay-sized materials of low resistivity values. The highest density of mapped solution-widened joints was one in every one hundred and fifty feet, and these areas are where intense dissolution is taking place, and bedrock pervasively fractured. The study also showed that interpreted solution-widened joints trend in different directions, and often times conform with known structural lineaments in the area. About 40% of sinkhole collapses in the study areas are anthropogenic. Karst in Missouri varies, and can be classified as a combination of kI (juvenile), kIII (mature) and kIV (complex) karsts.

  12. Inspection and Verification of Domain Models with PlanWorks and Aver

    NASA Technical Reports Server (NTRS)

    Bedrax-Weiss, Tania; Frank, Jeremy; Iatauro, Michael; McGann, Conor

    2006-01-01

    When developing a domain model, it seems natural to bring the traditional informal tools of inspection and verification, debuggers and automated test suites, to bear upon the problems that will inevitably arise. Debuggers that allow inspection of registers and memory and stepwise execution have been a staple of software development of all sorts from the very beginning. Automated testing has repeatedly proven its considerable worth, to the extent that an entire design philosophy (Test Driven Development) has been developed around the writing of tests. Unfortunately, while not entirely without their uses, the limitations of these tools and the nature of the complexity of models and the underlying planning systems make the diagnosis of certain classes of problems and the verification of their solutions difficult or impossible. Debuggers provide a good local view of executing code, allowing a fine-grained look at algorithms and data. This view is, however, usually only at the level of the current scope in the implementation language, and the data-inspection capabilities of most debuggers usually consist of on-line print statements. More modem graphical debuggers offer a sort of tree view of data structures, but even this is too low-level and is often inappropriate for the kinds of structures created by planning systems. For instance, god or constraint networks are at best awkward when visualized as trees. Any any non-structural link between data structures, as through a lookup table, isn't captured at all. Further, while debuggers have powerful breakpointing facilities that are suitable for finding specific algorithmic errors, they have little use in the diagnosis of modeling errors.

  13. Additive Construction with Mobile Emplacement (ACME) / Automated Construction of Expeditionary Structures (ACES) Materials Delivery System (MDS)

    NASA Technical Reports Server (NTRS)

    Mueller, R. P.; Townsend, I. I.; Tamasy, G. J.; Evers, C. J.; Sibille, L. J.; Edmunson, J. E.; Fiske, M. R.; Fikes, J. C.; Case, M.

    2018-01-01

    The purpose of the Automated Construction of Expeditionary Structures, Phase 3 (ACES 3) project is to incorporate the Liquid Goods Delivery System (LGDS) into the Dry Goods Delivery System (DGDS) structure to create an integrated and automated Materials Delivery System (MDS) for 3D printing structures with ordinary Portland cement (OPC) concrete. ACES 3 is a prototype for 3-D printing barracks for soldiers in forward bases, here on Earth. The LGDS supports ACES 3 by storing liquid materials, mixing recipe batches of liquid materials, and working with the Dry Goods Feed System (DGFS) previously developed for ACES 2, combining the materials that are eventually extruded out of the print nozzle. Automated Construction of Expeditionary Structures, Phase 3 (ACES 3) is a project led by the US Army Corps of Engineers (USACE) and supported by NASA. The equivalent 3D printing system for construction in space is designated Additive Construction with Mobile Emplacement (ACME) by NASA.

  14. Instrumentation Automation for Concrete Structures: Report 2, Automation Hardware and Retrofitting Techniques, and Report 3, Available Data Collection and Reduction Software

    DTIC Science & Technology

    1987-06-01

    commercial products. · OP -- Typical cutout at a plumbiinc location where an automated monitoring system has bv :• installed. The sensor used with the...This report provides a description of commercially available sensors , instruments, and ADP equipment that may be selected to fully automate...automated. The automated plumbline monitoring system includes up to twelve sensors , repeaters, a system controller, and a printer. The system may

  15. Robotic space construction

    NASA Technical Reports Server (NTRS)

    Mixon, Randolph W.; Hankins, Walter W., III; Wise, Marion A.

    1988-01-01

    Research at Langley AFB concerning automated space assembly is reviewed, including a Space Shuttle experiment to test astronaut ability to assemble a repetitive truss structure, testing the use of teleoperated manipulators to construct the Assembly Concept for Construction of Erectable Space Structures I truss, and assessment of the basic characteristics of manipulator assembly operations. Other research topics include the simultaneous coordinated control of dual-arm manipulators and the automated assembly of candidate Space Station trusses. Consideration is given to the construction of an Automated Space Assembly Laboratory to study and develop the algorithms, procedures, special purpose hardware, and processes needed for automated truss assembly.

  16. Atomic structure solution of the complex quasicrystal approximant Al77Rh15Ru8 from electron diffraction data.

    PubMed

    Samuha, Shmuel; Mugnaioli, Enrico; Grushko, Benjamin; Kolb, Ute; Meshi, Louisa

    2014-12-01

    The crystal structure of the novel Al77Rh15Ru8 phase (which is an approximant of decagonal quasicrystals) was determined using modern direct methods (MDM) applied to automated electron diffraction tomography (ADT) data. The Al77Rh15Ru8 E-phase is orthorhombic [Pbma, a = 23.40 (5), b = 16.20 (4) and c = 20.00 (5) Å] and has one of the most complicated intermetallic structures solved solely by electron diffraction methods. Its structural model consists of 78 unique atomic positions in the unit cell (19 Rh/Ru and 59 Al). Precession electron diffraction (PED) patterns and high-resolution electron microscopy (HRTEM) images were used for the validation of the proposed atomic model. The structure of the E-phase is described using hierarchical packing of polyhedra and a single type of tiling in the form of a parallelogram. Based on this description, the structure of the E-phase is compared with that of the ε6-phase formed in Al-Rh-Ru at close compositions.

  17. Fluorescent Approaches to High Throughput Crystallography

    NASA Technical Reports Server (NTRS)

    Pusey, Marc L.; Forsythe, Elizabeth; Achari, Aniruddha

    2006-01-01

    We have shown that by covalently modifying a subpopulation, less than or equal to 1%, of a macromolecule with a fluorescent probe, the labeled material will add to a growing crystal as a microheterogeneous growth unit. Labeling procedures can be readily incorporated into the final stages of purification, and the presence of the probe at low concentrations does not affect the X-ray data quality or the crystallization behavior. The presence of the trace fluorescent label gives a number of advantages when used with high throughput crystallizations. The covalently attached probe will concentrate in the crystal relative to the solution, and under fluorescent illumination crystals show up as bright objects against a dark background. Non-protein structures, such as salt crystals, will not incorporate the probe and will not show up under fluorescent illumination. Brightly fluorescent crystals are readily found against less bright precipitated phases, which under white light illumination may obscure the crystals. Automated image analysis to find crystals should be greatly facilitated, without having to first define crystallization drop boundaries as the protein or protein structures is all that shows up. Fluorescence intensity is a faster search parameter, whether visually or by automated methods, than looking for crystalline features. We are now testing the use of high fluorescence intensity regions, in the absence of clear crystalline features or "hits", as a means for determining potential lead conditions. A working hypothesis is that kinetics leading to non-structured phases may overwhelm and trap more slowly formed ordered assemblies, which subsequently show up as regions of brighter fluorescence intensity. Preliminary experiments with test proteins have resulted in the extraction of a number of crystallization conditions from screening outcomes based solely on the presence of bright fluorescent regions. Subsequent experiments will test this approach using a wider range of proteins. The trace fluorescently labeled crystals will also emit with sufficient intensity to aid in the automation of crystal alignment using relatively low cost optics, further increasing throughput at synchrotrons.

  18. NVR-BIP: Nuclear Vector Replacement using Binary Integer Programming for NMR Structure-Based Assignments.

    PubMed

    Apaydin, Mehmet Serkan; Çatay, Bülent; Patrick, Nicholas; Donald, Bruce R

    2011-05-01

    Nuclear magnetic resonance (NMR) spectroscopy is an important experimental technique that allows one to study protein structure and dynamics in solution. An important bottleneck in NMR protein structure determination is the assignment of NMR peaks to the corresponding nuclei. Structure-based assignment (SBA) aims to solve this problem with the help of a template protein which is homologous to the target and has applications in the study of structure-activity relationship, protein-protein and protein-ligand interactions. We formulate SBA as a linear assignment problem with additional nuclear overhauser effect constraints, which can be solved within nuclear vector replacement's (NVR) framework (Langmead, C., Yan, A., Lilien, R., Wang, L. and Donald, B. (2003) A Polynomial-Time Nuclear Vector Replacement Algorithm for Automated NMR Resonance Assignments. Proc. the 7th Annual Int. Conf. Research in Computational Molecular Biology (RECOMB) , Berlin, Germany, April 10-13, pp. 176-187. ACM Press, New York, NY. J. Comp. Bio. , (2004), 11, pp. 277-298; Langmead, C. and Donald, B. (2004) An expectation/maximization nuclear vector replacement algorithm for automated NMR resonance assignments. J. Biomol. NMR , 29, 111-138). Our approach uses NVR's scoring function and data types and also gives the option of using CH and NH residual dipolar coupling (RDCs), instead of NH RDCs which NVR requires. We test our technique on NVR's data set as well as on four new proteins. Our results are comparable to NVR's assignment accuracy on NVR's test set, but higher on novel proteins. Our approach allows partial assignments. It is also complete and can return the optimum as well as near-optimum assignments. Furthermore, it allows us to analyze the information content of each data type and is easily extendable to accept new forms of input data, such as additional RDCs.

  19. A practical approach to automate randomized design of experiments for ligand-binding assays.

    PubMed

    Tsoi, Jennifer; Patel, Vimal; Shih, Judy

    2014-03-01

    Design of experiments (DOE) is utilized in optimizing ligand-binding assay by modeling factor effects. To reduce the analyst's workload and error inherent with DOE, we propose the integration of automated liquid handlers to perform the randomized designs. A randomized design created from statistical software was imported into custom macro converting the design into a liquid-handler worklist to automate reagent delivery. An optimized assay was transferred to a contract research organization resulting in a successful validation. We developed a practical solution for assay optimization by integrating DOE and automation to increase assay robustness and enable successful method transfer. The flexibility of this process allows it to be applied to a variety of assay designs.

  20. Automatic welding systems for large ship hulls

    NASA Astrophysics Data System (ADS)

    Arregi, B.; Granados, S.; Hascoet, JY.; Hamilton, K.; Alonso, M.; Ares, E.

    2012-04-01

    Welding processes represents about 40% of the total production time in shipbuilding. Although most of the indoor welding work is automated, outdoor operations still require the involvement of numerous operators. To automate hull welding operations is a priority in large shipyards. The objective of the present work is to develop a comprehensive welding system capable of working with several welding layers in an automated way. There are several difficulties for the seam tracking automation of the welding process. The proposed solution is the development of a welding machine capable of moving autonomously along the welding seam, controlling both the position of the torch and the welding parameters to adjust the thickness of the weld bead to the actual gap between the hull plates.

  1. Constrained multibody system dynamics: An automated approach

    NASA Technical Reports Server (NTRS)

    Kamman, J. W.; Huston, R. L.

    1982-01-01

    The governing equations for constrained multibody systems are formulated in a manner suitable for their automated, numerical development and solution. The closed loop problem of multibody chain systems is addressed. The governing equations are developed by modifying dynamical equations obtained from Lagrange's form of d'Alembert's principle. The modifications is based upon a solution of the constraint equations obtained through a zero eigenvalues theorem, is a contraction of the dynamical equations. For a system with n-generalized coordinates and m-constraint equations, the coefficients in the constraint equations may be viewed as constraint vectors in n-dimensional space. In this setting the system itself is free to move in the n-m directions which are orthogonal to the constraint vectors.

  2. Automated and Cooperative Vehicle Merging at Highway On-Ramps

    DOE PAGES

    Rios-Torres, Jackeline; Malikopoulos, Andreas A.

    2016-08-05

    Recognition of necessities of connected and automated vehicles (CAVs) is gaining momentum. CAVs can improve both transportation network efficiency and safety through control algorithms that can harmonically use all existing information to coordinate the vehicles. This paper addresses the problem of optimally coordinating CAVs at merging roadways to achieve smooth traffic flow without stop-and-go driving. Here we present an optimization framework and an analytical closed-form solution that allows online coordination of vehicles at merging zones. The effectiveness of the efficiency of the proposed solution is validated through a simulation, and it is shown that coordination of vehicles can significantly reducemore » both fuel consumption and travel time.« less

  3. The study of features of the structural organization of the au-tomated information processing system of the collective type

    NASA Astrophysics Data System (ADS)

    Nikolaev, V. N.; Titov, D. V.; Syryamkin, V. I.

    2018-05-01

    The comparative assessment of the level of channel capacity of different variants of the structural organization of the automated information processing systems is made. The information processing time assessment model depending on the type of standard elements and their structural organization is developed.

  4. Automated crystallographic system for high-throughput protein structure determination.

    PubMed

    Brunzelle, Joseph S; Shafaee, Padram; Yang, Xiaojing; Weigand, Steve; Ren, Zhong; Anderson, Wayne F

    2003-07-01

    High-throughput structural genomic efforts require software that is highly automated, distributive and requires minimal user intervention to determine protein structures. Preliminary experiments were set up to test whether automated scripts could utilize a minimum set of input parameters and produce a set of initial protein coordinates. From this starting point, a highly distributive system was developed that could determine macromolecular structures at a high throughput rate, warehouse and harvest the associated data. The system uses a web interface to obtain input data and display results. It utilizes a relational database to store the initial data needed to start the structure-determination process as well as generated data. A distributive program interface administers the crystallographic programs which determine protein structures. Using a test set of 19 protein targets, 79% were determined automatically.

  5. Airborne electronics for automated flight systems

    NASA Technical Reports Server (NTRS)

    Graves, G. B., Jr.

    1975-01-01

    The increasing importance of airborne electronics for use in automated flight systems is briefly reviewed with attention to both basic aircraft control functions and flight management systems for operational use. The requirements for high levels of systems reliability are recognized. Design techniques are discussed and the areas of control systems, computing and communications are considered in terms of key technical problems and trends for their solution.

  6. The administrative window into the integrated DBMS

    NASA Technical Reports Server (NTRS)

    Brock, G. H.

    1984-01-01

    A good office automation system manned by a team of facilitators seeking opportunities to serve end users could go a long way toward defining a DBMS that serves management. The problems of DBMS organization, alternative approaches to solving some of the major problems, problems that may have no solution, and how office automation fits into the development of the manager's management information system are discussed.

  7. Automated Report Generation for Research Data Repositories: From i2b2 to PDF.

    PubMed

    Thiemann, Volker S; Xu, Tingyan; Röhrig, Rainer; Majeed, Raphael W

    2017-01-01

    We developed an automated toolchain to generate reports of i2b2 data. It is based on free open source software and runs on a Java Application Server. It is sucessfully used in an ED registry project. The solution is highly configurable and portable to other projects based on i2b2 or compatible factual data sources.

  8. An Analysis on a Negotiation Model Based on Multiagent Systems with Symbiotic Learning and Evolution

    NASA Astrophysics Data System (ADS)

    Hossain, Md. Tofazzal

    This study explores an evolutionary analysis on a negotiation model based on Masbiole (Multiagent Systems with Symbiotic Learning and Evolution) which has been proposed as a new methodology of Multiagent Systems (MAS) based on symbiosis in the ecosystem. In Masbiole, agents evolve in consideration of not only their own benefits and losses, but also the benefits and losses of opponent agents. To aid effective application of Masbiole, we develop a competitive negotiation model where rigorous and advanced intelligent decision-making mechanisms are required for agents to achieve solutions. A Negotiation Protocol is devised aiming at developing a set of rules for agents' behavior during evolution. Simulations use a newly developed evolutionary computing technique, called Genetic Network Programming (GNP) which has the directed graph-type gene structure that can develop and design the required intelligent mechanisms for agents. In a typical scenario, competitive negotiation solutions are reached by concessions that are usually predetermined in the conventional MAS. In this model, however, not only concession is determined automatically by symbiotic evolution (making the system intelligent, automated, and efficient) but the solution also achieves Pareto optimal automatically.

  9. Evolutionary Computation for the Identification of Emergent Behavior in Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Terrile, Richard J.; Guillaume, Alexandre

    2009-01-01

    Over the past several years the Center for Evolutionary Computation and Automated Design at the Jet Propulsion Laboratory has developed a technique based on Evolutionary Computational Methods (ECM) that allows for the automated optimization of complex computationally modeled systems. An important application of this technique is for the identification of emergent behaviors in autonomous systems. Mobility platforms such as rovers or airborne vehicles are now being designed with autonomous mission controllers that can find trajectories over a solution space that is larger than can reasonably be tested. It is critical to identify control behaviors that are not predicted and can have surprising results (both good and bad). These emergent behaviors need to be identified, characterized and either incorporated into or isolated from the acceptable range of control characteristics. We use cluster analysis of automatically retrieved solutions to identify isolated populations of solutions with divergent behaviors.

  10. An automated repair method of water pipe infrastructure using carbon fiber bundles

    NASA Astrophysics Data System (ADS)

    Wisotzkey, Sean; Carr, Heath; Fyfe, Ed

    2011-04-01

    The United States water pipe infrastructure is made up of over 2 million miles of pipe. Due to age and deterioration, a large portion of this pipe is in need of repair to prevent catastrophic failures. Current repair methods generally involve intrusive techniques that can be time consuming and costly, but also can cause major societal impacts. A new automated repair method incorporating innovative carbon fiber technology is in development. This automated method would eliminate the need for trenching and would vastly cut time and labor costs, providing a much more economical pipe repair solution.

  11. Note: Automated electrochemical etching and polishing of silver scanning tunneling microscope tips.

    PubMed

    Sasaki, Stephen S; Perdue, Shawn M; Rodriguez Perez, Alejandro; Tallarida, Nicholas; Majors, Julia H; Apkarian, V Ara; Lee, Joonhee

    2013-09-01

    Fabrication of sharp and smooth Ag tips is crucial in optical scanning probe microscope experiments. To ensure reproducible tip profiles, the polishing process is fully automated using a closed-loop laminar flow system to deliver the electrolytic solution to moving electrodes mounted on a motorized translational stage. The repetitive translational motion is controlled precisely on the μm scale with a stepper motor and screw-thread mechanism. The automated setup allows reproducible control over the tip profile and improves smoothness and sharpness of tips (radius 27 ± 18 nm), as measured by ultrafast field emission.

  12. Automated basin delineation from digital terrain data

    NASA Technical Reports Server (NTRS)

    Marks, D.; Dozier, J.; Frew, J.

    1983-01-01

    While digital terrain grids are now in wide use, accurate delineation of drainage basins from these data is difficult to efficiently automate. A recursive order N solution to this problem is presented. The algorithm is fast because no point in the basin is checked more than once, and no points outside the basin are considered. Two applications for terrain analysis and one for remote sensing are given to illustrate the method, on a basin with high relief in the Sierra Nevada. This technique for automated basin delineation will enhance the utility of digital terrain analysis for hydrologic modeling and remote sensing.

  13. Advancing haemostasis automation--successful implementation of robotic centrifugation and sample processing in a tertiary service hospital.

    PubMed

    Sédille-Mostafaie, Nazanin; Engler, Hanna; Lutz, Susanne; Korte, Wolfgang

    2013-06-01

    Laboratories today face increasing pressure to automate operations due to increasing workloads and the need to reduce expenditure. Few studies to date have focussed on the laboratory automation of preanalytical coagulation specimen processing. In the present study, we examined whether a clinical chemistry automation protocol meets the preanalytical requirements for the analyses of coagulation. During the implementation of laboratory automation, we began to operate a pre- and postanalytical automation system. The preanalytical unit processes blood specimens for chemistry, immunology and coagulation by automated specimen processing. As the production of platelet-poor plasma is highly dependent on optimal centrifugation, we examined specimen handling under different centrifugation conditions in order to produce optimal platelet deficient plasma specimens. To this end, manually processed models centrifuged at 1500 g for 5 and 20 min were compared to an automated centrifugation model at 3000 g for 7 min. For analytical assays that are performed frequently enough to be targets for full automation, Passing-Bablok regression analysis showed close agreement between different centrifugation methods, with a correlation coefficient between 0.98 and 0.99 and a bias between -5% and +6%. For seldom performed assays that do not mandate full automation, the Passing-Bablok regression analysis showed acceptable to poor agreement between different centrifugation methods. A full automation solution is suitable and can be recommended for frequent haemostasis testing.

  14. MEthods of ASsessing blood pressUre: identifying thReshold and target valuEs (MeasureBP): a review & study protocol.

    PubMed

    Blom, Kimberly C; Farina, Sasha; Gomez, Yessica-Haydee; Campbell, Norm R C; Hemmelgarn, Brenda R; Cloutier, Lyne; McKay, Donald W; Dawes, Martin; Tobe, Sheldon W; Bolli, Peter; Gelfer, Mark; McLean, Donna; Bartlett, Gillian; Joseph, Lawrence; Featherstone, Robin; Schiffrin, Ernesto L; Daskalopoulou, Stella S

    2015-04-01

    Despite progress in automated blood pressure measurement (BPM) technology, there is limited research linking hard outcomes to automated office BPM (OBPM) treatment targets and thresholds. Equivalences for automated BPM devices have been estimated from approximations of standardized manual measurements of 140/90 mmHg. Until outcome-driven targets and thresholds become available for automated measurement methods, deriving evidence-based equivalences between automated methods and standardized manual OBPM is the next best solution. The MeasureBP study group was initiated by the Canadian Hypertension Education Program to close this critical knowledge gap. MeasureBP aims to define evidence-based equivalent values between standardized manual OBPM and automated BPM methods by synthesizing available evidence using a systematic review and individual subject-level data meta-analyses. This manuscript provides a review of the literature and MeasureBP study protocol. These results will lay the evidenced-based foundation to resolve uncertainties within blood pressure guidelines which, in turn, will improve the management of hypertension.

  15. Designing Domain-Specific HUMS Architectures: An Automated Approach

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi; Agarwal, Neha; Kumar, Pramod; Sundaram, Parthiban

    2004-01-01

    The HUMS automation system automates the design of HUMS architectures. The automated design process involves selection of solutions from a large space of designs as well as pure synthesis of designs. Hence the whole objective is to efficiently search for or synthesize designs or parts of designs in the database and to integrate them to form the entire system design. The automation system adopts two approaches in order to produce the designs: (a) Bottom-up approach and (b) Top down approach. Both the approaches are endowed with a Suite of quantitative and quantitative techniques that enable a) the selection of matching component instances, b) the determination of design parameters, c) the evaluation of candidate designs at component-level and at system-level, d) the performance of cost-benefit analyses, e) the performance of trade-off analyses, etc. In short, the automation system attempts to capitalize on the knowledge developed from years of experience in engineering, system design and operation of the HUMS systems in order to economically produce the most optimal and domain-specific designs.

  16. Saving Time with Automated Account Management

    ERIC Educational Resources Information Center

    School Business Affairs, 2013

    2013-01-01

    Thanks to intelligent solutions, schools, colleges, and universities no longer need to manage user account life cycles by using scripts or tedious manual procedures. The solutions house the scripts and manual procedures. Accounts can be automatically created, modified, or deleted in all applications within the school. This article describes how an…

  17. Laboratory Evaluation of Ion-Selective Electrodes for Simultaneous Analysis of Macronutrients in Hydroponic Solution

    USDA-ARS?s Scientific Manuscript database

    Automated sensing of macronutrients in hydroponic solution would allow more efficient management of nutrients for crop growth in closed hydroponic systems. Ion-selective microelectrode technology requires an ion-selective membrane or a solid metal material that responds selectively to one analyte in...

  18. A 'periodic table' for protein structures.

    PubMed

    Taylor, William R

    2002-04-11

    Current structural genomics programs aim systematically to determine the structures of all proteins coded in both human and other genomes, providing a complete picture of the number and variety of protein structures that exist. In the past, estimates have been made on the basis of the incomplete sample of structures currently known. These estimates have varied greatly (between 1,000 and 10,000; see for example refs 1 and 2), partly because of limited sample size but also owing to the difficulties of distinguishing one structure from another. This distinction is usually topological, based on the fold of the protein; however, in strict topological terms (neglecting to consider intra-chain cross-links), protein chains are open strings and hence are all identical. To avoid this trivial result, topologies are determined by considering secondary links in the form of intra-chain hydrogen bonds (secondary structure) and tertiary links formed by the packing of secondary structures. However, small additions to or loss of structure can make large changes to these perceived topologies and such subjective solutions are neither robust nor amenable to automation. Here I formalize both secondary and tertiary links to allow the rigorous and automatic definition of protein topology.

  19. Bonded repair of composite aircraft structures: A review of scientific challenges and opportunities

    NASA Astrophysics Data System (ADS)

    Katnam, K. B.; Da Silva, L. F. M.; Young, T. M.

    2013-08-01

    Advanced composite materials have gained popularity in high-performance structural designs such as aerospace applications that require lightweight components with superior mechanical properties in order to perform in demanding service conditions as well as provide energy efficiency. However, one of the major challenges that the aerospace industry faces with advanced composites - because of their inherent complex damage behaviour - is structural repair. Composite materials are primarily damaged by mechanical loads and/or environmental conditions. If material damage is not extensive, structural repair is the only feasible solution as replacing the entire component is not cost-effective in many cases. Bonded composite repairs (e.g. scarf patches) are generally preferred as they provide enhanced stress transfer mechanisms, joint efficiencies and aerodynamic performance. With an increased usage of advanced composites in primary and secondary aerospace structural components, it is thus essential to have robust, reliable and repeatable structural bonded repair procedures to restore damaged composite components. But structural bonded repairs, especially with primary structures, pose several scientific challenges with the current existing repair technologies. In this regard, the area of structural bonded repair of composites is broadly reviewed - starting from damage assessment to automation - to identify current scientific challenges and future opportunities.

  20. Role Of Impurities On Deformation Of HCP Crystal: A Multi-Scale Approach

    NASA Astrophysics Data System (ADS)

    Bhatia, Mehul Anoopkumar

    Commercially pure (CP) and extra low interstitial (ELI) grade Ti-alloys present excellent corrosion resistance, lightweight, and formability making them attractive materials for expanded use in transportation and medical applications. However, the strength and toughness of CP titanium are affected by relatively small variations in their impurity/solute content (IC), e.g., O, Al, and V. This increase in strength is due to the fact that the solute either increases the critical stress required for the prismatic slip systems ({10- 10}) or activates another slip system ((0001), {10-11}). In particular, solute additions such as O can effectively strengthen the alloy but with an attendant loss in ductility by changing the behavior from wavy (cross slip) to planar nature. In order to understand the underlying behavior of strengthening by solutes, it is important to understand the atomic scale mechanism. This dissertation aims to address this knowledge gap through a synergistic combination of density functional theory (DFT) and molecular dynamics. Further, due to the long-range strain fields of the dislocations and the periodicity of the DFT simulation cells, it is difficult to apply ab initio simulations to study the dislocation core structure. To alleviate this issue we developed a multiscale quantum mechanics/molecular mechanics approach (QM/MM) to study the dislocation core. We use the developed QM/MM method to study the pipe diffusion along a prismatic edge dislocation core. Complementary to the atomistic simulations, the Semi-discrete Variational Peierls-Nabarro model (SVPN) was also used to analyze the dislocation core structure and mobility. The chemical interaction between the solute/impurity and the dislocation core is captured by the so-called generalized stacking fault energy (GSFE) surface which was determined from DFT-VASP calculations. By taking the chemical interaction into consideration the SVPN model can predict the dislocation core structure and mobility in the presence and absence of the solute/impurity and thus reveal the effect of impurity/solute on the softening/hardening behavior in alpha-Ti. Finally, to study the interaction of the dislocation core with other planar defects such as grain boundaries (GB), we develop an automated method to theoretically generate GBs in HCP type materials.

  1. Arcnet(R) On-Fiber -- A Viable Factory Automation Alternative

    NASA Astrophysics Data System (ADS)

    Karlin, Geof; Tucker, Carol S.

    1987-01-01

    Manufacturers need to improve their operating methods and increase their productivity so they can compete successfully in the marketplace. This goal can be achieved through factory automation, and the key to this automation is successful data base management and factory integration. However, large scale factory automation and integration requires effective communications, and this has given rise to an interest in various Local Area Networks or LANs. In a completely integrated and automated factory, the entire organization must have access to the data base, and all departments and functions must be able to communicate with each other. Traditionally, these departments and functions use incompatible equipment, and the ability to make such equipment communicate presents numerous problems. ARCNET, a token-passing LAN which has a significant presence in the office environment today, coupled with fiber optic cable, the cable of the future, provide an effective, low-cost solution to a number of these problems.

  2. Automated Probabilistic Reconstruction of White-Matter Pathways in Health and Disease Using an Atlas of the Underlying Anatomy

    PubMed Central

    Yendiki, Anastasia; Panneck, Patricia; Srinivasan, Priti; Stevens, Allison; Zöllei, Lilla; Augustinack, Jean; Wang, Ruopeng; Salat, David; Ehrlich, Stefan; Behrens, Tim; Jbabdi, Saad; Gollub, Randy; Fischl, Bruce

    2011-01-01

    We have developed a method for automated probabilistic reconstruction of a set of major white-matter pathways from diffusion-weighted MR images. Our method is called TRACULA (TRActs Constrained by UnderLying Anatomy) and utilizes prior information on the anatomy of the pathways from a set of training subjects. By incorporating this prior knowledge in the reconstruction procedure, our method obviates the need for manual interaction with the tract solutions at a later stage and thus facilitates the application of tractography to large studies. In this paper we illustrate the application of the method on data from a schizophrenia study and investigate whether the inclusion of both patients and healthy subjects in the training set affects our ability to reconstruct the pathways reliably. We show that, since our method does not constrain the exact spatial location or shape of the pathways but only their trajectory relative to the surrounding anatomical structures, a set a of healthy training subjects can be used to reconstruct the pathways accurately in patients as well as in controls. PMID:22016733

  3. Large Scale Screening of Low Cost Ferritic Steel Designs For Advanced Ultra Supercritical Boiler Using First Principles Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ouyang, Lizhi

    Advanced Ultra Supercritical Boiler (AUSC) requires materials that can operate in corrosive environment at temperature and pressure as high as 760°C (or 1400°F) and 5000psi, respectively, while at the same time maintain good ductility at low temperature. We develop automated simulation software tools to enable fast large scale screening studies of candidate designs. While direct evaluation of creep rupture strength and ductility are currently not feasible, properties such as energy, elastic constants, surface energy, interface energy, and stack fault energy can be used to assess their relative ductility and creeping strength. We implemented software to automate the complex calculations tomore » minimize human inputs in the tedious screening studies which involve model structures generation, settings for first principles calculations, results analysis and reporting. The software developed in the project and library of computed mechanical properties of phases found in ferritic steels, many are complex solid solutions estimated for the first time, will certainly help the development of low cost ferritic steel for AUSC.« less

  4. Computational technique for stepwise quantitative assessment of equation correctness

    NASA Astrophysics Data System (ADS)

    Othman, Nuru'l Izzah; Bakar, Zainab Abu

    2017-04-01

    Many of the computer-aided mathematics assessment systems that are available today possess the capability to implement stepwise correctness checking of a working scheme for solving equations. The computational technique for assessing the correctness of each response in the scheme mainly involves checking the mathematical equivalence and providing qualitative feedback. This paper presents a technique, known as the Stepwise Correctness Checking and Scoring (SCCS) technique that checks the correctness of each equation in terms of structural equivalence and provides quantitative feedback. The technique, which is based on the Multiset framework, adapts certain techniques from textual information retrieval involving tokenization, document modelling and similarity evaluation. The performance of the SCCS technique was tested using worked solutions on solving linear algebraic equations in one variable. 350 working schemes comprising of 1385 responses were collected using a marking engine prototype, which has been developed based on the technique. The results show that both the automated analytical scores and the automated overall scores generated by the marking engine exhibit high percent agreement, high correlation and high degree of agreement with manual scores with small average absolute and mixed errors.

  5. Social networks to biological networks: systems biology of Mycobacterium tuberculosis.

    PubMed

    Vashisht, Rohit; Bhardwaj, Anshu; Osdd Consortium; Brahmachari, Samir K

    2013-07-01

    Contextualizing relevant information to construct a network that represents a given biological process presents a fundamental challenge in the network science of biology. The quality of network for the organism of interest is critically dependent on the extent of functional annotation of its genome. Mostly the automated annotation pipelines do not account for unstructured information present in volumes of literature and hence large fraction of genome remains poorly annotated. However, if used, this information could substantially enhance the functional annotation of a genome, aiding the development of a more comprehensive network. Mining unstructured information buried in volumes of literature often requires manual intervention to a great extent and thus becomes a bottleneck for most of the automated pipelines. In this review, we discuss the potential of scientific social networking as a solution for systematic manual mining of data. Focusing on Mycobacterium tuberculosis, as a case study, we discuss our open innovative approach for the functional annotation of its genome. Furthermore, we highlight the strength of such collated structured data in the context of drug target prediction based on systems level analysis of pathogen.

  6. Laboratory automation in clinical bacteriology: what system to choose?

    PubMed

    Croxatto, A; Prod'hom, G; Faverjon, F; Rochais, Y; Greub, G

    2016-03-01

    Automation was introduced many years ago in several diagnostic disciplines such as chemistry, haematology and molecular biology. The first laboratory automation system for clinical bacteriology was released in 2006, and it rapidly proved its value by increasing productivity, allowing a continuous increase in sample volumes despite limited budgets and personnel shortages. Today, two major manufacturers, BD Kiestra and Copan, are commercializing partial or complete laboratory automation systems for bacteriology. The laboratory automation systems are rapidly evolving to provide improved hardware and software solutions to optimize laboratory efficiency. However, the complex parameters of the laboratory and automation systems must be considered to determine the best system for each given laboratory. We address several topics on laboratory automation that may help clinical bacteriologists to understand the particularities and operative modalities of the different systems. We present (a) a comparison of the engineering and technical features of the various elements composing the two different automated systems currently available, (b) the system workflows of partial and complete laboratory automation, which define the basis for laboratory reorganization required to optimize system efficiency, (c) the concept of digital imaging and telebacteriology, (d) the connectivity of laboratory automation to the laboratory information system, (e) the general advantages and disadvantages as well as the expected impacts provided by laboratory automation and (f) the laboratory data required to conduct a workflow assessment to determine the best configuration of an automated system for the laboratory activities and specificities. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Asphalt compatibility testing using the automated Heithaus titration test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pauli, A.T.

    1996-12-31

    The Heithaus titration test or variations of the test have been used for over 35 years to predict compatibilities of blends of asphalts from different crude sources. Asphalt compatibility is determined from three calculated parameters that measure the state of peptization of an asphalt or asphalt blend. The parameter p{sub a} is a measure of the peptizability of the asphaltenes. The parameter p{sub a} is a measure of the peptizing power of the maltenes, and the parameter P, derived from p{sub a} and p{sub o} values, is a measure of the overall state of peptization of the asphalt or asphaltmore » blend. In Heithaus original procedure, samples of asphalt were dissolved in toluene and titrated with n-heptane in order to initiate flocculation. The onset of flocculation was detected either by photography or by spotting a filter paper with a small amount of the titrated solution. Recently, an {open_quotes}automated{close_quotes} procedure, after Hotier and Robin, has been developed for use with asphalt. In the automated method UV-visible spectrophotometric detection measures the onset of flocculation as a peak with the percent transmittance plotted as a function of the volume of titrating solvent added to a solution of asphalt. The automated procedure has proven to be less operator dependent and much faster than the original Heithaus procedure. Results from the automated procedure show the data to be consistent with results from the original, {open_quotes}classical{close_quotes} Heithaus procedure.« less

  8. Recent advances in automated protein design and its future challenges.

    PubMed

    Setiawan, Dani; Brender, Jeffrey; Zhang, Yang

    2018-04-25

    Protein function is determined by protein structure which is in turn determined by the corresponding protein sequence. If the rules that cause a protein to adopt a particular structure are understood, it should be possible to refine or even redefine the function of a protein by working backwards from the desired structure to the sequence. Automated protein design attempts to calculate the effects of mutations computationally with the goal of more radical or complex transformations than are accessible by experimental techniques. Areas covered: The authors give a brief overview of the recent methodological advances in computer-aided protein design, showing how methodological choices affect final design and how automated protein design can be used to address problems considered beyond traditional protein engineering, including the creation of novel protein scaffolds for drug development. Also, the authors address specifically the future challenges in the development of automated protein design. Expert opinion: Automated protein design holds potential as a protein engineering technique, particularly in cases where screening by combinatorial mutagenesis is problematic. Considering solubility and immunogenicity issues, automated protein design is initially more likely to make an impact as a research tool for exploring basic biology in drug discovery than in the design of protein biologics.

  9. Management Information Systems and Organizational Structure.

    ERIC Educational Resources Information Center

    Cox, Bruce B.

    1987-01-01

    Discusses the context within which office automation takes place by using the models of the Science of Creative Intelligence and Transcendental Meditation. Organizational structures are compared to the phenomenon of the "collective consciousness" and the development of automated information systems from manual methods of organizational…

  10. A Novel Active Imaging Model to Design Visual Systems: A Case of Inspection System for Specular Surfaces

    PubMed Central

    Azorin-Lopez, Jorge; Fuster-Guillo, Andres; Saval-Calvo, Marcelo; Mora-Mora, Higinio; Garcia-Chamizo, Juan Manuel

    2017-01-01

    The use of visual information is a very well known input from different kinds of sensors. However, most of the perception problems are individually modeled and tackled. It is necessary to provide a general imaging model that allows us to parametrize different input systems as well as their problems and possible solutions. In this paper, we present an active vision model considering the imaging system as a whole (including camera, lighting system, object to be perceived) in order to propose solutions to automated visual systems that present problems that we perceive. As a concrete case study, we instantiate the model in a real application and still challenging problem: automated visual inspection. It is one of the most used quality control systems to detect defects on manufactured objects. However, it presents problems for specular products. We model these perception problems taking into account environmental conditions and camera parameters that allow a system to properly perceive the specific object characteristics to determine defects on surfaces. The validation of the model has been carried out using simulations providing an efficient way to perform a large set of tests (different environment conditions and camera parameters) as a previous step of experimentation in real manufacturing environments, which more complex in terms of instrumentation and more expensive. Results prove the success of the model application adjusting scale, viewpoint and lighting conditions to detect structural and color defects on specular surfaces. PMID:28640211

  11. Cloud-Based CT Dose Monitoring using the DICOM-Structured Report: Fully Automated Analysis in Regard to National Diagnostic Reference Levels.

    PubMed

    Boos, J; Meineke, A; Rubbert, C; Heusch, P; Lanzman, R S; Aissa, J; Antoch, G; Kröpil, P

    2016-03-01

    To implement automated CT dose data monitoring using the DICOM-Structured Report (DICOM-SR) in order to monitor dose-related CT data in regard to national diagnostic reference levels (DRLs). We used a novel in-house co-developed software tool based on the DICOM-SR to automatically monitor dose-related data from CT examinations. The DICOM-SR for each CT examination performed between 09/2011 and 03/2015 was automatically anonymized and sent from the CT scanners to a cloud server. Data was automatically analyzed in accordance with body region, patient age and corresponding DRL for volumetric computed tomography dose index (CTDIvol) and dose length product (DLP). Data of 36,523 examinations (131,527 scan series) performed on three different CT scanners and one PET/CT were analyzed. The overall mean CTDIvol and DLP were 51.3% and 52.8% of the national DRLs, respectively. CTDIvol and DLP reached 43.8% and 43.1% for abdominal CT (n=10,590), 66.6% and 69.6% for cranial CT (n=16,098) and 37.8% and 44.0% for chest CT (n=10,387) of the compared national DRLs, respectively. Overall, the CTDIvol exceeded national DRLs in 1.9% of the examinations, while the DLP exceeded national DRLs in 2.9% of the examinations. Between different CT protocols of the same body region, radiation exposure varied up to 50% of the DRLs. The implemented cloud-based CT dose monitoring based on the DICOM-SR enables automated benchmarking in regard to national DRLs. Overall the local dose exposure from CT reached approximately 50% of these DRLs indicating that DRL actualization as well as protocol-specific DRLs are desirable. The cloud-based approach enables multi-center dose monitoring and offers great potential to further optimize radiation exposure in radiological departments. • The newly developed software based on the DICOM-Structured Report enables large-scale cloud-based CT dose monitoring • The implemented software solution enables automated benchmarking in regard to national DRLs • The local radiation exposure from CT reached approximately 50 % of the national DRLs • The cloud-based approach offers great potential for multi-center dose analysis. © Georg Thieme Verlag KG Stuttgart · New York.

  12. Automated electronic reminders to prevent miscommunication among primary medical, surgical and anaesthesia providers: a root cause analysis.

    PubMed

    Freundlich, Robert E; Grondin, Louise; Tremper, Kevin K; Saran, Kelly A; Kheterpal, Sachin

    2012-10-01

    In this case report, the authors present an adverse event possibly caused by miscommunication among three separate medical teams at their hospital. The authors then discuss the hospital's root cause analysis and its proposed solutions, focusing on the subsequent hospital-wide implementation of an automated electronic reminder for abnormal laboratory values that may have helped to prevent similar medical errors.

  13. The automated multi-stage substructuring system for NASTRAN

    NASA Technical Reports Server (NTRS)

    Field, E. I.; Herting, D. N.; Herendeen, D. L.; Hoesly, R. L.

    1975-01-01

    The substructuring capability developed for eventual installation in Level 16 is now operational in a test version of NASTRAN. Its features are summarized. These include the user-oriented, Case Control type control language, the automated multi-stage matrix processing, the independent direct access data storage facilities, and the static and normal modes solution capabilities. A complete problem analysis sequence is presented with card-by-card description of the user input.

  14. Test of the Center for Automated Processing of Hardwoods' Auto-Image Detection and Computer-Based Grading and Cutup System

    Treesearch

    Philip A. Araman; Janice K. Wiedenbeck

    1995-01-01

    Automated lumber grading and yield optimization using computer controlled saws will be plausible for hardwoods if and when lumber scanning systems can reliably identify all defects by type. Existing computer programs could then be used to grade the lumber, identify the best cut-up solution, and control the sawing machines. The potential value of a scanning grading...

  15. Automated kidney morphology measurements from ultrasound images using texture and edge analysis

    NASA Astrophysics Data System (ADS)

    Ravishankar, Hariharan; Annangi, Pavan; Washburn, Michael; Lanning, Justin

    2016-04-01

    In a typical ultrasound scan, a sonographer measures Kidney morphology to assess renal abnormalities. Kidney morphology can also help to discriminate between chronic and acute kidney failure. The caliper placements and volume measurements are often time consuming and an automated solution will help to improve accuracy, repeatability and throughput. In this work, we developed an automated Kidney morphology measurement solution from long axis Ultrasound scans. Automated kidney segmentation is challenging due to wide variability in kidney shape, size, weak contrast of the kidney boundaries and presence of strong edges like diaphragm, fat layers. To address the challenges and be able to accurately localize and detect kidney regions, we present a two-step algorithm that makes use of edge and texture information in combination with anatomical cues. First, we use an edge analysis technique to localize kidney region by matching the edge map with predefined templates. To accurately estimate the kidney morphology, we use textural information in a machine learning algorithm framework using Haar features and Gradient boosting classifier. We have tested the algorithm on 45 unseen cases and the performance against ground truth is measured by computing Dice overlap, % error in major and minor axis of kidney. The algorithm shows successful performance on 80% cases.

  16. Analysis of technical university information system

    NASA Astrophysics Data System (ADS)

    Savelyev, N. A.; Boyarkin, M. A.

    2018-05-01

    The paper covers a set and interaction of the existing higher education institution automated control systems in φ state budgetary educational institution of higher professional education "Industrial University of Tyumen ". A structural interaction of the existing systems and their functions has been analyzed which has become a basis for identification of a number of system-related and local (related to separate modules) drawbacks of the university activities automation. The authors suggested a new structure of the automated control system, consisting of three major subsystems: management support; training and methodology support; distance and supplementary education support. Functionality for each subsystem has been defined in accordance with the educational institution automation requirements. The suggested structure of the ACS will solve the challenges facing the university during reorganization and optimization of the processes of management of the institution activities as a whole.

  17. Archival storage solutions for PACS

    NASA Astrophysics Data System (ADS)

    Chunn, Timothy

    1997-05-01

    While they are many, one of the inhibitors to the wide spread diffusion of PACS systems has been robust, cost effective digital archive storage solutions. Moreover, an automated Nearline solution is key to a central, sharable data repository, enabling many applications such as PACS, telemedicine and teleradiology, and information warehousing and data mining for research such as patient outcome analysis. Selecting the right solution depends on a number of factors: capacity requirements, write and retrieval performance requirements, scaleability in capacity and performance, configuration architecture and flexibility, subsystem availability and reliability, security requirements, system cost, achievable benefits and cost savings, investment protection, strategic fit and more.This paper addresses many of these issues. It compares and positions optical disk and magnetic tape technologies, which are the predominant archive mediums today. Price and performance comparisons will be made at different archive capacities, plus the effect of file size on storage system throughput will be analyzed. The concept of automated migration of images from high performance, high cost storage devices to high capacity, low cost storage devices will be introduced as a viable way to minimize overall storage costs for an archive. The concept of access density will also be introduced and applied to the selection of the most cost effective archive solution.

  18. Real-time OHT Dispatching Mechanism for the Interbay Automated Material Handling System with Shortcuts and Bypasses

    NASA Astrophysics Data System (ADS)

    Pan, Cong; Zhang, Jie; Qin, Wei

    2017-05-01

    As a key to improve the performance of the interbay automated material handling system (AMHS) in 300 mm semiconductor wafer fabrication system, the real-time overhead hoist transport (OHT) dispatching problem has received much attention. This problem is first formulated as a special form of assignment problem and it is proved that more than one solution will be obtained by Hungarian algorithm simultaneously. Through proposing and strictly proving two propositions related to the characteristics of these solutions, a modified Hungarian algorithm is designed to distinguish these solutions. Finally, a new real-time OHT dispatching method is carefully designed by implementing the solution obtained by the modified Hungarian algorithm. The experimental results of discrete event simulations show that, compared with conventional Hungarian algorithm dispatching method, the proposed dispatching method that chooses the solution with the maximum variance respectively reduces on average 4 s of the average waiting time and average lead time of wafer lots, and its performance is rather stable in multiple different scenarios of the interbay AMHS with different quantities of shortcuts. This research provides an efficient real-time OHT dispatching mechanism for the interbay AMHS with shortcuts and bypasses.

  19. Automated and fast building of three-dimensional RNA structures.

    PubMed

    Zhao, Yunjie; Huang, Yangyu; Gong, Zhou; Wang, Yanjie; Man, Jianfen; Xiao, Yi

    2012-01-01

    Building tertiary structures of non-coding RNA is required to understand their functions and design new molecules. Current algorithms of RNA tertiary structure prediction give satisfactory accuracy only for small size and simple topology and many of them need manual manipulation. Here, we present an automated and fast program, 3dRNA, for RNA tertiary structure prediction with reasonable accuracy for RNAs of larger size and complex topology.

  20. Optimization of controlled processes in combined-cycle plant (new developments and researches)

    NASA Astrophysics Data System (ADS)

    Tverskoy, Yu S.; Muravev, I. K.

    2017-11-01

    All modern complex technical systems, including power units of TPP and nuclear power plants, work in the system-forming structure of multifunctional APCS. The development of the modern APCS mathematical support allows bringing the automation degree to the solution of complex optimization problems of equipment heat-mass-exchange processes in real time. The difficulty of efficient management of a binary power unit is related to the need to solve jointly at least three problems. The first problem is related to the physical issues of combined-cycle technologies. The second problem is determined by the criticality of the CCGT operation to changes in the regime and climatic factors. The third problem is related to a precise description of a vector of controlled coordinates of a complex technological object. To obtain a joint solution of this complex of interconnected problems, the methodology of generalized thermodynamic analysis, methods of the theory of automatic control and mathematical modeling are used. In the present report, results of new developments and studies are shown. These results allow improving the principles of process control and the automatic control systems structural synthesis of power units with combined-cycle plants that provide attainable technical and economic efficiency and operational reliability of equipment.

  1. Automated sensing of hydroponic macronutrients using a computer-controlled system with an array of ion-selective electrodes

    USDA-ARS?s Scientific Manuscript database

    Hydroponic production systems grow plants without soil, relying on a circulating solution to provide the necessary nutrients. Maintaining an optimum nutrient balance in this solution is important for maximizing crop growth and yield. Particularly in closed hydroponic systems it is important to monit...

  2. Augmenting Space Technology Program Management with Secure Cloud & Mobile Services

    NASA Technical Reports Server (NTRS)

    Hodson, Robert F.; Munk, Christopher; Helble, Adelle; Press, Martin T.; George, Cory; Johnson, David

    2017-01-01

    The National Aeronautics and Space Administration (NASA) Game Changing Development (GCD) program manages technology projects across all NASA centers and reports to NASA headquarters regularly on progress. Program stakeholders expect an up-to-date, accurate status and often have questions about the program's portfolio that requires a timely response. Historically, reporting, data collection, and analysis were done with manual processes that were inefficient and prone to error. To address these issues, GCD set out to develop a new business automation solution. In doing this, the program wanted to leverage the latest information technology platforms and decided to utilize traditional systems along with new cloud-based web services and gaming technology for a novel and interactive user environment. The team also set out to develop a mobile solution for anytime information access. This paper discusses a solution to these challenging goals and how the GCD team succeeded in developing and deploying such a system. The architecture and approach taken has proven to be effective and robust and can serve as a model for others looking to develop secure interactive mobile business solutions for government or enterprise business automation.

  3. Applying Semantic Web Services and Wireless Sensor Networks for System Integration

    NASA Astrophysics Data System (ADS)

    Berkenbrock, Gian Ricardo; Hirata, Celso Massaki; de Oliveira Júnior, Frederico Guilherme Álvares; de Oliveira, José Maria Parente

    In environments like factories, buildings, and homes automation services tend to often change during their lifetime. Changes are concerned to business rules, process optimization, cost reduction, and so on. It is important to provide a smooth and straightforward way to deal with these changes so that could be handled in a faster and low cost manner. Some prominent solutions use the flexibility of Wireless Sensor Networks and the meaningful description of Semantic Web Services to provide service integration. In this work, we give an overview of current solutions for machinery integration that combine both technologies as well as a discussion about some perspectives and open issues when applying Wireless Sensor Networks and Semantic Web Services for automation services integration.

  4. A computer program for automated flutter solution and matched point determination

    NASA Technical Reports Server (NTRS)

    Bhatia, K. G.

    1973-01-01

    The use of a digital computer program (MATCH) for automated determination of the flutter velocity and the matched-point flutter density is described. The program is based on the use of the modified Laguerre iteration formula to converge to a flutter crossing or a matched-point density. A general description of the computer program is included and the purpose of all subroutines used is stated. The input required by the program and various input options are detailed, and the output description is presented. The program can solve flutter equations formulated with up to 12 vibration modes and obtain flutter solutions for up to 10 air densities. The program usage is illustrated by a sample run, and the FORTRAN program listing is included.

  5. The value of the Semantic Web in the laboratory.

    PubMed

    Frey, Jeremy G

    2009-06-01

    The Semantic Web is beginning to impact on the wider chemical and physical sciences, beyond the earlier adopted bio-informatics. While useful in large-scale data driven science with automated processing, these technologies can also help integrate the work of smaller scale laboratories producing diverse data. The semantics aid the discovery, reliable re-use of data, provide improved provenance and facilitate automated processing by increased resilience to changes in presentation and reduced ambiguity. The Semantic Web, its tools and collections are not yet competitive with well-established solutions to current problems. It is in the reduced cost of instituting solutions to new problems that the versatility of Semantic Web-enabled data and resources will make their mark once the more general-purpose tools are more available.

  6. Nonanalytic Laboratory Automation: A Quarter Century of Progress.

    PubMed

    Hawker, Charles D

    2017-06-01

    Clinical laboratory automation has blossomed since the 1989 AACC meeting, at which Dr. Masahide Sasaki first showed a western audience what his laboratory had implemented. Many diagnostics and other vendors are now offering a variety of automated options for laboratories of all sizes. Replacing manual processing and handling procedures with automation was embraced by the laboratory community because of the obvious benefits of labor savings and improvement in turnaround time and quality. Automation was also embraced by the diagnostics vendors who saw automation as a means of incorporating the analyzers purchased by their customers into larger systems in which the benefits of automation were integrated to the analyzers.This report reviews the options that are available to laboratory customers. These options include so called task-targeted automation-modules that range from single function devices that automate single tasks (e.g., decapping or aliquoting) to multifunction workstations that incorporate several of the functions of a laboratory sample processing department. The options also include total laboratory automation systems that use conveyors to link sample processing functions to analyzers and often include postanalytical features such as refrigerated storage and sample retrieval.Most importantly, this report reviews a recommended process for evaluating the need for new automation and for identifying the specific requirements of a laboratory and developing solutions that can meet those requirements. The report also discusses some of the practical considerations facing a laboratory in a new implementation and reviews the concept of machine vision to replace human inspections. © 2017 American Association for Clinical Chemistry.

  7. Anchoring protein crystals to mounting loops with hydrogel using inkjet technology.

    PubMed

    Shinoda, Akira; Tanaka, Yoshikazu; Yao, Min; Tanaka, Isao

    2014-11-01

    X-ray crystallography is an important technique for structure-based drug discovery, mainly because it is the only technique that can reveal whether a ligand binds to the target protein as well as where and how it binds. However, ligand screening by X-ray crystallography involves a crystal-soaking experiment, which is usually performed manually. Thus, the throughput is not satisfactory for screening large numbers of candidate ligands. In this study, a technique to anchor protein crystals to mounting loops by using gel and inkjet technology has been developed; the method allows soaking of the mounted crystals in ligand-containing solution. This new technique may assist in the design of a fully automated drug-screening pipeline.

  8. Optimel: Software for selecting the optimal method

    NASA Astrophysics Data System (ADS)

    Popova, Olga; Popov, Boris; Romanov, Dmitry; Evseeva, Marina

    Optimel: software for selecting the optimal method automates the process of selecting a solution method from the optimization methods domain. Optimel features practical novelty. It saves time and money when conducting exploratory studies if its objective is to select the most appropriate method for solving an optimization problem. Optimel features theoretical novelty because for obtaining the domain a new method of knowledge structuring was used. In the Optimel domain, extended quantity of methods and their properties are used, which allows identifying the level of scientific studies, enhancing the user's expertise level, expand the prospects the user faces and opening up new research objectives. Optimel can be used both in scientific research institutes and in educational institutions.

  9. System and method for responding to ground and flight system malfunctions

    NASA Technical Reports Server (NTRS)

    Anderson, Julie J. (Inventor); Fussell, Ronald M. (Inventor)

    2010-01-01

    A system for on-board anomaly resolution for a vehicle has a data repository. The data repository stores data related to different systems, subsystems, and components of the vehicle. The data stored is encoded in a tree-based structure. A query engine is coupled to the data repository. The query engine provides a user and automated interface and provides contextual query to the data repository. An inference engine is coupled to the query engine. The inference engine compares current anomaly data to contextual data stored in the data repository using inference rules. The inference engine generates a potential solution to the current anomaly by referencing the data stored in the data repository.

  10. Design and performance of an automated radionuclide separator: its application on the determination of ⁹⁹Tc in groundwater.

    PubMed

    Chung, Kun Ho; Choi, Sang Do; Choi, Geun Sik; Kang, Mun Ja

    2013-11-01

    A modular automated radionuclide separator for (99)Tc (MARS Tc-99) has been developed for the rapid and reproducible separation of technetium in groundwater samples. The control software of MARS Tc-99 was developed in the LabView programming language. An automated radiochemical method for separating (99)Tc was developed and validated by the purification of (99m)Tc tracer solution eluted from a commercial (99)Mo/(99m)Tc generator. The chemical recovery and analytical time for this radiochemical method were found to be 96 ± 2% and 81 min, respectively. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Standardized 3D Bioprinting of Soft Tissue Models with Human Primary Cells.

    PubMed

    Rimann, Markus; Bono, Epifania; Annaheim, Helene; Bleisch, Matthias; Graf-Hausner, Ursula

    2016-08-01

    Cells grown in 3D are more physiologically relevant than cells cultured in 2D. To use 3D models in substance testing and regenerative medicine, reproducibility and standardization are important. Bioprinting offers not only automated standardizable processes but also the production of complex tissue-like structures in an additive manner. We developed an all-in-one bioprinting solution to produce soft tissue models. The holistic approach included (1) a bioprinter in a sterile environment, (2) a light-induced bioink polymerization unit, (3) a user-friendly software, (4) the capability to print in standard labware for high-throughput screening, (5) cell-compatible inkjet-based printheads, (6) a cell-compatible ready-to-use BioInk, and (7) standard operating procedures. In a proof-of-concept study, skin as a reference soft tissue model was printed. To produce dermal equivalents, primary human dermal fibroblasts were printed in alternating layers with BioInk and cultured for up to 7 weeks. During long-term cultures, the models were remodeled and fully populated with viable and spreaded fibroblasts. Primary human dermal keratinocytes were seeded on top of dermal equivalents, and epidermis-like structures were formed as verified with hematoxylin and eosin staining and immunostaining. However, a fully stratified epidermis was not achieved. Nevertheless, this is one of the first reports of an integrative bioprinting strategy for industrial routine application. © 2015 Society for Laboratory Automation and Screening.

  12. HYDRA-II: A hydrothermal analysis computer code: Volume 2, User's manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCann, R.A.; Lowery, P.S.; Lessor, D.L.

    1987-09-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite-difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations formore » conservation of momentum incorporate directional porosities and permeabilities that are available to model solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated methods are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume 1 - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. This volume, Volume 2 - User's Manual, contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a sample problem. The final volume, Volume 3 - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. 6 refs.« less

  13. About development of automation control systems

    NASA Astrophysics Data System (ADS)

    Myshlyaev, L. P.; Wenger, K. G.; Ivushkin, K. A.; Makarov, V. N.

    2018-05-01

    The shortcomings of approaches to the development of modern control automation systems and ways of their improvement are given: the correct formation of objects for study and optimization; a joint synthesis of control objects and control systems, an increase in the structural diversity of the elements of control systems. Diagrams of control systems with purposefully variable structure of their elements are presented. Structures of control algorithms for an object with a purposefully variable structure are given.

  14. Automated segmentation and recognition of the bone structure in non-contrast torso CT images using implicit anatomical knowledge

    NASA Astrophysics Data System (ADS)

    Zhou, X.; Hayashi, T.; Han, M.; Chen, H.; Hara, T.; Fujita, H.; Yokoyama, R.; Kanematsu, M.; Hoshi, H.

    2009-02-01

    X-ray CT images have been widely used in clinical diagnosis in recent years. A modern CT scanner can generate about 1000 CT slices to show the details of all the human organs within 30 seconds. However, CT image interpretations (viewing 500-1000 slices of CT images manually in front of a screen or films for each patient) require a lot of time and energy. Therefore, computer-aided diagnosis (CAD) systems that can support CT image interpretations are strongly anticipated. Automated recognition of the anatomical structures in CT images is a basic pre-processing of the CAD system. The bone structure is a part of anatomical structures and very useful to act as the landmarks for predictions of the other different organ positions. However, the automated recognition of the bone structure is still a challenging issue. This research proposes an automated scheme for segmenting the bone regions and recognizing the bone structure in noncontrast torso CT images. The proposed scheme was applied to 48 torso CT cases and a subjective evaluation for the experimental results was carried out by an anatomical expert following the anatomical definition. The experimental results showed that the bone structure in 90% CT cases have been recognized correctly. For quantitative evaluation, automated recognition results were compared to manual inputs of bones of lower limb created by an anatomical expert on 10 randomly selected CT cases. The error (maximum distance in 3D) between the recognition results and manual inputs distributed from 3-8 mm in different parts of the bone regions.

  15. Automated segmentation of thyroid gland on CT images with multi-atlas label fusion and random classification forest

    NASA Astrophysics Data System (ADS)

    Liu, Jiamin; Chang, Kevin; Kim, Lauren; Turkbey, Evrim; Lu, Le; Yao, Jianhua; Summers, Ronald

    2015-03-01

    The thyroid gland plays an important role in clinical practice, especially for radiation therapy treatment planning. For patients with head and neck cancer, radiation therapy requires a precise delineation of the thyroid gland to be spared on the pre-treatment planning CT images to avoid thyroid dysfunction. In the current clinical workflow, the thyroid gland is normally manually delineated by radiologists or radiation oncologists, which is time consuming and error prone. Therefore, a system for automated segmentation of the thyroid is desirable. However, automated segmentation of the thyroid is challenging because the thyroid is inhomogeneous and surrounded by structures that have similar intensities. In this work, the thyroid gland segmentation is initially estimated by multi-atlas label fusion algorithm. The segmentation is refined by supervised statistical learning based voxel labeling with a random forest algorithm. Multiatlas label fusion (MALF) transfers expert-labeled thyroids from atlases to a target image using deformable registration. Errors produced by label transfer are reduced by label fusion that combines the results produced by all atlases into a consensus solution. Then, random forest (RF) employs an ensemble of decision trees that are trained on labeled thyroids to recognize features. The trained forest classifier is then applied to the thyroid estimated from the MALF by voxel scanning to assign the class-conditional probability. Voxels from the expert-labeled thyroids in CT volumes are treated as positive classes; background non-thyroid voxels as negatives. We applied this automated thyroid segmentation system to CT scans of 20 patients. The results showed that the MALF achieved an overall 0.75 Dice Similarity Coefficient (DSC) and the RF classification further improved the DSC to 0.81.

  16. Toolboxes for a standardised and systematic study of glycans

    PubMed Central

    2014-01-01

    Background Recent progress in method development for characterising the branched structures of complex carbohydrates has now enabled higher throughput technology. Automation of structure analysis then calls for software development since adding meaning to large data collections in reasonable time requires corresponding bioinformatics methods and tools. Current glycobioinformatics resources do cover information on the structure and function of glycans, their interaction with proteins or their enzymatic synthesis. However, this information is partial, scattered and often difficult to find to for non-glycobiologists. Methods Following our diagnosis of the causes of the slow development of glycobioinformatics, we review the "objective" difficulties encountered in defining adequate formats for representing complex entities and developing efficient analysis software. Results Various solutions already implemented and strategies defined to bridge glycobiology with different fields and integrate the heterogeneous glyco-related information are presented. Conclusions Despite the initial stage of our integrative efforts, this paper highlights the rapid expansion of glycomics, the validity of existing resources and the bright future of glycobioinformatics. PMID:24564482

  17. Managing Automation: A Process, Not a Project.

    ERIC Educational Resources Information Center

    Hoffmann, Ellen

    1988-01-01

    Discussion of issues in management of library automation includes: (1) hardware, including systems growth and contracts; (2) software changes, vendor relations, local systems, and microcomputer software; (3) item and authority databases; (4) automation and library staff, organizational structure, and managing change; and (5) environmental issues,…

  18. Alert management for home healthcare based on home automation analysis.

    PubMed

    Truong, T T; de Lamotte, F; Diguet, J-Ph; Said-Hocine, F

    2010-01-01

    Rising healthcare for elder and disabled people can be controlled by offering people autonomy at home by means of information technology. In this paper, we present an original and sensorless alert management solution which performs multimedia and home automation service discrimination and extracts highly regular home activities as sensors for alert management. The results of simulation data, based on real context, allow us to evaluate our approach before application to real data.

  19. TECRA: C2 Application of Adaptive Automation Theory

    DTIC Science & Technology

    2010-03-01

    1 TECRA: C2 Application of Adaptive Automation Theory Ewart J. de Visser 1,2 , Melanie LeGoullon 1 , Don Horvath 1 , Gershon Weltman 1 , Amos...Solutions, Inc. 1001 19th St. N Suite 1500 Arlington, VA 22209 910-200-8596 edevisser@percsolutions.com 2 George Mason University 4400 University...to the RVT. Method Twelve students from George Mason University (4 Males and 8 Females) participated in this study and were compensated with

  20. ISS Operations Cost Reductions Through Automation of Real-Time Planning Tasks

    NASA Technical Reports Server (NTRS)

    Hall, Timothy A.

    2011-01-01

    In 2008 the Johnson Space Center s Mission Operations Directorate (MOD) management team challenged their organization to find ways to reduce the costs of International Space station (ISS) console operations in the Mission Control Center (MCC). Each MOD organization was asked to identify projects that would help them attain a goal of a 30% reduction in operating costs by 2012. The MOD Operations and Planning organization responded to this challenge by launching several software automation projects that would allow them to greatly improve ISS console operations and reduce staffing and operating costs. These projects to date have allowed the MOD Operations organization to remove one full time (7 x 24 x 365) ISS console position in 2010; with the plan of eliminating two full time ISS console support positions by 2012. This will account for an overall 10 EP reduction in staffing for the Operations and Planning organization. These automation projects focused on utilizing software to automate many administrative and often repetitive tasks involved with processing ISS planning and daily operations information. This information was exchanged between the ground flight control teams in Houston and around the globe, as well as with the ISS astronaut crew. These tasks ranged from managing mission plan changes from around the globe, to uploading and downloading information to and from the ISS crew, to even more complex tasks that required multiple decision points to process the data, track approvals and deliver it to the correct recipient across network and security boundaries. The software solutions leveraged several different technologies including customized web applications and implementation of industry standard web services architecture between several planning tools; as well as a engaging a previously research level technology (TRL 2-3) developed by Ames Research Center (ARC) that utilized an intelligent agent based system to manage and automate file traffic flow, archiving f data, and generating console logs. This technology called OCAMS (OCA (Orbital Communication System) Management System), is now considered TRL level 9 and is in daily use in the Mission Control Center in support of ISS operations. These solutions have not only allowed for improved efficiency on console; but since many of the previously manual data transfers are now automated, many of the human error prone steps have been removed, and the quality of the planning products has improved tremendously. This has also allowed our Planning Flight Controllers more time to focus on the abstract areas of the job, (like the complexities of planning a mission for 6 international crew members with a global planning team), instead of being burdened with the administrative tasks that took significant time each console shift to process. The resulting automation solutions have allowed the Operations and Planning organization to realize significant cost savings for the ISS program through 2020 and many of these solutions could be a viable

  1. Integrated Communications and Work Efficiency: Impacts on Organizational Structure and Power.

    ERIC Educational Resources Information Center

    Wigand, Rolf T.

    This paper reviews the work environment surrounding integrated office systems, synthesizes the known effects of automated office technologies, and discusses their impact on work efficiency in office environments. Particular attention is given to the effect of automated technologies on networks, workflow/processes, and organizational structure and…

  2. Peak picking multidimensional NMR spectra with the contour geometry based algorithm CYPICK.

    PubMed

    Würz, Julia M; Güntert, Peter

    2017-01-01

    The automated identification of signals in multidimensional NMR spectra is a challenging task, complicated by signal overlap, noise, and spectral artifacts, for which no universally accepted method is available. Here, we present a new peak picking algorithm, CYPICK, that follows, as far as possible, the manual approach taken by a spectroscopist who analyzes peak patterns in contour plots of the spectrum, but is fully automated. Human visual inspection is replaced by the evaluation of geometric criteria applied to contour lines, such as local extremality, approximate circularity (after appropriate scaling of the spectrum axes), and convexity. The performance of CYPICK was evaluated for a variety of spectra from different proteins by systematic comparison with peak lists obtained by other, manual or automated, peak picking methods, as well as by analyzing the results of automated chemical shift assignment and structure calculation based on input peak lists from CYPICK. The results show that CYPICK yielded peak lists that compare in most cases favorably to those obtained by other automated peak pickers with respect to the criteria of finding a maximal number of real signals, a minimal number of artifact peaks, and maximal correctness of the chemical shift assignments and the three-dimensional structure obtained by fully automated assignment and structure calculation.

  3. The Pandora multi-algorithm approach to automated pattern recognition of cosmic-ray muon and neutrino events in the MicroBooNE detector

    NASA Astrophysics Data System (ADS)

    Acciarri, R.; Adams, C.; An, R.; Anthony, J.; Asaadi, J.; Auger, M.; Bagby, L.; Balasubramanian, S.; Baller, B.; Barnes, C.; Barr, G.; Bass, M.; Bay, F.; Bishai, M.; Blake, A.; Bolton, T.; Camilleri, L.; Caratelli, D.; Carls, B.; Castillo Fernandez, R.; Cavanna, F.; Chen, H.; Church, E.; Cianci, D.; Cohen, E.; Collin, G. H.; Conrad, J. M.; Convery, M.; Crespo-Anadón, J. I.; Del Tutto, M.; Devitt, D.; Dytman, S.; Eberly, B.; Ereditato, A.; Escudero Sanchez, L.; Esquivel, J.; Fadeeva, A. A.; Fleming, B. T.; Foreman, W.; Furmanski, A. P.; Garcia-Gamez, D.; Garvey, G. T.; Genty, V.; Goeldi, D.; Gollapinni, S.; Graf, N.; Gramellini, E.; Greenlee, H.; Grosso, R.; Guenette, R.; Hackenburg, A.; Hamilton, P.; Hen, O.; Hewes, J.; Hill, C.; Ho, J.; Horton-Smith, G.; Hourlier, A.; Huang, E.-C.; James, C.; Jan de Vries, J.; Jen, C.-M.; Jiang, L.; Johnson, R. A.; Joshi, J.; Jostlein, H.; Kaleko, D.; Karagiorgi, G.; Ketchum, W.; Kirby, B.; Kirby, M.; Kobilarcik, T.; Kreslo, I.; Laube, A.; Li, Y.; Lister, A.; Littlejohn, B. R.; Lockwitz, S.; Lorca, D.; Louis, W. C.; Luethi, M.; Lundberg, B.; Luo, X.; Marchionni, A.; Mariani, C.; Marshall, J.; Martinez Caicedo, D. A.; Meddage, V.; Miceli, T.; Mills, G. B.; Moon, J.; Mooney, M.; Moore, C. D.; Mousseau, J.; Murrells, R.; Naples, D.; Nienaber, P.; Nowak, J.; Palamara, O.; Paolone, V.; Papavassiliou, V.; Pate, S. F.; Pavlovic, Z.; Piasetzky, E.; Porzio, D.; Pulliam, G.; Qian, X.; Raaf, J. L.; Rafique, A.; Rochester, L.; Rudolf von Rohr, C.; Russell, B.; Schmitz, D. W.; Schukraft, A.; Seligman, W.; Shaevitz, M. H.; Sinclair, J.; Smith, A.; Snider, E. L.; Soderberg, M.; Söldner-Rembold, S.; Soleti, S. R.; Spentzouris, P.; Spitz, J.; St. John, J.; Strauss, T.; Szelc, A. M.; Tagg, N.; Terao, K.; Thomson, M.; Toups, M.; Tsai, Y.-T.; Tufanli, S.; Usher, T.; Van De Pontseele, W.; Van de Water, R. G.; Viren, B.; Weber, M.; Wickremasinghe, D. A.; Wolbers, S.; Wongjirad, T.; Woodruff, K.; Yang, T.; Yates, L.; Zeller, G. P.; Zennamo, J.; Zhang, C.

    2018-01-01

    The development and operation of liquid-argon time-projection chambers for neutrino physics has created a need for new approaches to pattern recognition in order to fully exploit the imaging capabilities offered by this technology. Whereas the human brain can excel at identifying features in the recorded events, it is a significant challenge to develop an automated, algorithmic solution. The Pandora Software Development Kit provides functionality to aid the design and implementation of pattern-recognition algorithms. It promotes the use of a multi-algorithm approach to pattern recognition, in which individual algorithms each address a specific task in a particular topology. Many tens of algorithms then carefully build up a picture of the event and, together, provide a robust automated pattern-recognition solution. This paper describes details of the chain of over one hundred Pandora algorithms and tools used to reconstruct cosmic-ray muon and neutrino events in the MicroBooNE detector. Metrics that assess the current pattern-recognition performance are presented for simulated MicroBooNE events, using a selection of final-state event topologies.

  4. Sensors systems for the automation of operations in the ship repair industry.

    PubMed

    Navarro, Pedro Javier; Muro, Juan Suardíaz; Alcover, Pedro María; Fernández-Isla, Carlos

    2013-09-13

    Hull cleaning before repainting is a key operation in the maintenance of ships. For years, a method to improve such operation has been sought by means of the robotization of techniques such as grit blasting and ultra high pressure water jetting. Despite this, it continues to be standard practice in shipyards that this process is carried out manually because the developed robotized systems are too expensive to be widely accepted by shipyards. We have chosen to apply a more conservative and realistic approach to this problem, which has resulted in the development of several solutions that have been designed with different automation and operation range degrees. These solutions are fitted with most of the elements already available in many shipyards, so the installation of additional machinery in the workplace would not be necessary. This paper describes the evolutionary development of sensor systems for the automation of the preparation process of ship hull surfaces before the painting process is performed. Such evolution has given rise to the development of new technologies for coating removal.

  5. Slide Set: Reproducible image analysis and batch processing with ImageJ.

    PubMed

    Nanes, Benjamin A

    2015-11-01

    Most imaging studies in the biological sciences rely on analyses that are relatively simple. However, manual repetition of analysis tasks across multiple regions in many images can complicate even the simplest analysis, making record keeping difficult, increasing the potential for error, and limiting reproducibility. While fully automated solutions are necessary for very large data sets, they are sometimes impractical for the small- and medium-sized data sets common in biology. Here we present the Slide Set plugin for ImageJ, which provides a framework for reproducible image analysis and batch processing. Slide Set organizes data into tables, associating image files with regions of interest and other relevant information. Analysis commands are automatically repeated over each image in the data set, and multiple commands can be chained together for more complex analysis tasks. All analysis parameters are saved, ensuring transparency and reproducibility. Slide Set includes a variety of built-in analysis commands and can be easily extended to automate other ImageJ plugins, reducing the manual repetition of image analysis without the set-up effort or programming expertise required for a fully automated solution.

  6. Sensors Systems for the Automation of Operations in the Ship Repair Industry

    PubMed Central

    Navarro, Pedro Javier; Muro, Juan Suardíaz; Alcover, Pedro María; Fernández-Isla, Carlos

    2013-01-01

    Hull cleaning before repainting is a key operation in the maintenance of ships. For years, a method to improve such operation has been sought by means of the robotization of techniques such as grit blasting and ultra high pressure water jetting. Despite this, it continues to be standard practice in shipyards that this process is carried out manually because the developed robotized systems are too expensive to be widely accepted by shipyards. We have chosen to apply a more conservative and realistic approach to this problem, which has resulted in the development of several solutions that have been designed with different automation and operation range degrees. These solutions are fitted with most of the elements already available in many shipyards, so the installation of additional machinery in the workplace would not be necessary. This paper describes the evolutionary development of sensor systems for the automation of the preparation process of ship hull surfaces before the painting process is performed. Such evolution has given rise to the development of new technologies for coating removal. PMID:24064601

  7. The Upgrade Programme for the Structural Biology beamlines at the European Synchrotron Radiation Facility - High throughput sample evaluation and automation

    NASA Astrophysics Data System (ADS)

    Theveneau, P.; Baker, R.; Barrett, R.; Beteva, A.; Bowler, M. W.; Carpentier, P.; Caserotto, H.; de Sanctis, D.; Dobias, F.; Flot, D.; Guijarro, M.; Giraud, T.; Lentini, M.; Leonard, G. A.; Mattenet, M.; McCarthy, A. A.; McSweeney, S. M.; Morawe, C.; Nanao, M.; Nurizzo, D.; Ohlsson, S.; Pernot, P.; Popov, A. N.; Round, A.; Royant, A.; Schmid, W.; Snigirev, A.; Surr, J.; Mueller-Dieckmann, C.

    2013-03-01

    Automation and advances in technology are the key elements in addressing the steadily increasing complexity of Macromolecular Crystallography (MX) experiments. Much of this complexity is due to the inter-and intra-crystal heterogeneity in diffraction quality often observed for crystals of multi-component macromolecular assemblies or membrane proteins. Such heterogeneity makes high-throughput sample evaluation an important and necessary tool for increasing the chances of a successful structure determination. The introduction at the ESRF of automatic sample changers in 2005 dramatically increased the number of samples that were tested for diffraction quality. This "first generation" of automation, coupled with advances in software aimed at optimising data collection strategies in MX, resulted in a three-fold increase in the number of crystal structures elucidated per year using data collected at the ESRF. In addition, sample evaluation can be further complemented using small angle scattering experiments on the newly constructed bioSAXS facility on BM29 and the micro-spectroscopy facility (ID29S). The construction of a second generation of automated facilities on the MASSIF (Massively Automated Sample Screening Integrated Facility) beam lines will build on these advances and should provide a paradigm shift in how MX experiments are carried out which will benefit the entire Structural Biology community.

  8. Automated Image Registration Using Morphological Region of Interest Feature Extraction

    NASA Technical Reports Server (NTRS)

    Plaza, Antonio; LeMoigne, Jacqueline; Netanyahu, Nathan S.

    2005-01-01

    With the recent explosion in the amount of remotely sensed imagery and the corresponding interest in temporal change detection and modeling, image registration has become increasingly important as a necessary first step in the integration of multi-temporal and multi-sensor data for applications such as the analysis of seasonal and annual global climate changes, as well as land use/cover changes. The task of image registration can be divided into two major components: (1) the extraction of control points or features from images; and (2) the search among the extracted features for the matching pairs that represent the same feature in the images to be matched. Manual control feature extraction can be subjective and extremely time consuming, and often results in few usable points. Automated feature extraction is a solution to this problem, where desired target features are invariant, and represent evenly distributed landmarks such as edges, corners and line intersections. In this paper, we develop a novel automated registration approach based on the following steps. First, a mathematical morphology (MM)-based method is used to obtain a scale-orientation morphological profile at each image pixel. Next, a spectral dissimilarity metric such as the spectral information divergence is applied for automated extraction of landmark chips, followed by an initial approximate matching. This initial condition is then refined using a hierarchical robust feature matching (RFM) procedure. Experimental results reveal that the proposed registration technique offers a robust solution in the presence of seasonal changes and other interfering factors. Keywords-Automated image registration, multi-temporal imagery, mathematical morphology, robust feature matching.

  9. Principles of control automation of soil compacting machine operating mechanism

    NASA Astrophysics Data System (ADS)

    Anatoly Fedorovich, Tikhonov; Drozdov, Anatoly

    2018-03-01

    The relevance of the qualitative compaction of soil bases in the erection of embankment and foundations in building and structure construction is given.The quality of the compactible gravel and sandy soils provides the bearing capability and, accordingly, the strength and durability of constructed buildings.It has been established that the compaction quality depends on many external actions, such as surface roughness and soil moisture; granulometry, chemical composition and degree of elasticity of originalfilled soil for compaction.The analysis of technological processes of soil bases compaction of foreign and domestic information sources showed that the solution of such important problem as a continuous monitoring of soil compaction actual degree in the process of machine operation carry out only with the use of modern means of automation. An effective vibrodynamic method of gravel and sand material sealing for the building structure foundations for various applications was justified and suggested.The method of continuous monitoring the soil compaction by measurement of the amplitudes and frequencies of harmonic oscillations on the compactible surface was determined, which allowed to determine the basic elements of facilities of soil compacting machine monitoring system of operating, etc. mechanisms: an accelerometer, a bandpass filter, a vibro-harmonics, an on-board microcontroller. Adjustable parameters have been established to improve the soil compaction degree and the soil compacting machine performance, and the adjustable parameter dependences on the overall indexhave been experimentally determined, which is the soil compaction degree.A structural scheme of automatic control of the soil compacting machine control mechanism and theoperation algorithm has been developed.

  10. Automated peak picking and peak integration in macromolecular NMR spectra using AUTOPSY.

    PubMed

    Koradi, R; Billeter, M; Engeli, M; Güntert, P; Wüthrich, K

    1998-12-01

    A new approach for automated peak picking of multidimensional protein NMR spectra with strong overlap is introduced, which makes use of the program AUTOPSY (automated peak picking for NMR spectroscopy). The main elements of this program are a novel function for local noise level calculation, the use of symmetry considerations, and the use of lineshapes extracted from well-separated peaks for resolving groups of strongly overlapping peaks. The algorithm generates peak lists with precise chemical shift and integral intensities, and a reliability measure for the recognition of each peak. The results of automated peak picking of NOESY spectra with AUTOPSY were tested in combination with the combined automated NOESY cross peak assignment and structure calculation routine NOAH implemented in the program DYANA. The quality of the resulting structures was found to be comparable with those from corresponding data obtained with manual peak picking. Copyright 1998 Academic Press.

  11. Model-assisted template extraction SRAF application to contact holes patterns in high-end flash memory device fabrication

    NASA Astrophysics Data System (ADS)

    Seoud, Ahmed; Kim, Juhwan; Ma, Yuansheng; Jayaram, Srividya; Hong, Le; Chae, Gyu-Yeol; Lee, Jeong-Woo; Park, Dae-Jin; Yune, Hyoung-Soon; Oh, Se-Young; Park, Chan-Ha

    2018-03-01

    Sub-resolution assist feature (SRAF) insertion techniques have been effectively used for a long time now to increase process latitude in the lithography patterning process. Rule-based SRAF and model-based SRAF are complementary solutions, and each has its own benefits, depending on the objectives of applications and the criticality of the impact on manufacturing yield, efficiency, and productivity. Rule-based SRAF provides superior geometric output consistency and faster runtime performance, but the associated recipe development time can be of concern. Model-based SRAF provides better coverage for more complicated pattern structures in terms of shapes and sizes, with considerably less time required for recipe development, although consistency and performance may be impacted. In this paper, we introduce a new model-assisted template extraction (MATE) SRAF solution, which employs decision tree learning in a model-based solution to provide the benefits of both rule-based and model-based SRAF insertion approaches. The MATE solution is designed to automate the creation of rules/templates for SRAF insertion, and is based on the SRAF placement predicted by model-based solutions. The MATE SRAF recipe provides optimum lithographic quality in relation to various manufacturing aspects in a very short time, compared to traditional methods of rule optimization. Experiments were done using memory device pattern layouts to compare the MATE solution to existing model-based SRAF and pixelated SRAF approaches, based on lithographic process window quality, runtime performance, and geometric output consistency.

  12. Application of automation and information systems to forensic genetic specimen processing.

    PubMed

    Leclair, Benoît; Scholl, Tom

    2005-03-01

    During the last 10 years, the introduction of PCR-based DNA typing technologies in forensic applications has been highly successful. This technology has become pervasive throughout forensic laboratories and it continues to grow in prevalence. For many criminal cases, it provides the most probative evidence. Criminal genotype data banking and victim identification initiatives that follow mass-fatality incidents have benefited the most from the introduction of automation for sample processing and data analysis. Attributes of offender specimens including large numbers, high quality and identical collection and processing are ideal for the application of laboratory automation. The magnitude of kinship analysis required by mass-fatality incidents necessitates the application of computing solutions to automate the task. More recently, the development activities of many forensic laboratories are focused on leveraging experience from these two applications to casework sample processing. The trend toward increased prevalence of forensic genetic analysis will continue to drive additional innovations in high-throughput laboratory automation and information systems.

  13. High-throughput ab-initio dilute solute diffusion database.

    PubMed

    Wu, Henry; Mayeshiba, Tam; Morgan, Dane

    2016-07-19

    We demonstrate automated generation of diffusion databases from high-throughput density functional theory (DFT) calculations. A total of more than 230 dilute solute diffusion systems in Mg, Al, Cu, Ni, Pd, and Pt host lattices have been determined using multi-frequency diffusion models. We apply a correction method for solute diffusion in alloys using experimental and simulated values of host self-diffusivity. We find good agreement with experimental solute diffusion data, obtaining a weighted activation barrier RMS error of 0.176 eV when excluding magnetic solutes in non-magnetic alloys. The compiled database is the largest collection of consistently calculated ab-initio solute diffusion data in the world.

  14. Automated fiber placement: Evolution and current demonstrations

    NASA Technical Reports Server (NTRS)

    Grant, Carroll G.; Benson, Vernon M.

    1993-01-01

    The automated fiber placement process has been in development at Hercules since 1980. Fiber placement is being developed specifically for aircraft and other high performance structural applications. Several major milestones have been achieved during process development. These milestones are discussed in this paper. The automated fiber placement process is currently being demonstrated on the NASA ACT program. All demonstration projects to date have focused on fiber placement of transport aircraft fuselage structures. Hercules has worked closely with Boeing and Douglas on these demonstration projects. This paper gives a description of demonstration projects and results achieved.

  15. Protein Side-Chain Resonance Assignment and NOE Assignment Using RDC-Defined Backbones without TOCSY Data3

    PubMed Central

    Zeng, Jianyang; Zhou, Pei; Donald, Bruce Randall

    2011-01-01

    One bottleneck in NMR structure determination lies in the laborious and time-consuming process of side-chain resonance and NOE assignments. Compared to the well-studied backbone resonance assignment problem, automated side-chain resonance and NOE assignments are relatively less explored. Most NOE assignment algorithms require nearly complete side-chain resonance assignments from a series of through-bond experiments such as HCCH-TOCSY or HCCCONH. Unfortunately, these TOCSY experiments perform poorly on large proteins. To overcome this deficiency, we present a novel algorithm, called NASCA (NOE Assignment and Side-Chain Assignment), to automate both side-chain resonance and NOE assignments and to perform high-resolution protein structure determination in the absence of any explicit through-bond experiment to facilitate side-chain resonance assignment, such as HCCH-TOCSY. After casting the assignment problem into a Markov Random Field (MRF), NASCA extends and applies combinatorial protein design algorithms to compute optimal assignments that best interpret the NMR data. The MRF captures the contact map information of the protein derived from NOESY spectra, exploits the backbone structural information determined by RDCs, and considers all possible side-chain rotamers. The complexity of the combinatorial search is reduced by using a dead-end elimination (DEE) algorithm, which prunes side-chain resonance assignments that are provably not part of the optimal solution. Then an A* search algorithm is employed to find a set of optimal side-chain resonance assignments that best fit the NMR data. These side-chain resonance assignments are then used to resolve the NOE assignment ambiguity and compute high-resolution protein structures. Tests on five proteins show that NASCA assigns resonances for more than 90% of side-chain protons, and achieves about 80% correct assignments. The final structures computed using the NOE distance restraints assigned by NASCA have backbone RMSD 0.8 – 1.5 Å from the reference structures determined by traditional NMR approaches. PMID:21706248

  16. Automated position control of a surface array relative to a liquid microjunction surface sampler

    DOEpatents

    Van Berkel, Gary J.; Kertesz, Vilmos; Ford, Michael James

    2007-11-13

    A system and method utilizes an image analysis approach for controlling the probe-to-surface distance of a liquid junction-based surface sampling system for use with mass spectrometric detection. Such an approach enables a hands-free formation of the liquid microjunction used to sample solution composition from the surface and for re-optimization, as necessary, of the microjunction thickness during a surface scan to achieve a fully automated surface sampling system.

  17. Dynamic Communication Resource Negotiations

    NASA Technical Reports Server (NTRS)

    Chow, Edward; Vatan, Farrokh; Paloulian, George; Frisbie, Steve; Srostlik, Zuzana; Kalomiris, Vasilios; Apgar, Daniel

    2012-01-01

    Today's advanced network management systems can automate many aspects of the tactical networking operations within a military domain. However, automation of joint and coalition tactical networking across multiple domains remains challenging. Due to potentially conflicting goals and priorities, human agreement is often required before implementation into the network operations. This is further complicated by incompatible network management systems and security policies, rendering it difficult to implement automatic network management, thus requiring manual human intervention to the communication protocols used at various network routers and endpoints. This process of manual human intervention is tedious, error-prone, and slow. In order to facilitate a better solution, we are pursuing a technology which makes network management automated, reliable, and fast. Automating the negotiation of the common network communication parameters between different parties is the subject of this paper. We present the technology that enables inter-force dynamic communication resource negotiations to enable ad-hoc inter-operation in the field between force domains, without pre-planning. It also will enable a dynamic response to changing conditions within the area of operations. Our solution enables the rapid blending of intra-domain policies so that the forces involved are able to inter-operate effectively without overwhelming each other's networks with in-appropriate or un-warranted traffic. It will evaluate the policy rules and configuration data for each of the domains, then generate a compatible inter-domain policy and configuration that will update the gateway systems between the two domains.

  18. Automated generation of weld path trajectories.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sizemore, John M.; Hinman-Sweeney, Elaine Marie; Ames, Arlo Leroy

    2003-06-01

    AUTOmated GENeration of Control Programs for Robotic Welding of Ship Structure (AUTOGEN) is software that automates the planning and compiling of control programs for robotic welding of ship structure. The software works by evaluating computer representations of the ship design and the manufacturing plan. Based on this evaluation, AUTOGEN internally identifies and appropriately characterizes each weld. Then it constructs the robot motions necessary to accomplish the welds and determines for each the correct assignment of process control values. AUTOGEN generates these robot control programs completely without manual intervention or edits except to correct wrong or missing input data. Most shipmore » structure assemblies are unique or at best manufactured only a few times. Accordingly, the high cost inherent in all previous methods of preparing complex control programs has made robot welding of ship structures economically unattractive to the U.S. shipbuilding industry. AUTOGEN eliminates the cost of creating robot control programs. With programming costs eliminated, capitalization of robots to weld ship structures becomes economically viable. Robot welding of ship structures will result in reduced ship costs, uniform product quality, and enhanced worker safety. Sandia National Laboratories and Northrop Grumman Ship Systems worked with the National Shipbuilding Research Program to develop a means of automated path and process generation for robotic welding. This effort resulted in the AUTOGEN program, which has successfully demonstrated automated path generation and robot control. Although the current implementation of AUTOGEN is optimized for welding applications, the path and process planning capability has applicability to a number of industrial applications, including painting, riveting, and adhesive delivery.« less

  19. Iterative model building, structure refinement and density modification with the PHENIX AutoBuild wizard.

    PubMed

    Terwilliger, Thomas C; Grosse-Kunstleve, Ralf W; Afonine, Pavel V; Moriarty, Nigel W; Zwart, Peter H; Hung, Li Wei; Read, Randy J; Adams, Paul D

    2008-01-01

    The PHENIX AutoBuild wizard is a highly automated tool for iterative model building, structure refinement and density modification using RESOLVE model building, RESOLVE statistical density modification and phenix.refine structure refinement. Recent advances in the AutoBuild wizard and phenix.refine include automated detection and application of NCS from models as they are built, extensive model-completion algorithms and automated solvent-molecule picking. Model-completion algorithms in the AutoBuild wizard include loop building, crossovers between chains in different models of a structure and side-chain optimization. The AutoBuild wizard has been applied to a set of 48 structures at resolutions ranging from 1.1 to 3.2 A, resulting in a mean R factor of 0.24 and a mean free R factor of 0.29. The R factor of the final model is dependent on the quality of the starting electron density and is relatively independent of resolution.

  20. Measuring Up: Implementing a Dental Quality Measure in the Electronic Health Record Context

    PubMed Central

    Bhardwaj, Aarti; Ramoni, Rachel; Kalenderian, Elsbeth; Neumann, Ana; Hebballi, Nutan B; White, Joel M; McClellan, Lyle; Walji, Muhammad F

    2015-01-01

    Background Quality improvement requires quality measures that are validly implementable. In this work, we assessed the feasibility and performance of an automated electronic Meaningful Use dental clinical quality measure (percentage of children who received fluoride varnish). Methods We defined how to implement the automated measure queries in a dental electronic health record (EHR). Within records identified through automated query, we manually reviewed a subsample to assess the performance of the query. Results The automated query found 71.0% of patients to have had fluoride varnish compared to 77.6% found using the manual chart review. The automated quality measure performance was 90.5% sensitivity, 90.8% specificity, 96.9% positive predictive value, and 75.2% negative predictive value. Conclusions Our findings support the feasibility of automated dental quality measure queries in the context of sufficient structured data. Information noted only in the free text rather than in structured data would require natural language processing approaches to effectively query. Practical Implications To participate in self-directed quality improvement, dental clinicians must embrace the accountability era. Commitment to quality will require enhanced documentation in order to support near-term automated calculation of quality measures. PMID:26562736

  1. Rapid search for tertiary fragments reveals protein sequence–structure relationships

    PubMed Central

    Zhou, Jianfu; Grigoryan, Gevorg

    2015-01-01

    Finding backbone substructures from the Protein Data Bank that match an arbitrary query structural motif, composed of multiple disjoint segments, is a problem of growing relevance in structure prediction and protein design. Although numerous protein structure search approaches have been proposed, methods that address this specific task without additional restrictions and on practical time scales are generally lacking. Here, we propose a solution, dubbed MASTER, that is both rapid, enabling searches over the Protein Data Bank in a matter of seconds, and provably correct, finding all matches below a user-specified root-mean-square deviation cutoff. We show that despite the potentially exponential time complexity of the problem, running times in practice are modest even for queries with many segments. The ability to explore naturally plausible structural and sequence variations around a given motif has the potential to synthesize its design principles in an automated manner; so we go on to illustrate the utility of MASTER to protein structural biology. We demonstrate its capacity to rapidly establish structure–sequence relationships, uncover the native designability landscapes of tertiary structural motifs, identify structural signatures of binding, and automatically rewire protein topologies. Given the broad utility of protein tertiary fragment searches, we hope that providing MASTER in an open-source format will enable novel advances in understanding, predicting, and designing protein structure. PMID:25420575

  2. Toward Effective Shell Modeling of Wrinkled Thin-Film Membranes Exhibiting Stress Concentrations

    NASA Technical Reports Server (NTRS)

    Tessler, Alexander; Sleight, David W.

    2004-01-01

    Geometrically nonlinear shell finite element analysis has recently been applied to solar-sail membrane problems in order to model the out-of-plane deformations due to structural wrinkling. Whereas certain problems lend themselves to achieving converged nonlinear solutions that compare favorably with experimental observations, solutions to tensioned membranes exhibiting high stress concentrations have been difficult to obtain even with the best nonlinear finite element codes and advanced shell element technology. In this paper, two numerical studies are presented that pave the way to improving the modeling of this class of nonlinear problems. The studies address the issues of mesh refinement and stress-concentration alleviation, and the effects of these modeling strategies on the ability to attain converged nonlinear deformations due to wrinkling. The numerical studies demonstrate that excessive mesh refinement in the regions of stress concentration may be disadvantageous to achieving wrinkled equilibrium states, causing the nonlinear solution to lock in the membrane response mode, while totally discarding the very low-energy bending response that is necessary to cause wrinkling deformation patterns. An element-level, strain-energy density criterion is suggested for facilitating automated, adaptive mesh refinements specifically aimed at the modeling of thin-film membranes undergoing wrinkling deformations.

  3. Closed-form solution of decomposable stochastic models

    NASA Technical Reports Server (NTRS)

    Sjogren, Jon A.

    1990-01-01

    Markov and semi-Markov processes are increasingly being used in the modeling of complex reconfigurable systems (fault tolerant computers). The estimation of the reliability (or some measure of performance) of the system reduces to solving the process for its state probabilities. Such a model may exhibit numerous states and complicated transition distributions, contributing to an expensive and numerically delicate solution procedure. Thus, when a system exhibits a decomposition property, either structurally (autonomous subsystems), or behaviorally (component failure versus reconfiguration), it is desirable to exploit this decomposition in the reliability calculation. In interesting cases there can be failure states which arise from non-failure states of the subsystems. Equations are presented which allow the computation of failure probabilities of the total (combined) model without requiring a complete solution of the combined model. This material is presented within the context of closed-form functional representation of probabilities as utilized in the Symbolic Hierarchical Automated Reliability and Performance Evaluator (SHARPE) tool. The techniques adopted enable one to compute such probability functions for a much wider class of systems at a reduced computational cost. Several examples show how the method is used, especially in enhancing the versatility of the SHARPE tool.

  4. Automated information and control complex of hydro-gas endogenous mine processes

    NASA Astrophysics Data System (ADS)

    Davkaev, K. S.; Lyakhovets, M. V.; Gulevich, T. M.; Zolin, K. A.

    2017-09-01

    The automated information and control complex designed to prevent accidents, related to aerological situation in the underground workings, accounting of the received and handed over individual devices, transmission and display of measurement data, and the formation of preemptive solutions is considered. Examples for the automated workplace of an airgas control operator by individual means are given. The statistical characteristics of field data characterizing the aerological situation in the mine are obtained. The conducted studies of statistical characteristics confirm the feasibility of creating a subsystem of controlled gas distribution with an adaptive arrangement of points for gas control. The adaptive (multivariant) algorithm for processing measuring information of continuous multidimensional quantities and influencing factors has been developed.

  5. Microprocessor-based integration of microfluidic control for the implementation of automated sensor monitoring and multithreaded optimization algorithms.

    PubMed

    Ezra, Elishai; Maor, Idan; Bavli, Danny; Shalom, Itai; Levy, Gahl; Prill, Sebastian; Jaeger, Magnus S; Nahmias, Yaakov

    2015-08-01

    Microfluidic applications range from combinatorial synthesis to high throughput screening, with platforms integrating analog perfusion components, digitally controlled micro-valves and a range of sensors that demand a variety of communication protocols. Currently, discrete control units are used to regulate and monitor each component, resulting in scattered control interfaces that limit data integration and synchronization. Here, we present a microprocessor-based control unit, utilizing the MS Gadgeteer open framework that integrates all aspects of microfluidics through a high-current electronic circuit that supports and synchronizes digital and analog signals for perfusion components, pressure elements, and arbitrary sensor communication protocols using a plug-and-play interface. The control unit supports an integrated touch screen and TCP/IP interface that provides local and remote control of flow and data acquisition. To establish the ability of our control unit to integrate and synchronize complex microfluidic circuits we developed an equi-pressure combinatorial mixer. We demonstrate the generation of complex perfusion sequences, allowing the automated sampling, washing, and calibrating of an electrochemical lactate sensor continuously monitoring hepatocyte viability following exposure to the pesticide rotenone. Importantly, integration of an optical sensor allowed us to implement automated optimization protocols that require different computational challenges including: prioritized data structures in a genetic algorithm, distributed computational efforts in multiple-hill climbing searches and real-time realization of probabilistic models in simulated annealing. Our system offers a comprehensive solution for establishing optimization protocols and perfusion sequences in complex microfluidic circuits.

  6. How smart is your BEOL? productivity improvement through intelligent automation

    NASA Astrophysics Data System (ADS)

    Schulz, Kristian; Egodage, Kokila; Tabbone, Gilles; Garetto, Anthony

    2017-07-01

    The back end of line (BEOL) workflow in the mask shop still has crucial issues throughout all standard steps which are inspection, disposition, photomask repair and verification of repair success. All involved tools are typically run by highly trained operators or engineers who setup jobs and recipes, execute tasks, analyze data and make decisions based on the results. No matter how experienced operators are and how good the systems perform, there is one aspect that always limits the productivity and effectiveness of the operation: the human aspect. Human errors can range from seemingly rather harmless slip-ups to mistakes with serious and direct economic impact including mask rejects, customer returns and line stops in the wafer fab. Even with the introduction of quality control mechanisms that help to reduce these critical but unavoidable faults, they can never be completely eliminated. Therefore the mask shop BEOL cannot run in the most efficient manner as unnecessary time and money are spent on processes that still remain labor intensive. The best way to address this issue is to automate critical segments of the workflow that are prone to human errors. In fact, manufacturing errors can occur for each BEOL step where operators intervene. These processes comprise of image evaluation, setting up tool recipes, data handling and all other tedious but required steps. With the help of smart solutions, operators can work more efficiently and dedicate their time to less mundane tasks. Smart solutions connect tools, taking over the data handling and analysis typically performed by operators and engineers. These solutions not only eliminate the human error factor in the manufacturing process but can provide benefits in terms of shorter cycle times, reduced bottlenecks and prediction of an optimized workflow. In addition such software solutions consist of building blocks that seamlessly integrate applications and allow the customers to use tailored solutions. To accommodate for the variability and complexity in mask shops today, individual workflows can be supported according to the needs of any particular manufacturing line with respect to necessary measurement and production steps. At the same time the efficiency of assets is increased by avoiding unneeded cycle time and waste of resources due to the presence of process steps that are very crucial for a given technology. In this paper we present details of which areas of the BEOL can benefit most from intelligent automation, what solutions exist and the quantification of benefits to a mask shop with full automation by the use of a back end of line model.

  7. High throughput screening of CO2 solubility in aqueous monoamine solutions.

    PubMed

    Porcheron, Fabien; Gibert, Alexandre; Mougin, Pascal; Wender, Aurélie

    2011-03-15

    Post-combustion Carbon Capture and Storage technology (CCS) is viewed as an efficient solution to reduce CO(2) emissions of coal-fired power stations. In CCS, an aqueous amine solution is commonly used as a solvent to selectively capture CO(2) from the flue gas. However, this process generates additional costs, mostly from the reboiler heat duty required to release the carbon dioxide from the loaded solvent solution. In this work, we present thermodynamic results of CO(2) solubility in aqueous amine solutions from a 6-reactor High Throughput Screening (HTS) experimental device. This device is fully automated and designed to perform sequential injections of CO(2) within stirred-cell reactors containing the solvent solutions. The gas pressure within each reactor is monitored as a function of time, and the resulting transient pressure curves are transformed into CO(2) absorption isotherms. Solubility measurements are first performed on monoethanolamine, diethanolamine, and methyldiethanolamine aqueous solutions at T = 313.15 K. Experimental results are compared with existing data in the literature to validate the HTS device. In addition, a comprehensive thermodynamic model is used to represent CO(2) solubility variations in different classes of amine structures upon a wide range of thermodynamic conditions. This model is used to fit the experimental data and to calculate the cyclic capacity, which is a key parameter for CO(2) process design. Solubility measurements are then performed on a set of 50 monoamines and cyclic capacities are extracted using the thermodynamic model, to asses the potential of these molecules for CO(2) capture.

  8. Automated data collection based on RoboDiff at the ESRF beamline MASSIF-1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nurizzo, Didier, E-mail: Didier.nurizzo@esrf.fr; Guichard, Nicolas; McSweeney, Sean

    2016-07-27

    The European Synchrotron Radiation Facility has a long standing history in the automation of experiments in Macromolecular Crystallography. MASSIF-1 (Massively Automated Sample Screening and evaluation Integrated Facility), a beamline constructed as part of the ESRF Upgrade Phase I program, has been open to the external user community since July 2014 and offers a unique completely automated data collection service to both academic and industrial structural biologists.

  9. A Markov Random Field Framework for Protein Side-Chain Resonance Assignment

    NASA Astrophysics Data System (ADS)

    Zeng, Jianyang; Zhou, Pei; Donald, Bruce Randall

    Nuclear magnetic resonance (NMR) spectroscopy plays a critical role in structural genomics, and serves as a primary tool for determining protein structures, dynamics and interactions in physiologically-relevant solution conditions. The current speed of protein structure determination via NMR is limited by the lengthy time required in resonance assignment, which maps spectral peaks to specific atoms and residues in the primary sequence. Although numerous algorithms have been developed to address the backbone resonance assignment problem [68,2,10,37,14,64,1,31,60], little work has been done to automate side-chain resonance assignment [43, 48, 5]. Most previous attempts in assigning side-chain resonances depend on a set of NMR experiments that record through-bond interactions with side-chain protons for each residue. Unfortunately, these NMR experiments have low sensitivity and limited performance on large proteins, which makes it difficult to obtain enough side-chain resonance assignments. On the other hand, it is essential to obtain almost all of the side-chain resonance assignments as a prerequisite for high-resolution structure determination. To overcome this deficiency, we present a novel side-chain resonance assignment algorithm based on alternative NMR experiments measuring through-space interactions between protons in the protein, which also provide crucial distance restraints and are normally required in high-resolution structure determination. We cast the side-chain resonance assignment problem into a Markov Random Field (MRF) framework, and extend and apply combinatorial protein design algorithms to compute the optimal solution that best interprets the NMR data. Our MRF framework captures the contact map information of the protein derived from NMR spectra, and exploits the structural information available from the backbone conformations determined by orientational restraints and a set of discretized side-chain conformations (i.e., rotamers). A Hausdorff-based computation is employed in the scoring function to evaluate the probability of side-chain resonance assignments to generate the observed NMR spectra. The complexity of the assignment problem is first reduced by using a dead-end elimination (DEE) algorithm, which prunes side-chain resonance assignments that are provably not part of the optimal solution. Then an A* search algorithm is used to find a set of optimal side-chain resonance assignments that best fit the NMR data. We have tested our algorithm on NMR data for five proteins, including the FF Domain 2 of human transcription elongation factor CA150 (FF2), the B1 domain of Protein G (GB1), human ubiquitin, the ubiquitin-binding zinc finger domain of the human Y-family DNA polymerase Eta (pol η UBZ), and the human Set2-Rpb1 interacting domain (hSRI). Our algorithm assigns resonances for more than 90% of the protons in the proteins, and achieves about 80% correct side-chain resonance assignments. The final structures computed using distance restraints resulting from the set of assigned side-chain resonances have backbone RMSD 0.5 - 1.4 Å and all-heavy-atom RMSD 1.0 - 2.2 Å from the reference structures that were determined by X-ray crystallography or traditional NMR approaches. These results demonstrate that our algorithm can be successfully applied to automate side-chain resonance assignment and high-quality protein structure determination. Since our algorithm does not require any specific NMR experiments for measuring the through-bond interactions with side-chain protons, it can save a significant amount of both experimental cost and spectrometer time, and hence accelerate the NMR structure determination process.

  10. Development of a quantum mechanics-based free-energy perturbation method: use in the calculation of relative solvation free energies.

    PubMed

    Reddy, M Rami; Singh, U C; Erion, Mark D

    2004-05-26

    Free-energy perturbation (FEP) is considered the most accurate computational method for calculating relative solvation and binding free-energy differences. Despite some success in applying FEP methods to both drug design and lead optimization, FEP calculations are rarely used in the pharmaceutical industry. One factor limiting the use of FEP is its low throughput, which is attributed in part to the dependence of conventional methods on the user's ability to develop accurate molecular mechanics (MM) force field parameters for individual drug candidates and the time required to complete the process. In an attempt to find an FEP method that could eventually be automated, we developed a method that uses quantum mechanics (QM) for treating the solute, MM for treating the solute surroundings, and the FEP method for computing free-energy differences. The thread technique was used in all transformations and proved to be essential for the successful completion of the calculations. Relative solvation free energies for 10 structurally diverse molecular pairs were calculated, and the results were in close agreement with both the calculated results generated by conventional FEP methods and the experimentally derived values. While considerably more CPU demanding than conventional FEP methods, this method (QM/MM-based FEP) alleviates the need for development of molecule-specific MM force field parameters and therefore may enable future automation of FEP-based calculations. Moreover, calculation accuracy should be improved over conventional methods, especially for calculations reliant on MM parameters derived in the absence of experimental data.

  11. Deep Learning and Image Processing for Automated Crack Detection and Defect Measurement in Underground Structures

    NASA Astrophysics Data System (ADS)

    Panella, F.; Boehm, J.; Loo, Y.; Kaushik, A.; Gonzalez, D.

    2018-05-01

    This work presents the combination of Deep-Learning (DL) and image processing to produce an automated cracks recognition and defect measurement tool for civil structures. The authors focus on tunnel civil structures and survey and have developed an end to end tool for asset management of underground structures. In order to maintain the serviceability of tunnels, regular inspection is needed to assess their structural status. The traditional method of carrying out the survey is the visual inspection: simple, but slow and relatively expensive and the quality of the output depends on the ability and experience of the engineer as well as on the total workload (stress and tiredness may influence the ability to observe and record information). As a result of these issues, in the last decade there is the desire to automate the monitoring using new methods of inspection. The present paper has the goal of combining DL with traditional image processing to create a tool able to detect, locate and measure the structural defect.

  12. Evaluating different concentrations of hydrogen peroxide in an automated room disinfection system.

    PubMed

    Murdoch, L E; Bailey, L; Banham, E; Watson, F; Adams, N M T; Chewins, J

    2016-09-01

    A comparative study was made on the efficacy of 5, 10 and 35% weight by weight (w/w) hydrogen peroxide solutions when applied using an automated room disinfection system. Six-log biological indicators of methicillin-resistant Staphylococcus aureus (MRSA) and Geobacillus stearothermophilus were produced on stainless steel coupons and placed within a large, sealed, environmentally controlled enclosure. Five percent hydrogen peroxide was distributed throughout the enclosure using a Bioquell hydrogen peroxide vapour generator (BQ-50) for 40 min and left to reside for a further 200 min. Biological indicators were removed at 10-min intervals throughout the first 120 min of the process. The experiment was repeated for 10 and 35% hydrogen peroxide solutions. Five percent and 10% hydrogen peroxide solutions failed to achieve any reduction of MRSA, but achieved full kill of G. stearothermophilus spores at 70 and 40 min respectively. Thirty-five percent hydrogen peroxide achieved a 6-log reduction of MRSA after 30 min and full kill of G. stearothermophilus at 20 min. The concentration of 5% hydrogen peroxide within the enclosure after the 200-min dwell was measured at 9·0 ppm. This level exceeds the 15-min Short Term Exposure Limit (STEL) for hydrogen peroxide of 2·0 ppm. Users of automated hydrogen peroxide disinfection systems should review system efficacy and room re-entry protocols in light of these results. This research allows hospital infection control teams to consider the impact and risks of using low concentrations of hydrogen peroxide for disinfection within their facilities, and to question automated room disinfection system providers on the efficacy claims they make. The evidence that low concentration hydrogen peroxide solutions do not rapidly, autonomously break down, is in contradiction to the claims made by some hydrogen peroxide equipment providers and raises serious health and safety concerns. Facilities using hydrogen peroxide systems that claim autonomous break down of hydrogen peroxide should introduce monitoring procedures to ensure rooms are safe for re-entry and patient occupation. © 2016 The Society for Applied Microbiology.

  13. Automated indirect immunofluorescence evaluation of antinuclear autoantibodies on HEp-2 cells.

    PubMed

    Voigt, Jörn; Krause, Christopher; Rohwäder, Edda; Saschenbrecker, Sandra; Hahn, Melanie; Danckwardt, Maick; Feirer, Christian; Ens, Konstantin; Fechner, Kai; Barth, Erhardt; Martinetz, Thomas; Stöcker, Winfried

    2012-01-01

    Indirect immunofluorescence (IIF) on human epithelial (HEp-2) cells is considered as the gold standard screening method for the detection of antinuclear autoantibodies (ANA). However, in terms of automation and standardization, it has not been able to keep pace with most other analytical techniques used in diagnostic laboratories. Although there are already some automation solutions for IIF incubation in the market, the automation of result evaluation is still in its infancy. Therefore, the EUROPattern Suite has been developed as a comprehensive automated processing and interpretation system for standardized and efficient ANA detection by HEp-2 cell-based IIF. In this study, the automated pattern recognition was compared to conventional visual interpretation in a total of 351 sera. In the discrimination of positive from negative samples, concordant results between visual and automated evaluation were obtained for 349 sera (99.4%, kappa = 0.984). The system missed out none of the 272 antibody-positive samples and identified 77 out of 79 visually negative samples (analytical sensitivity/specificity: 100%/97.5%). Moreover, 94.0% of all main antibody patterns were recognized correctly by the software. Owing to its performance characteristics, EUROPattern enables fast, objective, and economic IIF ANA analysis and has the potential to reduce intra- and interlaboratory variability.

  14. Automated Indirect Immunofluorescence Evaluation of Antinuclear Autoantibodies on HEp-2 Cells

    PubMed Central

    Voigt, Jörn; Krause, Christopher; Rohwäder, Edda; Saschenbrecker, Sandra; Hahn, Melanie; Danckwardt, Maick; Feirer, Christian; Ens, Konstantin; Fechner, Kai; Barth, Erhardt; Martinetz, Thomas; Stöcker, Winfried

    2012-01-01

    Indirect immunofluorescence (IIF) on human epithelial (HEp-2) cells is considered as the gold standard screening method for the detection of antinuclear autoantibodies (ANA). However, in terms of automation and standardization, it has not been able to keep pace with most other analytical techniques used in diagnostic laboratories. Although there are already some automation solutions for IIF incubation in the market, the automation of result evaluation is still in its infancy. Therefore, the EUROPattern Suite has been developed as a comprehensive automated processing and interpretation system for standardized and efficient ANA detection by HEp-2 cell-based IIF. In this study, the automated pattern recognition was compared to conventional visual interpretation in a total of 351 sera. In the discrimination of positive from negative samples, concordant results between visual and automated evaluation were obtained for 349 sera (99.4%, kappa = 0.984). The system missed out none of the 272 antibody-positive samples and identified 77 out of 79 visually negative samples (analytical sensitivity/specificity: 100%/97.5%). Moreover, 94.0% of all main antibody patterns were recognized correctly by the software. Owing to its performance characteristics, EUROPattern enables fast, objective, and economic IIF ANA analysis and has the potential to reduce intra- and interlaboratory variability. PMID:23251220

  15. The State and Trends of Barcode, RFID, Biometric and Pharmacy Automation Technologies in US Hospitals.

    PubMed

    Uy, Raymonde Charles Y; Kury, Fabricio P; Fontelo, Paul A

    2015-01-01

    The standard of safe medication practice requires strict observance of the five rights of medication administration: the right patient, drug, time, dose, and route. Despite adherence to these guidelines, medication errors remain a public health concern that has generated health policies and hospital processes that leverage automation and computerization to reduce these errors. Bar code, RFID, biometrics and pharmacy automation technologies have been demonstrated in literature to decrease the incidence of medication errors by minimizing human factors involved in the process. Despite evidence suggesting the effectivity of these technologies, adoption rates and trends vary across hospital systems. The objective of study is to examine the state and adoption trends of automatic identification and data capture (AIDC) methods and pharmacy automation technologies in U.S. hospitals. A retrospective descriptive analysis of survey data from the HIMSS Analytics® Database was done, demonstrating an optimistic growth in the adoption of these patient safety solutions.

  16. Energy Assessment of Automated Mobility Districts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yuche

    Automated vehicles (AVs) are increasingly being discussed as the basis for on-demand mobility services, introducing a new paradigm in which a fleet of AVs displace private automobiles for day-to-day travel in dense activity districts. This project examines such a concept to displace privately owned automobiles within a region containing dense activity generators (jobs, retail, entertainment, etc.), referred to as an automated mobility district (AMDs). The project reviews several such districts including airport, college campuses, business parks, downtown urban cores, and military bases, with examples of previous attempts to meet the mobility needs apart from private automobiles, some with automated technologymore » and others with more traditional transit based solutions. The issues and benefits of AMDs are framed within the perspective of intra-district, inter-district, and border issues, and the requirements for a modeling framework are identified to adequately reflect the breadth of mobility, energy, and emissions impact anticipated with AMDs.« less

  17. A robust automated system elucidates mouse home cage behavioral structure

    PubMed Central

    Goulding, Evan H.; Schenk, A. Katrin; Juneja, Punita; MacKay, Adrienne W.; Wade, Jennifer M.; Tecott, Laurence H.

    2008-01-01

    Patterns of behavior exhibited by mice in their home cages reflect the function and interaction of numerous behavioral and physiological systems. Detailed assessment of these patterns thus has the potential to provide a powerful tool for understanding basic aspects of behavioral regulation and their perturbation by disease processes. However, the capacity to identify and examine these patterns in terms of their discrete levels of organization across diverse behaviors has been difficult to achieve and automate. Here, we describe an automated approach for the quantitative characterization of fundamental behavioral elements and their patterns in the freely behaving mouse. We demonstrate the utility of this approach by identifying unique features of home cage behavioral structure and changes in distinct levels of behavioral organization in mice with single gene mutations altering energy balance. The robust, automated, reproducible quantification of mouse home cage behavioral structure detailed here should have wide applicability for the study of mammalian physiology, behavior, and disease. PMID:19106295

  18. Development and verification testing of automation and robotics for assembly of space structures

    NASA Technical Reports Server (NTRS)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.

    1993-01-01

    A program was initiated within the past several years to develop operational procedures for automated assembly of truss structures suitable for large-aperture antennas. The assembly operations require the use of a robotic manipulator and are based on the principle of supervised autonomy to minimize crew resources. A hardware testbed was established to support development and evaluation testing. A brute-force automation approach was used to develop the baseline assembly hardware and software techniques. As the system matured and an operation was proven, upgrades were incorprated and assessed against the baseline test results. This paper summarizes the developmental phases of the program, the results of several assembly tests, the current status, and a series of proposed developments for additional hardware and software control capability. No problems that would preclude automated in-space assembly of truss structures have been encountered. The current system was developed at a breadboard level and continued development at an enhanced level is warranted.

  19. On the Relation between Automated Essay Scoring and Modern Views of the Writing Construct

    ERIC Educational Resources Information Center

    Deane, Paul

    2013-01-01

    This paper examines the construct measured by automated essay scoring (AES) systems. AES systems measure features of the text structure, linguistic structure, and conventional print form of essays; as such, the systems primarily measure text production skills. In the current state-of-the-art, AES provide little direct evidence about such matters…

  20. Using support vector machines to detect medical fraud and abuse.

    PubMed

    Francis, Charles; Pepper, Noah; Strong, Homer

    2011-01-01

    This paper examines the architecture and efficacy of Quash, an automated medical bill processing system capable of bill routing and abuse detection. Quash is designed to be used in conjunction with human auditors and a standard bill review software platform to provide a complete cost containment solution for medical claims. The primary contribution of Quash is to provide a real world speed up for medical fraud detection experts in their work. There will be a discussion of implementation details and preliminary experimental results. In this paper we are entirely focused on medical data and billing patterns that occur within the United States, though these results should be applicable to any financial transaction environment in which structured coding data can be mined.

  1. Developing quality indicators and auditing protocols from formal guideline models: knowledge representation and transformations.

    PubMed

    Advani, Aneel; Goldstein, Mary; Shahar, Yuval; Musen, Mark A

    2003-01-01

    Automated quality assessment of clinician actions and patient outcomes is a central problem in guideline- or standards-based medical care. In this paper we describe a model representation and algorithm for deriving structured quality indicators and auditing protocols from formalized specifications of guidelines used in decision support systems. We apply the model and algorithm to the assessment of physician concordance with a guideline knowledge model for hypertension used in a decision-support system. The properties of our solution include the ability to derive automatically context-specific and case-mix-adjusted quality indicators that can model global or local levels of detail about the guideline parameterized by defining the reliability of each indicator or element of the guideline.

  2. The evolution of automated launch processing

    NASA Technical Reports Server (NTRS)

    Tomayko, James E.

    1988-01-01

    The NASA Launch Processing System (LPS) to which attention is presently given has arrived at satisfactory solutions for the distributed-computing, good user interface and dissimilar-hardware interface, and automation-related problems that emerge in the specific arena of spacecraft launch preparations. An aggressive effort was made to apply the lessons learned in the 1960s, during the first attempts at automatic launch vehicle checkout, to the LPS. As the Space Shuttle System continues to evolve, the primary contributor to safety and reliability will be the LPS.

  3. A scalable, fully automated process for construction of sequence-ready human exome targeted capture libraries

    PubMed Central

    2011-01-01

    Genome targeting methods enable cost-effective capture of specific subsets of the genome for sequencing. We present here an automated, highly scalable method for carrying out the Solution Hybrid Selection capture approach that provides a dramatic increase in scale and throughput of sequence-ready libraries produced. Significant process improvements and a series of in-process quality control checkpoints are also added. These process improvements can also be used in a manual version of the protocol. PMID:21205303

  4. Problems of collaborative work of the automated process control system (APCS) and the its information security and solutions.

    NASA Astrophysics Data System (ADS)

    Arakelyan, E. K.; Andryushin, A. V.; Mezin, S. V.; Kosoy, A. A.; Kalinina, Ya V.; Khokhlov, I. S.

    2017-11-01

    The principle of interaction of the specified systems of technological protections by the Automated process control system (APCS) and information safety in case of incorrect execution of the algorithm of technological protection is offered. - checking the correctness of the operation of technological protection in each specific situation using the functional relationship between the monitored parameters. The methodology for assessing the economic feasibility of developing and implementing an information security system.

  5. A smart end-effector for assembly of space truss structures

    NASA Technical Reports Server (NTRS)

    Doggett, William R.; Rhodes, Marvin D.; Wise, Marion A.; Armistead, Maurice F.

    1992-01-01

    A unique facility, the Automated Structures Research Laboratory, is being used to investigate robotic assembly of truss structures. A special-purpose end-effector is used to assemble structural elements into an eight meter diameter structure. To expand the capabilities of the facility to include construction of structures with curved surfaces from straight structural elements of different lengths, a new end-effector has been designed and fabricated. This end-effector contains an integrated microprocessor to monitor actuator operations through sensor feedback. This paper provides an overview of the automated assembly tasks required by this end-effector and a description of the new end-effector's hardware and control software.

  6. Cleaning method and apparatus

    DOEpatents

    Jackson, Darryl D.; Hollen, Robert M.

    1983-01-01

    A new automatable cleaning apparatus which makes use of a method of very thoroughly and quickly cleaning a gauze electrode used in chemical analyses is given. The method generates very little waste solution, and this is very important in analyzing radioactive materials, especially in aqueous solutions. The cleaning apparatus can be used in a larger, fully automated controlled potential coulometric apparatus. About 99.98% of a 5 mg. plutonium sample was removed in less than 3 minutes, using only about 60 ml. of rinse solution and two main rinse steps.

  7. CORSS: Cylinder Optimization of Rings, Skin, and Stringers

    NASA Technical Reports Server (NTRS)

    Finckenor, J.; Rogers, P.; Otte, N.

    1994-01-01

    Launch vehicle designs typically make extensive use of cylindrical skin stringer construction. Structural analysis methods are well developed for preliminary design of this type of construction. This report describes an automated, iterative method to obtain a minimum weight preliminary design. Structural optimization has been researched extensively, and various programs have been written for this purpose. Their complexity and ease of use depends on their generality, the failure modes considered, the methodology used, and the rigor of the analysis performed. This computer program employs closed-form solutions from a variety of well-known structural analysis references and joins them with a commercially available numerical optimizer called the 'Design Optimization Tool' (DOT). Any ring and stringer stiffened shell structure of isotropic materials that has beam type loading can be analyzed. Plasticity effects are not included. It performs a more limited analysis than programs such as PANDA, but it provides an easy and useful preliminary design tool for a large class of structures. This report briefly describes the optimization theory, outlines the development and use of the program, and describes the analysis techniques that are used. Examples of program input and output, as well as the listing of the analysis routines, are included.

  8. Automated Solvent Seaming of Large Polyimide Membranes

    NASA Technical Reports Server (NTRS)

    Rood, Robert; Moore, James D.; Talley, Chris; Gierow, Paul A.

    2006-01-01

    A solvent-based welding process enables the joining of precise, cast polyimide membranes at their edges to form larger precise membranes. The process creates a homogeneous, optical-quality seam between abutting membranes, with no overlap and with only a very localized area of figure disturbance. The seam retains 90 percent of the strength of the parent material. The process was developed for original use in the fabrication of wide-aperture membrane optics, with areal densities of less than 1 kg/m2, for lightweight telescopes, solar concentrators, antennas, and the like to be deployed in outer space. The process is just as well applicable to the fabrication of large precise polyimide membranes for flat or inflatable solar concentrators and antenna reflectors for terrestrial applications. The process is applicable to cast membranes made of CP1 (or equivalent) polyimide. The process begins with the precise fitting together and fixturing of two membrane segments. The seam is formed by applying a metered amount of a doped solution of the same polyimide along the abutting edges of the membrane segments. After the solution has been applied, the fixtured films are allowed to dry and are then cured by convective heating. The weld material is the same as the parent material, so that what is formed is a homogeneous, strong joint that is almost indistinguishable from the parent material. The success of the process is highly dependent on formulation of the seaming solution from the correct proportion of the polyimide in a suitable solvent. In addition, the formation of reliable seams depends on the deposition of a precise amount of the seaming solution along the seam line. To ensure the required precision, deposition is performed by use of an automated apparatus comprising a modified commercially available, large-format, ink-jet print head on an automated positioning table. The printing head jets the seaming solution into the seam area at a rate controlled in coordination with the movement of the positioning table.

  9. SISYPHUS: A high performance seismic inversion factory

    NASA Astrophysics Data System (ADS)

    Gokhberg, Alexey; Simutė, Saulė; Boehm, Christian; Fichtner, Andreas

    2016-04-01

    In the recent years the massively parallel high performance computers became the standard instruments for solving the forward and inverse problems in seismology. The respective software packages dedicated to forward and inverse waveform modelling specially designed for such computers (SPECFEM3D, SES3D) became mature and widely available. These packages achieve significant computational performance and provide researchers with an opportunity to solve problems of bigger size at higher resolution within a shorter time. However, a typical seismic inversion process contains various activities that are beyond the common solver functionality. They include management of information on seismic events and stations, 3D models, observed and synthetic seismograms, pre-processing of the observed signals, computation of misfits and adjoint sources, minimization of misfits, and process workflow management. These activities are time consuming, seldom sufficiently automated, and therefore represent a bottleneck that can substantially offset performance benefits provided by even the most powerful modern supercomputers. Furthermore, a typical system architecture of modern supercomputing platforms is oriented towards the maximum computational performance and provides limited standard facilities for automation of the supporting activities. We present a prototype solution that automates all aspects of the seismic inversion process and is tuned for the modern massively parallel high performance computing systems. We address several major aspects of the solution architecture, which include (1) design of an inversion state database for tracing all relevant aspects of the entire solution process, (2) design of an extensible workflow management framework, (3) integration with wave propagation solvers, (4) integration with optimization packages, (5) computation of misfits and adjoint sources, and (6) process monitoring. The inversion state database represents a hierarchical structure with branches for the static process setup, inversion iterations, and solver runs, each branch specifying information at the event, station and channel levels. The workflow management framework is based on an embedded scripting engine that allows definition of various workflow scenarios using a high-level scripting language and provides access to all available inversion components represented as standard library functions. At present the SES3D wave propagation solver is integrated in the solution; the work is in progress for interfacing with SPECFEM3D. A separate framework is designed for interoperability with an optimization module; the workflow manager and optimization process run in parallel and cooperate by exchanging messages according to a specially designed protocol. A library of high-performance modules implementing signal pre-processing, misfit and adjoint computations according to established good practices is included. Monitoring is based on information stored in the inversion state database and at present implements a command line interface; design of a graphical user interface is in progress. The software design fits well into the common massively parallel system architecture featuring a large number of computational nodes running distributed applications under control of batch-oriented resource managers. The solution prototype has been implemented on the "Piz Daint" supercomputer provided by the Swiss Supercomputing Centre (CSCS).

  10. Virtual automation.

    PubMed

    Casis, E; Garrido, A; Uranga, B; Vives, A; Zufiaurre, C

    2001-01-01

    Total laboratory automation (TLA) can be substituted in mid-size laboratories by a computer sample workflow control (virtual automation). Such a solution has been implemented in our laboratory using PSM, software developed in cooperation with Roche Diagnostics (Barcelona, Spain), to this purpose. This software is connected to the online analyzers and to the laboratory information system and is able to control and direct the samples working as an intermediate station. The only difference with TLA is the replacement of transport belts by personnel of the laboratory. The implementation of this virtual automation system has allowed us the achievement of the main advantages of TLA: workload increase (64%) with reduction in the cost per test (43%), significant reduction in the number of biochemistry primary tubes (from 8 to 2), less aliquoting (from 600 to 100 samples/day), automation of functional testing, drastic reduction of preanalytical errors (from 11.7 to 0.4% of the tubes) and better total response time for both inpatients (from up to 48 hours to up to 4 hours) and outpatients (from up to 10 days to up to 48 hours). As an additional advantage, virtual automation could be implemented without hardware investment and significant headcount reduction (15% in our lab).

  11. Self-optimizing approach for automated laser resonator alignment

    NASA Astrophysics Data System (ADS)

    Brecher, C.; Schmitt, R.; Loosen, P.; Guerrero, V.; Pyschny, N.; Pavim, A.; Gatej, A.

    2012-02-01

    Nowadays, the assembly of laser systems is dominated by manual operations, involving elaborate alignment by means of adjustable mountings. From a competition perspective, the most challenging problem in laser source manufacturing is price pressure, a result of cost competition exerted mainly from Asia. From an economical point of view, an automated assembly of laser systems defines a better approach to produce more reliable units at lower cost. However, the step from today's manual solutions towards an automated assembly requires parallel developments regarding product design, automation equipment and assembly processes. This paper introduces briefly the idea of self-optimizing technical systems as a new approach towards highly flexible automation. Technically, the work focuses on the precision assembly of laser resonators, which is one of the final and most crucial assembly steps in terms of beam quality and laser power. The paper presents a new design approach for miniaturized laser systems and new automation concepts for a robot-based precision assembly, as well as passive and active alignment methods, which are based on a self-optimizing approach. Very promising results have already been achieved, considerably reducing the duration and complexity of the laser resonator assembly. These results as well as future development perspectives are discussed.

  12. Development of an automated asbestos counting software based on fluorescence microscopy.

    PubMed

    Alexandrov, Maxym; Ichida, Etsuko; Nishimura, Tomoki; Aoki, Kousuke; Ishida, Takenori; Hirota, Ryuichi; Ikeda, Takeshi; Kawasaki, Tetsuo; Kuroda, Akio

    2015-01-01

    An emerging alternative to the commonly used analytical methods for asbestos analysis is fluorescence microscopy (FM), which relies on highly specific asbestos-binding probes to distinguish asbestos from interfering non-asbestos fibers. However, all types of microscopic asbestos analysis require laborious examination of large number of fields of view and are prone to subjective errors and large variability between asbestos counts by different analysts and laboratories. A possible solution to these problems is automated counting of asbestos fibers by image analysis software, which would lower the cost and increase the reliability of asbestos testing. This study seeks to develop a fiber recognition and counting software for FM-based asbestos analysis. We discuss the main features of the developed software and the results of its testing. Software testing showed good correlation between automated and manual counts for the samples with medium and high fiber concentrations. At low fiber concentrations, the automated counts were less accurate, leading us to implement correction mode for automated counts. While the full automation of asbestos analysis would require further improvements in accuracy of fiber identification, the developed software could already assist professional asbestos analysts and record detailed fiber dimensions for the use in epidemiological research.

  13. Ion channel pharmacology under flow: automation via well-plate microfluidics.

    PubMed

    Spencer, C Ian; Li, Nianzhen; Chen, Qin; Johnson, Juliette; Nevill, Tanner; Kammonen, Juha; Ionescu-Zanetti, Cristian

    2012-08-01

    Automated patch clamping addresses the need for high-throughput screening of chemical entities that alter ion channel function. As a result, there is considerable utility in the pharmaceutical screening arena for novel platforms that can produce relevant data both rapidly and consistently. Here we present results that were obtained with an innovative microfluidic automated patch clamp system utilizing a well-plate that eliminates the necessity of internal robotic liquid handling. Continuous recording from cell ensembles, rapid solution switching, and a bench-top footprint enable a number of assay formats previously inaccessible to automated systems. An electro-pneumatic interface was employed to drive the laminar flow of solutions in a microfluidic network that delivered cells in suspension to ensemble recording sites. Whole-cell voltage clamp was applied to linear arrays of 20 cells in parallel utilizing a 64-channel voltage clamp amplifier. A number of unique assays requiring sequential compound applications separated by a second or less, such as rapid determination of the agonist EC(50) for a ligand-gated ion channel or the kinetics of desensitization recovery, are enabled by the system. In addition, the system was validated via electrophysiological characterizations of both voltage-gated and ligand-gated ion channel targets: hK(V)2.1 and human Ether-à-go-go-related gene potassium channels, hNa(V)1.7 and 1.8 sodium channels, and (α1) hGABA(A) and (α1) human nicotinic acetylcholine receptor receptors. Our results show that the voltage dependence, kinetics, and interactions of these channels with pharmacological agents were matched to reference data. The results from these IonFlux™ experiments demonstrate that the system provides high-throughput automated electrophysiology with enhanced reliability and consistency, in a user-friendly format.

  14. Preparation of crotaline F-ab antivenom (CroFab) with automated mixing methods: in vitro observations.

    PubMed

    Vohra, Rais; Kelner, Michael; Clark, Richard F

    2009-01-01

    Crotaline Polyvalent Ovine Fab antivenom (CroFab, Savage Laboratories and Protherics Inc., Brentwood, TN, USA) preparation requires that the lyophilized powder be manually reconstituted before use. We compared automated methods for driving the product into solution with the standard manual method of reconstitution, and the effect of repeated rinsing of the product vial, on the per-vial availability of antivenom. Normal saline (NS, 10 mL) was added to 12 vials of expired CroFab. Vials were assigned in pairs to each of six mixing methods, including one pair mixed manually as recommended by the product package insert. Each vial's contents were diluted to a final volume of 75 mL of normal saline. Protein concentration was measured with a colorimetric assay. The fluid left in each vial was removed and the vial was washed with 10 mL NS. Total protein yield from each step was calculated. There was no significant change in protein yield among three of five automated mixing methods when compared to manual reconstitution. Repeat rinsing of the product vial with an additional 10 mLs of fluid added to the protein yield regardless of the mixing method used. We found slightly higher protein yields with all automated methods compared to manual mixing, but only two of five comparisons with the standard mixing method demonstrated statistical significance. However, for all methods tested, the addition of a second rinsing and recovery step increased the amount of protein recovered considerably, presumably by allowing solution of protein trapped in the foamy residues. Automated mixing methods and repeat rinsing of the product vial may allow higher protein yields in the preparation of CroFab antivenom.

  15. High-throughput ab-initio dilute solute diffusion database

    PubMed Central

    Wu, Henry; Mayeshiba, Tam; Morgan, Dane

    2016-01-01

    We demonstrate automated generation of diffusion databases from high-throughput density functional theory (DFT) calculations. A total of more than 230 dilute solute diffusion systems in Mg, Al, Cu, Ni, Pd, and Pt host lattices have been determined using multi-frequency diffusion models. We apply a correction method for solute diffusion in alloys using experimental and simulated values of host self-diffusivity. We find good agreement with experimental solute diffusion data, obtaining a weighted activation barrier RMS error of 0.176 eV when excluding magnetic solutes in non-magnetic alloys. The compiled database is the largest collection of consistently calculated ab-initio solute diffusion data in the world. PMID:27434308

  16. Recent developments in MrBUMP: better search-model preparation, graphical interaction with search models, and solution improvement and assessment.

    PubMed

    Keegan, Ronan M; McNicholas, Stuart J; Thomas, Jens M H; Simpkin, Adam J; Simkovic, Felix; Uski, Ville; Ballard, Charles C; Winn, Martyn D; Wilson, Keith S; Rigden, Daniel J

    2018-03-01

    Increasing sophistication in molecular-replacement (MR) software and the rapid expansion of the PDB in recent years have allowed the technique to become the dominant method for determining the phases of a target structure in macromolecular X-ray crystallography. In addition, improvements in bioinformatic techniques for finding suitable homologous structures for use as MR search models, combined with developments in refinement and model-building techniques, have pushed the applicability of MR to lower sequence identities and made weak MR solutions more amenable to refinement and improvement. MrBUMP is a CCP4 pipeline which automates all stages of the MR procedure. Its scope covers everything from the sourcing and preparation of suitable search models right through to rebuilding of the positioned search model. Recent improvements to the pipeline include the adoption of more sensitive bioinformatic tools for sourcing search models, enhanced model-preparation techniques including better ensembling of homologues, and the use of phase improvement and model building on the resulting solution. The pipeline has also been deployed as an online service through CCP4 online, which allows its users to exploit large bioinformatic databases and coarse-grained parallelism to speed up the determination of a possible solution. Finally, the molecular-graphics application CCP4mg has been combined with MrBUMP to provide an interactive visual aid to the user during the process of selecting and manipulating search models for use in MR. Here, these developments in MrBUMP are described with a case study to explore how some of the enhancements to the pipeline and to CCP4mg can help to solve a difficult case.

  17. HYDRA-II: A hydrothermal analysis computer code: Volume 3, Verification/validation assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCann, R.A.; Lowery, P.S.

    1987-10-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equationsmore » for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume I - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. This volume, Volume III - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. This volume also documents comparisons between the results of simulations of single- and multiassembly storage systems and actual experimental data. 11 refs., 55 figs., 13 tabs.« less

  18. Recent developments in MrBUMP: better search-model preparation, graphical interaction with search models, and solution improvement and assessment

    PubMed Central

    Keegan, Ronan M.; McNicholas, Stuart J.; Thomas, Jens M. H.; Simpkin, Adam J.; Uski, Ville; Ballard, Charles C.

    2018-01-01

    Increasing sophistication in molecular-replacement (MR) software and the rapid expansion of the PDB in recent years have allowed the technique to become the dominant method for determining the phases of a target structure in macromolecular X-ray crystallography. In addition, improvements in bioinformatic techniques for finding suitable homologous structures for use as MR search models, combined with developments in refinement and model-building techniques, have pushed the applicability of MR to lower sequence identities and made weak MR solutions more amenable to refinement and improvement. MrBUMP is a CCP4 pipeline which automates all stages of the MR procedure. Its scope covers everything from the sourcing and preparation of suitable search models right through to rebuilding of the positioned search model. Recent improvements to the pipeline include the adoption of more sensitive bioinformatic tools for sourcing search models, enhanced model-preparation techniques including better ensembling of homologues, and the use of phase improvement and model building on the resulting solution. The pipeline has also been deployed as an online service through CCP4 online, which allows its users to exploit large bioinformatic databases and coarse-grained parallelism to speed up the determination of a possible solution. Finally, the molecular-graphics application CCP4mg has been combined with MrBUMP to provide an interactive visual aid to the user during the process of selecting and manipulating search models for use in MR. Here, these developments in MrBUMP are described with a case study to explore how some of the enhancements to the pipeline and to CCP4mg can help to solve a difficult case. PMID:29533225

  19. Synthesis and stability of hetaerolite, ZnMn2O4, at 25°C

    USGS Publications Warehouse

    Hem, J.D.; Roberson, C.E.; Lind, C.J.

    1987-01-01

    A precipitate of nearly pure hetaerolite, ZnMn2O4, a spinel-structured analog of hausmannite, Mn3O4, was prepared by an irreversible wprecipitation of zinc with manganese at 25°C. The synthesis technique entailed constant slow addition of a dilute solution of Mn2+ and Zn2+ chlorides having a Mn/Zn ratio of 2:1 to a reaction vessel that initially contained distilled deionized water, maintained at a pH of 8.50 by addition of dilute NaOH by an automated pH stat, with continuous bubbling of CO2-free air. The solid was identified by means of X-ray diffraction and transmission electron microscopy and consisted of bipyramidal crystals generally less than 0.10 μm in diameter. Zn2+ ions are able to substitute extensively for Mn2+ ions that occupy tetrahedral sites in the hausmannite structure.Hetaerolite appears to be more stable than hausmannite with respect to spontaneous conversion to γMnOOH. The value of the standard free energy of formation of hetaerolite was estimated from the experimental data to be −289.4 ± 0.8 kcal per mole. Solids intermediate in composition between hetaerolite and hausmannite can be prepared by altering the Mn/Zn ratio in the feed solution.

  20. Application-level regression testing framework using Jenkins

    DOE PAGES

    Budiardja, Reuben; Bouvet, Timothy; Arnold, Galen

    2017-09-26

    Monitoring and testing for regression of large-scale systems such as the NCSA's Blue Waters supercomputer are challenging tasks. In this paper, we describe the solution we came up with to perform those tasks. The goal was to find an automated solution for running user-level regression tests to evaluate system usability and performance. Jenkins, an automation server software, was chosen for its versatility, large user base, and multitude of plugins including collecting data and plotting test results over time. We also describe our Jenkins deployment to launch and monitor jobs on remote HPC system, perform authentication with one-time password, and integratemore » with our LDAP server for its authorization. We show some use cases and describe our best practices for successfully using Jenkins as a user-level system-wide regression testing and monitoring framework for large supercomputer systems.« less

  1. Application-level regression testing framework using Jenkins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Budiardja, Reuben; Bouvet, Timothy; Arnold, Galen

    Monitoring and testing for regression of large-scale systems such as the NCSA's Blue Waters supercomputer are challenging tasks. In this paper, we describe the solution we came up with to perform those tasks. The goal was to find an automated solution for running user-level regression tests to evaluate system usability and performance. Jenkins, an automation server software, was chosen for its versatility, large user base, and multitude of plugins including collecting data and plotting test results over time. We also describe our Jenkins deployment to launch and monitor jobs on remote HPC system, perform authentication with one-time password, and integratemore » with our LDAP server for its authorization. We show some use cases and describe our best practices for successfully using Jenkins as a user-level system-wide regression testing and monitoring framework for large supercomputer systems.« less

  2. Automated sizing of large structures by mixed optimization methods

    NASA Technical Reports Server (NTRS)

    Sobieszczanski, J.; Loendorf, D.

    1973-01-01

    A procedure for automating the sizing of wing-fuselage airframes was developed and implemented in the form of an operational program. The program combines fully stressed design to determine an overall material distribution with mass-strength and mathematical programming methods to design structural details accounting for realistic design constraints. The practicality and efficiency of the procedure is demonstrated for transport aircraft configurations. The methodology is sufficiently general to be applicable to other large and complex structures.

  3. Space Construction Automated Fabrication Experiment Definition Study (SCAFEDS), part 3. Volume 3: Requirements

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The performance, design and verification requirements for the space Construction Automated Fabrication Experiment (SCAFE) are defined. The SCAFE program defines, develops, and demonstrates the techniques, processes, and equipment required for the automatic fabrication of structural elements in space and for the assembly of such elements into a large, lightweight structure. The program defines a large structural platform to be constructed in orbit using the space shuttle as a launch vehicle and construction base.

  4. Fluorescent Applications to Crystallization

    NASA Technical Reports Server (NTRS)

    Pusey, Marc L.; Forsythe, Elizabeth; Achari, Aniruddha

    2006-01-01

    By covalently modifying a subpopulation, less than or equal to 1%, of a macromolecule with a fluorescent probe, the labeled material will add to a growing crystal as a microheterogeneous growth unit. Labeling procedures can be readily incorporated into the final stages of purification, and tests with model proteins have shown that labeling u to 5 percent of the protein molecules does not affect the X-ray data quality obtained . The presence of the trace fluorescent label gives a number of advantages. Since the label is covalently attached to the protein molecules, it "tracks" the protein s response to the crystallization conditions. The covalently attached probe will concentrate in the crystal relative to the solution, and under fluorescent illumination crystals show up as bright objects against a darker background. Non-protein structures, such as salt crystals, do not show up under fluorescent illumination. Crystals have the highest protein concentration and are readily observed against less bright precipitated phases, which under white light illumination may obscure the crystals. Automated image analysis to find crystals should be greatly facilitated, without having to first define crystallization drop boundaries as the protein or protein structures is all that shows up. Fluorescence intensity is a faster search parameter, whether visually or by automated methods, than looking for crystalline features. Preliminary tests, using model proteins, indicates that we can use high fluorescence intensity regions, in the absence of clear crystalline features or "hits", as a means for determining potential lead conditions. A working hypothesis is that more rapid amorphous precipitation kinetics may overwhelm and trap more slowly formed ordered assemblies, which subsequently show up as regions of brighter fluorescence intensity. Experiments are now being carried out to test this approach using a wider range, of proteins. The trace fluorescently labeled crystals will also emit with sufficient intensity to aid in the automation of crystal alignment using relatively low cost optics, further increasing throughput at synchrotrons.

  5. Acoustic-sensor-based detection of damage in composite aircraft structures

    NASA Astrophysics Data System (ADS)

    Foote, Peter; Martin, Tony; Read, Ian

    2004-03-01

    Acoustic emission detection is a well-established method of locating and monitoring crack development in metal structures. The technique has been adapted to test facilities for non-destructive testing applications. Deployment as an operational or on-line automated damage detection technology in vehicles is posing greater challenges. A clear requirement of potential end-users of such systems is a level of automation capable of delivering low-level diagnosis information. The output from the system is in the form of "go", "no-go" indications of structural integrity or immediate maintenance actions. This level of automation requires significant data reduction and processing. This paper describes recent trials of acoustic emission detection technology for the diagnosis of damage in composite aerospace structures. The technology comprises low profile detection sensors using piezo electric wafers encapsulated in polymer film ad optical sensors. Sensors are bonded to the structure"s surface and enable acoustic events from the loaded structure to be located by triangulation. Instrumentation has been enveloped to capture and parameterise the sensor data in a form suitable for low-bandwidth storage and transmission.

  6. Automated assessment of the remineralization of artificial enamel lesions with polarization-sensitive optical coherence tomography

    PubMed Central

    Lee, Robert C.; Kang, Hobin; Darling, Cynthia L.; Fried, Daniel

    2014-01-01

    Accurate measurement of the highly mineralized transparent surface layer that forms on caries lesions is important for diagnosis of the lesion activity because chemical intervention can slow or reverse the caries process via remineralization. Previous in-vitro and in-vivo studies have demonstrated that polarization-sensitive optical coherence tomography (PS-OCT) can nondestructively image the subsurface lesion structure and the highly mineralized transparent surface zone of caries lesions. The purpose of this study was to develop an approach to automatically process 3-dimensional PS-OCT images and to accurately assess the remineralization process in simulated enamel lesions. Artificial enamel lesions were prepared on twenty bovine enamel blocks using two models to produce varying degree of demineralization and remineralization. The thickness of the transparent surface layer and the integrated reflectivity of the subsurface lesion were measured using PS-OCT. The automated transparent surface layer detection algorithm was able to successfully detect the transparent surface layers with high sensitivity ( = 0.92) and high specificity ( = 0.97). The estimated thickness of the transparent surface layer showed a strong correlation with polarized light microscopy (PLM) measurements of all regions (R2 = 0.90). The integrated reflectivity, ΔR, and the integrated mineral loss, ΔZ, showed a moderate correlation (R2 = 0.32). This study demonstrates that PS-OCT can automatically measure the changes in artificial enamel lesion structure and severity upon exposure to remineralization solutions. PMID:25401009

  7. The Pandora multi-algorithm approach to automated pattern recognition of cosmic-ray muon and neutrino events in the MicroBooNE detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Acciarri, R.; Adams, C.; An, R.

    The development and operation of Liquid-Argon Time-Projection Chambers for neutrino physics has created a need for new approaches to pattern recognition in order to fully exploit the imaging capabilities offered by this technology. Whereas the human brain can excel at identifying features in the recorded events, it is a significant challenge to develop an automated, algorithmic solution. The Pandora Software Development Kit provides functionality to aid the design and implementation of pattern-recognition algorithms. It promotes the use of a multi-algorithm approach to pattern recognition, in which individual algorithms each address a specific task in a particular topology. Many tens ofmore » algorithms then carefully build up a picture of the event and, together, provide a robust automated pattern-recognition solution. This paper describes details of the chain of over one hundred Pandora algorithms and tools used to reconstruct cosmic-ray muon and neutrino events in the MicroBooNE detector. Metrics that assess the current pattern-recognition performance are presented for simulated MicroBooNE events, using a selection of final-state event topologies.« less

  8. The Pandora multi-algorithm approach to automated pattern recognition of cosmic-ray muon and neutrino events in the MicroBooNE detector

    DOE PAGES

    Acciarri, R.; Adams, C.; An, R.; ...

    2018-01-29

    The development and operation of Liquid-Argon Time-Projection Chambers for neutrino physics has created a need for new approaches to pattern recognition in order to fully exploit the imaging capabilities offered by this technology. Whereas the human brain can excel at identifying features in the recorded events, it is a significant challenge to develop an automated, algorithmic solution. The Pandora Software Development Kit provides functionality to aid the design and implementation of pattern-recognition algorithms. It promotes the use of a multi-algorithm approach to pattern recognition, in which individual algorithms each address a specific task in a particular topology. Many tens ofmore » algorithms then carefully build up a picture of the event and, together, provide a robust automated pattern-recognition solution. This paper describes details of the chain of over one hundred Pandora algorithms and tools used to reconstruct cosmic-ray muon and neutrino events in the MicroBooNE detector. Metrics that assess the current pattern-recognition performance are presented for simulated MicroBooNE events, using a selection of final-state event topologies.« less

  9. Automated Gravimetric Calibration to Optimize the Accuracy and Precision of TECAN Freedom EVO Liquid Handler

    PubMed Central

    Bessemans, Laurent; Jully, Vanessa; de Raikem, Caroline; Albanese, Mathieu; Moniotte, Nicolas; Silversmet, Pascal; Lemoine, Dominique

    2016-01-01

    High-throughput screening technologies are increasingly integrated into the formulation development process of biopharmaceuticals. The performance of liquid handling systems is dependent on the ability to deliver accurate and precise volumes of specific reagents to ensure process quality. We have developed an automated gravimetric calibration procedure to adjust the accuracy and evaluate the precision of the TECAN Freedom EVO liquid handling system. Volumes from 3 to 900 µL using calibrated syringes and fixed tips were evaluated with various solutions, including aluminum hydroxide and phosphate adjuvants, β-casein, sucrose, sodium chloride, and phosphate-buffered saline. The methodology to set up liquid class pipetting parameters for each solution was to split the process in three steps: (1) screening of predefined liquid class, including different pipetting parameters; (2) adjustment of accuracy parameters based on a calibration curve; and (3) confirmation of the adjustment. The run of appropriate pipetting scripts, data acquisition, and reports until the creation of a new liquid class in EVOware was fully automated. The calibration and confirmation of the robotic system was simple, efficient, and precise and could accelerate data acquisition for a wide range of biopharmaceutical applications. PMID:26905719

  10. Online, offline, realtime: recent developments in industrial photogrammetry

    NASA Astrophysics Data System (ADS)

    Boesemann, Werner

    2003-01-01

    In recent years industrial photogrammetry has emerged from a highly specialized niche technology to a well established tool in industrial coordinate measurement applications with numerous installations in a significantly growing market of flexible and portable optical measurement systems. This is due to the development of powerful, but affordable video and computer technology. The increasing industrial requirements for accuracy, speed, robustness and ease of use of these systems together with a demand for the highest possible degree of automation have forced universities and system manufacturer to develop hard- and software solutions to meet these requirements. The presentation will show the latest trends in hardware development, especially new generation digital and/or intelligent cameras, aspects of image engineering like use of controlled illumination or projection technologies, and algorithmic and software aspects like automation strategies or new camera models. The basic qualities of digital photogrammetry- like portability and flexibility on one hand and fully automated quality control on the other - sometimes lead to certain conflicts in the design of measurement systems for different online, offline, or real-time solutions. The presentation will further show, how these tools and methods are combined in different configurations to be able to cover the still growing demands of the industrial end-users.

  11. Photogrammetry in the line: recent developments in industrial photogrammetry

    NASA Astrophysics Data System (ADS)

    Boesemann, Werner

    2003-05-01

    In recent years industrial photogrammetry has emerged from a highly specialized niche technology to a well established tool in industrial coordinate measurement applications with numerous installations in a significantly growing market of flexible and portable optical measurement systems. This is due to the development of powerful, but affordable video and computer technology. The increasing industrial requirements for accuracy, speed, robustness and ease of use of these systems together with a demand for the highest possible degree of automation have forced universities and system manufacturers to develop hard- and software solutions to meet these requirements. The presentation will show the latest trends in hardware development, especially new generation digital and/or intelligent cameras, aspects of image engineering like use of controlled illumination or projection technologies,and algorithmic and software aspects like automation strategies or new camera models. The basic qualities of digital photogrammetry-like portability and flexibility on one hand and fully automated quality control on the other -- sometimes lead to certain conflicts in the design of measurement systems for different online, offline or real-time solutions. The presentation will further show, how these tools and methods are combined in different configurations to be able to cover the still growing demands of the industrial end-users.

  12. RCrane: semi-automated RNA model building.

    PubMed

    Keating, Kevin S; Pyle, Anna Marie

    2012-08-01

    RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Los Alamos National Laboratory, Mailstop M888, Los Alamos, NM 87545, USA; Lawrence Berkeley National Laboratory, One Cyclotron Road, Building 64R0121, Berkeley, CA 94720, USA; Department of Haematology, University of Cambridge, Cambridge CB2 0XY, England

    The PHENIX AutoBuild Wizard is a highly automated tool for iterative model-building, structure refinement and density modification using RESOLVE or TEXTAL model-building, RESOLVE statistical density modification, and phenix.refine structure refinement. Recent advances in the AutoBuild Wizard and phenix.refine include automated detection and application of NCS from models as they are built, extensive model completion algorithms, and automated solvent molecule picking. Model completion algorithms in the AutoBuild Wizard include loop-building, crossovers between chains in different models of a structure, and side-chain optimization. The AutoBuild Wizard has been applied to a set of 48 structures at resolutions ranging from 1.1 {angstrom} tomore » 3.2 {angstrom}, resulting in a mean R-factor of 0.24 and a mean free R factor of 0.29. The R-factor of the final model is dependent on the quality of the starting electron density, and relatively independent of resolution.« less

  14. Automated Subscores for TOEFL iBT[R] Independent Essays. Research Report. ETS RR-11-39

    ERIC Educational Resources Information Center

    Attali, Yigal

    2011-01-01

    The e-rater[R] automated essay scoring system is used operationally in the scoring of TOEFL iBT[R] independent essays. Previous research has found support for a 3-factor structure of the e-rater features. This 3-factor structure has an attractive hierarchical linguistic interpretation with a word choice factor, a grammatical convention within a…

  15. Collaborative Robots and Knowledge Management - A Short Review

    NASA Astrophysics Data System (ADS)

    Mușat, Flaviu-Constantin; Mihu, Florin-Constantin

    2017-12-01

    Because the requirements of the customers are more and more high related to quality, quantity, delivery times at lowest costs possible, the industry had to come with automated solutions to improve these requirements. Starting from the automated lines developed by Ford and Toyota, we have now developed automated and self-sustained working lines, which is possible nowadays-using collaborative robots. By using the knowledge management system we can improve the development of the future of this kind of area of research. This paper shows the benefits and the smartness use of the robots that are performing the manipulation activities that increases the work place ergonomically and improve the interaction between human - machine in order to assist in parallel tasks and lowering the physically human efforts.

  16. Automated assembly in space

    NASA Technical Reports Server (NTRS)

    Srivastava, Sandanand; Dwivedi, Suren N.; Soon, Toh Teck; Bandi, Reddy; Banerjee, Soumen; Hughes, Cecilia

    1989-01-01

    The installation of robots and their use of assembly in space will create an exciting and promising future for the U.S. Space Program. The concept of assembly in space is very complicated and error prone and it is not possible unless the various parts and modules are suitably designed for automation. Certain guidelines are developed for part designing and for an easy precision assembly. Major design problems associated with automated assembly are considered and solutions to resolve these problems are evaluated in the guidelines format. Methods for gripping and methods for part feeding are developed with regard to the absence of gravity in space. The guidelines for part orientation, adjustments, compliances and various assembly construction are discussed. Design modifications of various fasteners and fastening methods are also investigated.

  17. Existence and uniqueness of solutions to a class of nonlinear-operator-differential equations arising in automated spaceship navigation

    NASA Technical Reports Server (NTRS)

    Bogdan, V. M.

    1981-01-01

    A proof is given of the existence and uniqueness of the solution to the automatic control problem with a nonlinear state equation of the form y' = f(t,y,u) and nonlinear operator controls u = U(y) acting onto the state function y which satisfies the initial condition y(t) = x(t) for t or = 0.

  18. Automated quantification of renal interstitial fibrosis for computer-aided diagnosis: A comprehensive tissue structure segmentation method.

    PubMed

    Tey, Wei Keat; Kuang, Ye Chow; Ooi, Melanie Po-Leen; Khoo, Joon Joon

    2018-03-01

    Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses. This study proposes an automated quantification system for measuring the amount of interstitial fibrosis in renal biopsy images as a consistent basis of comparison among pathologists. The system extracts and segments the renal tissue structures based on colour information and structural assumptions of the tissue structures. The regions in the biopsy representing the interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area and quantified as a percentage of the total area of the biopsy sample. A ground truth image dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated a good correlation in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses due to the uncertainties in human judgement. An automated quantification system for accurately measuring the amount of interstitial fibrosis in renal biopsy images is presented as a consistent basis of comparison among pathologists. The system identifies the renal tissue structures through knowledge-based rules employing colour space transformations and structural features extraction from the images. In particular, the renal glomerulus identification is based on a multiscale textural feature analysis and a support vector machine. The regions in the biopsy representing interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area. The experiments conducted evaluate the system in terms of quantification accuracy, intra- and inter-observer variability in visual quantification by pathologists, and the effect introduced by the automated quantification system on the pathologists' diagnosis. A 40-image ground truth dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated an average error of 9 percentage points in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists involving samples from 70 kidney patients also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. The accuracy of the proposed quantification system has been validated with the ground truth dataset and compared against the pathologists' quantification results. It has been shown that the correlation between different pathologists' estimation of interstitial fibrosis area has significantly improved, demonstrating the effectiveness of the quantification system as a diagnostic aide. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Automated negotiation in environmental resource management: Review and assessment.

    PubMed

    Eshragh, Faezeh; Pooyandeh, Majeed; Marceau, Danielle J

    2015-10-01

    Negotiation is an integral part of our daily life and plays an important role in resolving conflicts and facilitating human interactions. Automated negotiation, which aims at capturing the human negotiation process using artificial intelligence and machine learning techniques, is well-established in e-commerce, but its application in environmental resource management remains limited. This is due to the inherent uncertainties and complexity of environmental issues, along with the diversity of stakeholders' perspectives when dealing with these issues. The objective of this paper is to describe the main components of automated negotiation, review and compare machine learning techniques in automated negotiation, and provide a guideline for the selection of suitable methods in the particular context of stakeholders' negotiation over environmental resource issues. We advocate that automated negotiation can facilitate the involvement of stakeholders in the exploration of a plurality of solutions in order to reach a mutually satisfying agreement and contribute to informed decisions in environmental management along with the need for further studies to consolidate the potential of this modeling approach. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Economics of automation for the design-to-mask interface

    NASA Astrophysics Data System (ADS)

    Erck, Wesley

    2009-04-01

    Mask order automation has increased steadily over the years through a variety of individual mask customer implementations. These have been supported by customer-specific software at the mask suppliers to support the variety of customer output formats. Some customers use the SEMI P10 1 standard, some use supplier-specific formats, and some use customer-specific formats. Some customers use little automation and depend instead on close customer-supplier relationships. Implementations are varied in quality and effectiveness. A major factor which has prolonged the adoption of more advanced and effective solutions has been a lack of understanding of the economic benefits. Some customers think standardized automation mainly benefits the mask supplier in order entry automation, but this ignores a number of other significant benefits which differ dramatically for each party in the supply chain. This paper discusses the nature of those differing advantages and presents simple models suited to four business cases: integrated device manufacturers (IDM), fabless companies, foundries and mask suppliers. Examples and estimates of the financial advantages for these business types will be shown.

  1. Automated characterization and assembly of individual nanowires for device fabrication.

    PubMed

    Yu, Kaiyan; Yi, Jingang; Shan, Jerry W

    2018-05-15

    The automated sorting and positioning of nanowires and nanotubes is essential to enabling the scalable manufacturing of nanodevices for a variety of applications. However, two fundamental challenges still remain: (i) automated placement of individual nanostructures in precise locations, and (ii) the characterization and sorting of highly variable nanomaterials to construct well-controlled nanodevices. Here, we propose and demonstrate an integrated, electric-field based method for the simultaneous automated characterization, manipulation, and assembly of nanowires (ACMAN) with selectable electrical conductivities into nanodevices. We combine contactless and solution-based electro-orientation spectroscopy and electrophoresis-based motion-control, planning and manipulation strategies to simultaneously characterize and manipulate multiple individual nanowires. These nanowires can be selected according to their electrical characteristics and precisely positioned at different locations in a low-conductivity liquid to form functional nanodevices with desired electrical properties. We validate the ACMAN design by assembling field-effect transistors (FETs) with silicon nanowires of selected electrical conductivities. The design scheme provides a key enabling technology for the scalable, automated sorting and assembly of nanowires and nanotubes to build functional nanodevices.

  2. Study of flutter related computational procedures for minimum weight structural sizing of advanced aircraft

    NASA Technical Reports Server (NTRS)

    Oconnell, R. F.; Hassig, H. J.; Radovcich, N. A.

    1976-01-01

    Results of a study of the development of flutter modules applicable to automated structural design of advanced aircraft configurations, such as a supersonic transport, are presented. Automated structural design is restricted to automated sizing of the elements of a given structural model. It includes a flutter optimization procedure; i.e., a procedure for arriving at a structure with minimum mass for satisfying flutter constraints. Methods of solving the flutter equation and computing the generalized aerodynamic force coefficients in the repetitive analysis environment of a flutter optimization procedure are studied, and recommended approaches are presented. Five approaches to flutter optimization are explained in detail and compared. An approach to flutter optimization incorporating some of the methods discussed is presented. Problems related to flutter optimization in a realistic design environment are discussed and an integrated approach to the entire flutter task is presented. Recommendations for further investigations are made. Results of numerical evaluations, applying the five methods of flutter optimization to the same design task, are presented.

  3. Iterative model building, structure refinement and density modification with the PHENIX AutoBuild wizard

    PubMed Central

    Terwilliger, Thomas C.; Grosse-Kunstleve, Ralf W.; Afonine, Pavel V.; Moriarty, Nigel W.; Zwart, Peter H.; Hung, Li-Wei; Read, Randy J.; Adams, Paul D.

    2008-01-01

    The PHENIX AutoBuild wizard is a highly automated tool for iterative model building, structure refinement and density modification using RESOLVE model building, RESOLVE statistical density modification and phenix.refine structure refinement. Recent advances in the AutoBuild wizard and phenix.refine include automated detection and application of NCS from models as they are built, extensive model-completion algorithms and automated solvent-molecule picking. Model-completion algorithms in the AutoBuild wizard include loop building, crossovers between chains in different models of a structure and side-chain optimization. The AutoBuild wizard has been applied to a set of 48 structures at resolutions ranging from 1.1 to 3.2 Å, resulting in a mean R factor of 0.24 and a mean free R factor of 0.29. The R factor of the final model is dependent on the quality of the starting electron density and is relatively independent of resolution. PMID:18094468

  4. Microreactor Cells for High-Throughput X-ray Absorption Spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beesley, Angela; Tsapatsaris, Nikolaos; Weiher, Norbert

    2007-01-19

    High-throughput experimentation has been applied to X-ray Absorption spectroscopy as a novel route for increasing research productivity in the catalysis community. Suitable instrumentation has been developed for the rapid determination of the local structure in the metal component of precursors for supported catalysts. An automated analytical workflow was implemented that is much faster than traditional individual spectrum analysis. It allows the generation of structural data in quasi-real time. We describe initial results obtained from the automated high throughput (HT) data reduction and analysis of a sample library implemented through the 96 well-plate industrial standard. The results show that a fullymore » automated HT-XAS technology based on existing industry standards is feasible and useful for the rapid elucidation of geometric and electronic structure of materials.« less

  5. Mass Spectra-Based Framework for Automated Structural Elucidation of Metabolome Data to Explore Phytochemical Diversity

    PubMed Central

    Matsuda, Fumio; Nakabayashi, Ryo; Sawada, Yuji; Suzuki, Makoto; Hirai, Masami Y.; Kanaya, Shigehiko; Saito, Kazuki

    2011-01-01

    A novel framework for automated elucidation of metabolite structures in liquid chromatography–mass spectrometer metabolome data was constructed by integrating databases. High-resolution tandem mass spectra data automatically acquired from each metabolite signal were used for database searches. Three distinct databases, KNApSAcK, ReSpect, and the PRIMe standard compound database, were employed for the structural elucidation. The outputs were retrieved using the CAS metabolite identifier for identification and putative annotation. A simple metabolite ontology system was also introduced to attain putative characterization of the metabolite signals. The automated method was applied for the metabolome data sets obtained from the rosette leaves of 20 Arabidopsis accessions. Phenotypic variations in novel Arabidopsis metabolites among these accessions could be investigated using this method. PMID:22645535

  6. Automation and Optimization of Multipulse Laser Zona Drilling of Mouse Embryos During Embryo Biopsy.

    PubMed

    Wong, Christopher Yee; Mills, James K

    2017-03-01

    Laser zona drilling (LZD) is a required step in many embryonic surgical procedures, for example, assisted hatching and preimplantation genetic diagnosis. LZD involves the ablation of the zona pellucida (ZP) using a laser while minimizing potentially harmful thermal effects on critical internal cell structures. Develop a method for the automation and optimization of multipulse LZD, applied to cleavage-stage embryos. A two-stage optimization is used. The first stage uses computer vision algorithms to identify embryonic structures and determines the optimal ablation zone farthest away from critical structures such as blastomeres. The second stage combines a genetic algorithm with a previously reported thermal analysis of LZD to optimize the combination of laser pulse locations and pulse durations. The goal is to minimize the peak temperature experienced by the blastomeres while creating the desired opening in the ZP. A proof of concept of the proposed LZD automation and optimization method is demonstrated through experiments on mouse embryos with positive results, as adequately sized openings are created. Automation of LZD is feasible and is a viable step toward the automation of embryo biopsy procedures. LZD is a common but delicate procedure performed by human operators using subjective methods to gauge proper LZD procedure. Automation of LZD removes human error to increase the success rate of LZD. Although the proposed methods are developed for cleavage-stage embryos, the same methods may be applied to most types LZD procedures, embryos at different developmental stages, or nonembryonic cells.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lachut, J. S.

    Laboratory tests have been completed to test the validity of automated solubility measurement equipment using sodium nitrate and sodium chloride solutions (see test plan WRPS-1404441, “Validation Testing for Automated Solubility Measurement Equipment”). The sodium nitrate solution results were within 2-3% of the reference values, so the experiment is considered successful using the turbidity meter. The sodium chloride test was done by sight, as the turbidity meter did not work well using sodium chloride. For example, the “clear” turbidity reading was 53 FNU at 80 °C, 107 FNU at 55 °C, and 151 FNU at 20 °C. The sodium chloride didmore » not work because it is granular and large; as the solution was stirred, the granules stayed to the outside of the reactor and just above the stir bar level, having little impact on the turbidity meter readings as the meter was aimed at the center of the solution. Also, the turbidity meter depth has an impact. The salt tends to remain near the stir bar level. If the meter is deeper in the slurry, it will read higher turbidity, and if the meter is raised higher in the slurry, it will read lower turbidity (possibly near zero) because it reads the “clear” part of the slurry. The sodium chloride solution results, as measured by sight rather than by turbidity instrument readings, were within 5-6% of the reference values.« less

  8. Film/Adhesive Processing Module for Fiber-Placement Processing of Composites

    NASA Technical Reports Server (NTRS)

    Hulcher, A. Bruce

    2007-01-01

    An automated apparatus has been designed and constructed that enables the automated lay-up of composite structures incorporating films, foils, and adhesives during the automated fiber-placement process. This apparatus, denoted a film module, could be used to deposit materials in film or thin sheet form either simultaneously when laying down the fiber composite article or in an independent step.

  9. Approaches to automated protein crystal harvesting

    PubMed Central

    Deller, Marc C.; Rupp, Bernhard

    2014-01-01

    The harvesting of protein crystals is almost always a necessary step in the determination of a protein structure using X-ray crystallographic techniques. However, protein crystals are usually fragile and susceptible to damage during the harvesting process. For this reason, protein crystal harvesting is the single step that remains entirely dependent on skilled human intervention. Automation has been implemented in the majority of other stages of the structure-determination pipeline, including cloning, expression, purification, crystallization and data collection. The gap in automation between crystallization and data collection results in a bottleneck in throughput and presents unfortunate opportunities for crystal damage. Several automated protein crystal harvesting systems have been developed, including systems utilizing microcapillaries, microtools, microgrippers, acoustic droplet ejection and optical traps. However, these systems have yet to be commonly deployed in the majority of crystallography laboratories owing to a variety of technical and cost-related issues. Automation of protein crystal harvesting remains essential for harnessing the full benefits of fourth-generation synchrotrons, free-electron lasers and microfocus beamlines. Furthermore, automation of protein crystal harvesting offers several benefits when compared with traditional manual approaches, including the ability to harvest microcrystals, improved flash-cooling procedures and increased throughput. PMID:24637746

  10. Automated one-step DNA sequencing based on nanoliter reaction volumes and capillary electrophoresis.

    PubMed

    Pang, H M; Yeung, E S

    2000-08-01

    An integrated system with a nano-reactor for cycle-sequencing reaction coupled to on-line purification and capillary gel electrophoresis has been demonstrated. Fifty nanoliters of reagent solution, which includes dye-labeled terminators, polymerase, BSA and template, was aspirated and mixed with the template inside the nano-reactor followed by cycle-sequencing reaction. The reaction products were then purified by a size-exclusion chromatographic column operated at 50 degrees C followed by room temperature on-line injection of the DNA fragments into a capillary for gel electrophoresis. Over 450 bases of DNA can be separated and identified. As little as 25 nl reagent solution can be used for the cycle-sequencing reaction with a slightly shorter read length. Significant savings on reagent cost is achieved because the remaining stock solution can be reused without contamination. The steps of cycle sequencing, on-line purification, injection, DNA separation, capillary regeneration, gel-filling and fluidic manipulation were performed with complete automation. This system can be readily multiplexed for high-throughput DNA sequencing or PCR analysis directly from templates or even biological materials.

  11. Benefits of an automated GLP final report preparation software solution.

    PubMed

    Elvebak, Larry E

    2011-07-01

    The final product of analytical laboratories performing US FDA-regulated (or GLP) method validation and bioanalysis studies is the final report. Although there are commercial-off-the-shelf (COTS) software/instrument systems available to laboratory managers to automate and manage almost every aspect of the instrumental and sample-handling processes of GLP studies, there are few software systems available to fully manage the GLP final report preparation process. This lack of appropriate COTS tools results in the implementation of rather Byzantine and manual processes to cobble together all the information needed to generate a GLP final report. The manual nature of these processes results in the need for several iterative quality control and quality assurance events to ensure data accuracy and report formatting. The industry is in need of a COTS solution that gives laboratory managers and study directors the ability to manage as many portions as possible of the GLP final report writing process and the ability to generate a GLP final report with the click of a button. This article describes the COTS software features needed to give laboratory managers and study directors such a solution.

  12. Numerical analysis of stiffened shells of revolution. Volume 3: Users' manual for STARS-2B, 2V, shell theory automated for rotational structures, 2 (buckling, vibrations), digital computer programs

    NASA Technical Reports Server (NTRS)

    Svalbonas, V.

    1973-01-01

    The User's manual for the shell theory automated for rotational structures (STARS) 2B and 2V (buckling, vibrations) is presented. Several features of the program are: (1) arbitrary branching of the shell meridians, (2) arbitrary boundary conditions, (3) minimum input requirements to describe a complex, practical shell of revolution structure, and (4) accurate analysis capability using a minimum number of degrees of freedom.

  13. The CADSS design automation system. [computerized design language for small digital systems

    NASA Technical Reports Server (NTRS)

    Franke, E. A.

    1973-01-01

    This research was designed to implement and extend a previously defined design automation system for the design of small digital structures. A description is included of the higher level language developed to describe systems as a sequence of register transfer operations. The system simulator which is used to determine if the original description is correct is also discussed. The design automation system produces tables describing the state transistions of the system and the operation of all registers. In addition all Boolean equations specifying system operation are minimized and converted to NAND gate structures. Suggestions for further extensions to the system are also given.

  14. Open-source image reconstruction of super-resolution structured illumination microscopy data in ImageJ

    PubMed Central

    Müller, Marcel; Mönkemöller, Viola; Hennig, Simon; Hübner, Wolfgang; Huser, Thomas

    2016-01-01

    Super-resolved structured illumination microscopy (SR-SIM) is an important tool for fluorescence microscopy. SR-SIM microscopes perform multiple image acquisitions with varying illumination patterns, and reconstruct them to a super-resolved image. In its most frequent, linear implementation, SR-SIM doubles the spatial resolution. The reconstruction is performed numerically on the acquired wide-field image data, and thus relies on a software implementation of specific SR-SIM image reconstruction algorithms. We present fairSIM, an easy-to-use plugin that provides SR-SIM reconstructions for a wide range of SR-SIM platforms directly within ImageJ. For research groups developing their own implementations of super-resolution structured illumination microscopy, fairSIM takes away the hurdle of generating yet another implementation of the reconstruction algorithm. For users of commercial microscopes, it offers an additional, in-depth analysis option for their data independent of specific operating systems. As a modular, open-source solution, fairSIM can easily be adapted, automated and extended as the field of SR-SIM progresses. PMID:26996201

  15. A de novo redesign of the WW domain

    PubMed Central

    Kraemer-Pecore, Christina M.; Lecomte, Juliette T.J.; Desjarlais, John R.

    2003-01-01

    We have used a sequence prediction algorithm and a novel sampling method to design protein sequences for the WW domain, a small β-sheet motif. The procedure, referred to as SPANS, designs sequences to be compatible with an ensemble of closely related polypeptide backbones, mimicking the inherent flexibility of proteins. Two designed sequences (termed SPANS-WW1 and SPANS-WW2), using only naturally occurring l-amino acids, were selected for study and the corresponding polypeptides were prepared in Escherichia coli. Circular dichroism data suggested that both purified polypeptides adopted secondary structure features related to those of the target without the aid of disulfide bridges or bound cofactors. The structure exhibited by SPANS-WW2 melted cooperatively by raising the temperature of the solution. Further analysis of this polypeptide by proton nuclear magnetic resonance spectroscopy demonstrated that at 5°C, it folds into a structure closely resembling a natural WW domain. This achievement constitutes one of a small number of successful de novo protein designs through fully automated computational methods and highlights the feasibility of including backbone flexibility in the design strategy. PMID:14500877

  16. A de novo redesign of the WW domain.

    PubMed

    Kraemer-Pecore, Christina M; Lecomte, Juliette T J; Desjarlais, John R

    2003-10-01

    We have used a sequence prediction algorithm and a novel sampling method to design protein sequences for the WW domain, a small beta-sheet motif. The procedure, referred to as SPANS, designs sequences to be compatible with an ensemble of closely related polypeptide backbones, mimicking the inherent flexibility of proteins. Two designed sequences (termed SPANS-WW1 and SPANS-WW2), using only naturally occurring L-amino acids, were selected for study and the corresponding polypeptides were prepared in Escherichia coli. Circular dichroism data suggested that both purified polypeptides adopted secondary structure features related to those of the target without the aid of disulfide bridges or bound cofactors. The structure exhibited by SPANS-WW2 melted cooperatively by raising the temperature of the solution. Further analysis of this polypeptide by proton nuclear magnetic resonance spectroscopy demonstrated that at 5 degrees C, it folds into a structure closely resembling a natural WW domain. This achievement constitutes one of a small number of successful de novo protein designs through fully automated computational methods and highlights the feasibility of including backbone flexibility in the design strategy.

  17. Human communication needs and organizational productivity: the potential impact of office automation.

    PubMed

    Culnan, M J; Bair, J H

    1983-05-01

    Much of what white collar workers do in offices is communication-related. White collar workers make up the majority of the labor force in the United States today and the majority of current labor costs. Because office automation represents more productive structured techniques for handling both written and oral communication, office automation therefore offers the potential to make organizations more productive by improving organizational communication. This article: (1) defines communication, (2) identifies the potential benefits to be realized from implementing office automation, and (3) offers caveats related to the implementation of office automation systems. Realization of the benefits of office automation depends upon the degree to which new modes of communication may be successfully substituted for traditional modes.

  18. Amazon Forest Structure from IKONOS Satellite Data and the Automated Characterization of Forest Canopy Properties

    Treesearch

    Michael Palace; Michael Keller; Gregory P. Asner; Stephen Hagen; Bobby Braswell

    2008-01-01

    We developed an automated tree crown analysis algorithm using 1-m panchromatic IKONOS satellite images to examine forest canopy structure in the Brazilian Amazon. The algorithm was calibrated on the landscape level with tree geometry and forest stand data at the Fazenda Cauaxi (3.75◦ S, 48.37◦ W) in the eastern Amazon, and then compared with forest...

  19. Device For Controlling Crystallization Of Protein

    NASA Technical Reports Server (NTRS)

    Noever, David A.

    1993-01-01

    Variable sandwich spacer enables optimization of evaporative driving force that governs crystallization of protein from solution. Mechanically more rigid than hanging-drop and sitting-drop devices. Large oscillations and dislodgment of drop of solution in response to vibrations suppressed by glass plates. Other advantages include: suitable for automated delivery, stable handling, and programmable evaporation of protein solution; controlled configuration enables simple and accurate determination of volume of solution without disrupting crystallization; pH and concentration of precipitant controlled dynamically because pH and concentration coupled to rate of evaporation, controllable via adjustment of gap between plates; and enables variation of ratio between surface area and volume of protein solution. Alternative version, plates oriented vertically instead of horizontally.

  20. Investigation of an expert health monitoring system for aeronautical structures based on pattern recognition and acousto-ultrasonics

    NASA Astrophysics Data System (ADS)

    Tibaduiza-Burgos, Diego Alexander; Torres-Arredondo, Miguel Angel

    2015-08-01

    Aeronautical structures are subjected to damage during their service raising the necessity for periodic inspection and maintenance of their components so that structural integrity and safe operation can be guaranteed. Cost reduction related to minimizing the out-of-service time of the aircraft, together with the advantages offered by real-time and safe-life service monitoring, have led to a boom in the design of inexpensive and structurally integrated transducer networks comprising actuators, sensors, signal processing units and controllers. These kinds of automated systems are normally referred to as smart structures and offer a multitude of new solutions to engineering problems and multi-functional capabilities. It is thus expected that structural health monitoring (SHM) systems will become one of the leading technologies for assessing and assuring the structural integrity of future aircraft. This study is devoted to the development and experimental investigation of an SHM methodology for the detection of damage in real scale complex aeronautical structures. The work focuses on each aspect of the SHM system and highlights the potentialities of the health monitoring technique based on acousto-ultrasonics and data-driven modelling within the concepts of sensor data fusion, feature extraction and pattern recognition. The methodology is experimentally demonstrated on an aircraft skin panel and fuselage panel for which several damage scenarios are analysed. The detection performance in both structures is quantified and presented.

  1. An object-oriented design for automated navigation of semantic networks inside a medical data dictionary.

    PubMed

    Ruan, W; Bürkle, T; Dudeck, J

    2000-01-01

    In this paper we present a data dictionary server for the automated navigation of information sources. The underlying knowledge is represented within a medical data dictionary. The mapping between medical terms and information sources is based on a semantic network. The key aspect of implementing the dictionary server is how to represent the semantic network in a way that is easier to navigate and to operate, i.e. how to abstract the semantic network and to represent it in memory for various operations. This paper describes an object-oriented design based on Java that represents the semantic network in terms of a group of objects. A node and its relationships to its neighbors are encapsulated in one object. Based on such a representation model, several operations have been implemented. They comprise the extraction of parts of the semantic network which can be reached from a given node as well as finding all paths between a start node and a predefined destination node. This solution is independent of any given layout of the semantic structure. Therefore the module, called Giessen Data Dictionary Server can act independent of a specific clinical information system. The dictionary server will be used to present clinical information, e.g. treatment guidelines or drug information sources to the clinician in an appropriate working context. The server is invoked from clinical documentation applications which contain an infobutton. Automated navigation will guide the user to all the information relevant to her/his topic, which is currently available inside our closed clinical network.

  2. Automating usability of ATLAS Distributed Computing resources

    NASA Astrophysics Data System (ADS)

    Tupputi, S. A.; Di Girolamo, A.; Kouba, T.; Schovancová, J.; Atlas Collaboration

    2014-06-01

    The automation of ATLAS Distributed Computing (ADC) operations is essential to reduce manpower costs and allow performance-enhancing actions, which improve the reliability of the system. In this perspective a crucial case is the automatic handling of outages of ATLAS computing sites storage resources, which are continuously exploited at the edge of their capabilities. It is challenging to adopt unambiguous decision criteria for storage resources of non-homogeneous types, sizes and roles. The recently developed Storage Area Automatic Blacklisting (SAAB) tool has provided a suitable solution, by employing an inference algorithm which processes history of storage monitoring tests outcome. SAAB accomplishes both the tasks of providing global monitoring as well as automatic operations on single sites. The implementation of the SAAB tool has been the first step in a comprehensive review of the storage areas monitoring and central management at all levels. Such review has involved the reordering and optimization of SAM tests deployment and the inclusion of SAAB results in the ATLAS Site Status Board with both dedicated metrics and views. The resulting structure allows monitoring the storage resources status with fine time-granularity and automatic actions to be taken in foreseen cases, like automatic outage handling and notifications to sites. Hence, the human actions are restricted to reporting and following up problems, where and when needed. In this work we show SAAB working principles and features. We present also the decrease of human interactions achieved within the ATLAS Computing Operation team. The automation results in a prompt reaction to failures, which leads to the optimization of resource exploitation.

  3. The art of fault-tolerant system reliability modeling

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Johnson, Sally C.

    1990-01-01

    A step-by-step tutorial of the methods and tools used for the reliability analysis of fault-tolerant systems is presented. Emphasis is on the representation of architectural features in mathematical models. Details of the mathematical solution of complex reliability models are not presented. Instead the use of several recently developed computer programs--SURE, ASSIST, STEM, PAWS--which automate the generation and solution of these models is described.

  4. Constraint Maintenance with Preferences and Underlying Flexible Solution

    NASA Technical Reports Server (NTRS)

    Bresina, John; Jonsson, Ari; Morris, Paul; Rajan, Kanna

    2003-01-01

    This paper describes an aspect of the constraint reasoning mechanism. that is part of a ground planning system slated to be used for the Mars Exploration Rovers mission, where two rovers are scheduled to land on Mars in January of 2003. The planning system combines manual planning software from JPL with an automatic planning/scheduling system from NASA Ames Research Center, and is designed to be used in a mixed-initiative mode. Among other things, this means that after a plan has been produced, the human operator can perform extensive modifications under the supervision of the automated. system. For each modification to an activity, the automated system must adjust other activities as needed to ensure that constraints continue to be satisfied. Thus, the system must accommodate change in an interactive setting. Performance is of critical importance for interactive use. This is achieved by maintaining an underlying flexible solution to the temporal constraints, while the system presents a fixed schedule to the user. Adjustments are then a matter of constraint propagation rather than completely re-solving the problem. However, this begs the important question of which fixed schedule (among the ones sanctioned by the underlying flexible solution) should be presented to the user.Our approach uses least-change and other preferences as a prism through which the user views the flexible solution.

  5. The 'PhenoBox', a flexible, automated, open-source plant phenotyping solution.

    PubMed

    Czedik-Eysenberg, Angelika; Seitner, Sebastian; Güldener, Ulrich; Koemeda, Stefanie; Jez, Jakub; Colombini, Martin; Djamei, Armin

    2018-04-05

    There is a need for flexible and affordable plant phenotyping solutions for basic research and plant breeding. We demonstrate our open source plant imaging and processing solution ('PhenoBox'/'PhenoPipe') and provide construction plans, source code and documentation to rebuild the system. Use of the PhenoBox is exemplified by studying infection of the model grass Brachypodium distachyon by the head smut fungus Ustilago bromivora, comparing phenotypic responses of maize to infection with a solopathogenic Ustilago maydis (corn smut) strain and effector deletion strains, and studying salt stress response in Nicotiana benthamiana. In U. bromivora-infected grass, phenotypic differences between infected and uninfected plants were detectable weeks before qualitative head smut symptoms. Based on this, we could predict the infection outcome for individual plants with high accuracy. Using a PhenoPipe module for calculation of multi-dimensional distances from phenotyping data, we observe a time after infection-dependent impact of U. maydis effector deletion strains on phenotypic response in maize. The PhenoBox/PhenoPipe system is able to detect established salt stress responses in N. benthamiana. We have developed an affordable, automated, open source imaging and data processing solution that can be adapted to various phenotyping applications in plant biology and beyond. © 2018 The Authors. New Phytologist © 2018 New Phytologist Trust.

  6. Cleaning method and apparatus

    DOEpatents

    Jackson, D.D.; Hollen, R.M.

    1981-02-27

    A method of very thoroughly and quikcly cleaning a guaze electrode used in chemical analyses is given, as well as an automobile cleaning apparatus which makes use of the method. The method generates very little waste solution, and this is very important in analyzing radioactive materials, especially in aqueous solutions. The cleaning apparatus can be used in a larger, fully automated controlled potential coulometric apparatus. About 99.98% of a 5 mg plutonium sample was removed in less than 3 minutes, using only about 60 ml of rinse solution and two main rinse steps.

  7. Collaboration, Automation, and Information Management at Hanford High Level Radioactive Waste (HLW) Tank Farms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aurah, Mirwaise Y.; Roberts, Mark A.

    Washington River Protection Solutions (WRPS), operator of High Level Radioactive Waste (HLW) Tank Farms at the Hanford Site, is taking an over 20-year leap in technology, replacing systems that were monitored with clipboards and obsolete computer systems, as well as solving major operations and maintenance hurdles in the area of process automation and information management. While WRPS is fully compliant with procedures and regulations, the current systems are not integrated and do not share data efficiently, hampering how information is obtained and managed.

  8. Computer vision for microscopy diagnosis of malaria.

    PubMed

    Tek, F Boray; Dempster, Andrew G; Kale, Izzet

    2009-07-13

    This paper reviews computer vision and image analysis studies aiming at automated diagnosis or screening of malaria infection in microscope images of thin blood film smears. Existing works interpret the diagnosis problem differently or propose partial solutions to the problem. A critique of these works is furnished. In addition, a general pattern recognition framework to perform diagnosis, which includes image acquisition, pre-processing, segmentation, and pattern classification components, is described. The open problems are addressed and a perspective of the future work for realization of automated microscopy diagnosis of malaria is provided.

  9. Controlling Wafer Contamination Using Automated On-Line Metrology during Wet Chemical Cleaning

    NASA Astrophysics Data System (ADS)

    Wang, Jason; Kingston, Skip; Han, Ye; Saini, Harmesh; McDonald, Robert; Mui, Rudy

    2003-09-01

    The capabilities of a trace contamination analyzer are discussed and demonstrated. This analytical tool utilizes an electrospray, time-of-flight mass spectrometer (ES-TOF-MS) for fully automated on-line monitoring of wafer cleaning solutions. The analyzer provides rich information on metallic, anionic, cationic, elemental, and organic species through its ability to provide harsh (elemental) and soft (molecular) ionization under both positive and negative modes. It is designed to meet semiconductor process control and yield management needs for the ever increasing complex new chemistries present in wafer fabrication.

  10. Water management requirements for animal and plant maintenance on the Space Station

    NASA Technical Reports Server (NTRS)

    Johnson, C. C.; Rasmussen, D.; Curran, G.

    1987-01-01

    Long-duration Space Station experiments that use animals and plants as test specimens will require increased automation and advanced technologies for water management in order to free scientist-astronauts from routine but time-consuming housekeeping tasks. The three areas that have been identified as requiring water management and that are discusseed are: (1) drinking water and humidity condensate of the animals, (2) nutrient solution and transpired water of the plants, and (3) habitat cleaning methods. Automation potential, technology assessment, crew time savings, and resupply penalties are also discussed.

  11. Urban Automation Networks: Current and Emerging Solutions for Sensed Data Collection and Actuation in Smart Cities.

    PubMed

    Gomez, Carles; Paradells, Josep

    2015-09-10

    Urban Automation Networks (UANs) are being deployed worldwide in order to enable Smart City applications. Given the crucial role of UANs, as well as their diversity, it is critically important to assess their properties and trade-offs. This article introduces the requirements and challenges for UANs, characterizes the main current and emerging UAN paradigms, provides guidelines for their design and/or choice, and comparatively examines their performance in terms of a variety of parameters including coverage, power consumption, latency, standardization status and economic cost.

  12. An evolutionary solution to anesthesia automated record keeping.

    PubMed

    Bicker, A A; Gage, J S; Poppers, P J

    1998-08-01

    In the course of five years the development of an automated anesthesia record keeper has evolved through nearly a dozen stages, each marked by new features and sophistication. Commodity PC hardware and software minimized development costs. Object oriented analysis, programming and design supported the process of change. In addition, we developed an evolutionary strategy that optimized motivation, risk management, and maximized return on investment. Besides providing record keeping services, the system supports educational and research activities and through a flexible plotting paradigm, supports each anesthesiologist's focus on physiological data during and after anesthesia.

  13. Flow-injection system for automated dissolution testing of isoniazid tablets with chemiluminescence detection.

    PubMed

    Li, B; Zhang, Z; Liu, W

    2001-05-30

    A simple and sensitive flow-injection chemiluminescence (CL) system for automated dissolution testing is described and evaluated for monitoring of dissolution profiles of isoniazid tablets. The undissolved suspended particles in the dissolved solution were eliminated via on-line filter. The novel CL system of KIO(4)-isoniazid was also investigated. The sampling frequency of the system was 120 h(-1). The dissolution profiles of isoniazid fast-release tablets from three sources were determined, which demonstrates the stability, great sensitivity, large dynamic measuring range and robustness of the system.

  14. Urban Automation Networks: Current and Emerging Solutions for Sensed Data Collection and Actuation in Smart Cities

    PubMed Central

    Gomez, Carles; Paradells, Josep

    2015-01-01

    Urban Automation Networks (UANs) are being deployed worldwide in order to enable Smart City applications. Given the crucial role of UANs, as well as their diversity, it is critically important to assess their properties and trade-offs. This article introduces the requirements and challenges for UANs, characterizes the main current and emerging UAN paradigms, provides guidelines for their design and/or choice, and comparatively examines their performance in terms of a variety of parameters including coverage, power consumption, latency, standardization status and economic cost. PMID:26378534

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buche, D. L.; Perry, S.

    This report describes Northern Indiana Public Service Co. project efforts to develop an automated energy distribution and reliability system. The purpose of this project was to implement a database-driven GIS solution that would manage all of the company's gas, electric, and landbase objects.

  16. Leveraging of Open EMR Architecture for Clinical Trial Accrual

    PubMed Central

    Afrin, Lawrence B.; Oates, James C.; Boyd, Caroline K.; Daniels, Mark S.

    2003-01-01

    Accrual to clinical trials is a major bottleneck in scientific progress in clinical medicine. Many methods for identifying potential subjects and improving accrual have been pursued; few have succeeded, and none have proven generally reproducible or scalable. We leveraged the open architecture of the core clinical data repository of our electronic medical record system to prototype a solution for this problem in a manner consistent with contemporary regulations and research ethics. We piloted the solution with a local investigator-initiated trial for which candidate identification was expected to be difficult. Key results in the eleven months of experience to date include automated screening of 7,296,708 lab results from 69,288 patients, detection of 1,768 screening tests of interest, identification of 70 potential candidates who met all further automated criteria, and accrual of three candidates to the trial. Hypotheses for this disappointing impact on accrual, and directions for future research, are discussed. PMID:14728125

  17. Automating Phase Change Lines and Their Labels Using Microsoft Excel(R).

    PubMed

    Deochand, Neil

    2017-09-01

    Many researchers have rallied against drawn in graphical elements and offered ways to avoid them, especially regarding the insertion of phase change lines (Deochand, Costello, & Fuqua, 2015; Dubuque, 2015; Vanselow & Bourret, 2012). However, few have offered a solution to automating the phase labels, which are often utilized in behavior analytic graphical displays (Deochand et al., 2015). Despite the fact that Microsoft Excel® is extensively utilized by behavior analysts, solutions to resolve issues in our graphing practices are not always apparent or user-friendly. Considering the insertion of phase change lines and their labels constitute a repetitious and laborious endeavor, any minimization in the steps to accomplish these graphical elements could offer substantial time-savings to the field. The purpose of this report is to provide an updated way (and templates in the supplemental materials) to add phase change lines with their respective labels, which stay embedded to the graph when they are moved or updated.

  18. Automating the solution of PDEs on the sphere and other manifolds in FEniCS 1.2

    NASA Astrophysics Data System (ADS)

    Rognes, M. E.; Ham, D. A.; Cotter, C. J.; McRae, A. T. T.

    2013-12-01

    Differential equations posed over immersed manifolds are of particular importance in studying geophysical flows; for instance, ocean and atmosphere simulations crucially rely on the capability to solve equations over the sphere. This paper presents the extension of the FEniCS software components to the automated solution of finite element formulations of differential equations defined over general, immersed manifolds. We describe the implementation and, in particular detail, how the required extensions essentially reduce to the extension of the FEniCS form compiler to cover this case. The resulting implementation has all the properties of the FEniCS pipeline and we demonstrate its flexibility by an extensive range of numerical examples covering a number of geophysical benchmark examples and test cases. The results are all in agreement with the expected values. The description here relates to DOLFIN/FEniCS 1.2.

  19. Automating the solution of PDEs on the sphere and other manifolds in FEniCS 1.2

    NASA Astrophysics Data System (ADS)

    Rognes, M. E.; Ham, D. A.; Cotter, C. J.; McRae, A. T. T.

    2013-07-01

    Differential equations posed over immersed manifolds are of particular importance in studying geophysical flows; for instance, ocean and atmosphere simulations crucially rely on the capability to solve equations over the sphere. This paper presents the extension of the FEniCS software components to the automated solution of finite element formulations of differential equations defined over general, immersed manifolds. We describe the implementation and in particular detail how the required extensions essentially reduce to the extension of the FEniCS form compiler to cover this case. The resulting implementation has all the properties of the FEniCS pipeline and we demonstrate its flexibility by an extensive range of numerical examples covering a number of geophysical benchmark examples and test cases. The results are all in agreement with the expected values. The description here relates to DOLFIN/FEniCS 1.2.

  20. Spacecraft command verification: The AI solution

    NASA Technical Reports Server (NTRS)

    Fesq, Lorraine M.; Stephan, Amy; Smith, Brian K.

    1990-01-01

    Recently, a knowledge-based approach was used to develop a system called the Command Constraint Checker (CCC) for TRW. CCC was created to automate the process of verifying spacecraft command sequences. To check command files by hand for timing and sequencing errors is a time-consuming and error-prone task. Conventional software solutions were rejected when it was estimated that it would require 36 man-months to build an automated tool to check constraints by conventional methods. Using rule-based representation to model the various timing and sequencing constraints of the spacecraft, CCC was developed and tested in only three months. By applying artificial intelligence techniques, CCC designers were able to demonstrate the viability of AI as a tool to transform difficult problems into easily managed tasks. The design considerations used in developing CCC are discussed and the potential impact of this system on future satellite programs is examined.

  1. A mixed optimization method for automated design of fuselage structures.

    NASA Technical Reports Server (NTRS)

    Sobieszczanski, J.; Loendorf, D.

    1972-01-01

    A procedure for automating the design of transport aircraft fuselage structures has been developed and implemented in the form of an operational program. The structure is designed in two stages. First, an overall distribution of structural material is obtained by means of optimality criteria to meet strength and displacement constraints. Subsequently, the detailed design of selected rings and panels consisting of skin and stringers is performed by mathematical optimization accounting for a set of realistic design constraints. The practicality and computer efficiency of the procedure is demonstrated on cylindrical and area-ruled large transport fuselages.

  2. Protein structure determination by electron diffraction using a single three-dimensional nanocrystal.

    PubMed

    Clabbers, M T B; van Genderen, E; Wan, W; Wiegers, E L; Gruene, T; Abrahams, J P

    2017-09-01

    Three-dimensional nanometre-sized crystals of macromolecules currently resist structure elucidation by single-crystal X-ray crystallography. Here, a single nanocrystal with a diffracting volume of only 0.14 µm 3 , i.e. no more than 6 × 10 5 unit cells, provided sufficient information to determine the structure of a rare dimeric polymorph of hen egg-white lysozyme by electron crystallography. This is at least an order of magnitude smaller than was previously possible. The molecular-replacement solution, based on a monomeric polyalanine model, provided sufficient phasing power to show side-chain density, and automated model building was used to reconstruct the side chains. Diffraction data were acquired using the rotation method with parallel beam diffraction on a Titan Krios transmission electron microscope equipped with a novel in-house-designed 1024 × 1024 pixel Timepix hybrid pixel detector for low-dose diffraction data collection. Favourable detector characteristics include the ability to accurately discriminate single high-energy electrons from X-rays and count them, fast readout to finely sample reciprocal space and a high dynamic range. This work, together with other recent milestones, suggests that electron crystallography can provide an attractive alternative in determining biological structures.

  3. Protein structure determination by electron diffraction using a single three-dimensional nanocrystal

    PubMed Central

    Clabbers, M. T. B.; van Genderen, E.; Wiegers, E. L.; Gruene, T.; Abrahams, J. P.

    2017-01-01

    Three-dimensional nanometre-sized crystals of macromolecules currently resist structure elucidation by single-crystal X-ray crystallography. Here, a single nanocrystal with a diffracting volume of only 0.14 µm3, i.e. no more than 6 × 105 unit cells, provided sufficient information to determine the structure of a rare dimeric polymorph of hen egg-white lysozyme by electron crystallography. This is at least an order of magnitude smaller than was previously possible. The molecular-replacement solution, based on a monomeric polyalanine model, provided sufficient phasing power to show side-chain density, and automated model building was used to reconstruct the side chains. Diffraction data were acquired using the rotation method with parallel beam diffraction on a Titan Krios transmission electron microscope equipped with a novel in-house-designed 1024 × 1024 pixel Timepix hybrid pixel detector for low-dose diffraction data collection. Favourable detector characteristics include the ability to accurately discriminate single high-energy electrons from X-rays and count them, fast readout to finely sample reciprocal space and a high dynamic range. This work, together with other recent milestones, suggests that electron crystallography can provide an attractive alternative in determining biological structures. PMID:28876237

  4. Discrimination of Isomers of Released N- and O-Glycans Using Diagnostic Product Ions in Negative Ion PGC-LC-ESI-MS/MS

    NASA Astrophysics Data System (ADS)

    Ashwood, Christopher; Lin, Chi-Hung; Thaysen-Andersen, Morten; Packer, Nicolle H.

    2018-03-01

    Profiling cellular protein glycosylation is challenging due to the presence of highly similar glycan structures that play diverse roles in cellular physiology. As the anomericity and the exact linkage type of a single glycosidic bond can influence glycan function, there is a demand for improved and automated methods to confirm detailed structural features and to discriminate between structurally similar isomers, overcoming a significant bottleneck in the analysis of data generated by glycomics experiments. We used porous graphitized carbon-LC-ESI-MS/MS to separate and detect released N- and O-glycan isomers from mammalian model glycoproteins using negative mode resonance activation CID-MS/MS. By interrogating similar fragment spectra from closely related glycan isomers that differ only in arm position and sialyl linkage, product fragment ions for discrimination between these features were discovered. Using the Skyline software, at least two diagnostic fragment ions of high specificity were validated for automated discrimination of sialylation and arm position in N-glycan structures, and sialylation in O-glycan structures, complementing existing structural diagnostic ions. These diagnostic ions were shown to be useful for isomer discrimination using both linear and 3D ion trap mass spectrometers when analyzing complex glycan mixtures from cell lysates. Skyline was found to serve as a useful tool for automated assessment of glycan isomer discrimination. This platform-independent workflow can potentially be extended to automate the characterization and quantitation of other challenging glycan isomers. [Figure not available: see fulltext.

  5. Accuracy of determining preoperative cancer extent measured by automated breast ultrasonography.

    PubMed

    Tozaki, Mitsuhiro; Fukuma, Eisuke

    2010-12-01

    The aim of this study was to determine the accuracy of measuring preoperative cancer extent using automated breast ultrasonography (US). This retrospective study consisted of 40 patients with histopathologically confirmed breast cancer. All of the patients underwent automated breast US (ABVS; Siemens Medical Solutions, Mountain View, CA, USA) on the day before the surgery. The sizes of the lesions on US were measured on coronal multiplanar reconstruction images using the ABVS workstation. Histopathological measurement of tumor size included not only the invasive foci but also any in situ component and was used as the gold standard. The discrepancy of the tumor extent between automated breast US and the histological examination was calculated. Automated breast US enabled visualization of the breast carcinomas in all patients. The mean size of the lesions on US was 12 mm (range 4-62 mm). The histopathological diagnosis was ductal carcinoma in situ (DCIS) in seven patients and invasive ductal carcinoma in 33 patients (18 without an intraductal component, 15 with an intraductal component). Lesions ranged in diameter from 4 to 65 mm (mean 16 mm). The accuracy of determination of the tumor extent with a deviation in length of <2 cm was 98% (39/40). Automated breast US is thought to be useful for evaluating tumor extent preoperatively.

  6. Automated single-trial assessment of laser-evoked potentials as an objective functional diagnostic tool for the nociceptive system.

    PubMed

    Hatem, S M; Hu, L; Ragé, M; Gierasimowicz, A; Plaghki, L; Bouhassira, D; Attal, N; Iannetti, G D; Mouraux, A

    2012-12-01

    To assess the clinical usefulness of an automated analysis of event-related potentials (ERPs). Nociceptive laser-evoked potentials (LEPs) and non-nociceptive somatosensory electrically-evoked potentials (SEPs) were recorded in 37 patients with syringomyelia and 21 controls. LEP and SEP peak amplitudes and latencies were estimated using a single-trial automated approach based on time-frequency wavelet filtering and multiple linear regression, as well as a conventional approach based on visual inspection. The amplitudes and latencies of normal and abnormal LEP and SEP peaks were identified reliably using both approaches, with similar sensitivity and specificity. Because the automated approach provided an unbiased solution to account for average waveforms where no ERP could be identified visually, it revealed significant differences between patients and controls that were not revealed using the visual approach. The automated analysis of ERPs characterized reliably and objectively LEP and SEP waveforms in patients. The automated single-trial analysis can be used to characterize normal and abnormal ERPs with a similar sensitivity and specificity as visual inspection. While this does not justify its use in a routine clinical setting, the technique could be useful to avoid observer-dependent biases in clinical research. Copyright © 2012 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  7. Advanced in In Situ Inspection of Automated Fiber Placement Systems

    NASA Technical Reports Server (NTRS)

    Juarez, Peter D.; Cramer, K. Elliott; Seebo, Jeffrey P.

    2016-01-01

    Automated Fiber Placement (AFP) systems have been developed to help take advantage of the tailorability of composite structures in aerospace applications. AFP systems allow the repeatable placement of uncured, spool fed, preimpregnated carbon fiber tape (tows) onto substrates in desired thicknesses and orientations. This automated process can incur defects, such as overlapping tow lines, which can severely undermine the structural integrity of the part. Current defect detection and abatement methods are very labor intensive, and still mostly rely on human manual inspection. Proposed is a thermographic in situ inspection technique which monitors tow placement with an on board thermal camera using the preheated substrate as a through transmission heat source. An investigation of the concept is conducted, and preliminary laboratory results are presented. Also included will be a brief overview of other emerging technologies that tackle the same issue. Keywords: Automated Fiber Placement, Manufacturing defects, Thermography

  8. Integrated force method versus displacement method for finite element analysis

    NASA Technical Reports Server (NTRS)

    Patnaik, S. N.; Berke, L.; Gallagher, R. H.

    1991-01-01

    A novel formulation termed the integrated force method (IFM) has been developed in recent years for analyzing structures. In this method all the internal forces are taken as independent variables, and the system equilibrium equations (EEs) are integrated with the global compatibility conditions (CCs) to form the governing set of equations. In IFM the CCs are obtained from the strain formulation of St. Venant, and no choices of redundant load systems have to be made, in constrast to the standard force method (SFM). This property of IFM allows the generation of the governing equation to be automated straightforwardly, as it is in the popular stiffness method (SM). In this report IFM and SM are compared relative to the structure of their respective equations, their conditioning, required solution methods, overall computational requirements, and convergence properties as these factors influence the accuracy of the results. Overall, this new version of the force method produces more accurate results than the stiffness method for comparable computational cost.

  9. Phaser crystallographic software.

    PubMed

    McCoy, Airlie J; Grosse-Kunstleve, Ralf W; Adams, Paul D; Winn, Martyn D; Storoni, Laurent C; Read, Randy J

    2007-08-01

    Phaser is a program for phasing macromolecular crystal structures by both molecular replacement and experimental phasing methods. The novel phasing algorithms implemented in Phaser have been developed using maximum likelihood and multivariate statistics. For molecular replacement, the new algorithms have proved to be significantly better than traditional methods in discriminating correct solutions from noise, and for single-wavelength anomalous dispersion experimental phasing, the new algorithms, which account for correlations between F(+) and F(-), give better phases (lower mean phase error with respect to the phases given by the refined structure) than those that use mean F and anomalous differences DeltaF. One of the design concepts of Phaser was that it be capable of a high degree of automation. To this end, Phaser (written in C++) can be called directly from Python, although it can also be called using traditional CCP4 keyword-style input. Phaser is a platform for future development of improved phasing methods and their release, including source code, to the crystallographic community.

  10. Integrated force method versus displacement method for finite element analysis

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Berke, Laszlo; Gallagher, Richard H.

    1990-01-01

    A novel formulation termed the integrated force method (IFM) has been developed in recent years for analyzing structures. In this method all the internal forces are taken as independent variables, and the system equilibrium equations (EE's) are integrated with the global compatibility conditions (CC's) to form the governing set of equations. In IFM the CC's are obtained from the strain formulation of St. Venant, and no choices of redundant load systems have to be made, in constrast to the standard force method (SFM). This property of IFM allows the generation of the governing equation to be automated straightforwardly, as it is in the popular stiffness method (SM). In this report IFM and SM are compared relative to the structure of their respective equations, their conditioning, required solution methods, overall computational requirements, and convergence properties as these factors influence the accuracy of the results. Overall, this new version of the force method produces more accurate results than the stiffness method for comparable computational cost.

  11. Membrane protein properties revealed through data-rich electrostatics calculations

    PubMed Central

    Guerriero, Christopher J.; Brodsky, Jeffrey L.; Grabe, Michael

    2015-01-01

    SUMMARY The electrostatic properties of membrane proteins often reveal many of their key biophysical characteristics, such as ion channel selectivity and the stability of charged membrane-spanning segments. The Poisson-Boltzmann (PB) equation is the gold standard for calculating protein electrostatics, and the software APBSmem enables the solution of the PB equation in the presence of a membrane. Here, we describe significant advances to APBSmem including: full automation of system setup, per-residue energy decomposition, incorporation of PDB2PQR, calculation of membrane induced pKa shifts, calculation of non-polar energies, and command-line scripting for large scale calculations. We highlight these new features with calculations carried out on a number of membrane proteins, including the recently solved structure of the ion channel TRPV1 and a large survey of 1,614 membrane proteins of known structure. This survey provides a comprehensive list of residues with large electrostatic penalties for being embedded in the membrane potentially revealing interesting functional information. PMID:26118532

  12. Membrane Protein Properties Revealed through Data-Rich Electrostatics Calculations.

    PubMed

    Marcoline, Frank V; Bethel, Neville; Guerriero, Christopher J; Brodsky, Jeffrey L; Grabe, Michael

    2015-08-04

    The electrostatic properties of membrane proteins often reveal many of their key biophysical characteristics, such as ion channel selectivity and the stability of charged membrane-spanning segments. The Poisson-Boltzmann (PB) equation is the gold standard for calculating protein electrostatics, and the software APBSmem enables the solution of the PB equation in the presence of a membrane. Here, we describe significant advances to APBSmem, including full automation of system setup, per-residue energy decomposition, incorporation of PDB2PQR, calculation of membrane-induced pKa shifts, calculation of non-polar energies, and command-line scripting for large-scale calculations. We highlight these new features with calculations carried out on a number of membrane proteins, including the recently solved structure of the ion channel TRPV1 and a large survey of 1,614 membrane proteins of known structure. This survey provides a comprehensive list of residues with large electrostatic penalties for being embedded in the membrane, potentially revealing interesting functional information. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Accelerating materials discovery through the development of polymer databases

    NASA Astrophysics Data System (ADS)

    Audus, Debra

    In our line of business we create chemical solutions for a wide range of applications, such as home and personal care, printing and packaging, automotive and structural coatings, and structural plastics and foams applications. In this environment, stable and highly automated workflows suitable to handle complex systems are a must. By satisfying these prerequisites, efficiency for the development of new materials can be significantly improved by combining modeling and experimental approaches. This is in fact in line with recent Materials Genome Initiative efforts sponsored by the US administration. From our experience, we know, that valuable contributions to product development are possible today by combining existing modeling techniques in an intelligent fashion, provided modeling and experiment work closely together. In my presentation I intend to review approaches to build and parameterize soft matter systems. As an example of our standard workflow, I will show a few applications, which include the design of a stabilizer molecule for dispersing polymer particles and the simulation of polystyrene dispersions.

  14. Elimination sequence optimization for SPAR

    NASA Technical Reports Server (NTRS)

    Hogan, Harry A.

    1986-01-01

    SPAR is a large-scale computer program for finite element structural analysis. The program allows user specification of the order in which the joints of a structure are to be eliminated since this order can have significant influence over solution performance, in terms of both storage requirements and computer time. An efficient elimination sequence can improve performance by over 50% for some problems. Obtaining such sequences, however, requires the expertise of an experienced user and can take hours of tedious effort to affect. Thus, an automatic elimination sequence optimizer would enhance productivity by reducing the analysts' problem definition time and by lowering computer costs. Two possible methods for automating the elimination sequence specifications were examined. Several algorithms based on the graph theory representations of sparse matrices were studied with mixed results. Significant improvement in the program performance was achieved, but sequencing by an experienced user still yields substantially better results. The initial results provide encouraging evidence that the potential benefits of such an automatic sequencer would be well worth the effort.

  15. Novel 2D Triple-Resonance NMR Experiments for Sequential Resonance Assignments of Proteins

    NASA Astrophysics Data System (ADS)

    Ding, Keyang; Gronenborn, Angela M.

    2002-06-01

    We present 2D versions of the popular triple resonance HN(CO) CACB, HN(COCA)CACB, HN(CO)CAHA, and HN(COCA) CAHA experiments, commonly used for sequential resonance assignments of proteins. These experiments provide information about correlations between amino proton and nitrogen chemical shifts and the α- and β-carbon and α-proton chemical shifts within and between amino acid residues. Using these 2D spectra, sequential resonance assignments of H N, N, C α, C β, and H α nuclei are easily achieved. The resolution of these spectra is identical to the well-resolved 2D 15N- 1H HSQC and H(NCO)CA spectra, with slightly reduced sensitivity compared to their 3D and 4D versions. These types of spectra are ideally suited for exploitation in automated assignment procedures and thereby constitute a fast and efficient means for NMR structural determination of small and medium-sized proteins in solution in structural genomics programs.

  16. Design of Flight Vehicle Management Systems

    NASA Technical Reports Server (NTRS)

    Meyer, George; Aiken, Edwin W. (Technical Monitor)

    1994-01-01

    As the operation of large systems becomes ever more dependent on extensive automation, the need for an effective solution to the problem of design and validation of the underlying software becomes more critical. Large systems possess much detailed structure, typically hierarchical, and they are hybrid. Information processing at the top of the hierarchy is by means of formal logic and sentences; on the bottom it is by means of simple scalar differential equations and functions of time; and in the middle it is by an interacting mix of nonlinear multi-axis differential equations and automata, and functions of time and discrete events. The lecture will address the overall problem as it relates to flight vehicle management, describe the middle level, and offer a design approach that is based on Differential Geometry and Discrete Event Dynamic Systems Theory.

  17. Nonlinear Control and Discrete Event Systems

    NASA Technical Reports Server (NTRS)

    Meyer, George; Null, Cynthia H. (Technical Monitor)

    1995-01-01

    As the operation of large systems becomes ever more dependent on extensive automation, the need for an effective solution to the problem of design and validation of the underlying software becomes more critical. Large systems possesses much detailed structure, typically hierarchical, and they are hybrid. Information processing at the top of the hierarchy is by means of formal logic and sentences; on the bottom it is by means of simple scalar differential equations and functions of time; and in the middle it is by an interacting mix of nonlinear multi-axis differential equations and automata, and functions of time and discrete events. The lecture will address the overall problem as it relates to flight vehicle management, describe the middle level, and offer a design approach that is based on Differential Geometry and Discrete Event Dynamic Systems Theory.

  18. Matching of energetic, mechanic and control characteristics of positioning actuator

    NASA Astrophysics Data System (ADS)

    Y Nosova, N.; Misyurin, S. Yu; Kreinin, G. V.

    2017-12-01

    The problem of preliminary choice of parameters of the automated drive power channel is discussed. The drive of the mechatronic complex divides into two main units - power and control. The first determines the energy capabilities and, as a rule, the overall dimensions of the complex. The sufficient capacity of the power unit is a necessary condition for successful solution of control tasks without excessive complication of the control system structure. Preliminary selection of parameters is carried out based on the condition of providing the necessary drive power. The proposed approach is based on: a research of a sufficiently developed but not excessive dynamic model of the power block with the help of a conditional test control system; a transition to a normalized model with the formation of similarity criteria; constructing the synthesis procedure.

  19. Developing Quality Indicators and Auditing Protocols from Formal Guideline Models: Knowledge Representation and Transformations

    PubMed Central

    Advani, Aneel; Goldstein, Mary; Shahar, Yuval; Musen, Mark A.

    2003-01-01

    Automated quality assessment of clinician actions and patient outcomes is a central problem in guideline- or standards-based medical care. In this paper we describe a model representation and algorithm for deriving structured quality indicators and auditing protocols from formalized specifications of guidelines used in decision support systems. We apply the model and algorithm to the assessment of physician concordance with a guideline knowledge model for hypertension used in a decision-support system. The properties of our solution include the ability to derive automatically (1) context-specific and (2) case-mix-adjusted quality indicators that (3) can model global or local levels of detail about the guideline (4) parameterized by defining the reliability of each indicator or element of the guideline. PMID:14728124

  20. The big data challenges of connectomics.

    PubMed

    Lichtman, Jeff W; Pfister, Hanspeter; Shavit, Nir

    2014-11-01

    The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces 'big data', unprecedented quantities of digital information at unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here we describe some of the key difficulties that may arise and provide suggestions for managing them.

  1. Development of a Prototype Automation Simulation Scenario Generator for Air Traffic Management Software Simulations

    NASA Technical Reports Server (NTRS)

    Khambatta, Cyrus F.

    2007-01-01

    A technique for automated development of scenarios for use in the Multi-Center Traffic Management Advisor (McTMA) software simulations is described. The resulting software is designed and implemented to automate the generation of simulation scenarios with the intent of reducing the time it currently takes using an observational approach. The software program is effective in achieving this goal. The scenarios created for use in the McTMA simulations are based on data taken from data files from the McTMA system, and were manually edited before incorporation into the simulations to ensure accuracy. Despite the software s overall favorable performance, several key software issues are identified. Proposed solutions to these issues are discussed. Future enhancements to the scenario generator software may address the limitations identified in this paper.

  2. A modular approach for automated sample preparation and chemical analysis

    NASA Technical Reports Server (NTRS)

    Clark, Michael L.; Turner, Terry D.; Klingler, Kerry M.; Pacetti, Randolph

    1994-01-01

    Changes in international relations, especially within the past several years, have dramatically affected the programmatic thrusts of the U.S. Department of Energy (DOE). The DOE now is addressing the environmental cleanup required as a result of 50 years of nuclear arms research and production. One major obstacle in the remediation of these areas is the chemical determination of potentially contaminated material using currently acceptable practices. Process bottlenecks and exposure to hazardous conditions pose problems for the DOE. One proposed solution is the application of modular automated chemistry using Standard Laboratory Modules (SLM) to perform Standard Analysis Methods (SAM). The Contaminant Analysis Automation (CAA) Program has developed standards and prototype equipment that will accelerate the development of modular chemistry technology and is transferring this technology to private industry.

  3. Spinoff 2010

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Topics covered include: Burnishing Techniques Strengthen Hip Implants; Signal Processing Methods Monitor Cranial Pressure; Ultraviolet-Blocking Lenses Protect, Enhance Vision; Hyperspectral Systems Increase Imaging Capabilities; Programs Model the Future of Air Traffic Management; Tail Rotor Airfoils Stabilize Helicopters, Reduce Noise; Personal Aircraft Point to the Future of Transportation; Ducted Fan Designs Lead to Potential New Vehicles; Winglets Save Billions of Dollars in Fuel Costs; Sensor Systems Collect Critical Aerodynamics Data; Coatings Extend Life of Engines and Infrastructure; Radiometers Optimize Local Weather Prediction; Energy-Efficient Systems Eliminate Icing Danger for UAVs; Rocket-Powered Parachutes Rescue Entire Planes; Technologies Advance UAVs for Science, Military; Inflatable Antennas Support Emergency Communication; Smart Sensors Assess Structural Health; Hand-Held Devices Detect Explosives and Chemical Agents; Terahertz Tools Advance Imaging for Security, Industry; LED Systems Target Plant Growth; Aerogels Insulate Against Extreme Temperatures; Image Sensors Enhance Camera Technologies; Lightweight Material Patches Allow for Quick Repairs; Nanomaterials Transform Hairstyling Tools; Do-It-Yourself Additives Recharge Auto Air Conditioning; Systems Analyze Water Quality in Real Time; Compact Radiometers Expand Climate Knowledge; Energy Servers Deliver Clean, Affordable Power; Solutions Remediate Contaminated Groundwater; Bacteria Provide Cleanup of Oil Spills, Wastewater; Reflective Coatings Protect People and Animals; Innovative Techniques Simplify Vibration Analysis; Modeling Tools Predict Flow in Fluid Dynamics; Verification Tools Secure Online Shopping, Banking; Toolsets Maintain Health of Complex Systems; Framework Resources Multiply Computing Power; Tools Automate Spacecraft Testing, Operation; GPS Software Packages Deliver Positioning Solutions; Solid-State Recorders Enhance Scientific Data Collection; Computer Models Simulate Fine Particle Dispersion; Composite Sandwich Technologies Lighten Components; Cameras Reveal Elements in the Short Wave Infrared; Deformable Mirrors Correct Optical Distortions; Stitching Techniques Advance Optics Manufacturing; Compact, Robust Chips Integrate Optical Functions; Fuel Cell Stations Automate Processes, Catalyst Testing; Onboard Systems Record Unique Videos of Space Missions; Space Research Results Purify Semiconductor Materials; and Toolkits Control Motion of Complex Robotics.

  4. RootGraph: a graphic optimization tool for automated image analysis of plant roots

    PubMed Central

    Cai, Jinhai; Zeng, Zhanghui; Connor, Jason N.; Huang, Chun Yuan; Melino, Vanessa; Kumar, Pankaj; Miklavcic, Stanley J.

    2015-01-01

    This paper outlines a numerical scheme for accurate, detailed, and high-throughput image analysis of plant roots. In contrast to existing root image analysis tools that focus on root system-average traits, a novel, fully automated and robust approach for the detailed characterization of root traits, based on a graph optimization process is presented. The scheme, firstly, distinguishes primary roots from lateral roots and, secondly, quantifies a broad spectrum of root traits for each identified primary and lateral root. Thirdly, it associates lateral roots and their properties with the specific primary root from which the laterals emerge. The performance of this approach was evaluated through comparisons with other automated and semi-automated software solutions as well as against results based on manual measurements. The comparisons and subsequent application of the algorithm to an array of experimental data demonstrate that this method outperforms existing methods in terms of accuracy, robustness, and the ability to process root images under high-throughput conditions. PMID:26224880

  5. Driving out errors through tight integration between software and automation.

    PubMed

    Reifsteck, Mark; Swanson, Thomas; Dallas, Mary

    2006-01-01

    A clear case has been made for using clinical IT to improve medication safety, particularly bar-code point-of-care medication administration and computerized practitioner order entry (CPOE) with clinical decision support. The equally important role of automation has been overlooked. When the two are tightly integrated, with pharmacy information serving as a hub, the distinctions between software and automation become blurred. A true end-to-end medication management system drives out errors from the dockside to the bedside. Presbyterian Healthcare Services in Albuquerque has been building such a system since 1999, beginning by automating pharmacy operations to support bar-coded medication administration. Encouraged by those results, it then began layering on software to further support clinician workflow and improve communication, culminating with the deployment of CPOE and clinical decision support. This combination, plus a hard-wired culture of safety, has resulted in a dramatically lower mortality and harm rate that could not have been achieved with a partial solution.

  6. The State and Trends of Barcode, RFID, Biometric and Pharmacy Automation Technologies in US Hospitals

    PubMed Central

    Uy, Raymonde Charles Y.; Kury, Fabricio P.; Fontelo, Paul A.

    2015-01-01

    The standard of safe medication practice requires strict observance of the five rights of medication administration: the right patient, drug, time, dose, and route. Despite adherence to these guidelines, medication errors remain a public health concern that has generated health policies and hospital processes that leverage automation and computerization to reduce these errors. Bar code, RFID, biometrics and pharmacy automation technologies have been demonstrated in literature to decrease the incidence of medication errors by minimizing human factors involved in the process. Despite evidence suggesting the effectivity of these technologies, adoption rates and trends vary across hospital systems. The objective of study is to examine the state and adoption trends of automatic identification and data capture (AIDC) methods and pharmacy automation technologies in U.S. hospitals. A retrospective descriptive analysis of survey data from the HIMSS Analytics® Database was done, demonstrating an optimistic growth in the adoption of these patient safety solutions. PMID:26958264

  7. Robotic Automation of In Vivo Two-Photon Targeted Whole-Cell Patch-Clamp Electrophysiology.

    PubMed

    Annecchino, Luca A; Morris, Alexander R; Copeland, Caroline S; Agabi, Oshiorenoya E; Chadderton, Paul; Schultz, Simon R

    2017-08-30

    Whole-cell patch-clamp electrophysiological recording is a powerful technique for studying cellular function. While in vivo patch-clamp recording has recently benefited from automation, it is normally performed "blind," meaning that throughput for sampling some genetically or morphologically defined cell types is unacceptably low. One solution to this problem is to use two-photon microscopy to target fluorescently labeled neurons. Combining this with robotic automation is difficult, however, as micropipette penetration induces tissue deformation, moving target cells from their initial location. Here we describe a platform for automated two-photon targeted patch-clamp recording, which solves this problem by making use of a closed loop visual servo algorithm. Our system keeps the target cell in focus while iteratively adjusting the pipette approach trajectory to compensate for tissue motion. We demonstrate platform validation with patch-clamp recordings from a variety of cells in the mouse neocortex and cerebellum. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  8. Enabling Automated Dynamic Demand Response: From Theory to Practice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frincu, Marc; Chelmis, Charalampos; Aman, Saima

    2015-07-14

    Demand response (DR) is a technique used in smart grids to shape customer load during peak hours. Automated DR offers utilities a fine grained control and a high degree of confidence in the outcome. However the impact on the customer's comfort means this technique is more suited for industrial and commercial settings than for residential homes. In this paper we propose a system for achieving automated controlled DR in a heterogeneous environment. We present some of the main issues arising in building such a system, including privacy, customer satisfiability, reliability, and fast decision turnaround, with emphasis on the solutions wemore » proposed. Based on the lessons we learned from empirical results we describe an integrated automated system for controlled DR on the USC microgrid. Results show that while on a per building per event basis the accuracy of our prediction and customer selection techniques varies, it performs well on average when considering several events and buildings.« less

  9. Workload-Matched Adaptive Automation Support of Air Traffic Controller Information Processing Stages

    NASA Technical Reports Server (NTRS)

    Kaber, David B.; Prinzel, Lawrence J., III; Wright, Melanie C.; Clamann, Michael P.

    2002-01-01

    Adaptive automation (AA) has been explored as a solution to the problems associated with human-automation interaction in supervisory control environments. However, research has focused on the performance effects of dynamic control allocations of early stage sensory and information acquisition functions. The present research compares the effects of AA to the entire range of information processing stages of human operators, such as air traffic controllers. The results provide evidence that the effectiveness of AA is dependent on the stage of task performance (human-machine system information processing) that is flexibly automated. The results suggest that humans are better able to adapt to AA when applied to lower-level sensory and psychomotor functions, such as information acquisition and action implementation, as compared to AA applied to cognitive (analysis and decision-making) tasks. The results also provide support for the use of AA, as compared to completely manual control. These results are discussed in terms of implications for AA design for aviation.

  10. Initial Assessment and Modeling Framework Development for Automated Mobility Districts: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hou, Yi; Young, Stanley E; Garikapati, Venu

    Automated vehicles (AVs) are increasingly being discussed as the basis for on-demand mobility services, introducing a new paradigm in which a fleet of AVs displaces private automobiles for day-to-day travel in dense activity districts. This paper examines a concept to displace privately owned automobiles within a region containing dense activity generators (jobs, retail, entertainment, etc.), referred to as an automated mobility district (AMD). This paper reviews several such districts, including airports, college campuses, business parks, downtown urban cores, and military bases, with examples of previous attempts to meet the mobility needs apart from private automobiles, some with automated technology andmore » others with more traditional transit-based solutions. The issues and benefits of AMDs are framed within the perspective of intra-district, inter-district, and border issues, and the requirements for a modeling framework are identified to adequately reflect the breadth of mobility, energy, and emissions impact anticipated with AMDs« less

  11. Is partially automated driving a bad idea? Observations from an on-road study.

    PubMed

    Banks, Victoria A; Eriksson, Alexander; O'Donoghue, Jim; Stanton, Neville A

    2018-04-01

    The automation of longitudinal and lateral control has enabled drivers to become "hands and feet free" but they are required to remain in an active monitoring state with a requirement to resume manual control if required. This represents the single largest allocation of system function problem with vehicle automation as the literature suggests that humans are notoriously inefficient at completing prolonged monitoring tasks. To further explore whether partially automated driving solutions can appropriately support the driver in completing their new monitoring role, video observations were collected as part of an on-road study using a Tesla Model S being operated in Autopilot mode. A thematic analysis of video data suggests that drivers are not being properly supported in adhering to their new monitoring responsibilities and instead demonstrate behaviour indicative of complacency and over-trust. These attributes may encourage drivers to take more risks whilst out on the road. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Validation of an automated fluorescein method for determining bromide in water

    USGS Publications Warehouse

    Fishman, M. J.; Schroder, L.J.; Friedman, L.C.

    1985-01-01

    Surface, atmospheric precipitation and deionized water samples were spiked with ??g l-1 concentrations of bromide, and the solutions stored in polyethylene and polytetrafluoroethylene bottles. Bromide was determined periodically for 30 days. Automated fluorescein and ion chromatography methods were used to determine bromide in these prepared samples. Analysis of the data by the paired t-test indicates that the two methods are not significantly different at a probability of 95% for samples containing from 0.015 to 0.5 mg l-1 of bromide. The correlation coefficient for the same sets of paired data is 0.9987. Recovery data, except for the surface water samples to which 0.005 mg l-1 of bromide was added, range from 89 to 112%. There appears to be no loss of bromide from solution in either type of container.Surface, atmospheric precipitation and deionized water samples were spiked with mu g l** minus **1 concentrations of bromide, and the solutions stored in polyethylene and polytetrafluoroethylene bottles. Bromide was determined periodically for 30 days. Automated fluorescein and ion chromatography methods were used to determine bromide in these prepared samples. Analysis of the data by the paired t-test indicates that the two methods are not significantly different at a probability of 95% for samples containing from 0. 015 to 0. 5 mg l** minus **1 of bromide. The correlation coefficient for the same sets of paired data is 0. 9987. Recovery data, except for the surface water samples to which 0. 005 mg l** minus **1 of bromide was added, range from 89 to 112%. Refs.

  13. Adaptive Automation Triggered by EEG-Based Mental Workload Index: A Passive Brain-Computer Interface Application in Realistic Air Traffic Control Environment.

    PubMed

    Aricò, Pietro; Borghini, Gianluca; Di Flumeri, Gianluca; Colosimo, Alfredo; Bonelli, Stefano; Golfetti, Alessia; Pozzi, Simone; Imbert, Jean-Paul; Granger, Géraud; Benhacene, Raïlane; Babiloni, Fabio

    2016-01-01

    Adaptive Automation (AA) is a promising approach to keep the task workload demand within appropriate levels in order to avoid both the under - and over-load conditions, hence enhancing the overall performance and safety of the human-machine system. The main issue on the use of AA is how to trigger the AA solutions without affecting the operative task. In this regard, passive Brain-Computer Interface (pBCI) systems are a good candidate to activate automation, since they are able to gather information about the covert behavior (e.g., mental workload) of a subject by analyzing its neurophysiological signals (i.e., brain activity), and without interfering with the ongoing operational activity. We proposed a pBCI system able to trigger AA solutions integrated in a realistic Air Traffic Management (ATM) research simulator developed and hosted at ENAC (É cole Nationale de l'Aviation Civile of Toulouse, France). Twelve Air Traffic Controller (ATCO) students have been involved in the experiment and they have been asked to perform ATM scenarios with and without the support of the AA solutions. Results demonstrated the effectiveness of the proposed pBCI system, since it enabled the AA mostly during the high-demanding conditions (i.e., overload situations) inducing a reduction of the mental workload under which the ATCOs were operating. On the contrary, as desired, the AA was not activated when workload level was under the threshold, to prevent too low demanding conditions that could bring the operator's workload level toward potentially dangerous conditions of underload.

  14. Adaptive Automation Triggered by EEG-Based Mental Workload Index: A Passive Brain-Computer Interface Application in Realistic Air Traffic Control Environment

    PubMed Central

    Aricò, Pietro; Borghini, Gianluca; Di Flumeri, Gianluca; Colosimo, Alfredo; Bonelli, Stefano; Golfetti, Alessia; Pozzi, Simone; Imbert, Jean-Paul; Granger, Géraud; Benhacene, Raïlane; Babiloni, Fabio

    2016-01-01

    Adaptive Automation (AA) is a promising approach to keep the task workload demand within appropriate levels in order to avoid both the under- and over-load conditions, hence enhancing the overall performance and safety of the human-machine system. The main issue on the use of AA is how to trigger the AA solutions without affecting the operative task. In this regard, passive Brain-Computer Interface (pBCI) systems are a good candidate to activate automation, since they are able to gather information about the covert behavior (e.g., mental workload) of a subject by analyzing its neurophysiological signals (i.e., brain activity), and without interfering with the ongoing operational activity. We proposed a pBCI system able to trigger AA solutions integrated in a realistic Air Traffic Management (ATM) research simulator developed and hosted at ENAC (École Nationale de l'Aviation Civile of Toulouse, France). Twelve Air Traffic Controller (ATCO) students have been involved in the experiment and they have been asked to perform ATM scenarios with and without the support of the AA solutions. Results demonstrated the effectiveness of the proposed pBCI system, since it enabled the AA mostly during the high-demanding conditions (i.e., overload situations) inducing a reduction of the mental workload under which the ATCOs were operating. On the contrary, as desired, the AA was not activated when workload level was under the threshold, to prevent too low demanding conditions that could bring the operator's workload level toward potentially dangerous conditions of underload. PMID:27833542

  15. Use of Archived Information by the United States National Data Center

    NASA Astrophysics Data System (ADS)

    Junek, W. N.; Pope, B. M.; Roman-Nieves, J. I.; VanDeMark, T. F.; Ichinose, G. A.; Poffenberger, A.; Woods, M. T.

    2012-12-01

    The United States National Data Center (US NDC) is responsible for monitoring international compliance to nuclear test ban treaties, acquiring data and data products from the International Data Center (IDC), and distributing data according to established policy. The archive of automated and reviewed event solutions residing at the US NDC is a valuable resource for assessing and improving the performance of signal detection, event formation, location, and discrimination algorithms. Numerous research initiatives are currently underway that are focused on optimizing these processes using historic waveform data and alphanumeric information. Identification of optimum station processing parameters is routinely performed through the analysis of archived waveform data. Station specific detector tuning studies produce and compare receiver operating characteristics for multiple detector configurations (e.g., detector type, filter passband) to identify an optimum set of processing parameters with an acceptable false alarm rate. Large aftershock sequences can inundate automated phase association algorithms with numerous detections that are closely spaced in time, which increases the number of false and/or mixed associations in automated event solutions and increases analyst burden. Archived waveform data and alphanumeric information are being exploited to develop an aftershock processor that will construct association templates to assist the Global Association (GA) application, reduce the number of false and merged phase associations, and lessen analyst burden. Statistical models are being developed and evaluated for potential use by the GA application for identifying and rejecting unlikely preliminary event solutions. Other uses of archived data at the US NDC include: improved event locations using empirical travel time corrections and discrimination via a statistical framework known as the event classification matrix (ECM).

  16. Functional MRI Preprocessing in Lesioned Brains: Manual Versus Automated Region of Interest Analysis

    PubMed Central

    Garrison, Kathleen A.; Rogalsky, Corianne; Sheng, Tong; Liu, Brent; Damasio, Hanna; Winstein, Carolee J.; Aziz-Zadeh, Lisa S.

    2015-01-01

    Functional magnetic resonance imaging (fMRI) has significant potential in the study and treatment of neurological disorders and stroke. Region of interest (ROI) analysis in such studies allows for testing of strong a priori clinical hypotheses with improved statistical power. A commonly used automated approach to ROI analysis is to spatially normalize each participant’s structural brain image to a template brain image and define ROIs using an atlas. However, in studies of individuals with structural brain lesions, such as stroke, the gold standard approach may be to manually hand-draw ROIs on each participant’s non-normalized structural brain image. Automated approaches to ROI analysis are faster and more standardized, yet are susceptible to preprocessing error (e.g., normalization error) that can be greater in lesioned brains. The manual approach to ROI analysis has high demand for time and expertise, but may provide a more accurate estimate of brain response. In this study, commonly used automated and manual approaches to ROI analysis were directly compared by reanalyzing data from a previously published hypothesis-driven cognitive fMRI study, involving individuals with stroke. The ROI evaluated is the pars opercularis of the inferior frontal gyrus. Significant differences were identified in task-related effect size and percent-activated voxels in this ROI between the automated and manual approaches to ROI analysis. Task interactions, however, were consistent across ROI analysis approaches. These findings support the use of automated approaches to ROI analysis in studies of lesioned brains, provided they employ a task interaction design. PMID:26441816

  17. Structure determination of helical filaments by solid-state NMR spectroscopy

    PubMed Central

    Ahmed, Mumdooh; Spehr, Johannes; König, Renate; Lünsdorf, Heinrich; Rand, Ulfert; Lührs, Thorsten; Ritter, Christiane

    2016-01-01

    The controlled formation of filamentous protein complexes plays a crucial role in many biological systems and represents an emerging paradigm in signal transduction. The mitochondrial antiviral signaling protein (MAVS) is a central signal transduction hub in innate immunity that is activated by a receptor-induced conversion into helical superstructures (filaments) assembled from its globular caspase activation and recruitment domain. Solid-state NMR (ssNMR) spectroscopy has become one of the most powerful techniques for atomic resolution structures of protein fibrils. However, for helical filaments, the determination of the correct symmetry parameters has remained a significant hurdle for any structural technique and could thus far not be precisely derived from ssNMR data. Here, we solved the atomic resolution structure of helical MAVSCARD filaments exclusively from ssNMR data. We present a generally applicable approach that systematically explores the helical symmetry space by efficient modeling of the helical structure restrained by interprotomer ssNMR distance restraints. Together with classical automated NMR structure calculation, this allowed us to faithfully determine the symmetry that defines the entire assembly. To validate our structure, we probed the protomer arrangement by solvent paramagnetic resonance enhancement, analysis of chemical shift differences relative to the solution NMR structure of the monomer, and mutagenesis. We provide detailed information on the atomic contacts that determine filament stability and describe mechanistic details on the formation of signaling-competent MAVS filaments from inactive monomers. PMID:26733681

  18. FIGENIX: Intelligent automation of genomic annotation: expertise integration in a new software platform

    PubMed Central

    Gouret, Philippe; Vitiello, Vérane; Balandraud, Nathalie; Gilles, André; Pontarotti, Pierre; Danchin, Etienne GJ

    2005-01-01

    Background Two of the main objectives of the genomic and post-genomic era are to structurally and functionally annotate genomes which consists of detecting genes' position and structure, and inferring their function (as well as of other features of genomes). Structural and functional annotation both require the complex chaining of numerous different software, algorithms and methods under the supervision of a biologist. The automation of these pipelines is necessary to manage huge amounts of data released by sequencing projects. Several pipelines already automate some of these complex chaining but still necessitate an important contribution of biologists for supervising and controlling the results at various steps. Results Here we propose an innovative automated platform, FIGENIX, which includes an expert system capable to substitute to human expertise at several key steps. FIGENIX currently automates complex pipelines of structural and functional annotation under the supervision of the expert system (which allows for example to make key decisions, check intermediate results or refine the dataset). The quality of the results produced by FIGENIX is comparable to those obtained by expert biologists with a drastic gain in terms of time costs and avoidance of errors due to the human manipulation of data. Conclusion The core engine and expert system of the FIGENIX platform currently handle complex annotation processes of broad interest for the genomic community. They could be easily adapted to new, or more specialized pipelines, such as for example the annotation of miRNAs, the classification of complex multigenic families, annotation of regulatory elements and other genomic features of interest. PMID:16083500

  19. Adaptive elastic segmentation of brain MRI via shape-model-guided evolutionary programming.

    PubMed

    Pitiot, Alain; Toga, Arthur W; Thompson, Paul M

    2002-08-01

    This paper presents a fully automated segmentation method for medical images. The goal is to localize and parameterize a variety of types of structure in these images for subsequent quantitative analysis. We propose a new hybrid strategy that combines a general elastic template matching approach and an evolutionary heuristic. The evolutionary algorithm uses prior statistical information about the shape of the target structure to control the behavior of a number of deformable templates. Each template, modeled in the form of a B-spline, is warped in a potential field which is itself dynamically adapted. Such a hybrid scheme proves to be promising: by maintaining a population of templates, we cover a large domain of the solution space under the global guidance of the evolutionary heuristic, and thoroughly explore interesting areas. We address key issues of automated image segmentation systems. The potential fields are initially designed based on the spatial features of the edges in the input image, and are subjected to spatially adaptive diffusion to guarantee the deformation of the template. This also improves its global consistency and convergence speed. The deformation algorithm can modify the internal structure of the templates to allow a better match. We investigate in detail the preprocessing phase that the images undergo before they can be used more effectively in the iterative elastic matching procedure: a texture classifier, trained via linear discriminant analysis of a learning set, is used to enhance the contrast of the target structure with respect to surrounding tissues. We show how these techniques interact within a statistically driven evolutionary scheme to achieve a better tradeoff between template flexibility and sensitivity to noise and outliers. We focus on understanding the features of template matching that are most beneficial in terms of the achieved match. Examples from simulated and real image data are discussed, with considerations of algorithmic efficiency.

  20. Efficient, Multi-Scale Designs Take Flight

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Engineers can solve aerospace design problems faster and more efficiently with a versatile software product that performs automated structural analysis and sizing optimization. Collier Research Corporation's HyperSizer Structural Sizing Software is a design, analysis, and documentation tool that increases productivity and standardization for a design team. Based on established aerospace structural methods for strength, stability, and stiffness, HyperSizer can be used all the way from the conceptual design to in service support. The software originated from NASA s efforts to automate its capability to perform aircraft strength analyses, structural sizing, and weight prediction and reduction. With a strategy to combine finite element analysis with an automated design procedure, NASA s Langley Research Center led the development of a software code known as ST-SIZE from 1988 to 1995. Collier Research employees were principal developers of the code along with Langley researchers. The code evolved into one that could analyze the strength and stability of stiffened panels constructed of any material, including light-weight, fiber-reinforced composites.

  1. Automated building of organometallic complexes from 3D fragments.

    PubMed

    Foscato, Marco; Venkatraman, Vishwesh; Occhipinti, Giovanni; Alsberg, Bjørn K; Jensen, Vidar R

    2014-07-28

    A method for the automated construction of three-dimensional (3D) molecular models of organometallic species in design studies is described. Molecular structure fragments derived from crystallographic structures and accurate molecular-level calculations are used as 3D building blocks in the construction of multiple molecular models of analogous compounds. The method allows for precise control of stereochemistry and geometrical features that may otherwise be very challenging, or even impossible, to achieve with commonly available generators of 3D chemical structures. The new method was tested in the construction of three sets of active or metastable organometallic species of catalytic reactions in the homogeneous phase. The performance of the method was compared with those of commonly available methods for automated generation of 3D models, demonstrating higher accuracy of the prepared 3D models in general, and, in particular, a much wider range with respect to the kind of chemical structures that can be built automatically, with capabilities far beyond standard organic and main-group chemistry.

  2. Automated Reconstruction of Historic Roof Structures from Point Clouds - Development and Examples

    NASA Astrophysics Data System (ADS)

    Pöchtrager, M.; Styhler-Aydın, G.; Döring-Williams, M.; Pfeifer, N.

    2017-08-01

    The analysis of historic roof constructions is an important task for planning the adaptive reuse of buildings or for maintenance and restoration issues. Current approaches to modeling roof constructions consist of several consecutive operations that need to be done manually or using semi-automatic routines. To increase efficiency and allow the focus to be on analysis rather than on data processing, a set of methods was developed for the fully automated analysis of the roof constructions, including integration of architectural and structural modeling. Terrestrial laser scanning permits high-detail surveying of large-scale structures within a short time. Whereas 3-D laser scan data consist of millions of single points on the object surface, we need a geometric description of structural elements in order to obtain a structural model consisting of beam axis and connections. Preliminary results showed that the developed methods work well for beams in flawless condition with a quadratic cross section and no bending. Deformations or damages such as cracks and cuts on the wooden beams can lead to incomplete representations in the model. Overall, a high degree of automation was achieved.

  3. Dynamic Weather Routes: A Weather Avoidance Concept for Trajectory-Based Operations

    NASA Technical Reports Server (NTRS)

    McNally, B. David; Love, John

    2011-01-01

    The integration of convective weather modeling with trajectory automation for conflict detection, trial planning, direct routing, and auto resolution has uncovered a concept that could help controllers, dispatchers, and pilots identify improved weather routes that result in significant savings in flying time and fuel burn. Trajectory automation continuously and automatically monitors aircraft in flight to find those that could potentially benefit from improved weather reroutes. Controllers, dispatchers, and pilots then evaluate reroute options to assess their suitability given current weather and traffic. In today's operations aircraft fly convective weather avoidance routes that were implemented often hours before aircraft approach the weather and automation does not exist to automatically monitor traffic to find improved weather routes that open up due to changing weather conditions. The automation concept runs in real-time and employs two keysteps. First, a direct routing algorithm automatically identifies flights with large dog legs in their routes and therefore potentially large savings in flying time. These are common - and usually necessary - during convective weather operations and analysis of Fort Worth Center traffic shows many aircraft with short cuts that indicate savings on the order of 10 flying minutes. The second and most critical step is to apply trajectory automation with weather modeling to determine what savings could be achieved by modifying the direct route such that it avoids weather and traffic and is acceptable to controllers and flight crews. Initial analysis of Fort Worth Center traffic suggests a savings of roughly 50% of the direct route savings could be achievable.The core concept is to apply trajectory automation with convective weather modeling in real time to identify a reroute that is free of weather and traffic conflicts and indicates enough time and fuel savings to be considered. The concept is interoperable with today's integrated FMS/datalink. Auxiliary(lat/long) waypoints define a minimum delay reroute between current position and a downstream capture fix beyond the weather. These auxiliary waypoints can be uplinked to equipped aircraft and auto-loaded into the FMS. Alternatively, for unequipped aircraft, auxiliary waypoints can be replaced by nearby named fixes, but this could reduce potential savings. The presentation includes an overview of the automation approach and focuses on several cases in terms of potential savings, reroute complexity, best auxiliary waypoint solution vs. named fix solution, and other metrics.

  4. A continuously growing web-based interface structure databank

    NASA Astrophysics Data System (ADS)

    Erwin, N. A.; Wang, E. I.; Osysko, A.; Warner, D. H.

    2012-07-01

    The macroscopic properties of materials can be significantly influenced by the presence of microscopic interfaces. The complexity of these interfaces coupled with the vast configurational space in which they reside has been a long-standing obstacle to the advancement of true bottom-up material behavior predictions. In this vein, atomistic simulations have proven to be a valuable tool for investigating interface behavior. However, before atomistic simulations can be utilized to model interface behavior, meaningful interface atomic structures must be generated. The generation of structures has historically been carried out disjointly by individual research groups, and thus, has constituted an overlap in effort across the broad research community. To address this overlap and to lower the barrier for new researchers to explore interface modeling, we introduce a web-based interface structure databank (www.isdb.cee.cornell.edu) where users can search, download and share interface structures. The databank is intended to grow via two mechanisms: (1) interface structure donations from individual research groups and (2) an automated structure generation algorithm which continuously creates equilibrium interface structures. In this paper, we describe the databank, the automated interface generation algorithm, and compare a subset of the autonomously generated structures to structures currently available in the literature. To date, the automated generation algorithm has been directed toward aluminum grain boundary structures, which can be compared with experimentally measured population densities of aluminum polycrystals.

  5. The WHO 2016 verbal autopsy instrument: An international standard suitable for automated analysis by InterVA, InSilicoVA, and Tariff 2.0

    PubMed Central

    Chandramohan, Daniel; Clark, Samuel J.; Jakob, Robert; Leitao, Jordana; Rao, Chalapati; Riley, Ian; Setel, Philip W.

    2018-01-01

    Background Verbal autopsy (VA) is a practical method for determining probable causes of death at the population level in places where systems for medical certification of cause of death are weak. VA methods suitable for use in routine settings, such as civil registration and vital statistics (CRVS) systems, have developed rapidly in the last decade. These developments have been part of a growing global momentum to strengthen CRVS systems in low-income countries. With this momentum have come pressure for continued research and development of VA methods and the need for a single standard VA instrument on which multiple automated diagnostic methods can be developed. Methods and findings In 2016, partners harmonized a WHO VA standard instrument that fully incorporates the indicators necessary to run currently available automated diagnostic algorithms. The WHO 2016 VA instrument, together with validated approaches to analyzing VA data, offers countries solutions to improving information about patterns of cause-specific mortality. This VA instrument offers the opportunity to harmonize the automated diagnostic algorithms in the future. Conclusions Despite all improvements in design and technology, VA is only recommended where medical certification of cause of death is not possible. The method can nevertheless provide sufficient information to guide public health priorities in communities in which physician certification of deaths is largely unavailable. The WHO 2016 VA instrument, together with validated approaches to analyzing VA data, offers countries solutions to improving information about patterns of cause-specific mortality. PMID:29320495

  6. An Automated, Experimenter-Free Method for the Standardised, Operant Cognitive Testing of Rats

    PubMed Central

    Rivalan, Marion; Munawar, Humaira; Fuchs, Anna; Winter, York

    2017-01-01

    Animal models of human pathology are essential for biomedical research. However, a recurring issue in the use of animal models is the poor reproducibility of behavioural and physiological findings within and between laboratories. The most critical factor influencing this issue remains the experimenter themselves. One solution is the use of procedures devoid of human intervention. We present a novel approach to experimenter-free testing cognitive abilities in rats, by combining undisturbed group housing with automated, standardized and individual operant testing. This experimenter-free system consisted of an automated-operant system (Bussey-Saksida rat touch screen) connected to a home cage containing group living rats via an automated animal sorter (PhenoSys). The automated animal sorter, which is based on radio-frequency identification (RFID) technology, functioned as a mechanical replacement of the experimenter. Rats learnt to regularly and individually enter the operant chamber and remained there for the duration of the experimental session only. Self-motivated rats acquired the complex touch screen task of trial-unique non-matching to location (TUNL) in half the time reported for animals that were manually placed into the operant chamber. Rat performance was similar between the two groups within our laboratory, and comparable to previously published results obtained elsewhere. This reproducibility, both within and between laboratories, confirms the validity of this approach. In addition, automation reduced daily experimental time by 80%, eliminated animal handling, and reduced equipment cost. This automated, experimenter-free setup is a promising tool of great potential for testing a large variety of functions with full automation in future studies. PMID:28060883

  7. Automating expert role to determine design concept in Kansei Engineering

    NASA Astrophysics Data System (ADS)

    Lokman, Anitawati Mohd; Haron, Mohammad Bakri Che; Abidin, Siti Zaleha Zainal; Khalid, Noor Elaiza Abd

    2016-02-01

    Affect has become imperative in product quality. In affective design field, Kansei Engineering (KE) has been recognized as a technology that enables discovery of consumer's emotion and formulation of guide to design products that win consumers in the competitive market. Albeit powerful technology, there is no rule of thumb in its analysis and interpretation process. KE expertise is required to determine sets of related Kansei and the significant concept of emotion. Many research endeavors become handicapped with the limited number of available and accessible KE experts. This work is performed to simulate the role of experts with the use of Natphoric algorithm thus providing sound solution to the complexity and flexibility in KE. The algorithm is designed to learn the process by implementing training datasets taken from previous KE research works. A framework for automated KE is then designed to realize the development of automated KE system. A comparative analysis is performed to determine feasibility of the developed prototype to automate the process. The result shows that the significant Kansei is determined by manual KE implementation and the automated process is highly similar. KE research advocates will benefit this system to automatically determine significant design concepts.

  8. Combination of structured illumination and single molecule localization microscopy in one setup

    NASA Astrophysics Data System (ADS)

    Rossberger, Sabrina; Best, Gerrit; Baddeley, David; Heintzmann, Rainer; Birk, Udo; Dithmar, Stefan; Cremer, Christoph

    2013-09-01

    Understanding the positional and structural aspects of biological nanostructures simultaneously is as much a challenge as a desideratum. In recent years, highly accurate (20 nm) positional information of optically isolated targets down to the nanometer range has been obtained using single molecule localization microscopy (SMLM), while highly resolved (100 nm) spatial information has been achieved using structured illumination microscopy (SIM). In this paper, we present a high-resolution fluorescence microscope setup which combines the advantages of SMLM with SIM in order to provide high-precision localization and structural information in a single setup. Furthermore, the combination of the wide-field SIM image with the SMLM data allows us to identify artifacts produced during the visualization process of SMLM data, and potentially also during the reconstruction process of SIM images. We describe the SMLM-SIM combo and software, and apply the instrument in a first proof-of-principle to the same region of H3K293 cells to achieve SIM images with high structural resolution (in the 100 nm range) in overlay with the highly accurate position information of localized single fluorophores. Thus, with its robust control software, efficient switching between the SMLM and SIM mode, fully automated and user-friendly acquisition and evaluation software, the SMLM-SIM combo is superior over existing solutions.

  9. A modular assembling platform for manufacturing of microsystems by optical tweezers

    NASA Astrophysics Data System (ADS)

    Ksouri, Sarah Isabelle; Aumann, Andreas; Ghadiri, Reza; Prüfer, Michael; Baer, Sebastian; Ostendorf, Andreas

    2013-09-01

    Due to the increased complexity in terms of materials and geometries for microsystems new assembling techniques are required. Assembling techniques from the semiconductor industry are often very specific and cannot fulfill all specifications in more complex microsystems. Therefore, holographic optical tweezers are applied to manipulate structures in micrometer range with highest flexibility and precision. As is well known non-spherical assemblies can be trapped and controlled by laser light and assembled with an additional light modulator application, where the incident laser beam is rearranged into flexible light patterns in order to generate multiple spots. The complementary building blocks are generated by a two-photon-polymerization process. The possibilities of manufacturing arbitrary microstructures and the potential of optical tweezers lead to the idea of combining manufacturing techniques with manipulation processes to "microrobotic" processes. This work presents the manipulation of generated complex microstructures with optical tools as well as a storage solution for 2PP assemblies. A sample holder has been developed for the manual feeding of 2PP building blocks. Furthermore, a modular assembling platform has been constructed for an `all-in-one' 2PP manufacturing process as a dedicated storage system. The long-term objective is the automation process of feeding and storage of several different 2PP micro-assemblies to realize an automated assembly process.

  10. Automated Construction of Molecular Active Spaces from Atomic Valence Orbitals.

    PubMed

    Sayfutyarova, Elvira R; Sun, Qiming; Chan, Garnet Kin-Lic; Knizia, Gerald

    2017-09-12

    We introduce the atomic valence active space (AVAS), a simple and well-defined automated technique for constructing active orbital spaces for use in multiconfiguration and multireference (MR) electronic structure calculations. Concretely, the technique constructs active molecular orbitals capable of describing all relevant electronic configurations emerging from a targeted set of atomic valence orbitals (e.g., the metal d orbitals in a coordination complex). This is achieved via a linear transformation of the occupied and unoccupied orbital spaces from an easily obtainable single-reference wave function (such as from a Hartree-Fock or Kohn-Sham calculations) based on projectors to targeted atomic valence orbitals. We discuss the premises, theory, and implementation of the idea, and several of its variations are tested. To investigate the performance and accuracy, we calculate the excitation energies for various transition-metal complexes in typical application scenarios. Additionally, we follow the homolytic bond breaking process of a Fenton reaction along its reaction coordinate. While the described AVAS technique is not a universal solution to the active space problem, its premises are fulfilled in many application scenarios of transition-metal chemistry and bond dissociation processes. In these cases the technique makes MR calculations easier to execute, easier to reproduce by any user, and simplifies the determination of the appropriate size of the active space required for accurate results.

  11. Automating a Detailed Cognitive Task Analysis for Structuring Curriculum

    DTIC Science & Technology

    1991-08-01

    1991-- ] Aleeo/i ISM’-19# l Title: Automating a Detailed Cognitive Task Analysis for Structuring Curriculum Activities: To date we have completed task...The Institute for Management Sciences. Although the particular application of the modified GOMS cognitive task analysis technique under development is...Laboratories 91 9 23 074 Automnating a Detailed Cognitive Task Analysis For Stucuring Curriculum Research Plan Year 1 Task 1.0 Design Task 1.1 Conduct body

  12. Use of noncrystallographic symmetry for automated model building at medium to low resolution.

    PubMed

    Wiegels, Tim; Lamzin, Victor S

    2012-04-01

    A novel method is presented for the automatic detection of noncrystallographic symmetry (NCS) in macromolecular crystal structure determination which does not require the derivation of molecular masks or the segmentation of density. It was found that throughout structure determination the NCS-related parts may be differently pronounced in the electron density. This often results in the modelling of molecular fragments of variable length and accuracy, especially during automated model-building procedures. These fragments were used to identify NCS relations in order to aid automated model building and refinement. In a number of test cases higher completeness and greater accuracy of the obtained structures were achieved, specifically at a crystallographic resolution of 2.3 Å or poorer. In the best case, the method allowed the building of up to 15% more residues automatically and a tripling of the average length of the built fragments.

  13. Knowledge structure representation and automated updates in intelligent information management systems

    NASA Technical Reports Server (NTRS)

    Corey, Stephen; Carnahan, Richard S., Jr.

    1990-01-01

    A continuing effort to apply rapid prototyping and Artificial Intelligence techniques to problems associated with projected Space Station-era information management systems is examined. In particular, timely updating of the various databases and knowledge structures within the proposed intelligent information management system (IIMS) is critical to support decision making processes. Because of the significantly large amounts of data entering the IIMS on a daily basis, information updates will need to be automatically performed with some systems requiring that data be incorporated and made available to users within a few hours. Meeting these demands depends first, on the design and implementation of information structures that are easily modified and expanded, and second, on the incorporation of intelligent automated update techniques that will allow meaningful information relationships to be established. Potential techniques are studied for developing such an automated update capability and IIMS update requirements are examined in light of results obtained from the IIMS prototyping effort.

  14. Building a framework to manage trust in automation

    NASA Astrophysics Data System (ADS)

    Metcalfe, J. S.; Marathe, A. R.; Haynes, B.; Paul, V. J.; Gremillion, G. M.; Drnec, K.; Atwater, C.; Estepp, J. R.; Lukos, J. R.; Carter, E. C.; Nothwang, W. D.

    2017-05-01

    All automations must, at some point in their lifecycle, interface with one or more humans. Whether operators, end-users, or bystanders, human responses can determine the perceived utility and acceptance of an automation. It has been long believed that human trust is a primary determinant of human-automation interactions and further presumed that calibrating trust can lead to appropriate choices regarding automation use. However, attempts to improve joint system performance by calibrating trust have not yet provided a generalizable solution. To address this, we identified several factors limiting the direct integration of trust, or metrics thereof, into an active mitigation strategy. The present paper outlines our approach to addressing this important issue, its conceptual underpinnings, and practical challenges encountered in execution. Among the most critical outcomes has been a shift in focus from trust to basic interaction behaviors and their antecedent decisions. This change in focus inspired the development of a testbed and paradigm that was deployed in two experiments of human interactions with driving automation that were executed in an immersive, full-motion simulation environment. Moreover, by integrating a behavior and physiology-based predictor within a novel consequence-based control system, we demonstrated that it is possible to anticipate particular interaction behaviors and influence humans towards more optimal choices about automation use in real time. Importantly, this research provides a fertile foundation for the development and integration of advanced, wearable technologies for sensing and inferring critical state variables for better integration of human elements into otherwise fully autonomous systems.

  15. Accurate cytogenetic biodosimetry through automated dicentric chromosome curation and metaphase cell selection

    PubMed Central

    Wilkins, Ruth; Flegal, Farrah; Knoll, Joan H.M.; Rogan, Peter K.

    2017-01-01

    Accurate digital image analysis of abnormal microscopic structures relies on high quality images and on minimizing the rates of false positive (FP) and negative objects in images. Cytogenetic biodosimetry detects dicentric chromosomes (DCs) that arise from exposure to ionizing radiation, and determines radiation dose received based on DC frequency. Improvements in automated DC recognition increase the accuracy of dose estimates by reclassifying FP DCs as monocentric chromosomes or chromosome fragments. We also present image segmentation methods to rank high quality digital metaphase images and eliminate suboptimal metaphase cells. A set of chromosome morphology segmentation methods selectively filtered out FP DCs arising primarily from sister chromatid separation, chromosome fragmentation, and cellular debris. This reduced FPs by an average of 55% and was highly specific to these abnormal structures (≥97.7%) in three samples. Additional filters selectively removed images with incomplete, highly overlapped, or missing metaphase cells, or with poor overall chromosome morphologies that increased FP rates. Image selection is optimized and FP DCs are minimized by combining multiple feature based segmentation filters and a novel image sorting procedure based on the known distribution of chromosome lengths. Applying the same image segmentation filtering procedures to both calibration and test samples reduced the average dose estimation error from 0.4 Gy to <0.2 Gy, obviating the need to first manually review these images. This reliable and scalable solution enables batch processing for multiple samples of unknown dose, and meets current requirements for triage radiation biodosimetry of high quality metaphase cell preparations. PMID:29026522

  16. Human brain atlas for automated region of interest selection in quantitative susceptibility mapping: application to determine iron content in deep gray matter structures.

    PubMed

    Lim, Issel Anne L; Faria, Andreia V; Li, Xu; Hsu, Johnny T C; Airan, Raag D; Mori, Susumu; van Zijl, Peter C M

    2013-11-15

    The purpose of this paper is to extend the single-subject Eve atlas from Johns Hopkins University, which currently contains diffusion tensor and T1-weighted anatomical maps, by including contrast based on quantitative susceptibility mapping. The new atlas combines a "deep gray matter parcellation map" (DGMPM) derived from a single-subject quantitative susceptibility map with the previously established "white matter parcellation map" (WMPM) from the same subject's T1-weighted and diffusion tensor imaging data into an MNI coordinate map named the "Everything Parcellation Map in Eve Space," also known as the "EvePM." It allows automated segmentation of gray matter and white matter structures. Quantitative susceptibility maps from five healthy male volunteers (30 to 33 years of age) were coregistered to the Eve Atlas with AIR and Large Deformation Diffeomorphic Metric Mapping (LDDMM), and the transformation matrices were applied to the EvePM to produce automated parcellation in subject space. Parcellation accuracy was measured with a kappa analysis for the left and right structures of six deep gray matter regions. For multi-orientation QSM images, the Kappa statistic was 0.85 between automated and manual segmentation, with the inter-rater reproducibility Kappa being 0.89 for the human raters, suggesting "almost perfect" agreement between all segmentation methods. Segmentation seemed slightly more difficult for human raters on single-orientation QSM images, with the Kappa statistic being 0.88 between automated and manual segmentation, and 0.85 and 0.86 between human raters. Overall, this atlas provides a time-efficient tool for automated coregistration and segmentation of quantitative susceptibility data to analyze many regions of interest. These data were used to establish a baseline for normal magnetic susceptibility measurements for over 60 brain structures of 30- to 33-year-old males. Correlating the average susceptibility with age-based iron concentrations in gray matter structures measured by Hallgren and Sourander (1958) allowed interpolation of the average iron concentration of several deep gray matter regions delineated in the EvePM. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Human brain atlas for automated region of interest selection in quantitative susceptibility mapping: application to determine iron content in deep gray matter structures

    PubMed Central

    Lim, Issel Anne L.; Faria, Andreia V.; Li, Xu; Hsu, Johnny T.C.; Airan, Raag D.; Mori, Susumu; van Zijl, Peter C. M.

    2013-01-01

    The purpose of this paper is to extend the single-subject Eve atlas from Johns Hopkins University, which currently contains diffusion tensor and T1-weighted anatomical maps, by including contrast based on quantitative susceptibility mapping. The new atlas combines a “deep gray matter parcellation map” (DGMPM) derived from a single-subject quantitative susceptibility map with the previously established “white matter parcellation map” (WMPM) from the same subject’s T1-weighted and diffusion tensor imaging data into an MNI coordinate map named the “Everything Parcellation Map in Eve Space,” also known as the “EvePM.” It allows automated segmentation of gray matter and white matter structures. Quantitative susceptibility maps from five healthy male volunteers (30 to 33 years of age) were coregistered to the Eve Atlas with AIR and Large Deformation Diffeomorphic Metric Mapping (LDDMM), and the transformation matrices were applied to the EvePM to produce automated parcellation in subject space. Parcellation accuracy was measured with a kappa analysis for the left and right structures of six deep gray matter regions. For multi-orientation QSM images, the Kappa statistic was 0.85 between automated and manual segmentation, with the inter-rater reproducibility Kappa being 0.89 for the human raters, suggesting “almost perfect” agreement between all segmentation methods. Segmentation seemed slightly more difficult for human raters on single-orientation QSM images, with the Kappa statistic being 0.88 between automated and manual segmentation, and 0.85 and 0.86 between human raters. Overall, this atlas provides a time-efficient tool for automated coregistration and segmentation of quantitative susceptibility data to analyze many regions of interest. These data were used to establish a baseline for normal magnetic susceptibility measurements for over 60 brain structures of 30- to 33-year-old males. Correlating the average susceptibility with age-based iron concentrations in gray matter structures measured by Hallgren and Sourander (1958) allowed interpolation of the average iron concentration of several deep gray matter regions delineated in the EvePM. PMID:23769915

  18. Surveying Florida MPO readiness to incorporate innovative technologies into long range transportation plans : draft final report.

    DOT National Transportation Integrated Search

    2016-08-01

    There is optimism that Automated Vehicles (AVs) can improve the safety of the transportation system, : reduce congestion, increase reliability, offer improved mobility solutions to all segments of the population : including the transportation-disadva...

  19. Biblios Hawaii.

    ERIC Educational Resources Information Center

    Gotanda, Masae; Bourne, Charles P.

    A feasibility study identified the information requirements and alternative solutions for the Hawaii State Library System. On recommendation of the library service directors, the Book Inventory Building and Library Oriented System (BIBLOS) was purchased and installed. The system presently provides for automated acquisitions, orders, accounts,…

  20. Measuring up: Implementing a dental quality measure in the electronic health record context.

    PubMed

    Bhardwaj, Aarti; Ramoni, Rachel; Kalenderian, Elsbeth; Neumann, Ana; Hebballi, Nutan B; White, Joel M; McClellan, Lyle; Walji, Muhammad F

    2016-01-01

    Quality improvement requires using quality measures that can be implemented in a valid manner. Using guidelines set forth by the Meaningful Use portion of the Health Information Technology for Economic and Clinical Health Act, the authors assessed the feasibility and performance of an automated electronic Meaningful Use dental clinical quality measure to determine the percentage of children who received fluoride varnish. The authors defined how to implement the automated measure queries in a dental electronic health record. Within records identified through automated query, the authors manually reviewed a subsample to assess the performance of the query. The automated query results revealed that 71.0% of patients had fluoride varnish compared with the manual chart review results that indicated 77.6% of patients had fluoride varnish. The automated quality measure performance results indicated 90.5% sensitivity, 90.8% specificity, 96.9% positive predictive value, and 75.2% negative predictive value. The authors' findings support the feasibility of using automated dental quality measure queries in the context of sufficient structured data. Information noted only in free text rather than in structured data would require using natural language processing approaches to effectively query electronic health records. To participate in self-directed quality improvement, dental clinicians must embrace the accountability era. Commitment to quality will require enhanced documentation to support near-term automated calculation of quality measures. Copyright © 2016 American Dental Association. Published by Elsevier Inc. All rights reserved.

  1. A fragmentation and reassembly method for ab initio phasing.

    PubMed

    Shrestha, Rojan; Zhang, Kam Y J

    2015-02-01

    Ab initio phasing with de novo models has become a viable approach for structural solution from protein crystallographic diffraction data. This approach takes advantage of the known protein sequence information, predicts de novo models and uses them for structure determination by molecular replacement. However, even the current state-of-the-art de novo modelling method has a limit as to the accuracy of the model predicted, which is sometimes insufficient to be used as a template for successful molecular replacement. A fragment-assembly phasing method has been developed that starts from an ensemble of low-accuracy de novo models, disassembles them into fragments, places them independently in the crystallographic unit cell by molecular replacement and then reassembles them into a whole structure that can provide sufficient phase information to enable complete structure determination by automated model building. Tests on ten protein targets showed that the method could solve structures for eight of these targets, although the predicted de novo models cannot be used as templates for successful molecular replacement since the best model for each target is on average more than 4.0 Å away from the native structure. The method has extended the applicability of the ab initio phasing by de novo models approach. The method can be used to solve structures when the best de novo models are still of low accuracy.

  2. Towards Automated Structure-Based NMR Resonance Assignment

    NASA Astrophysics Data System (ADS)

    Jang, Richard; Gao, Xin; Li, Ming

    We propose a general framework for solving the structure-based NMR backbone resonance assignment problem. The core is a novel 0-1 integer programming model that can start from a complete or partial assignment, generate multiple assignments, and model not only the assignment of spins to residues, but also pairwise dependencies consisting of pairs of spins to pairs of residues. It is still a challenge for automated resonance assignment systems to perform the assignment directly from spectra without any manual intervention. To test the feasibility of this for structure-based assignment, we integrated our system with our automated peak picking and sequence-based resonance assignment system to obtain an assignment for the protein TM1112 with 91% recall and 99% precision without manual intervention. Since using a known structure has the potential to allow one to use only N-labeled NMR data and avoid the added expense of using C-labeled data, we work towards the goal of automated structure-based assignment using only such labeled data. Our system reduced the assignment error of Xiong-Pandurangan-Bailey-Kellogg's contact replacement (CR) method, which to our knowledge is the most error-tolerant method for this problem, by 5 folds on average. By using an iterative algorithm, our system has the added capability of using the NOESY data to correct assignment errors due to errors in predicting the amino acid and secondary structure type of each spin system. On a publicly available data set for Ubiquitin, where the type prediction accuracy is 83%, we achieved 91% assignment accuracy, compared to the 59% accuracy that was obtained without correcting for typing errors.

  3. Supervised and Unsupervised Learning Technology in the Study of Rodent Behavior

    PubMed Central

    Gris, Katsiaryna V.; Coutu, Jean-Philippe; Gris, Denis

    2017-01-01

    Quantifying behavior is a challenge for scientists studying neuroscience, ethology, psychology, pathology, etc. Until now, behavior was mostly considered as qualitative descriptions of postures or labor intensive counting of bouts of individual movements. Many prominent behavioral scientists conducted studies describing postures of mice and rats, depicting step by step eating, grooming, courting, and other behaviors. Automated video assessment technologies permit scientists to quantify daily behavioral patterns/routines, social interactions, and postural changes in an unbiased manner. Here, we extensively reviewed published research on the topic of the structural blocks of behavior and proposed a structure of behavior based on the latest publications. We discuss the importance of defining a clear structure of behavior to allow professionals to write viable algorithms. We presented a discussion of technologies that are used in automated video assessment of behavior in mice and rats. We considered advantages and limitations of supervised and unsupervised learning. We presented the latest scientific discoveries that were made using automated video assessment. In conclusion, we proposed that the automated quantitative approach to evaluating animal behavior is the future of understanding the effect of brain signaling, pathologies, genetic content, and environment on behavior. PMID:28804452

  4. Automated segmentation of midbrain structures with high iron content.

    PubMed

    Garzón, Benjamín; Sitnikov, Rouslan; Bäckman, Lars; Kalpouzos, Grégoria

    2018-04-15

    The substantia nigra (SN), the subthalamic nucleus (STN), and the red nucleus (RN) are midbrain structures of ample interest in many neuroimaging studies, which may benefit from the availability of automated segmentation methods. The high iron content of these structures awards them high contrast in quantitative susceptibility mapping (QSM) images. We present a novel segmentation method that leverages the information of these images to produce automated segmentations of the SN, STN, and RN. The algorithm builds a map of spatial priors for the structures by non-linearly registering a set of manually-traced training labels to the midbrain. The priors are used to inform a Gaussian mixture model of the image intensities, with smoothness constraints imposed to ensure anatomical plausibility. The method was validated on manual segmentations from a sample of 40 healthy younger and older subjects. Average Dice scores were 0.81 (0.05) for the SN, 0.66 (0.14) for the STN and 0.88 (0.04) for the RN in the left hemisphere, and similar values were obtained for the right hemisphere. In all structures, volumes of manual and automatically obtained segmentations were significantly correlated. The algorithm showed lower accuracy on R 2 * and T 2 -weighted Fluid Attenuated Inversion Recovery (FLAIR) images, which are also sensitive to iron content. To illustrate an application of the method, we show that the automated segmentations were comparable to the manual ones regarding detection of age-related differences to putative iron content. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Computational solution verification and validation applied to a thermal model of a ruggedized instrumentation package

    DOE PAGES

    Scott, Sarah Nicole; Templeton, Jeremy Alan; Hough, Patricia Diane; ...

    2014-01-01

    This study details a methodology for quantification of errors and uncertainties of a finite element heat transfer model applied to a Ruggedized Instrumentation Package (RIP). The proposed verification and validation (V&V) process includes solution verification to examine errors associated with the code's solution techniques, and model validation to assess the model's predictive capability for quantities of interest. The model was subjected to mesh resolution and numerical parameters sensitivity studies to determine reasonable parameter values and to understand how they change the overall model response and performance criteria. To facilitate quantification of the uncertainty associated with the mesh, automatic meshing andmore » mesh refining/coarsening algorithms were created and implemented on the complex geometry of the RIP. Automated software to vary model inputs was also developed to determine the solution’s sensitivity to numerical and physical parameters. The model was compared with an experiment to demonstrate its accuracy and determine the importance of both modelled and unmodelled physics in quantifying the results' uncertainty. An emphasis is placed on automating the V&V process to enable uncertainty quantification within tight development schedules.« less

  6. An automated system for liquid-liquid extraction in monosegmented flow analysis

    PubMed Central

    Facchin, Ileana; Pasquini, Celio

    1997-01-01

    An automated system to perform liquid-liquid extraction in monosegmented flow analysis is described. The system is controlled by a microcomputer that can track the localization of the aqueous monosegmented sample in the manifold. Optical switches are employed to sense the gas-liquid interface of the air bubbles that define the monosegment. The logical level changes, generated by the switches, are flagged by the computer through a home-made interface that also contains the analogue-to-digital converter for signal acquisition. The sequence of operations, necessary for a single extraction or for concentration of the analyte in the organic phase, is triggered by these logical transitions. The system was evaluated for extraction of Cd(II), Cu(II) and Zn(II) and concentration of Cd(II) from aqueous solutions at pH 9.9 (NH3/NH4Cl buffer) into chloroform containing PAN (1-(2-pyridylazo)-2-naphthol) . The results show a mean repeatability of 3% (rsd) for a 2.0 mg l-1 Cd(II) solution and a linear increase of the concentration factor for a 0.5mg l-1 Cd(II) solution observed for up to nine extraction cycles. PMID:18924792

  7. End-to-end automated microfluidic platform for synthetic biology: from design to functional analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linshiz, Gregory; Jensen, Erik; Stawski, Nina

    Synthetic biology aims to engineer biological systems for desired behaviors. The construction of these systems can be complex, often requiring genetic reprogramming, extensive de novo DNA synthesis, and functional screening. Here, we present a programmable, multipurpose microfluidic platform and associated software and apply the platform to major steps of the synthetic biology research cycle: design, construction, testing, and analysis. We show the platform’s capabilities for multiple automated DNA assembly methods, including a new method for Isothermal Hierarchical DNA Construction, and for Escherichia coli and Saccharomyces cerevisiae transformation. The platform enables the automated control of cellular growth, gene expression induction, andmore » proteogenic and metabolic output analysis. Finally, taken together, we demonstrate the microfluidic platform’s potential to provide end-to-end solutions for synthetic biology research, from design to functional analysis.« less

  8. Remote voice training: A case study on space shuttle applications, appendix C

    NASA Technical Reports Server (NTRS)

    Mollakarimi, Cindy; Hamid, Tamin

    1990-01-01

    The Tile Automation System includes applications of automation and robotics technology to all aspects of the Shuttle tile processing and inspection system. An integrated set of rapid prototyping testbeds was developed which include speech recognition and synthesis, laser imaging systems, distributed Ada programming environments, distributed relational data base architectures, distributed computer network architectures, multi-media workbenches, and human factors considerations. Remote voice training in the Tile Automation System is discussed. The user is prompted over a headset by synthesized speech for the training sequences. The voice recognition units and the voice output units are remote from the user and are connected by Ethernet to the main computer system. A supervisory channel is used to monitor the training sequences. Discussions include the training approaches as well as the human factors problems and solutions for this system utilizing remote training techniques.

  9. Automated translating beam profiler for in situ laser beam spot-size and focal position measurements

    NASA Astrophysics Data System (ADS)

    Keaveney, James

    2018-03-01

    We present a simple and convenient, high-resolution solution for automated laser-beam profiling with axial translation. The device is based on a Raspberry Pi computer, Pi Noir CMOS camera, stepper motor, and commercial translation stage. We also provide software to run the device. The CMOS sensor is sensitive over a large wavelength range between 300 and 1100 nm and can be translated over 25 mm along the beam axis. The sensor head can be reversed without changing its axial position, allowing for a quantitative estimate of beam overlap with counter-propagating laser beams. Although not limited to this application, the intended use for this device is the automated measurement of the focal position and spot-size of a Gaussian laser beam. We present example data of one such measurement to illustrate device performance.

  10. Automated translating beam profiler for in situ laser beam spot-size and focal position measurements.

    PubMed

    Keaveney, James

    2018-03-01

    We present a simple and convenient, high-resolution solution for automated laser-beam profiling with axial translation. The device is based on a Raspberry Pi computer, Pi Noir CMOS camera, stepper motor, and commercial translation stage. We also provide software to run the device. The CMOS sensor is sensitive over a large wavelength range between 300 and 1100 nm and can be translated over 25 mm along the beam axis. The sensor head can be reversed without changing its axial position, allowing for a quantitative estimate of beam overlap with counter-propagating laser beams. Although not limited to this application, the intended use for this device is the automated measurement of the focal position and spot-size of a Gaussian laser beam. We present example data of one such measurement to illustrate device performance.

  11. End-to-end automated microfluidic platform for synthetic biology: from design to functional analysis

    DOE PAGES

    Linshiz, Gregory; Jensen, Erik; Stawski, Nina; ...

    2016-02-02

    Synthetic biology aims to engineer biological systems for desired behaviors. The construction of these systems can be complex, often requiring genetic reprogramming, extensive de novo DNA synthesis, and functional screening. Here, we present a programmable, multipurpose microfluidic platform and associated software and apply the platform to major steps of the synthetic biology research cycle: design, construction, testing, and analysis. We show the platform’s capabilities for multiple automated DNA assembly methods, including a new method for Isothermal Hierarchical DNA Construction, and for Escherichia coli and Saccharomyces cerevisiae transformation. The platform enables the automated control of cellular growth, gene expression induction, andmore » proteogenic and metabolic output analysis. Finally, taken together, we demonstrate the microfluidic platform’s potential to provide end-to-end solutions for synthetic biology research, from design to functional analysis.« less

  12. Promoting autonomy in a smart home environment with a smarter interface.

    PubMed

    Brennan, C P; McCullagh, P J; Galway, L; Lightbody, G

    2015-01-01

    In the not too distant future, the median population age will tend towards 65; an age at which the need for dependency increases. Most older people want to remain autonomous and self-sufficient for as long as possible. As environments become smarter home automation solutions can be provided to support this aspiration. The technology discussed within this paper focuses on providing a home automation system that can be controlled by most users regardless of mobility restrictions, and hence it may be applicable to older people. It comprises a hybrid Brain-Computer Interface, home automation user interface and actuators. In the first instance, our system is controlled with conventional computer input, which is then replaced with eye tracking and finally a BCI and eye tracking collaboration. The systems have been assessed in terms of information throughput; benefits and limitations are evaluated.

  13. High-dimensional neural network potentials for solvation: The case of protonated water clusters in helium

    NASA Astrophysics Data System (ADS)

    Schran, Christoph; Uhl, Felix; Behler, Jörg; Marx, Dominik

    2018-03-01

    The design of accurate helium-solute interaction potentials for the simulation of chemically complex molecules solvated in superfluid helium has long been a cumbersome task due to the rather weak but strongly anisotropic nature of the interactions. We show that this challenge can be met by using a combination of an effective pair potential for the He-He interactions and a flexible high-dimensional neural network potential (NNP) for describing the complex interaction between helium and the solute in a pairwise additive manner. This approach yields an excellent agreement with a mean absolute deviation as small as 0.04 kJ mol-1 for the interaction energy between helium and both hydronium and Zundel cations compared with coupled cluster reference calculations with an energetically converged basis set. The construction and improvement of the potential can be performed in a highly automated way, which opens the door for applications to a variety of reactive molecules to study the effect of solvation on the solute as well as the solute-induced structuring of the solvent. Furthermore, we show that this NNP approach yields very convincing agreement with the coupled cluster reference for properties like many-body spatial and radial distribution functions. This holds for the microsolvation of the protonated water monomer and dimer by a few helium atoms up to their solvation in bulk helium as obtained from path integral simulations at about 1 K.

  14. High-dimensional neural network potentials for solvation: The case of protonated water clusters in helium.

    PubMed

    Schran, Christoph; Uhl, Felix; Behler, Jörg; Marx, Dominik

    2018-03-14

    The design of accurate helium-solute interaction potentials for the simulation of chemically complex molecules solvated in superfluid helium has long been a cumbersome task due to the rather weak but strongly anisotropic nature of the interactions. We show that this challenge can be met by using a combination of an effective pair potential for the He-He interactions and a flexible high-dimensional neural network potential (NNP) for describing the complex interaction between helium and the solute in a pairwise additive manner. This approach yields an excellent agreement with a mean absolute deviation as small as 0.04 kJ mol -1 for the interaction energy between helium and both hydronium and Zundel cations compared with coupled cluster reference calculations with an energetically converged basis set. The construction and improvement of the potential can be performed in a highly automated way, which opens the door for applications to a variety of reactive molecules to study the effect of solvation on the solute as well as the solute-induced structuring of the solvent. Furthermore, we show that this NNP approach yields very convincing agreement with the coupled cluster reference for properties like many-body spatial and radial distribution functions. This holds for the microsolvation of the protonated water monomer and dimer by a few helium atoms up to their solvation in bulk helium as obtained from path integral simulations at about 1 K.

  15. Automated software to determine thermal diffusivity of oilgas mixture

    NASA Astrophysics Data System (ADS)

    Khismatullin, A. S.

    2018-05-01

    The paper presents automated software to determine thermal diffusivity of oil-gas mixture. A series of laboratory testscovering transformer oil cooling in a power transformer tank was conducted. The paper also describes diagrams of temperature-timedependence of bubbling. Thermal diffusivity coefficients are experimentally defined. The paper considers a mathematical task of heat flowdistribution in a rectangular parallelepiped, alongside with the solution of heat a conduction equation in a power transformer tank, which represents a rectangular parallelepiped. A device for temperature monitoring in the tank is described in detail. The relay control diagram, which ensures temperature monitoring againsttransformer overheating is described.

  16. National Aeronautics and Space Administration Manned Spacecraft Center data base requirements study

    NASA Technical Reports Server (NTRS)

    1971-01-01

    A study was conducted to evaluate the types of data that the Manned Spacecraft Center (MSC) should automate in order to make available essential management and technical information to support MSC's various functions and missions. In addition, the software and hardware capabilities to best handle the storage and retrieval of this data were analyzed. Based on the results of this study, recommendations are presented for a unified data base that provides a cost effective solution to MSC's data automation requirements. The recommendations are projected through a time frame that includes the earth orbit space station.

  17. Parmodel: a web server for automated comparative modeling of proteins.

    PubMed

    Uchôa, Hugo Brandão; Jorge, Guilherme Eberhart; Freitas Da Silveira, Nelson José; Camera, João Carlos; Canduri, Fernanda; De Azevedo, Walter Filgueira

    2004-12-24

    Parmodel is a web server for automated comparative modeling and evaluation of protein structures. The aim of this tool is to help inexperienced users to perform modeling, assessment, visualization, and optimization of protein models as well as crystallographers to evaluate structures solved experimentally. It is subdivided in four modules: Parmodel Modeling, Parmodel Assessment, Parmodel Visualization, and Parmodel Optimization. The main module is the Parmodel Modeling that allows the building of several models for a same protein in a reduced time, through the distribution of modeling processes on a Beowulf cluster. Parmodel automates and integrates the main softwares used in comparative modeling as MODELLER, Whatcheck, Procheck, Raster3D, Molscript, and Gromacs. This web server is freely accessible at .

  18. Use of dye to distinguish salt and protein crystals under microcrystallization conditions

    NASA Technical Reports Server (NTRS)

    Cosenza, Larry (Inventor); Gester, Thomas E. (Inventor); Bray, Terry L. (Inventor); DeLucas, Lawrence J. (Inventor); Hamrick, David T. (Inventor)

    2007-01-01

    An improved method of screening crystal growth conditions is provided wherein molecules are crystallized from solutions containing dyes. These dyes are selectively incorporated or associated with crystals of particular character thereby rendering crystals of particular character colored and improving detection of the dyed crystals. A preferred method involves use of dyes in protein solutions overlayed by oil. Use of oil allows the use of small volumes of solution and facilitates the screening of large numbers of crystallization conditions in arrays using automated devices that dispense appropriate solutions to generate crystallization trials, overlay crystallization trials with an oil, provide appropriate conditions conducive to crystallization and enhance detection of dyed (colored) or undyed (uncolored) crystals that result.

  19. Transition from Legacy to Connectivity Solution for Infrastructure Control of Smart Municipal Systems

    NASA Astrophysics Data System (ADS)

    Zabasta, A.; Kunicina, N.; Kondratjevs, K.

    2017-06-01

    Collaboration between heterogeneous systems and architectures is not an easy problem in the automation domain. By now, utilities and suppliers encounter real problems due to underestimated costs of technical solutions, frustration in selecting technical solutions relevant for local needs, and incompatibilities between a plenty of protocols and appropriate solutions. The paper presents research on creation of architecture of smart municipal systems in a local cloud of services that apply SOA and IoT approaches. The authors of the paper have developed a broker that applies orchestration services and resides on a gateway, which provides adapter and protocol translation functions, as well as applies a tool for wiring together hardware devices, APIs and online services.

  20. Media-fill simulation tests in manual and robotic aseptic preparation of injection solutions in syringes.

    PubMed

    Krämer, Irene; Federici, Matteo; Kaiser, Vanessa; Thiesen, Judith

    2016-04-01

    The purpose of this study was to evaluate the contamination rate of media-fill products either prepared automated with a robotic system (APOTECAchemo™) or prepared manually at cytotoxic workbenches in the same cleanroom environment and by experienced operators. Media fills were completed by microbiological environmental control in the critical zones and used to validate the cleaning and disinfection procedures of the robotic system. The aseptic preparation of patient individual ready-to-use injection solutions was simulated by using double concentrated tryptic soy broth as growth medium, water for injection and plastic syringes as primary packaging materials. Media fills were either prepared automated (500 units) in the robot or manually (500 units) in cytotoxic workbenches in the same cleanroom over a period of 18 working days. The test solutions were incubated at room temperature (22℃) over 4 weeks. Products were visually inspected for turbidity after a 2-week and 4-week period. Following incubation, growth promotion tests were performed with Staphylococcus epidermidis. During the media-fill procedures, passive air monitoring was performed with settle plates and surface monitoring with contact plates on predefined locations as well as fingerprints. The plates got incubated for 5-7 days at room temperature, followed by 2-3 days at 30-35℃ and the colony forming units (cfu) counted after both periods. The robot was cleaned and disinfected according to the established standard operating procedure on two working days prior to the media-fill session, while on six other working days only six critical components were sanitized at the end of the media-fill sessions. Every day UV irradiation was operated for 4 h after finishing work. None of the 1000 media-fill products prepared in the two different settings showed turbidity after the incubation period thereby indicating no contamination with microorganisms. All products remained uniform, clear, and light-amber solutions. In addition, the reliability of the nutrient medium and the process was demonstrated by positive growth promotion tests with S. epidermidis. During automated preparation the recommended limits < 1 cfu per settle/contact plate set for cleanroom Grade A zones were not succeeded in the carousel and working area, but in the loading area of the robot. During manual preparation, the number of cfus detected on settle/contact plates inside the workbenches lay far below the limits. The number of cfus detected on fingertips succeeded several times the limit during manual preparation but not during automated preparation. There was no difference in the microbial contamination rate depending on the extent of cleaning and disinfection of the robot. Extensive media-fill tests simulating manual and automated preparation of ready-to-use cytotoxic injection solutions revealed the same level of sterility for both procedures. The results of supplemental environmental controls confirmed that the aseptic procedures are well controlled. As there was no difference in the microbial contamination rates of the media preparations depending on the extent of cleaning and disinfection of the robot, the results were used to adapt the respective standard operating procedures. © The Author(s) 2014.

Top