Science.gov

Sample records for automated multistep genetic

  1. Design of a new automated multi-step outflow test apparatus

    NASA Astrophysics Data System (ADS)

    Figueras, J.; Gribb, M. M.; McNamara, J. P.

    2006-12-01

    Modeling flow and transport in the vadose zone requires knowledge of the soil hydraulic properties. Laboratories studies involving vadose zone soils typically include use of the multistep outflow method (MSO), which can provide information about wetting and drying soil-moisture and hydraulic conductivity curves from a single test. However, manual MSO testing is time consuming and measurement errors can be easily introduced. A computer-automated system has been designed to allow convenient measurement of soil-water characteristic curves. Computer-controlled solenoid valves are used to regulate the pressure inside Tempe cells to drain soil samples, and outflow volumes are measured with a pressure transducer. The electronic components of the system are controlled using LabVIEW software. This system has been optimized for undisturbed core samples. System performance has been evaluated by comparing results from undisturbed samples subjected first to manual MSO testing and then automated testing. The automated and manual MSO tests yielded similar drying soil-water characteristic curves. These curves are further compared to in-situ measurements and those obtained using pedotransfer functions for a semi-arid watershed.

  2. A miniature integrated device for automated multistep genetic assays

    PubMed Central

    Anderson, Rolfe C.; Su, Xing; Bogdan, Gregory J.; Fenton, Jeffery

    2000-01-01

    A highly integrated monolithic device was developed that automatically carries out a complex series of molecular processes on multiple samples. The device is capable of extracting and concentrating nucleic acids from milliliter aqueous samples and performing microliter chemical amplification, serial enzymatic reactions, metering, mixing and nucleic acid hybridization. The device, which is smaller than a credit card, can manipulate over 10 reagents in more than 60 sequential operations and was tested for the detection of mutations in a 1.6 kb region of the HIV genome from serum samples containing as few as 500 copies of the RNA. The elements in this device are readily linked into complex, flexible and highly parallel analysis networks for high throughput sample preparation or, conversely, for low cost portable DNA analysis instruments in point-of-care medical diagnostics, environmental testing and defensive biological agent detection. PMID:10871383

  3. Automating multi-step paper-based assays using integrated layering of reagents.

    PubMed

    Jahanshahi-Anbuhi, Sana; Kannan, Balamurali; Pennings, Kevin; Monsur Ali, M; Leung, Vincent; Giang, Karen; Wang, Jingyun; White, Dawn; Li, Yingfu; Pelton, Robert H; Brennan, John D; Filipe, Carlos D M

    2017-02-28

    We describe a versatile and simple method to perform sequential reactions on paper analytical devices by stacking dry pullulan films on paper, where each film contains one or more reagents or acts as a delay layer. Exposing the films to an aqueous solution of the analyte leads to sequential dissolution of the films in a temporally controlled manner followed by diffusive mixing of the reagents, so that sequential reactions can be performed. The films can be easily arranged for lateral flow assays or for spot tests (reactions take place sequentially in the z-direction). We have tested the general feasibility of the approach using three different model systems to demonstrate different capabilities: 1) pH ramping from low to high and high to low to demonstrate timing control; 2) rapid ready-to-use two-step Simon's assays on paper for detection of drugs of abuse utilizing a 2-layer stack containing two different reagents to demonstrate the ability to perform assays in the z-direction; and 3) sequential cell lysing and colorimetric detection of an intracellular bacterial enzyme, to demonstrate the ability of the method to perform sample preparation and analysis in the form of a spot assay. Overall, these studies demonstrate the potential of stacked pullulan films as useful components to enable multi-step assays on simple paper-based devices.

  4. Multi-step excitation energy transfer engineered in genetic fusions of natural and synthetic light-harvesting proteins.

    PubMed

    Mancini, Joshua A; Kodali, Goutham; Jiang, Jianbing; Reddy, Kanumuri Ramesh; Lindsey, Jonathan S; Bryant, Donald A; Dutton, P Leslie; Moser, Christopher C

    2017-02-01

    Synthetic proteins designed and constructed from first principles with minimal reference to the sequence of any natural protein have proven robust and extraordinarily adaptable for engineering a range of functions. Here for the first time we describe the expression and genetic fusion of a natural photosynthetic light-harvesting subunit with a synthetic protein designed for light energy capture and multi-step transfer. We demonstrate excitation energy transfer from the bilin of the CpcA subunit (phycocyanin α subunit) of the cyanobacterial photosynthetic light-harvesting phycobilisome to synthetic four-helix-bundle proteins accommodating sites that specifically bind a variety of selected photoactive tetrapyrroles positioned to enhance energy transfer by relay. The examination of combinations of different bilin, chlorin and bacteriochlorin cofactors has led to identification of the preconditions for directing energy from the bilin light-harvesting antenna into synthetic protein-cofactor constructs that can be customized for light-activated chemistry in the cell.

  5. Programming cells: towards an automated 'Genetic Compiler'.

    PubMed

    Clancy, Kevin; Voigt, Christopher A

    2010-08-01

    One of the visions of synthetic biology is to be able to program cells using a language that is similar to that used to program computers or robotics. For large genetic programs, keeping track of the DNA on the level of nucleotides becomes tedious and error prone, requiring a new generation of computer-aided design (CAD) software. To push the size of projects, it is important to abstract the designer from the process of part selection and optimization. The vision is to specify genetic programs in a higher-level language, which a genetic compiler could automatically convert into a DNA sequence. Steps towards this goal include: defining the semantics of the higher-level language, algorithms to select and assemble parts, and biophysical methods to link DNA sequence to function. These will be coupled to graphic design interfaces and simulation packages to aid in the prediction of program dynamics, optimize genes, and scan projects for errors.

  6. Automated computer vision interpretation for physical and genetic mapping experiments

    SciTech Connect

    Pathak, D.K.; Perlin, M.W.

    1994-09-01

    Much of the high-throughput data currently generated in the molecular genetics laboratory is in the form of two dimensional images. In physical mapping, a high-density gridded filter containing thousands YACs or cosmids can be hybridized against a single labeled probe, such as the IRS-PCR products of a radiation hybrid (RH). In genetic mapping, hundreds of polymorphic dinucleotide repeat PCR experiments can be multiplexed into distinct lanes, size ranges, and fluorescent colors on a single run of an Applied Biosystems (ABl) 373A automated DNA sequencer. For all its advantages such high-throughput data introduces a new fundamental bottleneck: the greatly increased time, expense, and error of scoring these assays when relying solely on the human visual system. Toward this end, we have developed a novel computer vision automation architecture that addresses the needs of high-throughput data interpretation in the molecular genetics laboratory. A flexible knowledge-based approach is used to systematically detect and analyze signal features, motivated by how human experts perform the interpretation. This architecture enables customization to similar hybridization-and gel-based tasks. Our prototype system has thus far been tested on both hybridization data and ABl gel images.

  7. AutoBioCAD: full biodesign automation of genetic circuits.

    PubMed

    Rodrigo, Guillermo; Jaramillo, Alfonso

    2013-05-17

    Synthetic regulatory networks with prescribed functions are engineered by assembling a reduced set of functional elements. We could also assemble them computationally if the mathematical models of those functional elements were predictive enough in different genetic contexts. Only after achieving this will we have libraries of models of biological parts able to provide predictive dynamical behaviors for most circuits constructed with them. We thus need tools that can automatically explore different genetic contexts, in addition to being able to use such libraries to design novel circuits with targeted dynamics. We have implemented a new tool, AutoBioCAD, aimed at the automated design of gene regulatory circuits. AutoBioCAD loads a library of models of genetic elements and implements evolutionary design strategies to produce (i) nucleotide sequences encoding circuits with targeted dynamics that can then be tested experimentally and (ii) circuit models for testing regulation principles in natural systems, providing a new tool for synthetic biology. AutoBioCAD can be used to model and design genetic circuits with dynamic behavior, thanks to the incorporation of stochastic effects, robustness, qualitative dynamics, multiobjective optimization, or degenerate nucleotide sequences, all facilitating the link with biological part/circuit engineering.

  8. MALDI-TOF MS analysis of soluble PEG based multi-step synthetic reaction mixtures with automated detection of reaction failure.

    PubMed

    Enjalbal, Christine; Ribière, Patrice; Lamaty, Frédéric; Yadav-Bhatnagar, Neerja; Martinez, Jean; Aubagnac, Jean-Louis

    2005-05-01

    Macromolecules of tunable solubility, used to mimic inert insoluble materials while maintaining solution conditions, allowed the performance of efficient supported organic chemistry and facilitated in situ reaction monitoring. To satisfy the high throughput requirements of automated synthetic processes, organic syntheses carried out on bifunctional polyethylene glycol polymers (PEG(3400)-OH) were monitored step-by-step by matrix assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS). A protocol was designed to control the ionization mechanism of such polymers exhibiting high affinity for alkali metal cations. Automated, rapid, and reliable data interpretation was performed by an in-house developed visual basic application relying on the sodiated ion accurate monoisotopic mass measurement. The methodology was illustrated through the monitoring of a six-step synthetic scheme.

  9. A Parallel Genetic Algorithm for Automated Electronic Circuit Design

    NASA Technical Reports Server (NTRS)

    Long, Jason D.; Colombano, Silvano P.; Haith, Gary L.; Stassinopoulos, Dimitris

    2000-01-01

    Parallelized versions of genetic algorithms (GAs) are popular primarily for three reasons: the GA is an inherently parallel algorithm, typical GA applications are very compute intensive, and powerful computing platforms, especially Beowulf-style computing clusters, are becoming more affordable and easier to implement. In addition, the low communication bandwidth required allows the use of inexpensive networking hardware such as standard office ethernet. In this paper we describe a parallel GA and its use in automated high-level circuit design. Genetic algorithms are a type of trial-and-error search technique that are guided by principles of Darwinian evolution. Just as the genetic material of two living organisms can intermix to produce offspring that are better adapted to their environment, GAs expose genetic material, frequently strings of 1s and Os, to the forces of artificial evolution: selection, mutation, recombination, etc. GAs start with a pool of randomly-generated candidate solutions which are then tested and scored with respect to their utility. Solutions are then bred by probabilistically selecting high quality parents and recombining their genetic representations to produce offspring solutions. Offspring are typically subjected to a small amount of random mutation. After a pool of offspring is produced, this process iterates until a satisfactory solution is found or an iteration limit is reached. Genetic algorithms have been applied to a wide variety of problems in many fields, including chemistry, biology, and many engineering disciplines. There are many styles of parallelism used in implementing parallel GAs. One such method is called the master-slave or processor farm approach. In this technique, slave nodes are used solely to compute fitness evaluations (the most time consuming part). The master processor collects fitness scores from the nodes and performs the genetic operators (selection, reproduction, variation, etc.). Because of dependency

  10. Automating the packing heuristic design process with genetic programming.

    PubMed

    Burke, Edmund K; Hyde, Matthew R; Kendall, Graham; Woodward, John

    2012-01-01

    The literature shows that one-, two-, and three-dimensional bin packing and knapsack packing are difficult problems in operational research. Many techniques, including exact, heuristic, and metaheuristic approaches, have been investigated to solve these problems and it is often not clear which method to use when presented with a new instance. This paper presents an approach which is motivated by the goal of building computer systems which can design heuristic methods. The overall aim is to explore the possibilities for automating the heuristic design process. We present a genetic programming system to automatically generate a good quality heuristic for each instance. It is not necessary to change the methodology depending on the problem type (one-, two-, or three-dimensional knapsack and bin packing problems), and it therefore has a level of generality unmatched by other systems in the literature. We carry out an extensive suite of experiments and compare with the best human designed heuristics in the literature. Note that our heuristic design methodology uses the same parameters for all the experiments. The contribution of this paper is to present a more general packing methodology than those currently available, and to show that, by using this methodology, it is possible for a computer system to design heuristics which are competitive with the human designed heuristics from the literature. This represents the first packing algorithm in the literature able to claim human competitive results in such a wide variety of packing domains.

  11. Integrating GIS and genetic algorithms for automating land partitioning

    NASA Astrophysics Data System (ADS)

    Demetriou, Demetris; See, Linda; Stillwell, John

    2014-08-01

    Land consolidation is considered to be the most effective land management planning approach for controlling land fragmentation and hence improving agricultural efficiency. Land partitioning is a basic process of land consolidation that involves the subdivision of land into smaller sub-spaces subject to a number of constraints. This paper explains the development of a module called LandParcelS (Land Parcelling System) that integrates geographical information systems and a genetic algorithm to automate the land partitioning process by designing and optimising land parcels in terms of their shape, size and value. This new module has been applied to two land blocks that are part of a larger case study area in Cyprus. Partitioning is carried out by guiding a Thiessen polygon process within ArcGIS and it is treated as a multiobjective problem. The results suggest that a step forward has been made in solving this complex spatial problem, although further research is needed to improve the algorithm. The contribution of this research extends land partitioning and space partitioning in general, since these approaches may have relevance to other spatial processes that involve single or multi-objective problems that could be solved in the future by spatial evolutionary algorithms.

  12. Assessing genetic diversity in a sugarcane germplasm collection using an automated AFLP analysis.

    PubMed

    Besse, P; Taylor, G; Carroll, B; Berding, N; Burner, D; McIntyre, C L

    1998-10-01

    An assessment of genetic diversity within and between Saccharum, Old World Erianthus sect. Ripidium, and North American E.giganteus (S.giganteum) was conducted using Amplified Fragment Length Polymorphism (AFLP(TM)) markers. An automated gel scoring system (GelCompar(TM)) was successfully used to analyse the complex AFLP patterns obtained in sugarcane and its relatives. Similarity coefficient calculations and clustering revealed a genetic structure for Saccharum and Erianthus sect. Ripidium that was identical to the one previously obtained using other molecular marker types, showing the appropriateness of AFLP markers and the associated automated analysis in assessing genetic diversity in sugarcane. A genetic structure that correlated with cytotype (2n=30, 60, 90) was revealed within the North American species, E. giganteus (S.giganteum). Complex relationships among Saccharum, Erianthus sect. Ripidium, and North American E.giganteus were revealed and are discussed in the light of a similar study which involved RAPD markers.

  13. Genetic Influences on Cognitive Function Using the Cambridge Neuropsychological Test Automated Battery

    ERIC Educational Resources Information Center

    Singer, Jamie J.; MacGregor, Alex J.; Cherkas, Lynn F.; Spector, Tim D.

    2006-01-01

    The genetic relationship between intelligence and components of cognition remains controversial. Conflicting results may be a function of the limited number of methods used in experimental evaluation. The current study is the first to use CANTAB (The Cambridge Neuropsychological Test Automated Battery). This is a battery of validated computerised…

  14. Automated Test Assembly for Cognitive Diagnosis Models Using a Genetic Algorithm

    ERIC Educational Resources Information Center

    Finkelman, Matthew; Kim, Wonsuk; Roussos, Louis A.

    2009-01-01

    Much recent psychometric literature has focused on cognitive diagnosis models (CDMs), a promising class of instruments used to measure the strengths and weaknesses of examinees. This article introduces a genetic algorithm to perform automated test assembly alongside CDMs. The algorithm is flexible in that it can be applied whether the goal is to…

  15. Genetic design automation: engineering fantasy or scientific renewal?

    PubMed

    Lux, Matthew W; Bramlett, Brian W; Ball, David A; Peccoud, Jean

    2012-02-01

    The aim of synthetic biology is to make genetic systems more amenable to engineering, which has naturally led to the development of computer-aided design (CAD) tools. Experimentalists still primarily rely on project-specific ad hoc workflows instead of domain-specific tools, which suggests that CAD tools are lagging behind the front line of the field. Here, we discuss the scientific hurdles that have limited the productivity gains anticipated from existing tools. We argue that the real value of efforts to develop CAD tools is the formalization of genetic design rules that determine the complex relationships between genotype and phenotype.

  16. A Parallel Genetic Algorithm for Automated Electronic Circuit Design

    NASA Technical Reports Server (NTRS)

    Lohn, Jason D.; Colombano, Silvano P.; Haith, Gary L.; Stassinopoulos, Dimitris; Norvig, Peter (Technical Monitor)

    2000-01-01

    We describe a parallel genetic algorithm (GA) that automatically generates circuit designs using evolutionary search. A circuit-construction programming language is introduced and we show how evolution can generate practical analog circuit designs. Our system allows circuit size (number of devices), circuit topology, and device values to be evolved. We present experimental results as applied to analog filter and amplifier design tasks.

  17. Complex Genetics of Behavior: BXDs in the Automated Home-Cage.

    PubMed

    Loos, Maarten; Verhage, Matthijs; Spijker, Sabine; Smit, August B

    2017-01-01

    This chapter describes a use case for the genetic dissection and automated analysis of complex behavioral traits using the genetically diverse panel of BXD mouse recombinant inbred strains. Strains of the BXD resource differ widely in terms of gene and protein expression in the brain, as well as in their behavioral repertoire. A large mouse resource opens the possibility for gene finding studies underlying distinct behavioral phenotypes, however, such a resource poses a challenge in behavioral phenotyping. To address the specifics of large-scale screening we describe how to investigate: (1) how to assess mouse behavior systematically in addressing a large genetic cohort, (2) how to dissect automation-derived longitudinal mouse behavior into quantitative parameters, and (3) how to map these quantitative traits to the genome, deriving loci underlying aspects of behavior.

  18. Using digital electronic design flow to create a Genetic Design Automation tool.

    PubMed

    Gendrault, Y; Madec, M; Wlotzko, V; Andraud, M; Lallement, C; Haiech, J

    2012-01-01

    Synthetic bio-systems become increasingly more complex and their development is lengthy and expensive. In the same way, in microelectronics, the design process of very complex circuits has benefited from many years of experience. It is now partly automated through Electronic Design Automation tools. Both areas present analogies that can be used to create a Genetic Design Automation tool inspired from EDA tools used in digital electronics. This tool would allow moving away from a totally manual design of bio-systems to assisted conception. This ambitious project is presented in this paper, with a deep focus on the tool that automatically generates models of bio-systems directly usable in electronic simulators.

  19. Automated identification of pathways from quantitative genetic interaction data

    PubMed Central

    Battle, Alexis; Jonikas, Martin C; Walter, Peter; Weissman, Jonathan S; Koller, Daphne

    2010-01-01

    High-throughput quantitative genetic interaction (GI) measurements provide detailed information regarding the structure of the underlying biological pathways by reporting on functional dependencies between genes. However, the analytical tools for fully exploiting such information lag behind the ability to collect these data. We present a novel Bayesian learning method that uses quantitative phenotypes of double knockout organisms to automatically reconstruct detailed pathway structures. We applied our method to a recent data set that measures GIs for endoplasmic reticulum (ER) genes, using the unfolded protein response as a quantitative phenotype. The results provided reconstructions of known functional pathways including N-linked glycosylation and ER-associated protein degradation. It also contained novel relationships, such as the placement of SGT2 in the tail-anchored biogenesis pathway, a finding that we experimentally validated. Our approach should be readily applicable to the next generation of quantitative GI data sets, as assays become available for additional phenotypes and eventually higher-level organisms. PMID:20531408

  20. Deadlock-free genetic scheduling algorithm for automated manufacturing systems based on deadlock control policy.

    PubMed

    Xing, KeYi; Han, LiBin; Zhou, MengChu; Wang, Feng

    2012-06-01

    Deadlock-free control and scheduling are vital for optimizing the performance of automated manufacturing systems (AMSs) with shared resources and route flexibility. Based on the Petri net models of AMSs, this paper embeds the optimal deadlock avoidance policy into the genetic algorithm and develops a novel deadlock-free genetic scheduling algorithm for AMSs. A possible solution of the scheduling problem is coded as a chromosome representation that is a permutation with repetition of parts. By using the one-step look-ahead method in the optimal deadlock control policy, the feasibility of a chromosome is checked, and infeasible chromosomes are amended into feasible ones, which can be easily decoded into a feasible deadlock-free schedule. The chromosome representation and polynomial complexity of checking and amending procedures together support the cooperative aspect of genetic search for scheduling problems strongly.

  1. Influence of parameter settings in automated scoring of AFLPs on population genetic analysis.

    PubMed

    Herrmann, Marc; Holderegger, Rolf; Van Strien, Maarten J

    2013-01-01

    The use of procedures for the automated scoring of amplified fragment length polymorphisms (AFLP) fragments has recently increased. Corresponding software does not only automatically score the presence or absence of AFLP fragments, but also allows an evaluation of how different settings of scoring parameters influence subsequent population genetic analyses. In this study, we used the automated scoring package rawgeno to evaluate how five scoring parameters influence the number of polymorphic bins and estimates of pairwise genetic differentiation between populations (F(st)). Steps were implemented in r to automatically run the scoring process in rawgeno for a set of different parameter combinations. While we found the scoring parameters minimum bin width and minimum number of samples per bin to have only weak influence on pairwise F(st) values, maximum bin width and bin reproducibility had much stronger effects. The minimum average bin fluorescence scoring parameter affected F(st) values in an only moderate way. At a range of scoring parameters around the default settings of rawgeno, the number of polymorphic bins as well as pairwise F(st) values stayed rather constant. This study thus shows the particularities of AFLP scoring, be it either manual or automatical, can have profound effects on subsequent population genetic analysis.

  2. Single-Cell Genetic Analysis Using Automated Microfluidics to Resolve Somatic Mosaicism.

    PubMed

    Szulwach, Keith E; Chen, Peilin; Wang, Xiaohui; Wang, Jing; Weaver, Lesley S; Gonzales, Michael L; Sun, Gang; Unger, Marc A; Ramakrishnan, Ramesh

    2015-01-01

    Somatic mosaicism occurs throughout normal development and contributes to numerous disease etiologies, including tumorigenesis and neurological disorders. Intratumor genetic heterogeneity is inherent to many cancers, creating challenges for effective treatments. Unfortunately, analysis of bulk DNA masks subclonal phylogenetic architectures created by the acquisition and distribution of somatic mutations amongst cells. As a result, single-cell genetic analysis is becoming recognized as vital for accurately characterizing cancers. Despite this, methods for single-cell genetics are lacking. Here we present an automated microfluidic workflow enabling efficient cell capture, lysis, and whole genome amplification (WGA). We find that ~90% of the genome is accessible in single cells with improved uniformity relative to current single-cell WGA methods. Allelic dropout (ADO) rates were limited to 13.75% and variant false discovery rates (SNV FDR) were 4.11x10(-6), on average. Application to ER-/PR-/HER2+ breast cancer cells and matched normal controls identified novel mutations that arose in a subpopulation of cells and effectively resolved the segregation of known cancer-related mutations with single-cell resolution. Finally, we demonstrate effective cell classification using mutation profiles with 10X average exome coverage depth per cell. Our data demonstrate an efficient automated microfluidic platform for single-cell WGA that enables the resolution of somatic mutation patterns in single cells.

  3. On stiffly stable implicit linear multistep methods.

    NASA Technical Reports Server (NTRS)

    Cooke, C. H.

    1972-01-01

    The motivation to increase the step size with no degradation of numerical accuracy and stability has led to the discovery of particular members of the class of stiffly stable implicit linear multistep algorithms. Sufficient conditions for a consistent linear multistep method to be stiffly stable are given. These conditions involve properties of the stability mapping from the extended complex plane onto itself.

  4. Multistep synthesis of a radiolabeled imaging probe using integrated microfluidics.

    PubMed

    Lee, Chung-Cheng; Sui, Guodong; Elizarov, Arkadij; Shu, Chengyi Jenny; Shin, Young-Shik; Dooley, Alek N; Huang, Jiang; Daridon, Antoine; Wyatt, Paul; Stout, David; Kolb, Hartmuth C; Witte, Owen N; Satyamurthy, Nagichettiar; Heath, James R; Phelps, Michael E; Quake, Stephen R; Tseng, Hsian-Rong

    2005-12-16

    Microreactor technology has shown potential for optimizing synthetic efficiency, particularly in preparing sensitive compounds. We achieved the synthesis of an [(18)F]fluoride-radiolabeled molecular imaging probe, 2-deoxy-2-[18F]fluoro-D-glucose ([18F]FDG), in an integrated microfluidic device. Five sequential processes-[18F]fluoride concentration, water evaporation, radiofluorination, solvent exchange, and hydrolytic deprotection-proceeded with high radio-chemical yield and purity and with shorter synthesis time relative to conventional automated synthesis. Multiple doses of [18F]FDG for positron emission tomography imaging studies in mice were prepared. These results, which constitute a proof of principle for automated multistep syntheses at the nanogram to microgram scale, could be generalized to a range of radiolabeled substrates.

  5. Automated, quantitative cognitive/behavioral screening of mice: for genetics, pharmacology, animal cognition and undergraduate instruction.

    PubMed

    Gallistel, C R; Balci, Fuat; Freestone, David; Kheifets, Aaron; King, Adam

    2014-02-26

    We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be

  6. Modelling molecule-surface interactions--an automated quantum-classical approach using a genetic algorithm.

    PubMed

    Herbers, Claudia R; Johnston, Karen; van der Vegt, Nico F A

    2011-06-14

    We present an automated and efficient method to develop force fields for molecule-surface interactions. A genetic algorithm (GA) is used to parameterise a classical force field so that the classical adsorption energy landscape of a molecule on a surface matches the corresponding landscape from density functional theory (DFT) calculations. The procedure performs a sophisticated search in the parameter phase space and converges very quickly. The method is capable of fitting a significant number of structures and corresponding adsorption energies. Water on a ZnO(0001) surface was chosen as a benchmark system but the method is implemented in a flexible way and can be applied to any system of interest. In the present case, pairwise Lennard Jones (LJ) and Coulomb potentials are used to describe the molecule-surface interactions. In the course of the fitting procedure, the LJ parameters are refined in order to reproduce the adsorption energy landscape. The classical model is capable of describing a wide range of energies, which is essential for a realistic description of a fluid-solid interface.

  7. Automated synthesis of both the topology and numerical parameters for seven patented optical lens systems using genetic programming

    NASA Astrophysics Data System (ADS)

    Jones, Lee W.; Al-Sakran, Sameer H.; Koza, John R.

    2005-08-01

    This paper describes how genetic programming was used as an automated invention machine to synthesize both the topology and numerical parameters for seven previously patented optical lens systems, including one aspherical system and one issued in the 21st-century. Two of the evolved optical lens systems infringe the claims of the patents and the others are novel solutions that satisfy the design goals stated in the patent. The automatic synthesis was done "from scratch"--that is, without starting from a pre-existing good design and without pre-specifying the number of lenses, the topological layout of the lenses, or the numerical parameters of the lenses. Genetic programming is a form of evolutionary computation used to automatically solve problems. It starts from a high-level statement of what needs to be done and progressively breeds a population of candidate individuals over many generations using the principle of Darwinian natural selection and genetic recombination. The paper describes how genetic programming created eyepieces that duplicated the functionality of seven previously patented lens systems. The seven designs were created in a substantially similar and routine way, suggesting that the use of genetic programming in the automated design of both the topology and numerical parameters for optical lens systems may have widespread utility.

  8. Multistep Methods for Integrating the Solar System

    DTIC Science & Technology

    1988-07-01

    Technical Report 1055 [Multistep Methods for Integrating the Solar System 0 Panayotis A. Skordos’ MIT Artificial Intelligence Laboratory DTIC S D g8...RMA ELEENT. PROECT. TASK Artific ial Inteligence Laboratory ARE1A G WORK UNIT NUMBERS 545 Technology Square Cambridge, MA 02139 IL. CONTROLLING...describes research done at the Artificial Intelligence Laboratory of the Massachusetts Institute of Technology, supported by the Advanced Research Projects

  9. Linear Multistep Methods for Integrating Reversible Differential Equations

    NASA Astrophysics Data System (ADS)

    Evans, N. Wyn; Tremaine, Scott

    1999-10-01

    This paper studies multistep methods for the integration of reversible dynamical systems, with particular emphasis on the planar Kepler problem. It has previously been shown by Cano & Sanz-Serna that reversible linear multisteps for first-order differential equations are generally unstable. Here we report on a subset of these methods-the zero-growth methods-that evade these instabilities. We provide an algorithm for identifying these rare methods. We find and study all zero-growth, reversible multisteps with six or fewer steps. This select group includes two well-known second-order multisteps (the trapezoidal and explicit midpoint methods), as well as three new fourth-order multisteps-one of which is explicit. Variable time steps can be readily implemented without spoiling the reversibility. Tests on Keplerian orbits show that these new reversible multisteps work well on orbits with low or moderate eccentricity, although at least 100 steps per radian are required for stability.

  10. Automated workflow-based exploitation of pathway databases provides new insights into genetic associations of metabolite profiles

    PubMed Central

    2013-01-01

    Background Genome-wide association studies (GWAS) have identified many common single nucleotide polymorphisms (SNPs) that associate with clinical phenotypes, but these SNPs usually explain just a small part of the heritability and have relatively modest effect sizes. In contrast, SNPs that associate with metabolite levels generally explain a higher percentage of the genetic variation and demonstrate larger effect sizes. Still, the discovery of SNPs associated with metabolite levels is challenging since testing all metabolites measured in typical metabolomics studies with all SNPs comes with a severe multiple testing penalty. We have developed an automated workflow approach that utilizes prior knowledge of biochemical pathways present in databases like KEGG and BioCyc to generate a smaller SNP set relevant to the metabolite. This paper explores the opportunities and challenges in the analysis of GWAS of metabolomic phenotypes and provides novel insights into the genetic basis of metabolic variation through the re-analysis of published GWAS datasets. Results Re-analysis of the published GWAS dataset from Illig et al. (Nature Genetics, 2010) using a pathway-based workflow (http://www.myexperiment.org/packs/319.html), confirmed previously identified hits and identified a new locus of human metabolic individuality, associating Aldehyde dehydrogenase family1 L1 (ALDH1L1) with serine/glycine ratios in blood. Replication in an independent GWAS dataset of phospholipids (Demirkan et al., PLoS Genetics, 2012) identified two novel loci supported by additional literature evidence: GPAM (Glycerol-3 phosphate acyltransferase) and CBS (Cystathionine beta-synthase). In addition, the workflow approach provided novel insight into the affected pathways and relevance of some of these gene-metabolite pairs in disease development and progression. Conclusions We demonstrate the utility of automated exploitation of background knowledge present in pathway databases for the analysis of GWAS

  11. Multistep sintering to synthesize fast lithium garnets

    NASA Astrophysics Data System (ADS)

    Xu, Biyi; Duan, Huanan; Xia, Wenhao; Guo, Yiping; Kang, Hongmei; Li, Hua; Liu, Hezhou

    2016-01-01

    A multistep sintering schedule is developed to synthesize Li7La3Zr2O12 (LLZO) doped with 0.2 mol% Al3+. The effect of sintering steps on phase, relative density and ionic conductivity of Al-doped LLZO has been evaluated using powder X-Ray diffraction (XRD), scanning electron microscopy (SEM), 27Al magic spinning nuclear magnetic resonance (NMR) spectroscopy and electrochemical impedance spectroscopy (EIS). The results show that by holding the sample at 900 °C for 6 h, the mixture of tetragonal and cubic garnet phases are obtained; by continuously holding at 1100 °C for 6 h, the tetragonal phase completely transforms into cubic phase; by holding at 1200 °C, the relative density increases without decomposition of the cubic phase. The Al-LLZO pellets after multistep sintering exhibit cubic phase, relative density of 94.25% and ionic conductivity of 4.5 × 10-4 S cm-1 at room temperature. Based on the observation, a sintering model is proposed and discussed.

  12. Synthesis of Silver Nanostructures by Multistep Methods

    PubMed Central

    Zhang, Tong; Song, Yuan-Jun; Zhang, Xiao-Yang; Wu, Jing-Yuan

    2014-01-01

    The shape of plasmonic nanostructures such as silver and gold is vital to their physical and chemical properties and potential applications. Recently, preparation of complex nanostructures with rich function by chemical multistep methods is the hotspot of research. In this review we introduce three typical multistep methods to prepare silver nanostructures with well-controlled shapes, including the double reductant method, etching technique and construction of core-shell nanostructures. The growth mechanism of double the reductant method is that different favorable facets of silver nanocrystals are produced in different reductants, which can be used to prepare complex nanostructures such as nanoflags with ultranarrow resonant band bandwidth or some silver nanostructures which are difficult to prepare using other methods. The etching technique can selectively remove nanoparticles to achieve the aim of shape control and is widely used for the synthesis of nanoflowers and hollow nanostructures. Construction of core-shell nanostructures is another tool to control shape and size. The three methods can not only prepare various silver nanostructures with well-controlled shapes, which exhibit unique optical properties, such as strong surface-enhanced Raman scattering (SERS) signal and localized surface plasmon resonance (LSPR) effect, but also have potential application in many areas. PMID:24670722

  13. Differential genetic regulation of motor activity and anxiety-related behaviors in mice using an automated home cage task.

    PubMed

    Kas, Martien J H; de Mooij-van Malsen, Annetrude J G; Olivier, Berend; Spruijt, Berry M; van Ree, Jan M

    2008-08-01

    Traditional behavioral tests, such as the open field test, measure an animal's responsiveness to a novel environment. However, it is generally difficult to assess whether the behavioral response obtained from these tests relates to the expression level of motor activity and/or to avoidance of anxiogenic areas. Here, an automated home cage environment for mice was designed to obtain independent measures of motor activity levels and of sheltered feeding preference during three consecutive days. Chronic treatment with the anxiolytic drug chlordiazepoxide (5 and 10 mg/kg/day) in C57BL/6J mice reduced sheltered feeding preference without altering motor activity levels. Furthermore, two distinct chromosome substitution strains, derived from C57BL/6J (host strain) and A/J (donor strain) inbred strains, expressed either increased sheltering preference in females (chromosome 15) or reduced motor activity levels in females and males (chromosome 1) when compared to C57BL/6J. Longitudinal behavioral monitoring revealed that these phenotypic differences maintained after adaptation to the home cage. Thus, by using new automated behavioral phenotyping approaches, behavior can be dissociated into distinct behavioral domains (e.g., anxiety-related and motor activity domains) with different underlying genetic origin and pharmacological responsiveness.

  14. An automated diagnosis system of liver disease using artificial immune and genetic algorithms.

    PubMed

    Liang, Chunlin; Peng, Lingxi

    2013-04-01

    The rise of health care cost is one of the world's most important problems. Disease prediction is also a vibrant research area. Researchers have approached this problem using various techniques such as support vector machine, artificial neural network, etc. This study typically exploits the immune system's characteristics of learning and memory to solve the problem of liver disease diagnosis. The proposed system applies a combination of two methods of artificial immune and genetic algorithm to diagnose the liver disease. The system architecture is based on artificial immune system. The learning procedure of system adopts genetic algorithm to interfere the evolution of antibody population. The experiments use two benchmark datasets in our study, which are acquired from the famous UCI machine learning repository. The obtained diagnosis accuracies are very promising with regard to the other diagnosis system in the literatures. These results suggest that this system may be a useful automatic diagnosis tool for liver disease.

  15. Spi-1/PU.1 transgenic mice develop multistep erythroleukemias.

    PubMed Central

    Moreau-Gachelin, F; Wendling, F; Molina, T; Denis, N; Titeux, M; Grimber, G; Briand, P; Vainchenker, W; Tavitian, A

    1996-01-01

    Insertional mutagenesis of the spi-1 gene is associated with the emergence of malignant proerythroblasts during Friend virus-induced acute erythroleukemia. To determine the role of spi-1/PU.1 in the genesis of leukemia, we generated spi-1 transgenic mice. In one founder line the transgene was overexpressed as an unexpected-size transcript in various mouse tissues. Homozygous transgenic animals gave rise to live-born offspring, but 50% of the animals developed a multistep erythroleukemia within 1.5 to 6 months of birth whereas the remainder survived without evidence of disease. At the onset of the disease, mice became severely anemic. Their hematopoietic tissues were massively invaded with nontumorigenic proerythroblasts that express a high level of Spi-1 protein. These transgenic proerythroblasts are partially blocked in differentiation and strictly dependent on erythropoietin for their proliferation both in vivo and in vitro. A complete but transient regression of the disease was observed after erythrocyte transfusion, suggesting that the constitutive expression of spi-1 is related to the block of the differentiation of erythroid precursors. At relapse, erythropoietin-independent malignant proerythroblasts arose. Growth factor autonomy could be partially explained by the autocrine secretion of erythropoietin; however, other genetic events appear to be necessary to confer the full malignant phenotype. These results reveal that overexpression of spi-1 is essential for malignant erythropoiesis and does not alter other hematopoietic lineages. PMID:8628313

  16. Automated microscopy system for detection and genetic characterization of fetal nucleated red blood cells on slides

    NASA Astrophysics Data System (ADS)

    Ravkin, Ilya; Temov, Vladimir

    1998-04-01

    The detection and genetic analysis of fetal cells in maternal blood will permit noninvasive prenatal screening for genetic defects. Applied Imaging has developed and is currently evaluating a system for semiautomatic detection of fetal nucleated red blood cells on slides and acquisition of their DNA probe FISH images. The specimens are blood smears from pregnant women (9 - 16 weeks gestation) enriched for nucleated red blood cells (NRBC). The cells are identified by using labeled monoclonal antibodies directed to different types of hemoglobin chains (gamma, epsilon); the nuclei are stained with DAPI. The Applied Imaging system has been implemented with both Olympus BX and Nikon Eclipse series microscopes which were equipped with transmission and fluorescence optics. The system includes the following motorized components: stage, focus, transmission, and fluorescence filter wheels. A video camera with light integration (COHU 4910) permits low light imaging. The software capabilities include scanning, relocation, autofocusing, feature extraction, facilities for operator review, and data analysis. Detection of fetal NRBCs is achieved by employing a combination of brightfield and fluorescence images of nuclear and cytoplasmic markers. The brightfield and fluorescence images are all obtained with a single multi-bandpass dichroic mirror. A Z-stack of DNA probe FISH images is acquired by moving focus and switching excitation filters. This stack is combined to produce an enhanced image for presentation and spot counting.

  17. A taste of individualized medicine: physicians’ reactions to automated genetic interpretations

    PubMed Central

    Lærum, Hallvard; Bremer, Sara; Bergan, Stein; Grünfeld, Thomas

    2014-01-01

    The potential of pharmacogenomics is well documented, and functionality exploiting this knowledge is about to be introduced into electronic medical records. To explore physicians’ reactions to automatic interpretations of genetic tests, we built a prototype with a simple interpretive algorithm. The algorithm was adapted to the needs of physicians handling immunosuppressive treatment during organ transplantation. Nine physicians were observed expressing their thoughts while using the prototype for two patient scenarios. The computer screen and audio were recorded, and the qualitative results triangulated with responses to a survey instrument. The physicians’ reactions to the prototype were very positive; they clearly trusted the results and the theory behind them. The explanation of the algorithm was prominently placed in the user interface for transparency, although this design led to considerable confusion. Background information and references should be available, but considerably less prominent than the result and recommendation. PMID:24001515

  18. Development of a Genetic Algorithm to Automate Clustering of a Dependency Structure Matrix

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; Korte, John J.; Bilardo, Vincent J.

    2006-01-01

    Much technology assessment and organization design data exists in Microsoft Excel spreadsheets. Tools are needed to put this data into a form that can be used by design managers to make design decisions. One need is to cluster data that is highly coupled. Tools such as the Dependency Structure Matrix (DSM) and a Genetic Algorithm (GA) can be of great benefit. However, no tool currently combines the DSM and a GA to solve the clustering problem. This paper describes a new software tool that interfaces a GA written as an Excel macro with a DSM in spreadsheet format. The results of several test cases are included to demonstrate how well this new tool works.

  19. Multistep, effective drug distribution within solid tumors

    PubMed Central

    Shemi, Amotz; Khvalevsky, Elina Zorde; Gabai, Rachel Malka; Domb, Abraham; Barenholz, Yechezkel

    2015-01-01

    The distribution of drugs within solid tumors presents a long-standing barrier for efficient cancer therapies. Tumors are highly resistant to diffusion, and the lack of blood and lymphatic flows suppresses convection. Prolonged, continuous intratumoral drug delivery from a miniature drug source offers an alternative to both systemic delivery and intratumoral injection. Presented here is a model of drug distribution from such a source, in a multistep process. At delivery onset the drug mainly affects the closest surroundings. Such ‘priming’ enables drug penetration to successive cell layers. Tumor ‘void volume’ (volume not occupied by cells) increases, facilitating lymphatic perfusion. The drug is then transported by hydraulic convection downstream along interstitial fluid pressure (IFP) gradients, away from the tumor core. After a week tumor cell death occurs throughout the entire tumor and IFP gradients are flattened. Then, the drug is transported mainly by ‘mixing’, powered by physiological bulk body movements. Steady state is achieved and the drug covers the entire tumor over several months. Supporting measurements are provided from the LODER™ system, releasing siRNA against mutated KRAS over months in pancreatic cancer in-vivo models. LODER™ was also successfully employed in a recent Phase 1/2 clinical trial with pancreatic cancer patients. PMID:26416413

  20. Genetic algorithm based feature selection combined with dual classification for the automated detection of proliferative diabetic retinopathy.

    PubMed

    Welikala, R A; Fraz, M M; Dehmeshki, J; Hoppe, A; Tah, V; Mann, S; Williamson, T H; Barman, S A

    2015-07-01

    Proliferative diabetic retinopathy (PDR) is a condition that carries a high risk of severe visual impairment. The hallmark of PDR is the growth of abnormal new vessels. In this paper, an automated method for the detection of new vessels from retinal images is presented. This method is based on a dual classification approach. Two vessel segmentation approaches are applied to create two separate binary vessel map which each hold vital information. Local morphology features are measured from each binary vessel map to produce two separate 4-D feature vectors. Independent classification is performed for each feature vector using a support vector machine (SVM) classifier. The system then combines these individual outcomes to produce a final decision. This is followed by the creation of additional features to generate 21-D feature vectors, which feed into a genetic algorithm based feature selection approach with the objective of finding feature subsets that improve the performance of the classification. Sensitivity and specificity results using a dataset of 60 images are 0.9138 and 0.9600, respectively, on a per patch basis and 1.000 and 0.975, respectively, on a per image basis.

  1. A generalized theory of chromatography and multistep liquid extraction

    NASA Astrophysics Data System (ADS)

    Chizhkov, V. P.; Boitsov, V. N.

    2017-03-01

    A generalized theory of chromatography and multistep liquid extraction is developed. The principles of highly efficient processes for fine preparative separation of binary mixture components on a fixed sorbent layer are discussed.

  2. Automated extraction of DNA from blood and PCR setup using a Tecan Freedom EVO liquid handler for forensic genetic STR typing of reference samples.

    PubMed

    Stangegaard, Michael; Frøslev, Tobias G; Frank-Hansen, Rune; Hansen, Anders J; Morling, Niels

    2011-04-01

    We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO liquid handler mounted with the Te-MagS magnetic separation device (Tecan, Männedorf, Switzerland). The protocols were validated for accredited forensic genetic work according to ISO 17025 using the Qiagen MagAttract DNA Mini M48 kit (Qiagen GmbH, Hilden, Germany) from fresh whole blood and blood from deceased individuals. The workflow was simplified by returning the DNA extracts to the original tubes minimizing the risk of misplacing samples. The tubes that originally contained the samples were washed with MilliQ water before the return of the DNA extracts. The PCR was setup in 96-well microtiter plates. The methods were validated for the kits: AmpFℓSTR Identifiler, SGM Plus and Yfiler (Applied Biosystems, Foster City, CA), GenePrint FFFL and PowerPlex Y (Promega, Madison, WI). The automated protocols allowed for extraction and addition of PCR master mix of 96 samples within 3.5h. In conclusion, we demonstrated that (1) DNA extraction with magnetic beads and (2) PCR setup for accredited, forensic genetic short tandem repeat typing can be implemented on a simple automated liquid handler leading to the reduction of manual work, and increased quality and throughput.

  3. Multi-step motion planning: Application to free-climbing robots

    NASA Astrophysics Data System (ADS)

    Bretl, Timothy Wolfe

    This dissertation addresses the problem of planning the motion of a multi-limbed robot to "free-climb" vertical rock surfaces. Free-climbing relies on natural features and friction (such as holes or protrusions) rather than special fixtures or tools. It requires strength, but more importantly it requires deliberate reasoning: not only must the robot decide how to adjust its posture to reach the next feature without falling, it must plan an entire sequence of steps, where each one might have future consequences. This process of reasoning is called multi-step planning. A multi-step planning framework is presented for computing non-gaited, free-climbing motions. This framework derives from an analysis of a free-climbing robot's configuration space, which can be decomposed into constraint manifolds associated with each state of contact between the robot and its environment. An understanding of the adjacency between manifolds motivates a two-stage strategy that uses a candidate sequence of steps to direct the subsequent search for motions. Three algorithms are developed to support the framework. The first algorithm reduces the amount of time required to plan each potential step, a large number of which must be considered over an entire multi-step search. It extends the probabilistic roadmap (PRM) approach based on an analysis of the interaction between balance and the topology of closed kinematic chains. The second algorithm addresses a problem with the PRM approach, that it is unable to distinguish challenging steps (which may be critical) from impossible ones. This algorithm detects impossible steps explicitly, using automated algebraic inference and machine learning. The third algorithm provides a fast constraint checker (on which the PRM approach depends), in particular a test of balance at the initially unknown number of sampled configurations associated with each step. It is a method of incremental precomputation, fast because it takes advantage of the sample

  4. A Multistep Synthesis for an Advanced Undergraduate Organic Chemistry Laboratory

    ERIC Educational Resources Information Center

    Chang Ji; Peters, Dennis G.

    2006-01-01

    Multistep syntheses are often important components of the undergraduate organic laboratory experience and a three-step synthesis of 5-(2-sulfhydrylethyl) salicylaldehyde was described. The experiment is useful as a special project for an advanced undergraduate organic chemistry laboratory course and offers opportunities for students to master a…

  5. INDES User's guide multistep input design with nonlinear rotorcraft modeling

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The INDES computer program, a multistep input design program used as part of a data processing technique for rotorcraft systems identification, is described. Flight test inputs base on INDES improve the accuracy of parameter estimates. The input design algorithm, program input, and program output are presented.

  6. 48 CFR 15.202 - Advisory multi-step process.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... process. 15.202 Section 15.202 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... Information 15.202 Advisory multi-step process. (a) The agency may publish a presolicitation notice (see 5.204... concept, past performance, and limited pricing information). At a minimum, the notice shall...

  7. 48 CFR 15.202 - Advisory multi-step process.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... process. 15.202 Section 15.202 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... Information 15.202 Advisory multi-step process. (a) The agency may publish a presolicitation notice (see 5.204... concept, past performance, and limited pricing information). At a minimum, the notice shall...

  8. 48 CFR 15.202 - Advisory multi-step process.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... process. 15.202 Section 15.202 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... Information 15.202 Advisory multi-step process. (a) The agency may publish a presolicitation notice (see 5.204... concept, past performance, and limited pricing information). At a minimum, the notice shall...

  9. 48 CFR 15.202 - Advisory multi-step process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... process. 15.202 Section 15.202 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... Information 15.202 Advisory multi-step process. (a) The agency may publish a presolicitation notice (see 5.204... concept, past performance, and limited pricing information). At a minimum, the notice shall...

  10. The Application of Baum-Welch Algorithm in Multistep Attack

    PubMed Central

    Zhang, Yanxue; Zhao, Dongmei; Liu, Jinxing

    2014-01-01

    The biggest difficulty of hidden Markov model applied to multistep attack is the determination of observations. Now the research of the determination of observations is still lacking, and it shows a certain degree of subjectivity. In this regard, we integrate the attack intentions and hidden Markov model (HMM) and support a method to forecasting multistep attack based on hidden Markov model. Firstly, we train the existing hidden Markov model(s) by the Baum-Welch algorithm of HMM. Then we recognize the alert belonging to attack scenarios with the Forward algorithm of HMM. Finally, we forecast the next possible attack sequence with the Viterbi algorithm of HMM. The results of simulation experiments show that the hidden Markov models which have been trained are better than the untrained in recognition and prediction. PMID:24991642

  11. Optimal installation locations for automated external defibrillators in Taipei 7-Eleven stores: using GIS and a genetic algorithm with a new stirring operator.

    PubMed

    Huang, Chung-Yuan; Wen, Tzai-Hung

    2014-01-01

    Immediate treatment with an automated external defibrillator (AED) increases out-of-hospital cardiac arrest (OHCA) patient survival potential. While considerable attention has been given to determining optimal public AED locations, spatial and temporal factors such as time of day and distance from emergency medical services (EMSs) are understudied. Here we describe a geocomputational genetic algorithm with a new stirring operator (GANSO) that considers spatial and temporal cardiac arrest occurrence factors when assessing the feasibility of using Taipei 7-Eleven stores as installation locations for AEDs. Our model is based on two AED conveyance modes, walking/running and driving, involving service distances of 100 and 300 meters, respectively. Our results suggest different AED allocation strategies involving convenience stores in urban settings. In commercial areas, such installations can compensate for temporal gaps in EMS locations when responding to nighttime OHCA incidents. In residential areas, store installations can compensate for long distances from fire stations, where AEDs are currently held in Taipei.

  12. Fully automated sample preparation microsystem for genetic testing of hereditary hearing loss using two-color multiplex allele-specific PCR.

    PubMed

    Zhuang, Bin; Gan, Wupeng; Wang, Shuaiqin; Han, Junping; Xiang, Guangxin; Li, Cai-Xia; Sun, Jing; Liu, Peng

    2015-01-20

    A fully automated microsystem consisting of a disposable DNA extraction and PCR microchip, as well as a compact control instrument, has been successfully developed for genetic testing of hereditary hearing loss from human whole blood. DNA extraction and PCR were integrated into a single 15-μL reaction chamber, where a piece of filter paper was embedded for capturing genomic DNA, followed by in-situ PCR amplification without elution. Diaphragm microvalves actuated by external solenoids together with a "one-way" fluidic control strategy operated by a modular valve positioner and a syringe pump were employed to control the fluids and to seal the chamber during thermal cycling. Fully automated DNA extractions from as low as 0.3-μL human whole blood followed by amplifications of 59-bp β-actin fragments can be completed on the microsystem in about 100 min. Negative control tests that were performed between blood sample analyses proved the successful elimination of any contamination or carryover in the system. To more critically test the microsystem, a two-color multiplex allele-specific PCR (ASPCR) assay for detecting c.176_191del16, c.235delC, and c.299_300delAT mutations in GJB2 gene that accounts for hereditary hearing loss was constructed. Two allele-specific primers, one labeled with TAMRA for wild type and the other with FAM for mutation, were designed for each locus. DNA extraction from blood and ASPCR were performed on the microsystem, followed by an electrophoretic analysis on a portable microchip capillary electrophoresis system. Blood samples from a healthy donor and five persons with genetic mutations were all accurately analyzed with only two steps in less than 2 h.

  13. Automated discovery of structural features of the optic nerve head on the basis of image and genetic data

    NASA Astrophysics Data System (ADS)

    Christopher, Mark; Tang, Li; Fingert, John H.; Scheetz, Todd E.; Abramoff, Michael D.

    2014-03-01

    Evaluation of optic nerve head (ONH) structure is a commonly used clinical technique for both diagnosis and monitoring of glaucoma. Glaucoma is associated with characteristic changes in the structure of the ONH. We present a method for computationally identifying ONH structural features using both imaging and genetic data from a large cohort of participants at risk for primary open angle glaucoma (POAG). Using 1054 participants from the Ocular Hypertension Treatment Study, ONH structure was measured by application of a stereo correspondence algorithm to stereo fundus images. In addition, the genotypes of several known POAG genetic risk factors were considered for each participant. ONH structural features were discovered using both a principal component analysis approach to identify the major modes of variance within structural measurements and a linear discriminant analysis approach to capture the relationship between genetic risk factors and ONH structure. The identified ONH structural features were evaluated based on the strength of their associations with genotype and development of POAG by the end of the OHTS study. ONH structural features with strong associations with genotype were identified for each of the genetic loci considered. Several identified ONH structural features were significantly associated (p < 0.05) with the development of POAG after Bonferroni correction. Further, incorporation of genetic risk status was found to substantially increase performance of early POAG prediction. These results suggest incorporating both imaging and genetic data into ONH structural modeling significantly improves the ability to explain POAG-related changes to ONH structure.

  14. Event-triggered logical flow control for comprehensive process integration of multi-step assays on centrifugal microfluidic platforms.

    PubMed

    Kinahan, David J; Kearney, Sinéad M; Dimov, Nikolay; Glynn, Macdara T; Ducrée, Jens

    2014-07-07

    The centrifugal "lab-on-a-disc" concept has proven to have great potential for process integration of bioanalytical assays, in particular where ease-of-use, ruggedness, portability, fast turn-around time and cost efficiency are of paramount importance. Yet, as all liquids residing on the disc are exposed to the same centrifugal field, an inherent challenge of these systems remains the automation of multi-step, multi-liquid sample processing and subsequent detection. In order to orchestrate the underlying bioanalytical protocols, an ample palette of rotationally and externally actuated valving schemes has been developed. While excelling with the level of flow control, externally actuated valves require interaction with peripheral instrumentation, thus compromising the conceptual simplicity of the centrifugal platform. In turn, for rotationally controlled schemes, such as common capillary burst valves, typical manufacturing tolerances tend to limit the number of consecutive laboratory unit operations (LUOs) that can be automated on a single disc. In this paper, a major advancement on recently established dissolvable film (DF) valving is presented; for the very first time, a liquid handling sequence can be controlled in response to completion of preceding liquid transfer event, i.e. completely independent of external stimulus or changes in speed of disc rotation. The basic, event-triggered valve configuration is further adapted to leverage conditional, large-scale process integration. First, we demonstrate a fluidic network on a disc encompassing 10 discrete valving steps including logical relationships such as an AND-conditional as well as serial and parallel flow control. Then we present a disc which is capable of implementing common laboratory unit operations such as metering and selective routing of flows. Finally, as a pilot study, these functions are integrated on a single disc to automate a common, multi-step lab protocol for the extraction of total RNA from

  15. Genetics

    MedlinePlus

    ... Inheritance; Heterozygous; Inheritance patterns; Heredity and disease; Heritable; Genetic markers ... The chromosomes are made up of strands of genetic information called DNA. Each chromosome contains sections of ...

  16. Power transmission coefficients for multi-step index optical fibres.

    PubMed

    Aldabaldetreku, Gotzon; Zubia, Joseba; Durana, Gaizka; Arrue, Jon

    2006-02-20

    The aim of the present paper is to provide a single analytical expression of the power transmission coefficient for leaky rays in multi-step index (MSI) fibres. This expression is valid for all tunnelling and refracting rays and allows us to evaluate numerically the power attenuation along an MSI fibre of an arbitrary number of layers. We validate our analysis by comparing the results obtained for limit cases of MSI fibres with those corresponding to step-index (SI) and graded-index (GI) fibres. We also make a similar comparison between this theoretical expression and the use of the WKB solutions of the scalar wave equation.

  17. Microwaves in drug discovery and multi-step synthesis.

    PubMed

    Alexandre, François-René; Domon, Lisianne; Frère, Stéphane; Testard, Alexandra; Thiéry, Valérie; Besson, Thierry

    2003-01-01

    The interest of microwaves in drug discovery and multi-step synthesis is exposed with the aim of describing our strategy. These studies are connected with our work on the synthesis of original heterocyclic compounds with potential pharmaceutical value. Reactions in the presence of solvent and solvent-free synthesis can be realised under a variety of conditions; for some of these selected results are given, and where available, results from comparison with the same solvent-free conditions but with classical heating are given.

  18. Modeling biology with HDL languages: a first step toward a genetic design automation tool inspired from microelectronics.

    PubMed

    Gendrault, Yves; Madec, Morgan; Lallement, Christophe; Haiech, Jacques

    2014-04-01

    Nowadays, synthetic biology is a hot research topic. Each day, progresses are made to improve the complexity of artificial biological functions in order to tend to complex biodevices and biosystems. Up to now, these systems are handmade by bioengineers, which require strong technical skills and leads to nonreusable development. Besides, scientific fields that share the same design approach, such as microelectronics, have already overcome several issues and designers succeed in building extremely complex systems with many evolved functions. On the other hand, in systems engineering and more specifically in microelectronics, the development of the domain has been promoted by both the improvement of technological processes and electronic design automation tools. The work presented in this paper paves the way for the adaptation of microelectronics design tools to synthetic biology. Considering the similarities and differences between the synthetic biology and microelectronics, the milestones of this adaptation are described. The first one concerns the modeling of biological mechanisms. To do so, a new formalism is proposed, based on an extension of the generalized Kirchhoff laws to biology. This way, a description of all biological mechanisms can be made with languages widely used in microelectronics. Our approach is therefore successfully validated on specific examples drawn from the literature.

  19. DNA Compatible Multistep Synthesis and Applications to DNA Encoded Libraries.

    PubMed

    Satz, Alexander Lee; Cai, Jianping; Chen, Yi; Goodnow, Robert; Gruber, Felix; Kowalczyk, Agnieszka; Petersen, Ann; Naderi-Oboodi, Goli; Orzechowski, Lucja; Strebel, Quentin

    2015-08-19

    Complex mixtures of DNA encoded small molecules may be readily interrogated via high-throughput sequencing. These DNA encoded libraries (DELs) are commonly used to discover molecules that interact with pharmaceutically relevant proteins. The chemical diversity displayed by the library is key to successful discovery of potent, novel, and drug-like chemical matter. The small molecule moieties of DELs are generally synthesized though a multistep process, and each chemical step is accomplished while it is simultaneously attached to an encoding DNA oligomer. Hence, library chemical diversity is often limited to DNA compatible synthetic reactions. Herein, protocols for 24 reactions are provided that have been optimized for high-throughput production of DELs. These protocols detail the multistep synthesis of benzimidazoles, imidazolidinones, quinazolinones, isoindolinones, thiazoles, and imidazopyridines. Additionally, protocols are provided for a diverse range of useful chemical reactions including BOC deprotection (under pH neutral conditions), carbamylation, and Sonogashira coupling. Last, step-by-step protocols for synthesizing functionalized DELs from trichloronitropyrimidine and trichloropyrimidine scaffolds are detailed.

  20. Adaptation to vocal expressions reveals multistep perception of auditory emotion.

    PubMed

    Bestelmeyer, Patricia E G; Maurage, Pierre; Rouger, Julien; Latinus, Marianne; Belin, Pascal

    2014-06-11

    The human voice carries speech as well as important nonlinguistic signals that influence our social interactions. Among these cues that impact our behavior and communication with other people is the perceived emotional state of the speaker. A theoretical framework for the neural processing stages of emotional prosody has suggested that auditory emotion is perceived in multiple steps (Schirmer and Kotz, 2006) involving low-level auditory analysis and integration of the acoustic information followed by higher-level cognition. Empirical evidence for this multistep processing chain, however, is still sparse. We examined this question using functional magnetic resonance imaging and a continuous carry-over design (Aguirre, 2007) to measure brain activity while volunteers listened to non-speech-affective vocalizations morphed on a continuum between anger and fear. Analyses dissociated neuronal adaptation effects induced by similarity in perceived emotional content between consecutive stimuli from those induced by their acoustic similarity. We found that bilateral voice-sensitive auditory regions as well as right amygdala coded the physical difference between consecutive stimuli. In contrast, activity in bilateral anterior insulae, medial superior frontal cortex, precuneus, and subcortical regions such as bilateral hippocampi depended predominantly on the perceptual difference between morphs. Our results suggest that the processing of vocal affect recognition is a multistep process involving largely distinct neural networks. Amygdala and auditory areas predominantly code emotion-related acoustic information while more anterior insular and prefrontal regions respond to the abstract, cognitive representation of vocal affect.

  1. Multistep carcinogenesis in the formation of basal cell carcinoma of the skin

    SciTech Connect

    Gailani, M.; Leffell, D.; Ziegler, A.

    1994-09-01

    Basal cell carcinoma of the skin (BCC) is the most common cancer in humans, a slow growing tumor whose incidence strongly correlates with exposure to UV light. Although the molecular basis of BCC formation is not well understood, loss of heterozygosity (LOH) for markers on chromosome 9q in 70% of BCCs suggests that inactivation of a tumor suppressor on 9q22 is an important early step. UV induced mutations in the p53 gene have also been found in over 50% of sporadic BCCs. We analyzed 18 sporadic BCCs for allelic loss on chromosome 9 and point mutations in the p53 gene and attempted to correlate genetic alteration with pathological subtype and relative UV light exposure. Eight of eighteen tumors (45%) showed LOH on chromosome 9 as well as point mutation of the p53 gene, three of eighteen tumors (17%) showed mutation of the p53 gene without LOH on chromosome 9, five of eighteen tumors (27%) showed LOH for chromosome 9 without evidence of mutation in the p53 gene, and two of eighteen tumors (11%) showed neither LOH on chromosome 9 nor mutation in the p53 gene. Tumor pathology showed no obvious correlation between mutation and tumor aggressiveness. However, one tumor of a unique, aggressive growth subtype had no genetic alteration suggesting a different genetic mechanism in this particular subgroup. 38% of tumors from areas of greatest sun-exposure showed both mutations. The data suggests a strong correlation between inactivation of a tumor suppressor gene on chromosome 9 and mutation in the p53 gene though the sequence of events cannot be determined. Because carcinogenesis is a multistep process and genetic injury from UV light is only one factor, further correlation with degree of tumor differentiation may clarify the genetic process in BCCs.

  2. Vitamin B12 transport from food to the body's cells--a sophisticated, multistep pathway.

    PubMed

    Nielsen, Marianne J; Rasmussen, Mie R; Andersen, Christian B F; Nexø, Ebba; Moestrup, Søren K

    2012-05-01

    Vitamin B(12) (B(12); also known as cobalamin) is a cofactor in many metabolic processes; deficiency of this vitamin is associated with megaloblastic anaemia and various neurological disorders. In contrast to many prokaryotes, humans and other mammals are unable to synthesize B(12). Instead, a sophisticated pathway for specific uptake and transport of this molecule has evolved. Failure in the gastrointestinal part of this pathway is the most common cause of nondietary-induced B(12) deficiency disease. However, although less frequent, defects in cellular processing and further downstream steps in the transport pathway are also known culprits of functional B(12) deficiency. Biochemical and genetic approaches have identified novel proteins in the B(12) transport pathway--now known to involve more than 15 gene products--delineating a coherent pathway for B(12) trafficking from food to the body's cells. Some of these gene products are specifically dedicated to B(12) transport, whereas others embrace additional roles, which explains the heterogeneity in the clinical picture of the many genetic disorders causing B(12) deficiency. This Review describes basic and clinical features of this multistep pathway with emphasis on gastrointestinal transport of B(12) and its importance in clinical medicine.

  3. An ordinary differential equation model for the multistep transformation to cancer.

    PubMed

    Spencer, Sabrina L; Berryman, Matthew J; García, José A; Abbott, Derek

    2004-12-21

    Cancer is viewed as a multistep process whereby a normal cell is transformed into a cancer cell through the acquisition of mutations. We reduce the complexities of cancer progression to a simple set of underlying rules that govern the transformation of normal cells to malignant cells. In doing so, we derive an ordinary differential equation model that explores how the balance of angiogenesis, cell death rates, genetic instability, and replication rates give rise to different kinetics in the development of cancer. The key predictions of the model are that cancer develops fastest through a particular ordering of mutations and that mutations in genes that maintain genomic integrity would be the most deleterious type of mutations to inherit. In addition, we perform a sensitivity analysis on the parameters included in the model to determine the probable contribution of each. This paper presents a novel approach to viewing the genetic basis of cancer from a systems biology perspective and provides the groundwork for other models that can be directly tied to clinical and molecular data.

  4. Autonomous Multistep Organic Synthesis in a Single Isothermal Solution Mediated by a DNA Walker

    PubMed Central

    He, Yu; Liu, David R.

    2010-01-01

    Multistep synthesis in the laboratory typically requires numerous reaction vessels, each containing a different set of reactants. In contrast, cells are capable of performing highly efficient and selective multistep biosynthesis under mild conditions with all reactants simultaneously present in solution. If the latter approach could be applied in the laboratory, it may improve the ease, speed, and efficiency of multistep reaction sequences. Here we show that a DNA mechanical device— a DNA walker moving along a DNA track— can be used to perform a series of amine acylation reactions in a single solution without any external intervention. The multistep products generated by this primitive ribosome mimetic are programmed by the sequence of the DNA track, are unrelated to the structure of DNA, and are formed with speeds and overall yields significantly greater than those previously achieved by multistep DNA-templated small-molecule synthesis. PMID:20935654

  5. Multistep prediction of physiological tremor for surgical robotics applications.

    PubMed

    Veluvolu, Kalyana C; Tatinati, Sivanagaraja; Hong, Sun-Mog; Ang, Wei Tech

    2013-11-01

    Accurate canceling of physiological tremor is extremely important in robotics-assisted surgical instruments/procedures. The performance of robotics-based hand-held surgical devices degrades in real time due to the presence of phase delay in sensors (hardware) and filtering (software) processes. Effective tremor compensation requires zero-phase lag in filtering process so that the filtered tremor signal can be used to regenerate an opposing motion in real time. Delay as small as 20 ms degrades the performance of human-machine interference. To overcome this phase delay, we employ multistep prediction in this paper. Combined with the existing tremor estimation methods, the procedure improves the overall accuracy by 60% for tremor estimation compared to single-step prediction methods in the presence of phase delay. Experimental results with developed methods for 1-DOF tremor estimation highlight the improvement.

  6. Global model including multistep ionizations in helium plasmas

    NASA Astrophysics Data System (ADS)

    Oh, Seung-Ju; Lee, Hyo-Chang; Chung, Chin-Wook

    2016-12-01

    Particle and power balance equations including stepwise ionizations are derived and solved in helium plasmas. In the balance equations, two metastable states (21S1 in singlet and 23S1 triplet) are considered and the followings are obtained. The plasma density linearly increases and the electron temperature is relatively in a constant value against the absorbed power. It is also found that the contribution to multi-step ionization with respect to the single-step ionization is in the range of 8%-23%, as the gas pressure increases from 10 mTorr to 100 mTorr. Compared to the results in the argon plasma, there is little variation in the collisional energy loss per electron-ion pair created (ɛc) with absorbed power and gas pressure due to the small collision cross section and higher inelastic collision threshold energy.

  7. Multi-step prediction of physiological tremor for robotics applications.

    PubMed

    Veluvolu, K C; Tatinati, S; Hong, S M; Ang, W T

    2013-01-01

    The performance of surgical robotic devices in real-time mainly depends on phase-delay in sensors and filtering process. A phase delay of 16-20 ms is unavoidable in these robotics procedures due to the presence of hardware low pass filter in sensors and pre-filtering required in later stages of cancellation. To overcome this phase delay, we employ multi-step prediction with band limited multiple Fourier linear combiner (BMFLC) and Autoregressive (AR) methods. Results show that the overall accuracy is improved by 60% for tremor estimation compared to single-step prediction methods in the presence of phase delay. Experimental results with the proposed methods for 1-DOF tremor estimation highlight the improvement.

  8. Multi-objective genetic algorithm for the automated planning of a wireless sensor network to monitor a critical facility

    NASA Astrophysics Data System (ADS)

    Jourdan, Damien B.; de Weck, Olivier L.

    2004-09-01

    This paper examines the optimal placement of nodes for a Wireless Sensor Network (WSN) designed to monitor a critical facility in a hostile region. The sensors are dropped from an aircraft, and they must be connected (directly or via hops) to a High Energy Communication Node (HECN), which serves as a relay from the ground to a satellite or a high-altitude aircraft. The sensors are assumed to have fixed communication and sensing ranges. The facility is modeled as circular and served by two roads. This simple model is used to benchmark the performance of the optimizer (a Multi-Objective Genetic Algorithm, or MOGA) in creating WSN designs that provide clear assessments of movements in and out of the facility, while minimizing both the likelihood of sensors being discovered and the number of sensors to be dropped. The algorithm is also tested on two other scenarios; in the first one the WSN must detect movements in and out of a circular area, and in the second one it must cover uniformly a square region. The MOGA is shown again to perform well on those scenarios, which shows its flexibility and possible application to more complex mission scenarios with multiple and diverse targets of observation.

  9. Evidence that the RdeA protein is a component of a multistep phosphorelay modulating rate of development in Dictyostelium.

    PubMed Central

    Chang, W T; Thomason, P A; Gross, J D; Neweil, P C

    1998-01-01

    We have isolated an insertional mutant of Dictyostelium discoideum that aggregated rapidly and formed spores and stalk cells within 14 h of development instead of the normal 24 h. We have shown by parasexual genetics that the insertion is in the rdeA locus and have cloned the gene. It encodes a predicted 28 kDa protein (RdeA) that is enriched in charged residues and is very hydrophilic. Constructs with the DNA for the c-Myc epitope or for the green fluorescent protein indicate that RdeA is not compartmentalized. RdeA displays homology around a histidine residue at amino acid 65 with members of the H2 module family of phosphotransferases that participate in multistep phosphoryl relays. Replacement of this histidine rendered the protein inactive. The mutant is complemented by transformation with the Ypd1 gene of Saccharomyces cerevisiae, itself an H2 module protein. We propose that RdeA is part of a multistep phosphorelay system that modulates the rate of development. PMID:9582274

  10. Space station automation study: Automation requirements derived from space manufacturing concepts. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The electroepitaxial process and the Very Large Scale Integration (VLSI) circuits (chips) facilities were chosen because each requires a very high degree of automation, and therefore involved extensive use of teleoperators, robotics, process mechanization, and artificial intelligence. Both cover a raw materials process and a sophisticated multi-step process and are therfore highly representative of the kinds of difficult operation, maintenance, and repair challenges which can be expected for any type of space manufacturing facility. Generic areas were identified which will require significant further study. The initial design will be based on terrestrial state-of-the-art hard automation. One hundred candidate missions were evaluated on the basis of automation portential and availability of meaning ful knowldege. The design requirements and unconstrained design concepts developed for the two missions are presented.

  11. Stable hydrogen isotopic analysis of nanomolar molecular hydrogen by automatic multi-step gas chromatographic separation.

    PubMed

    Komatsu, Daisuke D; Tsunogai, Urumu; Kamimura, Kanae; Konno, Uta; Ishimura, Toyoho; Nakagawa, Fumiko

    2011-11-15

    We have developed a new automated analytical system that employs a continuous flow isotope ratio mass spectrometer to determine the stable hydrogen isotopic composition (δD) of nanomolar quantities of molecular hydrogen (H(2)) in an air sample. This method improves previous methods to attain simpler and lower-cost analyses, especially by avoiding the use of expensive or special devices, such as a Toepler pump, a cryogenic refrigerator, and a special evacuation system to keep the temperature of a coolant under reduced pressure. Instead, the system allows H(2) purification from the air matrix via automatic multi-step gas chromatographic separation using the coolants of both liquid nitrogen (77 K) and liquid nitrogen + ethanol (158 K) under 1 atm pressure. The analytical precision of the δD determination using the developed method was better than 4‰ for >5 nmol injections (250 mL STP for 500 ppbv air sample) and better than 15‰ for 1 nmol injections, regardless of the δD value, within 1 h for one sample analysis. Using the developed system, the δD values of H(2) can be quantified for atmospheric samples as well as samples of representative sources and sinks including those containing small quantities of H(2) , such as H(2) in soil pores or aqueous environments, for which there is currently little δD data available. As an example of such trace H(2) analyses, we report here the isotope fractionations during H(2) uptake by soils in a static chamber. The δD values of H(2) in these H(2)-depleted environments can be useful in constraining the budgets of atmospheric H(2) by applying an isotope mass balance model.

  12. Properties of true quaternary fission of nuclei with allowance for its multistep and sequential character

    NASA Astrophysics Data System (ADS)

    Kadmensky, S. G.; Titova, L. V.; Bulychev, A. O.

    2015-07-01

    An analysis of basicmechanisms of binary and ternary fission of nuclei led to the conclusion that true ternary and quaternary fission of nuclei has a sequential two-step (three-step) character, where, at the first step, a fissile nucleus emits a third light particle (third and fourth light particles) under shakeup effects associated with a nonadiabatic character of its collective deformation motion, whereupon the residual nucleus undergoes fission to two fission fragments. Owing to this, the formulas derived earlier for the widths with respect to sequential two- and three-step decays of nuclei in constructing the theory of two-step twoproton decays and multistep decays in chains of genetically related nuclei could be used to describe the relative yields and angular and energy distributions of third and fourth light particles emitted in ( α, α), ( t, t), and ( α, t) pairs upon the true quaternary spontaneous fission of 252Cf and thermal-neutron-induced fission of 235U and 233U target nuclei. Mechanisms that explain a sharp decrease in the yield of particles appearing second in time and entering into the composition of light-particle pairs that originate from true quaternary fission of nuclei in relation to the yields of analogous particles in true ternary fission of nuclei are proposed.

  13. Properties of true quaternary fission of nuclei with allowance for its multistep and sequential character

    SciTech Connect

    Kadmensky, S. G. Titova, L. V.; Bulychev, A. O.

    2015-07-15

    An analysis of basicmechanisms of binary and ternary fission of nuclei led to the conclusion that true ternary and quaternary fission of nuclei has a sequential two-step (three-step) character, where, at the first step, a fissile nucleus emits a third light particle (third and fourth light particles) under shakeup effects associated with a nonadiabatic character of its collective deformation motion, whereupon the residual nucleus undergoes fission to two fission fragments. Owing to this, the formulas derived earlier for the widths with respect to sequential two- and three-step decays of nuclei in constructing the theory of two-step twoproton decays and multistep decays in chains of genetically related nuclei could be used to describe the relative yields and angular and energy distributions of third and fourth light particles emitted in (α, α), (t, t), and (α, t) pairs upon the true quaternary spontaneous fission of {sup 252}Cf and thermal-neutron-induced fission of {sup 235}U and {sup 233}U target nuclei. Mechanisms that explain a sharp decrease in the yield of particles appearing second in time and entering into the composition of light-particle pairs that originate from true quaternary fission of nuclei in relation to the yields of analogous particles in true ternary fission of nuclei are proposed.

  14. Multistep processing and stress reduction in CVD diamond films

    NASA Astrophysics Data System (ADS)

    Nijhawan, Sumit

    A serious impediment in the utility of diamond films is the large internal stresses that develop during growth. These stresses generally have thermal and growth components. The thermal component is determined by the mismatch in thermal expansion coefficients of film and substrate while the growth component may arise from several possible mechanisms during CVD growth. These growth stresses tend to be particularly large in diamond. The objective of this work is to understand and reduce the growth stresses in diamond films by tailoring the CVD process. Continuous, polycrystalline diamond films were deposited on Si by microwave plasma-assisted CVD. Very high internal stresses (>2 GPA) consisting of growth and thermal components were observed. The growth component is tensile and increases with growth time. We were able to reduce the evolution of growth stresses considerably by multistep processing of our films. An intermediate annealing step was included between successive growth periods. It is important to note that the annealing step must be conducted at key points during the growth process in order to effectively reduce stress. Maximum reduction in stress is achieved only if the sample is annealed when the diamond grains are partially coalesced (after 2--3 hours of growth). Annealing of continuous films does not produce a significant reduction in stress. The origin of growth stress in our films is attributed to non-equilibrated initial atomic positions during impingement and the successive relaxations to minimize interfacial energies. The film quality was monitored using Raman spectroscopy and electron microscopy. Based on our experimental results and analyses, it is hypothesized that rearrangements of strained boundary structures during the anneal can lower the interfacial energy change during subsequent growth and produce less stress. Multistep processing was also used to enhance diamond nucleation on Ni. An annealing pretreatment step, that consists of saturating

  15. Statistical properties of multistep enzyme-mediated reactions

    SciTech Connect

    Nemenman, Ilya; Sinitsyn, Nikolai A; De Ronde, Wiet H; Daniels, Bryan C; Mugler, Andrew

    2008-01-01

    Enzyme-mediated reactions may proceed through multiple intermediate conformational states before creating a final product molecule, and one often wishes to identify such intermediate structures from observations of the product creation. In this paper, we address this problem by solving the chemical master equations for various enzymatic reactions. We devise a perturbation theory analogous to that used in quantum mechanics that allows us to determine the first () and the second (variance) cumulants of the distribution of created product molecules as a function of the substrate concentration and the kinetic rates of the intermediate processes. The mean product flux V=d/dt (or 'dose-response' curve) and the Fano factor F=variance/ are both realistically measurable quantities, and while the mean flux can often appear the same for different reaction types, the Fano factor can be quite different. This suggests both qualitative and quantitative ways to discriminate between different reaction schemes, and we explore this possibility in the context of four sample multistep enzymatic reactions. We argue that measuring both the mean flux and the Fano factor can not only discriminate between reaction types, but can also provide some detailed information about the internal, unobserved kinetic rates, and this can be done without measuring single-molecule transition events.

  16. Comprehensive Control of Networked Control Systems with Multistep Delay

    PubMed Central

    Jiang, Jie

    2014-01-01

    In networked control systems with multi-step delay, long time-delay causes vacant sampling and controller design difficulty. In order to solve the above problems, comprehensive control methods are proposed in this paper. Time-delay compensation control and linear-quadratic-Guassian (LQG) optimal control are adopted and the systems switch different controllers between two different states. LQG optimal controller is used with probability 1 − α in normal state, which is shown to render the systems mean square exponentially stable. Time-delay compensation controller is used with probability α in abnormal state to compensate vacant sampling and long time-delay. In addition, a buffer window is established at the actuator of the systems to store some history control inputs which are used to estimate the control state of present sampling period under the vacant sampling cases. The comprehensive control methods simplify control design which is easier to be implemented in engineering. The performance of the systems is also improved. Simulation results verify the validity of the proposed theory. PMID:25101322

  17. Multistep hopping and extracellular charge transfer in microbial redox chains.

    PubMed

    Pirbadian, Sahand; El-Naggar, Mohamed Y

    2012-10-28

    Dissimilatory metal-reducing bacteria are microorganisms that gain energy by transferring respiratory electrons to extracellular solid-phase electron acceptors. In addition to its importance for physiology and natural environmental processes, this form of metabolism is being investigated for energy conversion and fuel production in bioelectrochemical systems, where microbes are used as biocatalysts at electrodes. One proposed strategy to accomplish this extracellular charge transfer involves forming a conductive pathway to electrodes by incorporating redox components on outer cell membranes and along extracellular appendages known as microbial nanowires within biofilms. To describe extracellular charge transfer in microbial redox chains, we employed a model based on incoherent hopping between sites in the chain and an interfacial treatment of electrochemical interactions with the surrounding electrodes. Based on this model, we calculated the current-voltage (I-V) characteristics and found the results to be in good agreement with I-V measurements across and along individual microbial nanowires produced by the bacterium Shewanella oneidensis MR-1. Based on our analysis, we propose that multistep hopping in redox chains constitutes a viable strategy for extracellular charge transfer in microbial biofilms.

  18. Droplet-based microsystem for multi-step bioreactions.

    PubMed

    Wang, Fang; Burns, Mark A

    2010-06-01

    A droplet-based microfluidic platform was used to perform on-chip droplet generation, merging and mixing for applications in multi-step reactions and assays. Submicroliter-sized droplets can be produced separately from three identical droplet-generation channels and merged together in a single chamber. Three different mixing strategies were used for mixing the merged droplet. For pure diffusion, the reagents were mixed in approximately 10 min. Using flow around the stationary droplet to induce circulatory flow within the droplet, the mixing time was decreased to approximately one minute. The shortest mixing time (10 s) was obtained with bidirectional droplet motion between the chamber and channel, and optimization could result in a total time of less than 1 s. We also tested this on-chip droplet generation and manipulation platform using a two-step thermal cycled bioreaction: nested TaqMan PCR. With the same concentration of template DNA, the two-step reaction in a well-mixed merged droplet shows a cycle threshold of approximately 6 cycles earlier than that in the diffusively mixed droplet, and approximately 40 cycles earlier than the droplet-based regular (single-step) TaqMan PCR.

  19. Solvent recyclability in a multistep direct liquefaction process

    SciTech Connect

    Hetland, M.D.; Rindt, J.R.

    1995-12-31

    Direct liquefaction research at the Energy & Environmental Research Center (EERC) has, for a number of years, concentrated on developing a direct liquefaction process specifically for low-rank coals (LRCs) through the use of hydrogen-donating solvents and solvents similar to coal-derived liquids, the water/gas shift reaction, and lower-severity reaction conditions. The underlying assumption of all of the research was that advantage could be taken of the reactivity and specific qualities of LRCs to produce a tetrahydrofuran (THF)-soluble material that might be easier to upgrade than the soluble residuum produced during direct liquefaction of high-rank coals. A multistep approach was taken to produce the THF-soluble material, consisting of (1) preconversion treatment to prepare the coal for solubilization, (2) solubilization of the coal in the solvent, and (3) polishing to complete solubilization of the remaining material. The product of these three steps can then be upgraded during a traditional hydrotreatment step. The results of the EERC`s research indicated that additional studies to develop this process more fully were justified. Two areas were targeted for further research: (1) determination of the recyclability of the solvent used during solubilization and (2) determination of the minimum severity required for hydrotreatment of the liquid product. The current project was funded to investigate these two areas.

  20. Characterization and Application of Xylene Monooxygenase for Multistep Biocatalysis

    PubMed Central

    Bühler, Bruno; Witholt, Bernard; Hauer, Bernhard; Schmid, Andreas

    2002-01-01

    Xylene monooxygenase of Pseudomonas putida mt-2 catalyzes multistep oxidations of one methyl group of toluene and xylenes. Recombinant Escherichia coli expressing the monooxygenase genes xylM and xylA catalyzes the oxygenation of toluene, pseudocumene, the corresponding alcohols, and the corresponding aldehydes, all by a monooxygenation type of reaction (B. Bühler, A. Schmid, B. Hauer, and B. Witholt, J. Biol. Chem. 275:10085-10092, 2000). Using E. coli expressing xylMA, we investigated the kinetics of this one-enzyme three-step biotransformation. We found that unoxidized substrates like toluene and pseudocumene inhibit the second and third oxygenation steps and that the corresponding alcohols inhibit the third oxygenation step. These inhibitions might promote the energetically more favorable alcohol and aldehyde dehydrogenations in the wild type. Growth of E. coli was strongly affected by low concentrations of pseudocumene and its products. Toxicity and solubility problems were overcome by the use of a two-liquid-phase system with bis(2-ethylhexyl)phthalate as the carrier solvent, allowing high overall substrate and product concentrations. In a fed-batch-based two-liquid-phase process with pseudocumene as the substrate, we observed the consecutive accumulation of aldehyde, acid, and alcohol. Our results indicate that, depending on the reaction conditions, product formation could be directed to one specific product. PMID:11823191

  1. Genetics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The genus Capsicum represents one of several well characterized Solanaceous genera. A wealth of classical and molecular genetics research is available for the genus. Information gleaned from its cultivated relatives, tomato and potato, provide further insight for basic and applied studies. Early ...

  2. Genetics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Maintaining genetic variation in wild populations of Arctic organisms is fundamental to the long-term persistence of high latitude biodiversity. Variability is important because it provides options for species to respond to changing environmental conditions and novel challenges such as emerging path...

  3. Stability with large step sizes for multistep discretizations of stiff ordinary differential equations

    NASA Technical Reports Server (NTRS)

    Majda, George

    1986-01-01

    One-leg and multistep discretizations of variable-coefficient linear systems of ODEs having both slow and fast time scales are investigated analytically. The stability properties of these discretizations are obtained independent of ODE stiffness and compared. The results of numerical computations are presented in tables, and it is shown that for large step sizes the stability of one-leg methods is better than that of the corresponding linear multistep methods.

  4. Automation or De-automation

    NASA Astrophysics Data System (ADS)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  5. Stochastic modeling of biochemical systems with multistep reactions using state-dependent time delay

    PubMed Central

    Wu, Qianqian; Tian, Tianhai

    2016-01-01

    To deal with the growing scale of molecular systems, sophisticated modelling techniques have been designed in recent years to reduce the complexity of mathematical models. Among them, a widely used approach is delayed reaction for simplifying multistep reactions. However, recent research results suggest that a delayed reaction with constant time delay is unable to describe multistep reactions accurately. To address this issue, we propose a novel approach using state-dependent time delay to approximate multistep reactions. We first use stochastic simulations to calculate time delay arising from multistep reactions exactly. Then we design algorithms to calculate time delay based on system dynamics precisely. To demonstrate the power of proposed method, two processes of mRNA degradation are used to investigate the function of time delay in determining system dynamics. In addition, a multistep pathway of metabolic synthesis is used to explore the potential of the proposed method to simplify multistep reactions with nonlinear reaction rates. Simulation results suggest that the state-dependent time delay is a promising and accurate approach to reduce model complexity and decrease the number of unknown parameters in the models. PMID:27553753

  6. A new theory for multistep discretizations of stiff ordinary differential equations: Stability with large step sizes

    NASA Technical Reports Server (NTRS)

    Majda, G.

    1985-01-01

    A large set of variable coefficient linear systems of ordinary differential equations which possess two different time scales, a slow one and a fast one is considered. A small parameter epsilon characterizes the stiffness of these systems. A system of o.d.e.s. in this set is approximated by a general class of multistep discretizations which includes both one-leg and linear multistep methods. Sufficient conditions are determined under which each solution of a multistep method is uniformly bounded, with a bound which is independent of the stiffness of the system of o.d.e.s., when the step size resolves the slow time scale, but not the fast one. This property is called stability with large step sizes. The theory presented lets one compare properties of one-leg methods and linear multistep methods when they approximate variable coefficient systems of stiff o.d.e.s. In particular, it is shown that one-leg methods have better stability properties with large step sizes than their linear multistep counter parts. The theory also allows one to relate the concept of D-stability to the usual notions of stability and stability domains and to the propagation of errors for multistep methods which use large step sizes.

  7. Automation pilot

    NASA Technical Reports Server (NTRS)

    1983-01-01

    An important concept of the Action Information Management System (AIMS) approach is to evaluate office automation technology in the context of hands on use by technical program managers in the conduct of human acceptance difficulties which may accompany the transition to a significantly changing work environment. The improved productivity and communications which result from application of office automation technology are already well established for general office environments, but benefits unique to NASA are anticipated and these will be explored in detail.

  8. Multistep process of squamous differentiation in tracheobronchial epithelial cells in vitro: analogy with epidermal differentiation.

    PubMed Central

    Jetten, A M

    1989-01-01

    The lung, in particular the bronchial epithelium, is a major site for tumor formation in humans. Environmental factors, such as cigarette smoke, in conjunction with genetic factors are important determinants in this disease. Malignant cells exhibit alterations in their control of proliferation and differentiation. It is believed that the acquisition of defects in the regulation of these processes is important in the process of carcinogenesis. A clear insight into the basic mechanisms of the regulation of proliferation and differentiation is required to understand the molecular mechanisms involved in tumor development and in other pathological conditions. Studies using in vitro cell culture systems of tracheobronchial epithelial cells provide useful models in which to study the regulation of differentiation and proliferation. The clonogenic cells derived from the treacheobronchial epithelium are pluripotent: They have self-renewal capacity and can differentiate along either a normal, mucosecretory, or a squamous cell pathway. Squamous differentiation in tracheobronchial epithelial cells has many morphological, biochemical, and regulatory properties in common with epidermal differentiation. This pathway of differentiation is a multistep process consisting of at least three stages. In the initial stage, cells become committed to terminal cell division. This is followed by the expression of the squamous differentiated phenotype and finally cornification. Various factors, such as several growth factors, retinoids, calcium ions, and phorbol esters, regulate the program of differentiation at different stages. Studies have indicated that the controls of proliferation and differentiation are interrelated. Cell lines established from tracheobronchial epithelial cells expressing SV 40 large T-antigen, as well as carcinoma cell lines, exhibit altered responses to growth and differentiation regulatory factors. Alterations in the commitment to terminal cell division must be a

  9. Automated liquid operation method for microfluidic heterogeneous immunoassay.

    PubMed

    Yi, Hui; Pan, Jian-Zhang; Shi, Xiao-Tong; Fang, Qun

    2013-02-15

    In this work, an automated liquid operation method for multistep heterogeneous immunoassay toward point of care testing (POCT) was proposed. A miniaturized peristaltic pump was developed to control the flow direction, flow time and flow rate in the microliter range according to a program. The peristaltic pump has the advantages of simple structure, small size, low cost, and easy to build and use. By coupling the peristaltic pump with an antibody-coated capillary and a reagent-preloaded cartridge, the complicated liquid handling operation for heterogeneous immunoassay, including sample metering and introduction, multistep reagent introduction and rinsing, could be triggered by an action and accomplished automatically in 12 min. The analytical performance of the present immunoassay system was demonstrated in the measurement of human IgG with fluorescence detection. A detection limit of 0.68 μg/mL IgG and a dynamic range of 2-300 μg/mL were obtained.

  10. Office automation.

    PubMed

    Arenson, R L

    1986-03-01

    By now, the term "office automation" should have more meaning for those readers who are not intimately familiar with the subject. Not all of the preceding material pertains to every department or practice, but certainly, word processing and simple telephone management are key items. The size and complexity of the organization will dictate the usefulness of electronic mail and calendar management, and the individual radiologist's personal needs and habits will determine the usefulness of the home computer. Perhaps the most important ingredient for success in the office automation arena relates to the ability to integrate information from various systems in a simple and flexible manner. Unfortunately, this is perhaps the one area that most office automation systems have ignored or handled poorly. In the personal computer world, there has been much emphasis recently on integration of packages such as spreadsheet, database management, word processing, graphics, time management, and communications. This same philosophy of integration has been applied to a few office automation systems, but these are generally vendor-specific and do not allow for a mixture of foreign subsystems. During the next few years, it is likely that a few vendors will emerge as dominant in this integrated office automation field and will stress simplicity and flexibility as major components.

  11. Habitat automation

    NASA Technical Reports Server (NTRS)

    Swab, Rodney E.

    1992-01-01

    A habitat, on either the surface of the Moon or Mars, will be designed and built with the proven technologies of that day. These technologies will be mature and readily available to the habitat designer. We believe an acceleration of the normal pace of automation would allow a habitat to be safer and more easily maintained than would be the case otherwise. This document examines the operation of a habitat and describes elements of that operation which may benefit from an increased use of automation. Research topics within the automation realm are then defined and discussed with respect to the role they can have in the design of the habitat. Problems associated with the integration of advanced technologies into real-world projects at NASA are also addressed.

  12. Modeling the Auto-Ignition of Biodiesel Blends with a Multi-Step Model

    SciTech Connect

    Toulson, Dr. Elisa; Allen, Casey M; Miller, Dennis J; McFarlane, Joanna; Schock, Harold; Lee, Tonghun

    2011-01-01

    There is growing interest in using biodiesel in place of or in blends with petrodiesel in diesel engines; however, biodiesel oxidation chemistry is complicated to directly model and existing surrogate kinetic models are very large, making them computationally expensive. The present study describes a method for predicting the ignition behavior of blends of n-heptane and methyl butanoate, fuels whose blends have been used in the past as a surrogate for biodiesel. The autoignition is predicted using a multistep (8-step) model in order to reduce computational time and make this a viable tool for implementation into engine simulation codes. A detailed reaction mechanism for n-heptane-methyl butanoate blends was used as a basis for validating the multistep model results. The ignition delay trends predicted by the multistep model for the n-heptane-methyl butanoate blends matched well with that of the detailed CHEMKIN model for the majority of conditions tested.

  13. Direct observation of multistep energy transfer in LHCII with fifth-order 3D electronic spectroscopy

    NASA Astrophysics Data System (ADS)

    Zhang, Zhengyang; Lambrev, Petar H.; Wells, Kym L.; Garab, Győző; Tan, Howe-Siang

    2015-07-01

    During photosynthesis, sunlight is efficiently captured by light-harvesting complexes, and the excitation energy is then funneled towards the reaction centre. These photosynthetic excitation energy transfer (EET) pathways are complex and proceed in a multistep fashion. Ultrafast two-dimensional electronic spectroscopy (2DES) is an important tool to study EET processes in photosynthetic complexes. However, the multistep EET processes can only be indirectly inferred by correlating different cross peaks from a series of 2DES spectra. Here we directly observe multistep EET processes in LHCII using ultrafast fifth-order three-dimensional electronic spectroscopy (3DES). We measure cross peaks in 3DES spectra of LHCII that directly indicate energy transfer from excitons in the chlorophyll b (Chl b) manifold to the low-energy level chlorophyll a (Chl a) via mid-level Chl a energy states. This new spectroscopic technique allows scientists to move a step towards mapping the complete complex EET processes in photosynthetic systems.

  14. Direct observation of multistep energy transfer in LHCII with fifth-order 3D electronic spectroscopy

    PubMed Central

    Zhang, Zhengyang; Lambrev, Petar H.; Wells, Kym L.; Garab, Győző; Tan, Howe-Siang

    2015-01-01

    During photosynthesis, sunlight is efficiently captured by light-harvesting complexes, and the excitation energy is then funneled towards the reaction centre. These photosynthetic excitation energy transfer (EET) pathways are complex and proceed in a multistep fashion. Ultrafast two-dimensional electronic spectroscopy (2DES) is an important tool to study EET processes in photosynthetic complexes. However, the multistep EET processes can only be indirectly inferred by correlating different cross peaks from a series of 2DES spectra. Here we directly observe multistep EET processes in LHCII using ultrafast fifth-order three-dimensional electronic spectroscopy (3DES). We measure cross peaks in 3DES spectra of LHCII that directly indicate energy transfer from excitons in the chlorophyll b (Chl b) manifold to the low-energy level chlorophyll a (Chl a) via mid-level Chl a energy states. This new spectroscopic technique allows scientists to move a step towards mapping the complete complex EET processes in photosynthetic systems. PMID:26228055

  15. Automating Finance

    ERIC Educational Resources Information Center

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  16. Surface Modified Particles By Multi-Step Addition And Process For The Preparation Thereof

    DOEpatents

    Cook, Ronald Lee; Elliott, Brian John; Luebben, Silvia DeVito; Myers, Andrew William; Smith, Bryan Matthew

    2006-01-17

    The present invention relates to a new class of surface modified particles and to a multi-step surface modification process for the preparation of the same. The multi-step surface functionalization process involves two or more reactions to produce particles that are compatible with various host systems and/or to provide the particles with particular chemical reactivities. The initial step comprises the attachment of a small organic compound to the surface of the inorganic particle. The subsequent steps attach additional compounds to the previously attached organic compounds through organic linking groups.

  17. On the origin of multi-step spin transition behaviour in 1D nanoparticles

    NASA Astrophysics Data System (ADS)

    Chiruta, Daniel; Jureschi, Catalin-Maricel; Linares, Jorge; Dahoo, Pierre Richard; Garcia, Yann; Rotaru, Aurelian

    2015-09-01

    To investigate the spin state switching mechanism in spin crossover (SCO) nanoparticles, a special attention is given to three-step thermally induced SCO behavior in 1D chains. An additional term is included in the standard Ising-like Hamiltonian to account for the border interaction between SCO molecules and its local environment. It is shown that this additional interaction, together with the short range interaction, drives the multi-steps thermal hysteretic behavior in 1D SCO systems. The relation between a polymeric matrix and this particular multi-step SCO phenomenon is discussed accordingly. Finally, the environmental influence on the SCO system's size is analyzed as well.

  18. A Multistep Synthesis Featuring Classic Carbonyl Chemistry for the Advanced Organic Chemistry Laboratory

    ERIC Educational Resources Information Center

    Duff, David B.; Abbe, Tyler G.; Goess, Brian C.

    2012-01-01

    A multistep synthesis of 5-isopropyl-1,3-cyclohexanedione is carried out from three commodity chemicals. The sequence involves an aldol condensation, Dieckmann-type annulation, ester hydrolysis, and decarboxylation. No purification is required until after the final step, at which point gravity column chromatography provides the desired product in…

  19. Attention and Multistep Problem Solving in 24-Month-Old Children

    ERIC Educational Resources Information Center

    Carrico, Renee L.

    2013-01-01

    The current study examined the role of increased attentional load in 24 month-old children's multistep problem-solving behavior. Children solved an object-based nonspatial working-memory search task, to which a motor component of varying difficulty was added. Significant disruptions in search performance were observed with the introduction of the…

  20. A Bias and Variance Analysis for Multistep-Ahead Time Series Forecasting.

    PubMed

    Ben Taieb, Souhaib; Atiya, Amir F

    2016-01-01

    Multistep-ahead forecasts can either be produced recursively by iterating a one-step-ahead time series model or directly by estimating a separate model for each forecast horizon. In addition, there are other strategies; some of them combine aspects of both aforementioned concepts. In this paper, we present a comprehensive investigation into the bias and variance behavior of multistep-ahead forecasting strategies. We provide a detailed review of the different multistep-ahead strategies. Subsequently, we perform a theoretical study that derives the bias and variance for a number of forecasting strategies. Finally, we conduct a Monte Carlo experimental study that compares and evaluates the bias and variance performance of the different strategies. From the theoretical and the simulation studies, we analyze the effect of different factors, such as the forecast horizon and the time series length, on the bias and variance components, and on the different multistep-ahead strategies. Several lessons are learned, and recommendations are given concerning the advantages, disadvantages, and best conditions of use of each strategy.

  1. Multi-step routes of capuchin monkeys in a laser pointer traveling salesman task.

    PubMed

    Howard, Allison M; Fragaszy, Dorothy M

    2014-09-01

    Prior studies have claimed that nonhuman primates plan their routes multiple steps in advance. However, a recent reexamination of multi-step route planning in nonhuman primates indicated that there is no evidence for planning more than one step ahead. We tested multi-step route planning in capuchin monkeys using a pointing device to "travel" to distal targets while stationary. This device enabled us to determine whether capuchins distinguish the spatial relationship between goals and themselves and spatial relationships between goals and the laser dot, allocentrically. In Experiment 1, two subjects were presented with identical food items in Near-Far (one item nearer to subject) and Equidistant (both items equidistant from subject) conditions with a laser dot visible between the items. Subjects moved the laser dot to the items using a joystick. In the Near-Far condition, one subject demonstrated a bias for items closest to self but the other subject chose efficiently. In the second experiment, subjects retrieved three food items in similar Near-Far and Equidistant arrangements. Both subjects preferred food items nearest the laser dot and showed no evidence of multi-step route planning. We conclude that these capuchins do not make choices on the basis of multi-step look ahead strategies.

  2. Use of Chiral Oxazolidinones for a Multi-Step Synthetic Laboratory Module

    ERIC Educational Resources Information Center

    Betush, Matthew P.; Murphree, S. Shaun

    2009-01-01

    Chiral oxazolidinone chemistry is used as a framework for an advanced multi-step synthesis lab. The cost-effective and robust preparation of chiral starting materials is presented, as well as the use of chiral auxiliaries in a synthesis scheme that is appropriate for students currently in the second semester of the organic sequence. (Contains 1…

  3. Multistep Synthesis of a Terphenyl Derivative Showcasing the Diels-Alder Reaction

    ERIC Educational Resources Information Center

    Davie, Elizabeth A. Colby

    2015-01-01

    An adaptable multistep synthesis project designed for the culmination of a second-year organic chemistry laboratory course is described. The target compound is a terphenyl derivative that is an intermediate in the synthesis of compounds used in organic light-emitting devices. Students react a conjugated diene with dimethylacetylene dicarboxylate…

  4. Controlled growth of silica-titania hybrid functional nanoparticles through a multistep microfluidic approach.

    PubMed

    Shiba, K; Sugiyama, T; Takei, T; Yoshikawa, G

    2015-11-11

    Silica/titania-based functional nanoparticles were prepared through controlled nucleation of titania and subsequent encapsulation by silica through a multistep microfluidic approach, which was successfully applied to obtaining aminopropyl-functionalized silica/titania nanoparticles for a highly sensitive humidity sensor.

  5. A Multistep Synthesis Incorporating a Green Bromination of an Aromatic Ring

    ERIC Educational Resources Information Center

    Cardinal, Pascal; Greer, Brandon; Luong, Horace; Tyagunova, Yevgeniya

    2012-01-01

    Electrophilic aromatic substitution is a fundamental topic taught in the undergraduate organic chemistry curriculum. A multistep synthesis that includes a safer and greener method for the bromination of an aromatic ring than traditional bromination methods is described. This experiment is multifaceted and can be used to teach students about…

  6. Two-dimensional Paper‡ Networks: programmable fluidic disconnects for multi-step processes in shaped paper

    PubMed Central

    Trinh, Philip; Ball, Cameron; Fu, Elain; Yager, Paul

    2016-01-01

    Most laboratory assays take advantage of multi-step protocols to achieve high performance, but conventional paper-based tests (e.g., lateral flow tests) are generally limited to assays that can be carried out in a single fluidic step. We have developed two-dimensional paper networks (2DPNs) that use materials from lateral flow tests but reconfigure them to enable programming of multi-step reagent delivery sequences. The 2DPN uses multiple converging fluid inlets to control the arrival time of each fluid to a detection zone or reaction zone, and it requires a method to disconnect each fluid source in a corresponding timed sequence. Here, we present a method that allows programmed disconnection of fluid sources required for multi-step delivery. A 2DPN with legs of different lengths is inserted into a shared buffer well, and the dropping fluid surface disconnects each leg at in a programmable sequence. This approach could enable multi-step laboratory assays to be converted into simple point-of-care devices that have high performance yet remain easy to use. PMID:22037591

  7. Biocatalyzed Regioselective Synthesis in Undergraduate Organic Laboratories: Multistep Synthesis of 2-Arachidonoylglycerol

    ERIC Educational Resources Information Center

    Johnston, Meghan R.; Makriyannis, Alexandros; Whitten, Kyle M.; Drew, Olivia C.; Best, Fiona A.

    2016-01-01

    In order to introduce the concepts of biocatalysis and its utility in synthesis to organic chemistry students, a multistep synthesis of endogenous cannabinergic ligand 2-arachidonoylglycerol (2-AG) was tailored for use as a laboratory exercise. Over four weeks, students successfully produced 2-AG, purifying and characterizing products at each…

  8. Mechanical and Metallurgical Evolution of Stainless Steel 321 in a Multi-step Forming Process

    NASA Astrophysics Data System (ADS)

    Anderson, M.; Bridier, F.; Gholipour, J.; Jahazi, M.; Wanjara, P.; Bocher, P.; Savoie, J.

    2016-04-01

    This paper examines the metallurgical evolution of AISI Stainless Steel 321 (SS 321) during multi-step forming, a process that involves cycles of deformation with intermediate heat treatment steps. The multi-step forming process was simulated by implementing interrupted uniaxial tensile testing experiments. Evolution of the mechanical properties as well as the microstructural features, such as twins and textures of the austenite and martensite phases, was studied as a function of the multi-step forming process. The characteristics of the Strain-Induced Martensite (SIM) were also documented for each deformation step and intermediate stress relief heat treatment. The results indicated that the intermediate heat treatments considerably increased the formability of SS 321. Texture analysis showed that the effect of the intermediate heat treatment on the austenite was minor and led to partial recrystallization, while deformation was observed to reinforce the crystallographic texture of austenite. For the SIM, an Olson-Cohen equation type was identified to analytically predict its formation during the multi-step forming process. The generated SIM was textured and weakened with increasing deformation.

  9. Design Automation in Synthetic Biology.

    PubMed

    Appleton, Evan; Madsen, Curtis; Roehner, Nicholas; Densmore, Douglas

    2017-04-03

    Design automation refers to a category of software tools for designing systems that work together in a workflow for designing, building, testing, and analyzing systems with a target behavior. In synthetic biology, these tools are called bio-design automation (BDA) tools. In this review, we discuss the BDA tools areas-specify, design, build, test, and learn-and introduce the existing software tools designed to solve problems in these areas. We then detail the functionality of some of these tools and show how they can be used together to create the desired behavior of two types of modern synthetic genetic regulatory networks.

  10. The discrepancies in multistep damage evolution of yttria-stabilized zirconia irradiated with different ions

    SciTech Connect

    Yang, Tengfei; Taylor, Caitlin A.; Kong, Shuyan; Wang, Chenxu; Zhang, Yanwen; Huang, Xuejun; Xue, Jianming; Yan, Sha; Wang, Yugang

    2013-01-01

    This paper reports a comprehensive investigation of structural damage in yttria-stabilized zirconia irradiated with different ions over a wide fluence range. A similar multistep damage accumulation exists for the irradiations of different ions, but the critical doses for occurrence of second damage step, characterized by a faster increase in damage fraction, and the maximum elastic strain at the first damage step are varied and depend on ion mass. For irradiations of heavier ions, the second damage step occurs at a higher dose with a lower critical elastic strain. Furthermore, larger extended defects were observed in the irradiations of heavy ions at the second damage step. Associated with other experiment results and multistep damage accumulation model, the distinct discrepancies in the damage buildup under irradiations of different ions were interpreted by the effects of electronic excitation, energy of primary knock-on atom and chemistry contributions of deposited ions.

  11. Using motor milestones as a multistep process to screen preterm infants for cerebral palsy.

    PubMed

    Allen, M C; Alexander, G R

    1997-01-01

    Pediatricians often informally use motor milestones to screen infant motor development, and one advantage is that they can be used during sequential office visits, as a multistep screening process. In this study we evaluated six motor milestones (roll prone to supine, roll supine to prone, sit with support, sit without support, crawl and cruise) as a multistep process in screening for cerebral palsy in 173 high-risk preterm infants (<33 weeks gestational age) who had been followed with sequential developmental assessments for at least 18 months. At the 18 to 24 month evaluation, 31 (18%) had cerebral palsy. We found that using the motor milestones as serial screening tests for cerebral palsy was more effective in terms of positive predictive value than any individual milestone alone. Limited community resources can be more efficiently used if preterm infants with delays in more than four motor milestones are referred for further evaluation and early intervention services.

  12. Multistep integration formulas for the numerical integration of the satellite problem

    NASA Technical Reports Server (NTRS)

    Lundberg, J. B.; Tapley, B. D.

    1981-01-01

    The use of two Class 2/fixed mesh/fixed order/multistep integration packages of the PECE type for the numerical integration of the second order, nonlinear, ordinary differential equation of the satellite orbit problem. These two methods are referred to as the general and the second sum formulations. The derivation of the basic equations which characterize each formulation and the role of the basic equations in the PECE algorithm are discussed. Possible starting procedures are examined which may be used to supply the initial set of values required by the fixed mesh/multistep integrators. The results of the general and second sum integrators are compared to the results of various fixed step and variable step integrators.

  13. Color-Tunable Resonant Photoluminescence and Cavity-Mediated Multistep Energy Transfer Cascade.

    PubMed

    Okada, Daichi; Nakamura, Takashi; Braam, Daniel; Dao, Thang Duy; Ishii, Satoshi; Nagao, Tadaaki; Lorke, Axel; Nabeshima, Tatsuya; Yamamoto, Yohei

    2016-07-26

    Color-tunable resonant photoluminescence (PL) was attained from polystyrene microspheres doped with a single polymorphic fluorescent dye, boron-dipyrrin (BODIPY) 1. The color of the resonant PL depends on the assembling morphology of 1 in the microspheres, which can be selectively controlled from green to red by the initial concentration of 1 in the preparation process of the microspheres. Studies on intersphere PL propagation with multicoupled microspheres, prepared by micromanipulation technique, revealed that multistep photon transfer takes place through the microspheres, accompanying energy transfer cascade with stepwise PL color change. The intersphere energy transfer cascade is direction selective, where energy donor-to-acceptor down conversion direction is only allowed. Such cavity-mediated long-distance and multistep energy transfer will be advantageous for polymer photonics device application.

  14. Teaching multi-step math skills to adults with disabilities via video prompting.

    PubMed

    Kellems, Ryan O; Frandsen, Kaitlyn; Hansen, Blake; Gabrielsen, Terisa; Clarke, Brynn; Simons, Kalee; Clements, Kyle

    2016-11-01

    The purpose of this study was to evaluate the effectiveness of teaching multi-step math skills to nine adults with disabilities in an 18-21 post-high school transition program using a video prompting intervention package. The dependent variable was the percentage of steps completed correctly. The independent variable was the video prompting intervention, which involved several multi-step math calculation skills: (a) calculating a tip (15%), (b) calculating item unit prices, and (c) adjusting a recipe for more or fewer people. Results indicated a functional relationship between the video prompting interventions and prompting package and the percentage of steps completed correctly. 8 out of the 9 adults showed significant gains immediately after receiving the video prompting intervention.

  15. Multistep impregnation method for incorporation of high amount of titania into SBA-15

    SciTech Connect

    Wang Wei; Song Mo . E-mail: m.song@lboro.ac.uk

    2006-02-02

    A multistep impregnation method was employed to incorporate high amount of titania into the mesoporous SBA-15 silica. No damage to the SBA-15 silica mesostructures was caused by the loading of titania in every cycle. The existence of titania small nanodomains were confirmed to be present by Raman spectra and UV-vis DRS measurements. High dispersion of them was realized via this method according to the results of low-angle X-ray powder diffraction (XRD), transmission electron microscopy (TEM) and N{sub 2} sorption measurements. Importantly, no blockage of mesostructures was acknowledged with titania content up to 24.4 wt.%. In comparison, normally used one-step impregnation method led to serious blockage of mesopores as the results of formation of bulk titania particles in the mesochannels. Photo-activity test for the removal of oestrogen showed the superiority of the materials synthesized by multistep impregnation method to one-step impregnation method.

  16. Region-based multi-step optic disk and cup segmentation from color fundus image

    NASA Astrophysics Data System (ADS)

    Xiao, Di; Lock, Jane; Manresa, Javier Moreno; Vignarajan, Janardhan; Tay-Kearney, Mei-Ling; Kanagasingam, Yogesan

    2013-02-01

    Retinal optic cup-disk-ratio (CDR) is a one of important indicators of glaucomatous neuropathy. In this paper, we propose a novel multi-step 4-quadrant thresholding method for optic disk segmentation and a multi-step temporal-nasal segmenting method for optic cup segmentation based on blood vessel inpainted HSL lightness images and green images. The performance of the proposed methods was evaluated on a group of color fundus images and compared with the manual outlining results from two experts. Dice scores of detected disk and cup regions between the auto and manual results were computed and compared. Vertical CDRs were also compared among the three results. The preliminary experiment has demonstrated the robustness of the method for automatic optic disk and cup segmentation and its potential value for clinical application.

  17. Multi-Step Deep Reactive Ion Etching Fabrication Process for Silicon-Based Terahertz Components

    NASA Technical Reports Server (NTRS)

    Jung-Kubiak, Cecile (Inventor); Reck, Theodore (Inventor); Chattopadhyay, Goutam (Inventor); Perez, Jose Vicente Siles (Inventor); Lin, Robert H. (Inventor); Mehdi, Imran (Inventor); Lee, Choonsup (Inventor); Cooper, Ken B. (Inventor); Peralta, Alejandro (Inventor)

    2016-01-01

    A multi-step silicon etching process has been developed to fabricate silicon-based terahertz (THz) waveguide components. This technique provides precise dimensional control across multiple etch depths with batch processing capabilities. Nonlinear and passive components such as mixers and multipliers waveguides, hybrids, OMTs and twists have been fabricated and integrated into a small silicon package. This fabrication technique enables a wafer-stacking architecture to provide ultra-compact multi-pixel receiver front-ends in the THz range.

  18. Intrinsic Micromechanism of Multi-step Structural Transformation in MnNi Shape Memory Alloys

    NASA Astrophysics Data System (ADS)

    Cui, Shushan; Wan, Jianfeng; Rong, Yonghua; Zhang, Jihua

    2017-03-01

    Simulation of the multi-step transformation of cubic matrix → multi-variant tetragonal domain → orthorhombic domain was realized by phase-field method. The intrinsic micromechanism of the second-step transformation in MnNi alloys was studied. It was found that the orthorhombic variant originated from the tetragonal variant with similar orientation, and bar-shaped orthorhombic phase firstly occurred around the interface of twinning bands. The second-step transformation resulted in localized variation of internal stress.

  19. Multistep-Ahead Air Passengers Traffic Prediction with Hybrid ARIMA-SVMs Models

    PubMed Central

    Ming, Wei; Xiong, Tao

    2014-01-01

    The hybrid ARIMA-SVMs prediction models have been established recently, which take advantage of the unique strength of ARIMA and SVMs models in linear and nonlinear modeling, respectively. Built upon this hybrid ARIMA-SVMs models alike, this study goes further to extend them into the case of multistep-ahead prediction for air passengers traffic with the two most commonly used multistep-ahead prediction strategies, that is, iterated strategy and direct strategy. Additionally, the effectiveness of data preprocessing approaches, such as deseasonalization and detrending, is investigated and proofed along with the two strategies. Real data sets including four selected airlines' monthly series were collected to justify the effectiveness of the proposed approach. Empirical results demonstrate that the direct strategy performs better than iterative one in long term prediction case while iterative one performs better in the case of short term prediction. Furthermore, both deseasonalization and detrending can significantly improve the prediction accuracy for both strategies, indicating the necessity of data preprocessing. As such, this study contributes as a full reference to the planners from air transportation industries on how to tackle multistep-ahead prediction tasks in the implementation of either prediction strategy. PMID:24723814

  20. Direct sampling of multiple single-molecular rupture dominant pathways involving a multistep transition.

    PubMed

    Jiang, Huijun; Ding, Huai; Hou, Zhonghuai

    2014-12-14

    We report a novel single-molecular rupture mechanism revealed by direct sampling of the dominant pathway using a self-optimized path sampling method. Multiple dominant pathways involving multistep transitions are identified. The rupture may take place via a direct unfolding from the native state to the unfolding state, or through a two-step pathway bypassing a distinct intermediate metastable state (IMS). This scenario facilitates us to propose a three-state kinetic model, which can produce a nonlinear dependence of the rupture time on pulling forces similar to the ones reported in the literature. In particular, molecule conformations in the IMS maintain an elongation of the tail at one terminal, by which external pulling will enhance the relative stability of IMS. Consequently, even though the overall transition rate of the multistep pathway is relatively small, the molecule still has to be ruptured via the multistep pathway rather than the direct pathway. Thus, our work demonstrates an IMS trapping effect induced rupture mechanism involving an abnormal switching from a fast dominant pathway to a slow one.

  1. Theory of Kinetics of Multistep Ligand-Receptor Assembly in Dissipating and Fluctuating Environments

    NASA Astrophysics Data System (ADS)

    Teslenko, Victor I.; Kapitanchuk, Oleksiy L.

    2013-09-01

    Multistep kinetic processes play a key role in physics (excitation transfer, energy degradation), chemistry (ligand-receptor assembly, radical reactions) and biology (signal perception, molecular recognition). While a phenomenological thermodynamic approach for modeling the elementary acts of transitions underlying the maintaining of a system's stationary and equilibrium states is now well recognized, a more satisfying microscopic description based on the consistent understanding of dissipation and fluctuation processes accompanying the multistep relaxations remains elusive. In this paper, a microscopic theory of kinetics of a few-state system exhibited the energy fluctuations and coupled to a condensed medium is developed. The theory is formulated such as of being an example of the case of irreversible multistep ligand-receptor assembly in a dissipating environment. We first derive general expression for the probability of transitions between the system states valid on the whole timescale and then reduce this expression to the effectively slow times by making it an average over both the steady-state fluctuations of a system's energies and the equilibrium vibrations of the environment. Further, we calculate the populations of states for the sequence of cases of the three-to-two-to-single-step assemblage in dependence on the temperature, viscosity and ligand concentration. Finally, we discuss the results obtained with reference to the case of "negative" cooperativity emerging by virtue of the irreversibility of the last kinetic step.

  2. Multistep greedy algorithm identifies community structure in real-world and computer-generated networks

    NASA Astrophysics Data System (ADS)

    Schuetz, Philipp; Caflisch, Amedeo

    2008-08-01

    We have recently introduced a multistep extension of the greedy algorithm for modularity optimization. The extension is based on the idea that merging l pairs of communities (l>1) at each iteration prevents premature condensation into few large communities. Here, an empirical formula is presented for the choice of the step width l that generates partitions with (close to) optimal modularity for 17 real-world and 1100 computer-generated networks. Furthermore, an in-depth analysis of the communities of two real-world networks (the metabolic network of the bacterium E. coli and the graph of coappearing words in the titles of papers coauthored by Martin Karplus) provides evidence that the partition obtained by the multistep greedy algorithm is superior to the one generated by the original greedy algorithm not only with respect to modularity, but also according to objective criteria. In other words, the multistep extension of the greedy algorithm reduces the danger of getting trapped in local optima of modularity and generates more reasonable partitions.

  3. Photon Production through Multi-step Processes Important in Nuclear Fluorescence Experiments

    SciTech Connect

    Hagmann, C; Pruet, J

    2006-10-26

    The authors present calculations describing the production of photons through multi-step processes occurring when a beam of gamma rays interacts with a macroscopic material. These processes involve the creation of energetic electrons through Compton scattering, photo-absorption and pair production, the subsequent scattering of these electrons, and the creation of energetic photons occurring as these electrons are slowed through Bremsstrahlung emission. Unlike single Compton collisions, during which an energetic photon that is scattered through a large angle loses most of its energy, these multi-step processes result in a sizable flux of energetic photons traveling at large angles relative to an incident photon beam. These multi-step processes are also a key background in experiments that measure nuclear resonance fluorescence by shining photons on a thin foil and observing the spectrum of back-scattered photons. Effective cross sections describing the production of backscattered photons are presented in a tabular form that allows simple estimates of backgrounds expected in a variety of experiments. Incident photons with energies between 0.5 MeV and 8 MeV are considered. These calculations of effective cross sections may be useful for those designing NRF experiments or systems that detect specific isotopes in well-shielded environments through observation of resonance fluorescence.

  4. Protein fabrication automation

    PubMed Central

    Cox, J. Colin; Lape, Janel; Sayed, Mahmood A.; Hellinga, Homme W.

    2007-01-01

    Facile “writing” of DNA fragments that encode entire gene sequences potentially has widespread applications in biological analysis and engineering. Rapid writing of open reading frames (ORFs) for expressed proteins could transform protein engineering and production for protein design, synthetic biology, and structural analysis. Here we present a process, protein fabrication automation (PFA), which facilitates the rapid de novo construction of any desired ORF from oligonucleotides with low effort, high speed, and little human interaction. PFA comprises software for sequence design, data management, and the generation of instruction sets for liquid-handling robotics, a liquid-handling robot, a robust PCR scheme for gene assembly from synthetic oligonucleotides, and a genetic selection system to enrich correctly assembled full-length synthetic ORFs. The process is robust and scalable. PMID:17242375

  5. Multistep cascade annihilations of dark matter and the Galactic Center excess

    DOE PAGES

    Elor, Gilly; Rodd, Nicholas L.; Slatyer, Tracy R.

    2015-05-26

    If dark matter is embedded in a non-trivial dark sector, it may annihilate and decay to lighter dark-sector states which subsequently decay to the Standard Model. Such scenarios - with annihilation followed by cascading dark-sector decays - can explain the apparent excess GeV gamma-rays identified in the central Milky Way, while evading bounds from dark matter direct detection experiments. Each 'step' in the cascade will modify the observable signatures of dark matter annihilation and decay, shifting the resulting photons and other final state particles to lower energies and broadening their spectra. We explore, in a model-independent way, the effect ofmore » multi-step dark-sector cascades on the preferred regions of parameter space to explain the GeV excess. We find that the broadening effects of multi-step cascades can admit final states dominated by particles that would usually produce too sharply peaked photon spectra; in general, if the cascades are hierarchical (each particle decays to substantially lighter particles), the preferred mass range for the dark matter is in all cases 20-150 GeV. Decay chains that have nearly-degenerate steps, where the products are close to half the mass of the progenitor, can admit much higher DM masses. We map out the region of mass/cross-section parameter space where cascades (degenerate, hierarchical or a combination) can fit the signal, for a range of final states. In the current paper, we study multi-step cascades in the context of explaining the GeV excess, but many aspects of our results are general and can be extended to other applications.« less

  6. Multistep cascade annihilations of dark matter and the Galactic Center excess

    SciTech Connect

    Elor, Gilly; Rodd, Nicholas L.; Slatyer, Tracy R.

    2015-05-26

    If dark matter is embedded in a non-trivial dark sector, it may annihilate and decay to lighter dark-sector states which subsequently decay to the Standard Model. Such scenarios - with annihilation followed by cascading dark-sector decays - can explain the apparent excess GeV gamma-rays identified in the central Milky Way, while evading bounds from dark matter direct detection experiments. Each 'step' in the cascade will modify the observable signatures of dark matter annihilation and decay, shifting the resulting photons and other final state particles to lower energies and broadening their spectra. We explore, in a model-independent way, the effect of multi-step dark-sector cascades on the preferred regions of parameter space to explain the GeV excess. We find that the broadening effects of multi-step cascades can admit final states dominated by particles that would usually produce too sharply peaked photon spectra; in general, if the cascades are hierarchical (each particle decays to substantially lighter particles), the preferred mass range for the dark matter is in all cases 20-150 GeV. Decay chains that have nearly-degenerate steps, where the products are close to half the mass of the progenitor, can admit much higher DM masses. We map out the region of mass/cross-section parameter space where cascades (degenerate, hierarchical or a combination) can fit the signal, for a range of final states. In the current paper, we study multi-step cascades in the context of explaining the GeV excess, but many aspects of our results are general and can be extended to other applications.

  7. Epigenetic genes and emotional reactivity to daily life events: a multi-step gene-environment interaction study.

    PubMed

    Pishva, Ehsan; Drukker, Marjan; Viechtbauer, Wolfgang; Decoster, Jeroen; Collip, Dina; van Winkel, Ruud; Wichers, Marieke; Jacobs, Nele; Thiery, Evert; Derom, Catherine; Geschwind, Nicole; van den Hove, Daniel; Lataster, Tineke; Myin-Germeys, Inez; van Os, Jim; Rutten, Bart P F; Kenis, Gunter

    2014-01-01

    Recent human and animal studies suggest that epigenetic mechanisms mediate the impact of environment on development of mental disorders. Therefore, we hypothesized that polymorphisms in epigenetic-regulatory genes impact stress-induced emotional changes. A multi-step, multi-sample gene-environment interaction analysis was conducted to test whether 31 single nucleotide polymorphisms (SNPs) in epigenetic-regulatory genes, i.e. three DNA methyltransferase genes DNMT1, DNMT3A, DNMT3B, and methylenetetrahydrofolate reductase (MTHFR), moderate emotional responses to stressful and pleasant stimuli in daily life as measured by Experience Sampling Methodology (ESM). In the first step, main and interactive effects were tested in a sample of 112 healthy individuals. Significant associations in this discovery sample were then investigated in a population-based sample of 434 individuals for replication. SNPs showing significant effects in both the discovery and replication samples were subsequently tested in three other samples of: (i) 85 unaffected siblings of patients with psychosis, (ii) 110 patients with psychotic disorders, and iii) 126 patients with a history of major depressive disorder. Multilevel linear regression analyses showed no significant association between SNPs and negative affect or positive affect. No SNPs moderated the effect of pleasant stimuli on positive affect. Three SNPs of DNMT3A (rs11683424, rs1465764, rs1465825) and 1 SNP of MTHFR (rs1801131) moderated the effect of stressful events on negative affect. Only rs11683424 of DNMT3A showed consistent directions of effect in the majority of the 5 samples. These data provide the first evidence that emotional responses to daily life stressors may be moderated by genetic variation in the genes involved in the epigenetic machinery.

  8. Epigenetic Genes and Emotional Reactivity to Daily Life Events: A Multi-Step Gene-Environment Interaction Study

    PubMed Central

    Pishva, Ehsan; Drukker, Marjan; Viechtbauer, Wolfgang; Decoster, Jeroen; Collip, Dina; van Winkel, Ruud; Wichers, Marieke; Jacobs, Nele; Thiery, Evert; Derom, Catherine; Geschwind, Nicole; van den Hove, Daniel; Lataster, Tineke; Myin-Germeys, Inez; van Os, Jim

    2014-01-01

    Recent human and animal studies suggest that epigenetic mechanisms mediate the impact of environment on development of mental disorders. Therefore, we hypothesized that polymorphisms in epigenetic-regulatory genes impact stress-induced emotional changes. A multi-step, multi-sample gene-environment interaction analysis was conducted to test whether 31 single nucleotide polymorphisms (SNPs) in epigenetic-regulatory genes, i.e. three DNA methyltransferase genes DNMT1, DNMT3A, DNMT3B, and methylenetetrahydrofolate reductase (MTHFR), moderate emotional responses to stressful and pleasant stimuli in daily life as measured by Experience Sampling Methodology (ESM). In the first step, main and interactive effects were tested in a sample of 112 healthy individuals. Significant associations in this discovery sample were then investigated in a population-based sample of 434 individuals for replication. SNPs showing significant effects in both the discovery and replication samples were subsequently tested in three other samples of: (i) 85 unaffected siblings of patients with psychosis, (ii) 110 patients with psychotic disorders, and iii) 126 patients with a history of major depressive disorder. Multilevel linear regression analyses showed no significant association between SNPs and negative affect or positive affect. No SNPs moderated the effect of pleasant stimuli on positive affect. Three SNPs of DNMT3A (rs11683424, rs1465764, rs1465825) and 1 SNP of MTHFR (rs1801131) moderated the effect of stressful events on negative affect. Only rs11683424 of DNMT3A showed consistent directions of effect in the majority of the 5 samples. These data provide the first evidence that emotional responses to daily life stressors may be moderated by genetic variation in the genes involved in the epigenetic machinery. PMID:24967710

  9. Expression Patterns of Cancer Stem Cell Markers During Specific Celecoxib Therapy in Multistep Rat Colon Carcinogenesis Bioassays.

    PubMed

    Salim, Elsayed I; Hegazi, Mona M; Kang, Jin Seok; Helmy, Hager M

    2016-01-01

    The purpose of this study was to investigate the role of colon cancer stem cells (CSCs) during chemicallyinduced rat multi-step colon carcinogenesis with or without the treatment with a specific cyclooxygenase-2 inhibitor drug (celecoxib). Two experiments were performed, the first, a short term 12 week colon carcinogenesis bioassay in which only surrogate markers for colon cancer, aberrant crypt foci (ACF) lesions, were formed. The other experiment was a medium term colon cancer rat assay in which tumors had developed after 32 weeks. Treatment with celecoxib lowered the numbers of ACF, as well as the tumor volumes and multiplicities after 32 weeks. Immunohistochemical proliferating cell nuclear antigen (PCNA) labeling indexes LI (%) were downregulated after treatment by celecoxib. Also different cell surface antigens known to associate with CSCs such as the epithelial cell adhesion molecule (EpCAM), CD44 and CD133 were compared between the two experiments and showed differential expression patterns depending on the stage of carcinogenesis and treatment with celecoxib. Flow cytometric analysis demonstrated that the numbers of CD133 cells were increased in the colonic epithelium after 12 weeks while those of CD44 but not CD133 cells were increased after 32 weeks. Moreover, aldehyde dehydrogenase-1 activity levels in the colonic epithelium (a known CSC marker) detected by ELISA assay were found down-regulated after 12 weeks, but were up-regulated after 32 weeks. The data have also shown that the protective effect of celecoxib on these specific markers and populations of CSCs and on other molecular processes such as apoptosis targeted by this drug may vary depending on the genetic and phenotypic stages of carcinogenesis. Therefore, uncovering these distinction roles of CSCs during different phases of carcinogenesis and during specific treatment could be useful for targeted therapy.

  10. Automated Assay of Telomere Length Measurement and Informatics for 100,000 Subjects in the Genetic Epidemiology Research on Adult Health and Aging (GERA) Cohort.

    PubMed

    Lapham, Kyle; Kvale, Mark N; Lin, Jue; Connell, Sheryl; Croen, Lisa A; Dispensa, Brad P; Fang, Lynn; Hesselson, Stephanie; Hoffmann, Thomas J; Iribarren, Carlos; Jorgenson, Eric; Kushi, Lawrence H; Ludwig, Dana; Matsuguchi, Tetsuya; McGuire, William B; Miles, Sunita; Quesenberry, Charles P; Rowell, Sarah; Sadler, Marianne; Sakoda, Lori C; Smethurst, David; Somkin, Carol P; Van Den Eeden, Stephen K; Walter, Lawrence; Whitmer, Rachel A; Kwok, Pui-Yan; Risch, Neil; Schaefer, Catherine; Blackburn, Elizabeth H

    2015-08-01

    The Kaiser Permanente Research Program on Genes, Environment, and Health (RPGEH) Genetic Epidemiology Research on Adult Health and Aging (GERA) cohort includes DNA specimens extracted from saliva samples of 110,266 individuals. Because of its relationship to aging, telomere length measurement was considered an important biomarker to develop on these subjects. To assay relative telomere length (TL) on this large cohort over a short time period, we created a novel high throughput robotic system for TL analysis and informatics. Samples were run in triplicate, along with control samples, in a randomized design. As part of quality control, we determined the within-sample variability and employed thresholds for the elimination of outlying measurements. Of 106,902 samples assayed, 105,539 (98.7%) passed all quality control (QC) measures. As expected, TL in general showed a decline with age and a sex difference. While telomeres showed a negative correlation with age up to 75 years, in those older than 75 years, age positively correlated with longer telomeres, indicative of an association of longer telomeres with more years of survival in those older than 75. Furthermore, while females in general had longer telomeres than males, this difference was significant only for those older than age 50. An additional novel finding was that the variance of TL between individuals increased with age. This study establishes reliable assay and analysis methodologies for measurement of TL in large, population-based human studies. The GERA cohort represents the largest currently available such resource, linked to comprehensive electronic health and genotype data for analysis.

  11. Analysis of intrinsic coupling loss in multi-step index optical fibres.

    PubMed

    Aldabaldetreku, Gotzon; Durana, Gaizka; Zubia, Joseba; Arrue, Jon; Jiménez, Felipe; Mateo, Javier

    2005-05-02

    The main goal of the present paper is to provide a comprehensive analysis of the intrinsic coupling loss for multi-step index (MSI) fibres and compare it with those obtained for step- and graded-index fibres. We investigate the effects of tolerances to each waveguide parameter typical in standard manufacturing processes by carrying out several simulations using the ray-tracing method. The results obtained will serve us to identify the most critical waveguide variations to which fibre manufactures will have to pay closer attention to achieve lower coupling losses.

  12. PRE-ADAMO: a multi-step approach for the identification of life on Mars

    NASA Astrophysics Data System (ADS)

    Brucato, J. R.; Vázquez, L.; Rotundi, A.; Cataldo, F.; Palomba, E.; Saladino, R.; di Mauro, E.; Baratta, G.; Barbier, B.; Battaglia, R.; Colangeli, L.; Costanzo, G.; Crestini, C.; della Corte, V.; Mazzotta Epifani, E.; Esposito, F.; Ferrini, G.; Gómez Elvira, J.; Isola, M.; Keheyan, Y.; Leto, G.; Martinez Frias, J.; Mennella, V.; Negri, R.; Palumbo, M. E.; Palumbo, P.; Strazzulla, G.; Falciani, P.; Adami, G.; Guizzo, G. P.; Campiotti, S.

    2004-03-01

    It is of paramount importance to detect traces of life on Mars surface. Organic molecules are highly polar and if present on Mars require to be extracted from the dust sample, separated, concentrated, processed and analysed by an appropriate apparatus. PRE-ADAMO (PRebiotic Experiment - Activity of Dust And bioMolecules Observation) is a multi-steps approach for the identification of possible polar substances present on Mars. It was proposed as instrument of Pasteur payload for the ESA (European Space Agency) ExoMars rover mission. Main scientific objectives and experimental approach of PRE-ADAMO are here presented.

  13. Synergy between chemo- and bio-catalysts in multi-step transformations.

    PubMed

    Caiazzo, Aldo; Garcia, Paula M L; Wever, Ron; van Hest, Jan C M; Rowan, Alan E; Reek, Joost N H

    2009-07-21

    Cascade synthetic pathways, which allow multi-step conversions to take place in one reaction vessel, are crucial for the development of biomimetic, highly efficient new methods of chemical synthesis. Theoretically, the complexity introduced by combining processes could lead to an improvement of the overall process; however, it is the current general belief that it is more efficient to run processes separately. Inspired by natural cascade procedures we successfully combined a lipase catalyzed amidation with palladium catalyzed coupling reactions, simultaneously carried out on the same molecule. Unexpectedly, the bio- and chemo-catalyzed processes show synergistic behaviour, highlighting the complexity of multi-catalyst systems.

  14. Autonomy and Automation

    NASA Technical Reports Server (NTRS)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  15. Automated tetraploid genotype calling by hierarchical clustering

    Technology Transfer Automated Retrieval System (TEKTRAN)

    SNP arrays are transforming breeding and genetics research for autotetraploids. To fully utilize these arrays, however, the relationship between signal intensity and allele dosage must be inferred independently for each marker. We developed an improved computational method to automate this process, ...

  16. Multistep Model of Cervical Cancer: Participation of miRNAs and Coding Genes

    PubMed Central

    López, Angelica Judith Granados; López, Jesús Adrián

    2014-01-01

    Aberrant miRNA expression is well recognized as an important step in the development of cancer. Close to 70 microRNAs (miRNAs) have been implicated in cervical cancer up to now, nevertheless it is unknown if aberrant miRNA expression causes the onset of cervical cancer. One of the best ways to address this issue is through a multistep model of carcinogenesis. In the progression of cervical cancer there are three well-established steps to reach cancer that we used in the model proposed here. The first step of the model comprises the gene changes that occur in normal cells to be transformed into immortal cells (CIN 1), the second comprises immortal cell changes to tumorigenic cells (CIN 2), the third step includes cell changes to increase tumorigenic capacity (CIN 3), and the final step covers tumorigenic changes to carcinogenic cells. Altered miRNAs and their target genes are located in each one of the four steps of the multistep model of carcinogenesis. miRNA expression has shown discrepancies in different works; therefore, in this model we include miRNAs recording similar results in at least two studies. The present model is a useful insight into studying potential prognostic, diagnostic, and therapeutic miRNAs. PMID:25192291

  17. Multistep model of cervical cancer: participation of miRNAs and coding genes.

    PubMed

    Granados López, Angelica Judith; López, Jesús Adrián

    2014-09-04

    Aberrant miRNA expression is well recognized as an important step in the development of cancer. Close to 70 microRNAs (miRNAs) have been implicated in cervical cancer up to now, nevertheless it is unknown if aberrant miRNA expression causes the onset of cervical cancer. One of the best ways to address this issue is through a multistep model of carcinogenesis. In the progression of cervical cancer there are three well-established steps to reach cancer that we used in the model proposed here. The first step of the model comprises the gene changes that occur in normal cells to be transformed into immortal cells (CIN 1), the second comprises immortal cell changes to tumorigenic cells (CIN 2), the third step includes cell changes to increase tumorigenic capacity (CIN 3), and the final step covers tumorigenic changes to carcinogenic cells. Altered miRNAs and their target genes are located in each one of the four steps of the multistep model of carcinogenesis. miRNA expression has shown discrepancies in different works; therefore, in this model we include miRNAs recording similar results in at least two studies. The present model is a useful insight into studying potential prognostic, diagnostic, and therapeutic miRNAs.

  18. PSO-MISMO modeling strategy for multistep-ahead time series prediction.

    PubMed

    Bao, Yukun; Xiong, Tao; Hu, Zhongyi

    2014-05-01

    Multistep-ahead time series prediction is one of the most challenging research topics in the field of time series modeling and prediction, and is continually under research. Recently, the multiple-input several multiple-outputs (MISMO) modeling strategy has been proposed as a promising alternative for multistep-ahead time series prediction, exhibiting advantages compared with the two currently dominating strategies, the iterated and the direct strategies. Built on the established MISMO strategy, this paper proposes a particle swarm optimization (PSO)-based MISMO modeling strategy, which is capable of determining the number of sub-models in a self-adaptive mode, with varying prediction horizons. Rather than deriving crisp divides with equal-size s prediction horizons from the established MISMO, the proposed PSO-MISMO strategy, implemented with neural networks, employs a heuristic to create flexible divides with varying sizes of prediction horizons and to generate corresponding sub-models, providing considerable flexibility in model construction, which has been validated with simulated and real datasets.

  19. A new multistep Ca2+-induced cold gelation process for beta-lactoglobulin.

    PubMed

    Veerman, Cecile; Baptist, Harry; Sagis, Leonard M C; van der Linden, Erik

    2003-06-18

    The objective of this study was to obtain beta-lactoglobulin (beta-lg) gels at very low protein concentrations using a new multistep Ca(2+)-induced cold gelation process. In the conventional cold gelation process, salt free beta-lg solutions were heated at neutral pH, cooled, and cross-linked by adding salts. In our new process, first, long linear beta-lg fibrils were formed at pH 2. Solutions of these fibrils were cooled, and subsequently, the pH was adjusted to 7 or 8. Transmission electron microscopy studies showed that the long linear fibrils formed at pH 2 were stable when the pH was adjusted to 7 or 8. In the final step, the fibrils were cross-linked using CaCl(2). Using rheological measurements, the critical percolation concentration was determined. In the new multistep cold gelation process, the critical percolation concentration was an order of magnitude lower than in the conventional cold gelation method.

  20. Electrochemical Biosensor: Multistep functionalization of thiolated ssDNA on gold-coated microcantilever

    NASA Astrophysics Data System (ADS)

    Dulanto Carbajal, Jorge

    Bio-chemical sensors are an emerging and vibrant area of research. The use of micromechanical cantilevers is relatively new as biomechanical recognition detectors. Reactions on a gold coated and chemically functionalized surface produce a mechanical deflection of the cantilever which is used as the input signal of the detector. Within the area of biosensors, DNA-sensors have a wide range of applications such as DNA hybridization detectors, DNA mismatch sequence detectors and protein detectors. We designed and built a microcantilever sensor system which allows for control and characterization of surface conditions. This includes controlled functionalization which can be a dominant factor in signal generation and reproducibility in these systems. Additionally, we developed a multistep functionalization protocol which consists of a sequence of short incubations and characterizations of thiolated ssDNA on a gold-coated cantilever. Multistep functionalization is a new protocol that is used to control the ssDNA surface density on a gold-coated cantilever. Repeatable responses and feasible biosensors are obtained using this protocol.

  1. Chiral transformation in protonated and deprotonated adipic acids through multistep internal proton transfer.

    PubMed

    Min, Seung Kyu; Park, Mina; Singh, N Jiten; Lee, Han Myoung; Lee, Eun Cheol; Kim, Kwang S; Lagutschenkov, Anita; Niedner-Schatteburg, Gereon

    2010-09-10

    Protonated and deprotonated adipic acids (PAA: HOOC-(CH(2))(4)--COOH(2) (+) and DAA: HOOC-(CH(2))(4)-COO(-)) have a charged hydrogen bond under the influence of steric constraint due to the molecular skeleton of a circular ring. Despite the similarity between PAA and DAA, it is surprising that the lowest energy structure of PAA is predicted to have (H(2)O...H...OH(2))(+) Zundel-like symmetric hydrogen bonding, whereas that of DAA has H(3)O(+) Eigen-like asymmetric hydrogen bonding. The energy profiles show that direct proton transfer between mirror image structures is unfavorable. Instead, the chiral transformation is possible by subsequent backbone twistings through stepwise proton transfer along multistep intermediate structures, which are Zundel-like ions for PAA and Eigen-like ions for DAA. This type of chiral transformation by multistep intramolecular proton transfers is unprecedented. Several prominent OH...O short hydrogen-bond stretching peaks are predicted in the range of 1000-1700 cm(-1) in the Car-Parrinello molecular dynamics (CPMD) simulations, which show distinctive signatures different from ordinary hydrogen-bond peaks. The O-H-O stretching peaks in the range of 1800-2700 cm(-1) become insignificant above around 150 K and are almost washed out at about 300 K.

  2. Adaptive multi-step Full Waveform Inversion based on Waveform Mode Decomposition

    NASA Astrophysics Data System (ADS)

    Hu, Yong; Han, Liguo; Xu, Zhuo; Zhang, Fengjiao; Zeng, Jingwen

    2017-04-01

    Full Waveform Inversion (FWI) can be used to build high resolution velocity models, but there are still many challenges in seismic field data processing. The most difficult problem is about how to recover long-wavelength components of subsurface velocity models when seismic data is lacking of low frequency information and without long-offsets. To solve this problem, we propose to use Waveform Mode Decomposition (WMD) method to reconstruct low frequency information for FWI to obtain a smooth model, so that the initial model dependence of FWI can be reduced. In this paper, we use adjoint-state method to calculate the gradient for Waveform Mode Decomposition Full Waveform Inversion (WMDFWI). Through the illustrative numerical examples, we proved that the low frequency which is reconstructed by WMD method is very reliable. WMDFWI in combination with the adaptive multi-step inversion strategy can obtain more faithful and accurate final inversion results. Numerical examples show that even if the initial velocity model is far from the true model and lacking of low frequency information, we still can obtain good inversion results with WMD method. From numerical examples of anti-noise test, we see that the adaptive multi-step inversion strategy for WMDFWI has strong ability to resist Gaussian noise. WMD method is promising to be able to implement for the land seismic FWI, because it can reconstruct the low frequency information, lower the dominant frequency in the adjoint source, and has a strong ability to resist noise.

  3. Forming Analysis of AZ31 Magnesium Alloy Sheets by Means of a Multistep Inverse Approach

    SciTech Connect

    Nguyen, Ba Nghiep; Bapanapalli, Satish K.

    2009-04-01

    This paper applies a multi-step inverse approach to predict the forming of AZ31 magnesium alloy sheets. An in-house finite element code named “INAPH”, which implements the inverse approach formulation by Guo et al. (Int. J. Numer. Methods Eng., 30, 1385-1401), has been used for the forming analysis. This inverse approach uses the deformation theory of plasticity and assumes that the deformation is independent of the loading history. Failure during forming is predicted by a stress-based criterion or a forming limit diagram-based criterion. The INAPH predictions have been compared with experimental results of Takuda et al (Journal of Materials Processing Technology, 89-90:135-140) and incremental analysis using ABAQUS. The multi-step inverse analysis has been shown to very quickly and fairly accurately predict stress, plastic strain, thickness distributions and failure locations on deeply drawn parts made of AZ31 magnesium alloy. The capability of INAPH to predict the formability of magnesium alloys has also been demonstrated at various temperatures. As magnesium alloys possess very limited formability at room temperature, and their formability becomes better at higher temperatures (> 100oC), the inverse analysis constitutes an efficient and valuable tool to predict forming of magnesium alloy parts as a function of temperature. In addition, other processing and design parameters such as the initial dimensions, final desired shape, blank holder forces, and friction can be quickly adjusted to assess the forming feasibility.

  4. Workflow automation architecture standard

    SciTech Connect

    Moshofsky, R.P.; Rohen, W.T.

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  5. Multistep generalized transformation method applied to solving equations of discrete and continuous time-fractional enzyme kinetics

    NASA Astrophysics Data System (ADS)

    Vosika, Z.; Mitić, V. V.; Vasić, A.; Lazović, G.; Matija, L.; Kocić, Lj. M.

    2017-03-01

    In this paper, Caputo based Michaelis-Menten kinetic model based on Time Scale Calculus (TSC) is proposed. The main reason for its consideration is a study of tumor cells population growth dynamics. In the particular case discrete-continuous time kinetics, Michaelis-Menten model is numerically treated, using a new algorithm proposed by authors, called multistep generalized difference transformation method (MSGDETM). In addition numerical simulations are performed and is shown that it represents the upgrade of the multi-step variant of generalized differential transformation method (MSGDTM). A possible conditions for its further development are discussed and possible experimental verification is described.

  6. A sandwich-type DNA electrochemical biosensor for hairpin-stem-loop structure based on multistep temperature-controlling method

    PubMed Central

    Hong, Guolin; Liu, Yinhuan; Chen, Wei; Weng, Shaohuang; Liu, Qicai; Liu, Ailin; Zheng, Daoxin; Lin, Xinhua

    2012-01-01

    A highly sensitive and selective method for amplified electrochemical detection for hairpin-stem-loop structured target sequences was developed based on the temperature regulation of DNA hybrids on a sandwich-type electrochemical DNA sensor. Multistep hybridization was applied to promote the hybridization efficiency of each section of sandwich structure. The results showed that both multistep and temperature-controlling hybridization techniques were both especially made to fabricate the sensor for the tendency of internal hybridization of target gene sequences. This strategy provides significantly enhanced hybridization efficiency and sequence specificity of electrochemical detection. PMID:23028223

  7. Multi-Step Bidirectional NDR Characteristics in Si/Si1-xGex/Si DHBTs and Their Temperature Dependence

    NASA Astrophysics Data System (ADS)

    Xu, D. X.; Shen, G. D.; Willander, M.; Hansson, G. V.

    1988-11-01

    Novel bidirectional negative differential resistance (NDR) phenomena have been observed at room temperature in strained base n-Si/p-Si1-xGex/n-Si double heterojunction bipolar transistors (DHBTs). A strong and symmetric bidirectional NDR modulated by base bias, together with a multi-step characteristic in collector current IC vs emitter-collector bias voltage VCE, was obtained in the devices with very thin base. The temperature dependence of the NDR and the multi-step IC-VCE characteristics has been measured to identify the possible transport mechanism. The physical origins of these phenomena are discussed.

  8. Automation in Clinical Microbiology

    PubMed Central

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  9. Shoe-String Automation

    SciTech Connect

    Duncan, M.L.

    2001-07-30

    Faced with a downsizing organization, serious budget reductions and retirement of key metrology personnel, maintaining capabilities to provide necessary services to our customers was becoming increasingly difficult. It appeared that the only solution was to automate some of our more personnel-intensive processes; however, it was crucial that the most personnel-intensive candidate process be automated, at the lowest price possible and with the lowest risk of failure. This discussion relates factors in the selection of the Standard Leak Calibration System for automation, the methods of automation used to provide the lowest-cost solution and the benefits realized as a result of the automation.

  10. Dynamical genetic programming in XCSF.

    PubMed

    Preen, Richard J; Bull, Larry

    2013-01-01

    A number of representation schemes have been presented for use within learning classifier systems, ranging from binary encodings to artificial neural networks. This paper presents results from an investigation into using a temporally dynamic symbolic representation within the XCSF learning classifier system. In particular, dynamical arithmetic networks are used to represent the traditional condition-action production system rules to solve continuous-valued reinforcement learning problems and to perform symbolic regression, finding competitive performance with traditional genetic programming on a number of composite polynomial tasks. In addition, the network outputs are later repeatedly sampled at varying temporal intervals to perform multistep-ahead predictions of a financial time series.

  11. Automated Design of Quantum Circuits

    NASA Technical Reports Server (NTRS)

    Williams, Colin P.; Gray, Alexander G.

    2000-01-01

    In order to design a quantum circuit that performs a desired quantum computation, it is necessary to find a decomposition of the unitary matrix that represents that computation in terms of a sequence of quantum gate operations. To date, such designs have either been found by hand or by exhaustive enumeration of all possible circuit topologies. In this paper we propose an automated approach to quantum circuit design using search heuristics based on principles abstracted from evolutionary genetics, i.e. using a genetic programming algorithm adapted specially for this problem. We demonstrate the method on the task of discovering quantum circuit designs for quantum teleportation. We show that to find a given known circuit design (one which was hand-crafted by a human), the method considers roughly an order of magnitude fewer designs than naive enumeration. In addition, the method finds novel circuit designs superior to those previously known.

  12. A multistep approach to single nucleotide polymorphism-set analysis: an evaluation of power and type I error of gene-based tests of association after pathway-based association tests.

    PubMed

    Valcarcel, Alessandra; Grinde, Kelsey; Cook, Kaitlyn; Green, Alden; Tintle, Nathan

    2016-01-01

    The aggregation of functionally associated variants given a priori biological information can aid in the discovery of rare variants associated with complex diseases. Many methods exist that aggregate rare variants into a set and compute a single p value summarizing association between the set of rare variants and a phenotype of interest. These methods are often called gene-based, rare variant tests of association because the variants in the set are often all contained within the same gene. A reasonable extension of these approaches involves aggregating variants across an even larger set of variants (eg, all variants contained in genes within a pathway). Testing sets of variants such as pathways for association with a disease phenotype reduces multiple testing penalties, may increase power, and allows for straightforward biological interpretation. However, a significant variant-set association test does not indicate precisely which variants contained within that set are causal. Because pathways often contain many variants, it may be helpful to follow-up significant pathway tests by conducting gene-based tests on each gene in that pathway to narrow in on the region of causal variants. In this paper, we propose such a multistep approach for variant-set analysis that can also account for covariates and complex pedigree structure. We demonstrate this approach on simulated phenotypes from Genetic Analysis Workshop 19. We find generally better power for the multistep approach when compared to a more conventional, single-step approach that simply runs gene-based tests of association on each gene across the genome. Further work is necessary to evaluate the multistep approach on different data sets with different characteristics.

  13. Automated DNA Sequencing System

    SciTech Connect

    Armstrong, G.A.; Ekkebus, C.P.; Hauser, L.J.; Kress, R.L.; Mural, R.J.

    1999-04-25

    Oak Ridge National Laboratory (ORNL) is developing a core DNA sequencing facility to support biological research endeavors at ORNL and to conduct basic sequencing automation research. This facility is novel because its development is based on existing standard biology laboratory equipment; thus, the development process is of interest to the many small laboratories trying to use automation to control costs and increase throughput. Before automation, biology Laboratory personnel purified DNA, completed cycle sequencing, and prepared 96-well sample plates with commercially available hardware designed specifically for each step in the process. Following purification and thermal cycling, an automated sequencing machine was used for the sequencing. A technician handled all movement of the 96-well sample plates between machines. To automate the process, ORNL is adding a CRS Robotics A- 465 arm, ABI 377 sequencing machine, automated centrifuge, automated refrigerator, and possibly an automated SpeedVac. The entire system will be integrated with one central controller that will direct each machine and the robot. The goal of this system is to completely automate the sequencing procedure from bacterial cell samples through ready-to-be-sequenced DNA and ultimately to completed sequence. The system will be flexible and will accommodate different chemistries than existing automated sequencing lines. The system will be expanded in the future to include colony picking and/or actual sequencing. This discrete event, DNA sequencing system will demonstrate that smaller sequencing labs can achieve cost-effective the laboratory grow.

  14. Variation of nanopore diameter along porous anodic alumina channels by multi-step anodization.

    PubMed

    Lee, Kwang Hong; Lim, Xin Yuan; Wai, Kah Wing; Romanato, Filippo; Wong, Chee Cheong

    2011-02-01

    In order to form tapered nanocapillaries, we investigated a method to vary the nanopore diameter along the porous anodic alumina (PAA) channels using multi-step anodization. By anodizing the aluminum in either single acid (H3PO4) or multi-acid (H2SO4, oxalic acid and H3PO4) with increasing or decreasing voltage, the diameter of the nanopore along the PAA channel can be varied systematically corresponding to the applied voltages. The pore size along the channel can be enlarged or shrunken in the range of 20 nm to 200 nm. Structural engineering of the template along the film growth direction can be achieved by deliberately designing a suitable voltage and electrolyte together with anodization time.

  15. The solution of Parrondo’s games with multi-step jumps

    NASA Astrophysics Data System (ADS)

    Saakian, David B.

    2016-04-01

    We consider the general case of Parrondo’s games, when there is a finite probability to stay in the current state as well as multi-step jumps. We introduce a modification of the model: the transition probabilities between different games depend on the choice of the game in the previous round. We calculate the rate of capital growth as well as the variance of the distribution, following large deviation theory. The modified model allows higher capital growth rates than in standard Parrondo games for the range of parameters considered in the key articles about these games, and positive capital growth is possible for a much wider regime of parameters of the model.

  16. Multistep divergent synthesis of benzimidazole linked benzoxazole/benzothiazole via copper catalyzed domino annulation.

    PubMed

    Liao, Jen-Yu; Selvaraju, Manikandan; Chen, Chih-Hau; Sun, Chung-Ming

    2013-04-21

    An efficient, facile synthesis of structurally diverse benzimidazole integrated benzoxazole and benzothiazoles has been developed. In a multi-step synthetic sequence, 4-fluoro-3-nitrobenzoic acid was converted into benzimidazole bis-heterocycles, via the intermediacy of benzimidazole linked ortho-chloro amines. The amphiphilic reactivity of this intermediate was designed to achieve the title compounds by the reaction of various acid chlorides and isothiocyanates in a single step through the in situ formation of ortho-chloro anilides and thioureas under microwave irradiation. A versatile one pot domino annulation reaction was developed to involve the reaction of benzimidazole linked ortho-chloro amines with acid chlorides and isothiocyanates. The initial acylation and urea formation followed by copper catalyzed intramolecular C-O and C-S cross coupling reactions furnished the angularly oriented bis-heterocycles which bear a close resemblance to the streptomyces antibiotic UK-1.

  17. A multistep photothermic-driven drug release system using wire-framed Au nanobundles.

    PubMed

    Bang, Doyeon; Lee, Taeksu; Choi, Jihye; Park, Yeonji; Kim, Eunkyoung; Huh, Yong-Min; Haam, Seungjoo

    2015-01-28

    Here, wire-framed Au nanobundles (WNBs), which consist of randomly oriented and mutually connected Au wires to form a bundle shape, are synthesized. In contrast to conventional nanoparticles (spheres, rods, cubes, and stars), which exhibit nanostructure only on the surface, cross-sectional view image shows that WNBs have nanostructures in a whole volume. By using this specific property of WNBs, an externally controllable multistep photothermic-driven drug release (PDR) system is demonstrated for in vivo cancer treatment. In contrast to conventional nanoparticles that encapsulate a drug on their surface, WNBs preserve the drug payload in the overall inner volume, providing a drug loading capacity sufficient for cancer therapy. An improved in vivo therapeutic efficacy of PDR therapy is also demonstrated by delivering sufficient amount of drugs to the target tumor region.

  18. A multistep game of kind between two economic systems under complete information

    SciTech Connect

    Malyukov, V.P.; Linder, N.V.

    1995-03-01

    We consider the problem of conflict interaction between two economic systems described by Leontief type single-commodity dynamic models. The problem is solved in the framework of a positional multistep game of kind under complete information assuming several terminal sets. The solution constructs optimality sets of the players that control the dynamic models and finds the winning strategies. As in the analysis of related games, the solution of the problem is found to depend on the relationship of the parameters that define the conflict interaction. Contrary to, our study does not assume nonnegativity of some of the game parameters. The solution is obtained for all possible relationships of game parameters; examples of a conflict interaction in dynamic economic models are given.

  19. Conjugate symplecticity of second-order linear multi-step methods

    NASA Astrophysics Data System (ADS)

    Feng, Quan-Dong; Jiao, Yan-Dong; Tang, Yi-Fa

    2007-06-01

    We review the two different approaches for symplecticity of linear multi-step methods (LMSM) by Eirola and Sanz-Serna, Ge and Feng, and by Feng and Tang, Hairer and Leone, respectively, and give a numerical example between these two approaches. We prove that in the conjugate relation with and being LMSMs, if is symplectic, then the B-series error expansions of , and of the form are equal to those of trapezoid, mid-point and Euler forward schemes up to a parameter [theta] (completely the same when [theta]=1), respectively, this also partially solves a problem due to Hairer. In particular we indicate that the second-order symmetric leap-frog scheme Z2=Z0+2[tau]J-1[backward difference]H(Z1) cannot be conjugate-symplectic via another LMSM.

  20. Controlled growth of SnO(2) hierarchical nanostructures by a multistep thermal vapor deposition process.

    PubMed

    Sun, Shuhui; Meng, Guowen; Zhang, Gaixia; Masse, Jean-Philippe; Zhang, Lide

    2007-01-01

    Branched and sub-branched SnO(2) hierarchical architectures in which numerous aligned nanowires grew on the surface of nanobelt substrates have been obtained by a multistep thermal vapor deposition route. Branch size and morphology can be controlled by adjusting the temperature and duration of growth. The same approach was used to grow branched ZnO-SnO(2) heterojunction nanostructures. In addition, the third level of SnO(2) nanostructures was obtained by repeating the vapor deposition growth process. This technique provides a general, facile, and convenient approach for preparing even more complex nanoarchitectures, and should open up new opportunities for both fundamental research and applications, such as nanobelt-based three-dimensional nanodevices.

  1. Numerical multistep methods for the efficient solution of quantum mechanics and related problems

    NASA Astrophysics Data System (ADS)

    Anastassi, Z. A.; Simos, T. E.

    2009-10-01

    In this paper we present the recent development in the numerical integration of the Schrödinger equation and related systems of ordinary differential equations with oscillatory solutions, such as the N-body problem. We examine several types of multistep methods (explicit, implicit, predictor-corrector, hybrid) and several properties (P-stability, trigonometric fitting of various orders, phase fitting, high phase-lag order, algebraic order). We analyze the local truncation error and the stability of the methods. The error for the Schrödinger equation is also presented, which reveals the relation of the error to the energy. The efficiency of the methods is evaluated through the integration of five problems. Figures are presented and analyzed and some general conclusions are made. Code written in Maple is given for the development of all methods analyzed in this paper. Also the subroutines written in Matlab, that concern the integration of the methods, are presented.

  2. A multistep ac electrodeposition method to prepare Co nanowires with high coercivity

    NASA Astrophysics Data System (ADS)

    Wang, Pangpang; Gao, Lumei; Qiu, Zhiyong; Song, Xiaoping; Wang, Liqun; Yang, Sen; Murakami, Ri-ichi

    2008-09-01

    It is known that the ac electrodeposition method with low current density can grow compact metal nanowires, but the length of those nanowires is very short. In contrast, the ac electrodeposition method with high current density can grow long metal nanowires. However these long nanowires are not compact and contain lots of defects. In this paper, we describe a multistep ac electrodeposition method to fabricate long metal nanowires with compact structure uniformly filled upon a porous anodic aluminum oxide template. Using this method, Co nanowires with high coercivity (Hc ∥) and remnant ratio (Mr/Ms) have been prepared under relatively low deposition current density. The Co nanowires exhibited obvious magnetic anisotropy with the easy axis along the axial direction of nanowires. The maximal Hc ∥ (2900 Oe) and Mr/Ms (0.95) were optimal for the perpendicular magnetic recording materials. The magnetic microstructure of Co nanowires is also discussed in this paper.

  3. Multistep modeling of protein structure: application towards refinement of tyr-tRNA synthetase

    NASA Technical Reports Server (NTRS)

    Srinivasan, S.; Shibata, M.; Roychoudhury, M.; Rein, R.

    1987-01-01

    The scope of multistep modeling (MSM) is expanding by adding a least-squares minimization step in the procedure to fit backbone reconstruction consistent with a set of C-alpha coordinates. The analytical solution of Phi and Psi angles, that fits a C-alpha x-ray coordinate is used for tyr-tRNA synthetase. Phi and Psi angles for the region where the above mentioned method fails, are obtained by minimizing the difference in C-alpha distances between the computed model and the crystal structure in a least-squares sense. We present a stepwise application of this part of MSM to the determination of the complete backbone geometry of the 321 N terminal residues of tyrosine tRNA synthetase to a root mean square deviation of 0.47 angstroms from the crystallographic C-alpha coordinates.

  4. A Multi-Step Assessment Scheme for Seismic Network Site Selection in Densely Populated Areas

    NASA Astrophysics Data System (ADS)

    Plenkers, Katrin; Husen, Stephan; Kraft, Toni

    2015-10-01

    We developed a multi-step assessment scheme for improved site selection during seismic network installation in densely populated areas. Site selection is a complex process where different aspects (seismic background noise, geology, and financing) have to be taken into account. In order to improve this process, we developed a step-wise approach that allows quantifying the quality of a site by using, in addition to expert judgement and test measurements, two weighting functions as well as reference stations. Our approach ensures that the recording quality aimed for is reached and makes different sites quantitatively comparable to each other. Last but not least, it is an easy way to document the decision process, because all relevant parameters are listed, quantified, and weighted.

  5. Progressive Enrichment of Stemness Features and Tumor Stromal Alterations in Multistep Hepatocarcinogenesis

    PubMed Central

    Rhee, Hyungjin; Kim, Haeryoung; Ahn, Ei Yong; Choi, Jin Sub; Roncalli, Massimo; Park, Young Nyun

    2017-01-01

    Cancer stem cells (CSCs), a subset of tumor cells, contribute to an aggressive biological behavior, which is also affected by the tumor stroma. Despite the role of CSCs and the tumor stroma in hepatocellular carcinoma (HCC), features of stemness have not yet been studied in relation to tumor stromal alterations in multistep hepatocarcinogenesis. We investigated the expression status of stemness markers and tumor stromal changes in B viral carcinogenesis, which is the main etiology of HCC in Asia. Stemness features of tumoral hepatocytes (EpCAM, K19, Oct3/4, c-KIT, c-MET, and CD133), and tumor stromal cells expressing α-smooth muscle actin (α-SMA), CD68, CD163, and IL-6 were analyzed in 36 low grade dysplastic nodules (DNs), 48 high grade DNs, 30 early HCCs (eHCCs), and 51 progressed HCCs (pHCCs) by immunohistochemistry or real-time PCR. Stemness features (EpCAM and K19 in particular) were progressively acquired during hepatocarcinogenesis in combination with enrichment of stromal cells (CAFs, TAMs, IL-6+ cells). Stemness features were seen sporadically in DNs, more consistent in eHCCs, and peaked in pHCCs. Likewise, stromal cells were discernable in DNs, showed up as consistent cell densities in eHCCs and peaked in pHCCs. The stemness features and tumor stromal alterations also peaked in less differentiated or larger HCCs. In conclusion, progression of B viral multistep hepatocarcinogenesis is characterized by an enrichment of stemness features of neoplastic hepatocytes and a parallel alteration of the tumor stroma. The modulation of neoplastic hepatocytes and stromal cells was at low levels in precancerous lesions (DNs), consistently increased in incipient cancer (eHCCs) and peaked in pHCCs. Thus, in B viral hepatocarcinogenesis, interactions between CSCs and the tumor stroma, although starting early, seem to play a major role in tumor progression. PMID:28114366

  6. Progressive Enrichment of Stemness Features and Tumor Stromal Alterations in Multistep Hepatocarcinogenesis.

    PubMed

    Yoo, Jeong Eun; Kim, Young-Joo; Rhee, Hyungjin; Kim, Haeryoung; Ahn, Ei Yong; Choi, Jin Sub; Roncalli, Massimo; Park, Young Nyun

    2017-01-01

    Cancer stem cells (CSCs), a subset of tumor cells, contribute to an aggressive biological behavior, which is also affected by the tumor stroma. Despite the role of CSCs and the tumor stroma in hepatocellular carcinoma (HCC), features of stemness have not yet been studied in relation to tumor stromal alterations in multistep hepatocarcinogenesis. We investigated the expression status of stemness markers and tumor stromal changes in B viral carcinogenesis, which is the main etiology of HCC in Asia. Stemness features of tumoral hepatocytes (EpCAM, K19, Oct3/4, c-KIT, c-MET, and CD133), and tumor stromal cells expressing α-smooth muscle actin (α-SMA), CD68, CD163, and IL-6 were analyzed in 36 low grade dysplastic nodules (DNs), 48 high grade DNs, 30 early HCCs (eHCCs), and 51 progressed HCCs (pHCCs) by immunohistochemistry or real-time PCR. Stemness features (EpCAM and K19 in particular) were progressively acquired during hepatocarcinogenesis in combination with enrichment of stromal cells (CAFs, TAMs, IL-6+ cells). Stemness features were seen sporadically in DNs, more consistent in eHCCs, and peaked in pHCCs. Likewise, stromal cells were discernable in DNs, showed up as consistent cell densities in eHCCs and peaked in pHCCs. The stemness features and tumor stromal alterations also peaked in less differentiated or larger HCCs. In conclusion, progression of B viral multistep hepatocarcinogenesis is characterized by an enrichment of stemness features of neoplastic hepatocytes and a parallel alteration of the tumor stroma. The modulation of neoplastic hepatocytes and stromal cells was at low levels in precancerous lesions (DNs), consistently increased in incipient cancer (eHCCs) and peaked in pHCCs. Thus, in B viral hepatocarcinogenesis, interactions between CSCs and the tumor stroma, although starting early, seem to play a major role in tumor progression.

  7. Star sub-pixel centroid calculation based on multi-step minimum energy difference method

    NASA Astrophysics Data System (ADS)

    Wang, Duo; Han, YanLi; Sun, Tengfei

    2013-09-01

    The star's centroid plays a vital role in celestial navigation, star images which be gotten during daytime, due to the strong sky background, have a low SNR, and the star objectives are nearly submerged in the background, takes a great trouble to the centroid localization. Traditional methods, such as a moment method, weighted centroid calculation method is simple but has a big error, especially in the condition of a low SNR. Gaussian method has a high positioning accuracy, but the computational complexity. Analysis of the energy distribution in star image, a location method for star target centroids based on multi-step minimum energy difference is proposed. This method uses the linear superposition to narrow the centroid area, in the certain narrow area uses a certain number of interpolation to pixels for the pixels' segmentation, and then using the symmetry of the stellar energy distribution, tentatively to get the centroid position: assume that the current pixel is the star centroid position, and then calculates and gets the difference of the sum of the energy which in the symmetric direction(in this paper we take the two directions of transverse and longitudinal) and the equal step length(which can be decided through different conditions, the paper takes 9 as the step length) of the current pixel, and obtain the centroid position in this direction when the minimum difference appears, and so do the other directions, then the validation comparison of simulated star images, and compare with several traditional methods, experiments shows that the positioning accuracy of the method up to 0.001 pixel, has good effect to calculate the centroid of low SNR conditions; at the same time, uses this method on a star map which got at the fixed observation site during daytime in near-infrared band, compare the results of the paper's method with the position messages which were known of the star, it shows that :the multi-step minimum energy difference method achieves a better

  8. Continuous multistep synthesis of perillic acid from limonene by catalytic biofilms under segmented flow.

    PubMed

    Willrodt, Christian; Halan, Babu; Karthaus, Lisa; Rehdorf, Jessica; Julsing, Mattijs K; Buehler, Katja; Schmid, Andreas

    2017-02-01

    The efficiency of biocatalytic reactions involving industrially interesting reactants is often constrained by toxification of the applied biocatalyst. Here, we evaluated the combination of biologically and technologically inspired strategies to overcome toxicity-related issues during the multistep oxyfunctionalization of (R)-(+)-limonene to (R)-(+)-perillic acid. Pseudomonas putida GS1 catalyzing selective limonene oxidation via the p-cymene degradation pathway and recombinant Pseudomonas taiwanensis VLB120 were evaluated for continuous perillic acid production. A tubular segmented-flow biofilm reactor was used in order to relieve oxygen limitations and to enable membrane mediated substrate supply as well as efficient in situ product removal. Both P. putida GS1 and P. taiwanensis VLB120 developed a catalytic biofilm in this system. The productivity of wild-type P. putida GS1 encoding the enzymes for limonene bioconversion was highly dependent on the carbon source and reached 34 g Ltube(-1)  day(-1) when glycerol was supplied. More than 10-fold lower productivities were reached irrespective of the applied carbon source when the recombinant P. taiwanensis VLB120 harboring p-cymene monooxygenase and p-cumic alcohol dehydrogenase was used as biocatalyst. The technical applicability for preparative perillic acid synthesis in the applied system was verified by purification of perillic acid from the outlet stream using an anion exchanger resin. This concept enabled the multistep production of perillic acid and which might be transferred to other reactions involving volatile reactants and toxic end-products. Biotechnol. Bioeng. 2017;114: 281-290. © 2016 Wiley Periodicals, Inc.

  9. Multistep soft chemistry method for valence reduction in transition metal oxides with triangular (CdI2-type) layers.

    PubMed

    Blakely, Colin K; Bruno, Shaun R; Poltavets, Viktor V

    2014-03-14

    Transition metal (M) oxides with MO2 triangular layers demonstrate a variety of physical properties depending on the metal oxidation states. In the known compounds, metal oxidation states are limited to either 3+ or mixed-valent 3+/4+. A multistep soft chemistry synthetic route for novel phases with M(2+/3+)O2 triangular layers is reported.

  10. Novel concept microarray enabling PCR and multistep reactions through pipette-free aperture-to-aperture parallel transfer

    PubMed Central

    2010-01-01

    Background The microarray has contributed to developing the omic analysis. However, as it depends basically on the surface reaction, it is hard to perform bulk reactions and sequential multistep reactions. On the other hand, the popular microplate technology, which has a great merit of being able to perform parallel multistep reactions, has come to its limit in increasing the number of wells (currently, up to 9600) and reducing the volume to deal with due to the difficulty in operations. Results Here, we report a novel microarray technology which enables us to explore advanced applications, termed microarray-with-manageable volumes (MMV). The technical essence is in the pipette-free direct parallel transfer from well to well performed by centrifugation, evading the evaporation and adsorption-losses during handling. By developing the MMV plate, accompanying devices and techniques, generation of multiple conditions (256 kinds) and performance of parallel multistep reactions, including PCR and in vitro translation reactions, have been made possible. These were demonstrated by applying the MMV technology to searching lysozyme-crystallizing conditions and selecting peptides aimed for Aβ-binding or cathepsin E-inhibition. Conclusions With the introduction of a novel concept microarray (MMV) technology, parallel and multistep reactions in sub-μL scale have become possible. PMID:20923572

  11. Synthesis of Frontalin, the Aggregation Pheromone of the Southern Pine Beetle: A Multistep Organic Synthesis for Undergraduate Students.

    ERIC Educational Resources Information Center

    Bartlett, Paul A.; And Others

    1984-01-01

    Background information and experimental procedures are provided for the multistep synthesis of frontalin. The experiment exposes students to a range of practical laboratory problems and important synthetic reactions and provides experiences in working on a medium-size, as well as a relatively small-size scale. (JN)

  12. A Multistep Organocatalysis Experiment for the Undergraduate Organic Laboratory: An Enantioselective Aldol Reaction Catalyzed by Methyl Prolinamide

    ERIC Educational Resources Information Center

    Wade, Edmir O.; Walsh, Kenneth E.

    2011-01-01

    In recent years, there has been an explosion of research concerning the area of organocatalysis. A multistep capstone laboratory project that combines traditional reactions frequently found in organic laboratory curriculums with this new field of research is described. In this experiment, the students synthesize a prolinamide-based organocatalyst…

  13. Synthesis of 10-Ethyl Flavin: A Multistep Synthesis Organic Chemistry Laboratory Experiment for Upper-Division Undergraduate Students

    ERIC Educational Resources Information Center

    Sichula, Vincent A.

    2015-01-01

    A multistep synthesis of 10-ethyl flavin was developed as an organic chemistry laboratory experiment for upper-division undergraduate students. Students synthesize 10-ethyl flavin as a bright yellow solid via a five-step sequence. The experiment introduces students to various hands-on experimental organic synthetic techniques, such as column…

  14. Synthesis of Two Local Anesthetics from Toluene: An Organic Multistep Synthesis in a Project-Oriented Laboratory Course

    ERIC Educational Resources Information Center

    Demare, Patricia; Regla, Ignacio

    2012-01-01

    This article describes one of the projects in the advanced undergraduate organic chemistry laboratory course concerning the synthesis of two local anesthetic drugs, prilocaine and benzocaine, with a common three-step sequence starting from toluene. Students undertake, in a several-week independent project, the multistep synthesis of a…

  15. Management Planning for Workplace Automation.

    ERIC Educational Resources Information Center

    McDole, Thomas L.

    Several factors must be considered when implementing office automation. Included among these are whether or not to automate at all, the effects of automation on employees, requirements imposed by automation on the physical environment, effects of automation on the total organization, and effects on clientele. The reasons behind the success or…

  16. Laboratory Automation and Middleware.

    PubMed

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory.

  17. Automated DNA extraction from pollen in honey.

    PubMed

    Guertler, Patrick; Eicheldinger, Adelina; Muschler, Paul; Goerlich, Ottmar; Busch, Ulrich

    2014-04-15

    In recent years, honey has become subject of DNA analysis due to potential risks evoked by microorganisms, allergens or genetically modified organisms. However, so far, only a few DNA extraction procedures are available, mostly time-consuming and laborious. Therefore, we developed an automated DNA extraction method from pollen in honey based on a CTAB buffer-based DNA extraction using the Maxwell 16 instrument and the Maxwell 16 FFS Nucleic Acid Extraction System, Custom-Kit. We altered several components and extraction parameters and compared the optimised method with a manual CTAB buffer-based DNA isolation method. The automated DNA extraction was faster and resulted in higher DNA yield and sufficient DNA purity. Real-time PCR results obtained after automated DNA extraction are comparable to results after manual DNA extraction. No PCR inhibition was observed. The applicability of this method was further successfully confirmed by analysis of different routine honey samples.

  18. Automating checks of plan check automation.

    PubMed

    Halabi, Tarek; Lu, Hsiao-Ming

    2014-07-08

    While a few physicists have designed new plan check automation solutions for their clinics, fewer, if any, managed to adapt existing solutions. As complex and varied as the systems they check, these programs must gain the full confidence of those who would run them on countless patient plans. The present automation effort, planCheck, therefore focuses on versatility and ease of implementation and verification. To demonstrate this, we apply planCheck to proton gantry, stereotactic proton gantry, stereotactic proton fixed beam (STAR), and IMRT treatments.

  19. WANTED: Fully Automated Indexing.

    ERIC Educational Resources Information Center

    Purcell, Royal

    1991-01-01

    Discussion of indexing focuses on the possibilities of fully automated indexing. Topics discussed include controlled indexing languages such as subject heading lists and thesauri, free indexing languages, natural indexing languages, computer-aided indexing, expert systems, and the need for greater creativity to further advance automated indexing.…

  20. The Automated Office.

    ERIC Educational Resources Information Center

    Naclerio, Nick

    1979-01-01

    Clerical personnel may be able to climb career ladders as a result of office automation and expanded job opportunities in the word processing area. Suggests opportunities in an automated office system and lists books and periodicals on word processing for counselors and teachers. (MF)

  1. Planning for Office Automation.

    ERIC Educational Resources Information Center

    Sherron, Gene T.

    1982-01-01

    The steps taken toward office automation by the University of Maryland are described. Office automation is defined and some types of word processing systems are described. Policies developed in the writing of a campus plan are listed, followed by a section on procedures adopted to implement the plan. (Author/MLW)

  2. Work and Programmable Automation.

    ERIC Educational Resources Information Center

    DeVore, Paul W.

    A new industrial era based on electronics and the microprocessor has arrived, an era that is being called intelligent automation. Intelligent automation, in the form of robots, replaces workers, and the new products, using microelectronic devices, require significantly less labor to produce than the goods they replace. The microprocessor thus…

  3. Order Division Automated System.

    ERIC Educational Resources Information Center

    Kniemeyer, Justin M.; And Others

    This publication was prepared by the Order Division Automation Project staff to fulfill the Library of Congress' requirement to document all automation efforts. The report was originally intended for internal use only and not for distribution outside the Library. It is now felt that the library community at-large may have an interest in the…

  4. Automation and Cataloging.

    ERIC Educational Resources Information Center

    Furuta, Kenneth; And Others

    1990-01-01

    These three articles address issues in library cataloging that are affected by automation: (1) the impact of automation and bibliographic utilities on professional catalogers; (2) the effect of the LASS microcomputer software on the cost of authority work in cataloging at the University of Arizona; and (3) online subject heading and classification…

  5. Automation in Immunohematology

    PubMed Central

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-01-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process. PMID:22988378

  6. Advances in inspection automation

    NASA Astrophysics Data System (ADS)

    Weber, Walter H.; Mair, H. Douglas; Jansen, Dion; Lombardi, Luciano

    2013-01-01

    This new session at QNDE reflects the growing interest in inspection automation. Our paper describes a newly developed platform that makes the complex NDE automation possible without the need for software programmers. Inspection tasks that are tedious, error-prone or impossible for humans to perform can now be automated using a form of drag and drop visual scripting. Our work attempts to rectify the problem that NDE is not keeping pace with the rest of factory automation. Outside of NDE, robots routinely and autonomously machine parts, assemble components, weld structures and report progress to corporate databases. By contrast, components arriving in the NDT department typically require manual part handling, calibrations and analysis. The automation examples in this paper cover the development of robotic thickness gauging and the use of adaptive contour following on the NRU reactor inspection at Chalk River.

  7. A multi-step transversal linearization (MTL) method in non-linear structural dynamics

    NASA Astrophysics Data System (ADS)

    Roy, D.; Kumar, Rajesh

    2005-10-01

    An implicit family of multi-step transversal linearization (MTL) methods is proposed for efficient and numerically stable integration of nonlinear oscillators of interest in structural dynamics. The presently developed method is a multi-step extension and further generalization of the locally transversal linearization (LTL) method proposed earlier by Roy (Proceedings of the Academy of the Royal Society (London) 457 (2001) 539-566), Roy and Ramachandra (Journal of Sound and Vibration 41 (2001a) 653-679, International Journal for Numerical Methods in Engineering 51 (2001b) 203-224) and Roy (International Journal of Numerical Methods in Engineering 61 (2004) 764). The MTL-based linearization is achieved through a non-unique replacement of the nonlinear part of the vector field by a conditionally linear interpolating expansion of known accuracy, whose coefficients contain the discretized state variables defined at a set of grid points. In the process, the nonlinear part of the vector field becomes a conditionally determinable equivalent forcing function. The MTL-based linearized differential equations thus become explicitly integrable. Based on the linearized solution, a set of algebraic, constraint equations are so formed that transversal intersections of the linearized and nonlinearized solution manifolds occur at the multiple grid points. The discretized state vectors are thus found as the zeros of the constraint equations. Simple error estimates for the displacement and velocity vectors are provided and, in particular, it is shown that the formal accuracy of the MTL methods as a function of the time step-size depends only on the error of replacement of the nonlinear part of the vector field. Presently, only two different polynomial-based interpolation schemes are employed for transversal linearization, viz. the Taylor-like interpolation and the Lagrangian interpolation. While the Taylor-like interpolation leads to numerical ill-conditioning as the order of

  8. Comparison of microbial community shifts in two parallel multi-step drinking water treatment processes.

    PubMed

    Xu, Jiajiong; Tang, Wei; Ma, Jun; Wang, Hong

    2017-04-11

    Drinking water treatment processes remove undesirable chemicals and microorganisms from source water, which is vital to public health protection. The purpose of this study was to investigate the effects of treatment processes and configuration on the microbiome by comparing microbial community shifts in two series of different treatment processes operated in parallel within a full-scale drinking water treatment plant (DWTP) in Southeast China. Illumina sequencing of 16S rRNA genes of water samples demonstrated little effect of coagulation/sedimentation and pre-oxidation steps on bacterial communities, in contrast to dramatic and concurrent microbial community shifts during ozonation, granular activated carbon treatment, sand filtration, and disinfection for both series. A large number of unique operational taxonomic units (OTUs) at these four treatment steps further illustrated their strong shaping power towards the drinking water microbial communities. Interestingly, multidimensional scaling analysis revealed tight clustering of biofilm samples collected from different treatment steps, with Nitrospira, the nitrite-oxidizing bacteria, noted at higher relative abundances in biofilm compared to water samples. Overall, this study provides a snapshot of step-to-step microbial evolvement in multi-step drinking water treatment systems, and the results provide insight to control and manipulation of the drinking water microbiome via optimization of DWTP design and operation.

  9. Potential Lung Nodules Identification for Characterization by Variable Multistep Threshold and Shape Indices from CT Images

    PubMed Central

    Iqbal, Saleem; Iqbal, Khalid; Shaukat, Arslan; Khanum, Aasia

    2014-01-01

    Computed tomography (CT) is an important imaging modality. Physicians, surgeons, and oncologists prefer CT scan for diagnosis of lung cancer. However, some nodules are missed in CT scan. Computer aided diagnosis methods are useful for radiologists for detection of these nodules and early diagnosis of lung cancer. Early detection of malignant nodule is helpful for treatment. Computer aided diagnosis of lung cancer involves lung segmentation, potential nodules identification, features extraction from the potential nodules, and classification of the nodules. In this paper, we are presenting an automatic method for detection and segmentation of lung nodules from CT scan for subsequent features extraction and classification. Contribution of the work is the detection and segmentation of small sized nodules, low and high contrast nodules, nodules attached with vasculature, nodules attached to pleura membrane, and nodules in close vicinity of the diaphragm and lung wall in one-go. The particular techniques of the method are multistep threshold for the nodule detection and shape index threshold for false positive reduction. We used 60 CT scans of “Lung Image Database Consortium-Image Database Resource Initiative” taken by GE medical systems LightSpeed16 scanner as dataset and correctly detected 92% nodules. The results are reproducible. PMID:25506388

  10. Monte Carlo simulation of electrodeposition of copper: a multistep free energy calculation.

    PubMed

    Harinipriya, S; Subramanian, Venkat R

    2008-04-03

    Electrodeposition of copper (Cu) involves length scales of a micrometer or even less. Several theoretical techniques such as continuum Monte Carlo, kinetic Monte Carlo (KMC), and molecular dynamics have been used for simulating this problem. However the multiphenomena characteristics of the problem pose a challenge for an efficient simulation algorithm. Traditional KMC methods are slow, especially when modeling surface diffusion with large number of particles and frequent particle jumps. Parameter estimation involving thousands of KMC runs is very time-consuming. Thus a less time-consuming and novel multistep continuum Monte Carlo simulation is carried out to evaluate the step wise free energy change in the process of electrochemical copper deposition. The procedure involves separate Monte Carlo codes employing different random number criterion (using hydrated radii, bare radii, hydration number of the species, redox potentials, etc.) to obtain the number of species (CuCl(2) or CuSO(4) or Cu as the case may be) and in turn the free energy. The effect of concentration of electrolyte, influence of electric field and presence of chloride ions on the free energy change for the processes is studied. The rate determining step for the process of electrodeposition of copper from CuCl(2) and CuSO(4) is also determined.

  11. Link community detection by non-negative matrix factorization with multi-step similarities

    NASA Astrophysics Data System (ADS)

    Tang, Xianchao; Yang, Guoqing; Xu, Tao; Feng, Xia; Wang, Xiao; Li, Qiannan; Liu, Yanbei

    2016-11-01

    Uncovering community structures is a fundamental and important problem in analyzing the complex networks. While most of the methods focus on identifying node communities, recent works show the intuitions and advantages of detecting link communities in networks. In this paper, we propose a non-negative matrix factorization (NMF) based method to detect the link community structures. Traditional NMF-based methods mainly consider the adjacency matrix as the representation of network topology, while the adjacency matrix only shows the relationship between immediate neighbor nodes, which does not take the relationship between non-neighbor nodes into consideration. This may greatly reduce the information contained in the network topology, and thus leads to unsatisfactory results. Here, we address this by introducing multi-step similarities using the graph random walk approach so that the similarities between non-neighbor nodes can be captured. Meanwhile, in order to reduce impact caused by self-similarities (similarities between nodes themselves) and increase importance gained from similarities between other different nodes, we add a penalty term to our objective function. Then an efficient optimization scheme for the objective function is derived. Finally, we test the proposed method on both synthetic and real networks. Experimental results demonstrate the effectiveness of the proposed approach.

  12. Multistep continuous-flow synthesis of (R)- and (S)-rolipram using heterogeneous catalysts.

    PubMed

    Tsubogo, Tetsu; Oyamada, Hidekazu; Kobayashi, Shū

    2015-04-16

    Chemical manufacturing is conducted using either batch systems or continuous-flow systems. Flow systems have several advantages over batch systems, particularly in terms of productivity, heat and mixing efficiency, safety, and reproducibility. However, for over half a century, pharmaceutical manufacturing has used batch systems because the synthesis of complex molecules such as drugs has been difficult to achieve with continuous-flow systems. Here we describe the continuous-flow synthesis of drugs using only columns packed with heterogeneous catalysts. Commercially available starting materials were successively passed through four columns containing achiral and chiral heterogeneous catalysts to produce (R)-rolipram, an anti-inflammatory drug and one of the family of γ-aminobutyric acid (GABA) derivatives. In addition, simply by replacing a column packed with a chiral heterogeneous catalyst with another column packed with the opposing enantiomer, we obtained antipole (S)-rolipram. Similarly, we also synthesized (R)-phenibut, another drug belonging to the GABA family. These flow systems are simple and stable with no leaching of metal catalysts. Our results demonstrate that multistep (eight steps in this case) chemical transformations for drug synthesis can proceed smoothly under flow conditions using only heterogeneous catalysts, without the isolation of any intermediates and without the separation of any catalysts, co-products, by-products, and excess reagents. We anticipate that such syntheses will be useful in pharmaceutical manufacturing.

  13. Multistep continuous-flow synthesis of (R)- and (S)-rolipram using heterogeneous catalysts

    NASA Astrophysics Data System (ADS)

    Tsubogo, Tetsu; Oyamada, Hidekazu; Kobayashi, Shū

    2015-04-01

    Chemical manufacturing is conducted using either batch systems or continuous-flow systems. Flow systems have several advantages over batch systems, particularly in terms of productivity, heat and mixing efficiency, safety, and reproducibility. However, for over half a century, pharmaceutical manufacturing has used batch systems because the synthesis of complex molecules such as drugs has been difficult to achieve with continuous-flow systems. Here we describe the continuous-flow synthesis of drugs using only columns packed with heterogeneous catalysts. Commercially available starting materials were successively passed through four columns containing achiral and chiral heterogeneous catalysts to produce (R)-rolipram, an anti-inflammatory drug and one of the family of γ-aminobutyric acid (GABA) derivatives. In addition, simply by replacing a column packed with a chiral heterogeneous catalyst with another column packed with the opposing enantiomer, we obtained antipole (S)-rolipram. Similarly, we also synthesized (R)-phenibut, another drug belonging to the GABA family. These flow systems are simple and stable with no leaching of metal catalysts. Our results demonstrate that multistep (eight steps in this case) chemical transformations for drug synthesis can proceed smoothly under flow conditions using only heterogeneous catalysts, without the isolation of any intermediates and without the separation of any catalysts, co-products, by-products, and excess reagents. We anticipate that such syntheses will be useful in pharmaceutical manufacturing.

  14. Exact free vibration of multi-step Timoshenko beam system with several attachments

    NASA Astrophysics Data System (ADS)

    Farghaly, S. H.; El-Sayed, T. A.

    2016-05-01

    This paper deals with the analysis of the natural frequencies, mode shapes of an axially loaded multi-step Timoshenko beam combined system carrying several attachments. The influence of system design and the proposed sub-system non-dimensional parameters on the combined system characteristics are the major part of this investigation. The effect of material properties, rotary inertia and shear deformation of the beam system for each span are included. The end masses are elastically supported against rotation and translation at an offset point from the point of attachment. A sub-system having two degrees of freedom is located at the beam ends and at any of the intermediate stations and acts as a support and/or a suspension. The boundary conditions of the ordinary differential equation governing the lateral deflections and slope due to bending of the beam system including the shear force term, due to the sub-system, have been formulated. Exact global coefficient matrices for the combined modal frequencies, the modal shape and for the discrete sub-system have been derived. Based on these formulae, detailed parametric studies of the combined system are carried out. The applied mathematical model is valid for wide range of applications especially in mechanical, naval and structural engineering fields.

  15. Multi-step sequential batch two-phase anaerobic composting of food waste.

    PubMed

    Shin, H S; Han, S K; Song, Y C; Lee, C Y

    2001-03-01

    This study was conducted to evaluate the newly devised process, called MUlti-step Sequential batch Two-phase Anaerobic Composting (MUSTAC). The MUSTAC process consisted of several leaching beds for hydrolysis, acidification and post-treatment, and a UASB reactor for methane recovery. This process to treat food waste was developed with a high-rate anaerobic composting technique based on the rate-limiting step approach. Rumen microorganisms were inoculated to improve the low efficiency of acidogenic fermentation. Both two-phase anaerobic digestion and sequential batch operation were used to control environmental constraints in anaerobic degradation. The MUSTAC process demonstrated excellent performance as it resulted in a large reduction in volatile solids (VS) (84.7%) and high methane conversion efficiency (84.4%) at high organic loading rates (10.8 kg VS m(-3) d(-1)) in a short SRT (10 days). Methane yield was 0.27 m3 kg(-1) VS, while methane gas production rate was 2.27 m3 m(-3) d(-1). The output from the post-treatment could be used as a soil amendment, which was produced at the same acidogenic fermenter without troublesome moving. The main advantages of the MUSTAC process were simple operation and high efficiency. The MUSTAC process proved stable, reliable and effective in resource recovery as well as waste stabilization.

  16. Potential lung nodules identification for characterization by variable multistep threshold and shape indices from CT images.

    PubMed

    Iqbal, Saleem; Iqbal, Khalid; Arif, Fahim; Shaukat, Arslan; Khanum, Aasia

    2014-01-01

    Computed tomography (CT) is an important imaging modality. Physicians, surgeons, and oncologists prefer CT scan for diagnosis of lung cancer. However, some nodules are missed in CT scan. Computer aided diagnosis methods are useful for radiologists for detection of these nodules and early diagnosis of lung cancer. Early detection of malignant nodule is helpful for treatment. Computer aided diagnosis of lung cancer involves lung segmentation, potential nodules identification, features extraction from the potential nodules, and classification of the nodules. In this paper, we are presenting an automatic method for detection and segmentation of lung nodules from CT scan for subsequent features extraction and classification. Contribution of the work is the detection and segmentation of small sized nodules, low and high contrast nodules, nodules attached with vasculature, nodules attached to pleura membrane, and nodules in close vicinity of the diaphragm and lung wall in one-go. The particular techniques of the method are multistep threshold for the nodule detection and shape index threshold for false positive reduction. We used 60 CT scans of "Lung Image Database Consortium-Image Database Resource Initiative" taken by GE medical systems LightSpeed16 scanner as dataset and correctly detected 92% nodules. The results are reproducible.

  17. Constrained Broyden Dimer Method with Bias Potential for Exploring Potential Energy Surface of Multistep Reaction Process.

    PubMed

    Shang, Cheng; Liu, Zhi-Pan

    2012-07-10

    To predict the chemical activity of new matter is an ultimate goal in chemistry. The identification of reaction pathways using modern quantum mechanics calculations, however, often requires a high demand in computational power and good chemical intuition on the reaction. Here, a new reaction path searching method is developed by combining our recently developed transition state (TS) location method, namely, the constrained Broyden dimer method, with a basin-filling method via bias potentials, which allows the system to walk out from the energy traps at a given reaction direction. In the new method, the reaction path searching starts from an initial state without the need for preguessing the TS-like or final state structure and can proceed iteratively to the final state by locating all related TSs and intermediates. In each elementary reaction step, a reaction direction, such as a bond breaking, needs to be specified, the information of which is refined and preserved as a normal mode through biased dimer rotation. The method is tested successfully on the Baker reaction system (50 elementary reactions) with good efficiency and stability and is also applied to the potential energy surface exploration of multistep reaction processes in the gas phase and on the surface. The new method can be applied for the computational screening of new catalytic materials with a minimum requirement of chemical intuition.

  18. Quantum-dynamical picture of a multistep enzymatic process: reaction catalyzed by phospholipase A(2).

    PubMed Central

    Bała, P; Grochowski, P; Nowiński, K; Lesyng, B; McCammon, J A

    2000-01-01

    A quantum-classical molecular dynamics model (QCMD), applying explicit integration of the time-dependent Schrödinger equation (QD) and Newtonian equations of motion (MD), is presented. The model is capable of describing quantum dynamical processes in complex biomolecular systems. It has been applied in simulations of a multistep catalytic process carried out by phospholipase A(2) in its active site. The process includes quantum-dynamical proton transfer from a water molecule to histidine localized in the active site, followed by a nucleophilic attack of the resulting OH(-) group on a carbonyl carbon atom of a phospholipid substrate, leading to cleavage of an adjacent ester bond. The process has been simulated using a parallel version of the QCMD code. The potential energy function for the active site is computed using an approximate valence bond (AVB) method. The dynamics of the key proton is described either by QD or classical MD. The coupling between the quantum proton and the classical atoms is accomplished via Hellmann-Feynman forces, as well as the time dependence of the potential energy function in the Schrödinger equation (QCMD/AVB model). Analysis of the simulation results with an Advanced Visualization System revealed a correlated rather than a stepwise picture of the enzymatic process. It is shown that an sp(2)--> sp(3) configurational change at the substrate carbonyl carbon is mostly responsible for triggering the activation process. PMID:10968989

  19. Complex network analysis of brain functional connectivity under a multi-step cognitive task

    NASA Astrophysics Data System (ADS)

    Cai, Shi-Min; Chen, Wei; Liu, Dong-Bai; Tang, Ming; Chen, Xun

    2017-01-01

    Functional brain network has been widely studied to understand the relationship between brain organization and behavior. In this paper, we aim to explore the functional connectivity of brain network under a multi-step cognitive task involving consecutive behaviors, and further understand the effect of behaviors on the brain organization. The functional brain networks are constructed based on a high spatial and temporal resolution fMRI dataset and analyzed via complex network based approach. We find that at voxel level the functional brain network shows robust small-worldness and scale-free characteristics, while its assortativity and rich-club organization are slightly restricted to the order of behaviors performed. More interestingly, the functional connectivity of brain network in activated ROIs strongly correlates with behaviors and is obviously restricted to the order of behaviors performed. These empirical results suggest that the brain organization has the generic properties of small-worldness and scale-free characteristics, and its diverse functional connectivity emerging from activated ROIs is strongly driven by these behavioral activities via the plasticity of brain.

  20. Stochastic diffusion model of multistep activation in a voltage-dependent K channel

    NASA Astrophysics Data System (ADS)

    Vaccaro, S. R.

    2010-04-01

    The energy barrier to the activated state for the S4 voltage sensor of a K channel is dependent on the electrostatic force between positively charged S4 residues and negatively charged groups on neighboring segments, the potential difference across the membrane, and the dielectric boundary force on the charged residues near the interface between the solvent and the low dielectric region of the membrane gating pore. The variation of the potential function with transverse displacement and rotation of the S4 sensor across the membrane may be derived from a solution of Poisson's equation for the electrostatic potential. By approximating the energy of an S4 sensor along a path between stationary states by a piecewise linear function of the transverse displacement, the dynamics of slow activation, in the millisecond range, may be described by the lowest frequency component of an analytical solution of interacting diffusion equations of Fokker-Planck type for resting and barrier regions. The solution of the Smoluchowski equations for an S4 sensor in an energy landscape with several barriers is in accord with an empirical master equation for multistep activation in a voltage-dependent K channel.

  1. Michaelis-Menten kinetics in shear flow: Similarity solutions for multi-step reactions.

    PubMed

    Ristenpart, W D; Stone, H A

    2012-03-01

    Models for chemical reaction kinetics typically assume well-mixed conditions, in which chemical compositions change in time but are uniform in space. In contrast, many biological and microfluidic systems of interest involve non-uniform flows where gradients in flow velocity dynamically alter the effective reaction volume. Here, we present a theoretical framework for characterizing multi-step reactions that occur when an enzyme or enzymatic substrate is released from a flat solid surface into a linear shear flow. Similarity solutions are developed for situations where the reactions are sufficiently slow compared to a convective time scale, allowing a regular perturbation approach to be employed. For the specific case of Michaelis-Menten reactions, we establish that the transversally averaged concentration of product scales with the distance x downstream as x(5/3). We generalize the analysis to n-step reactions, and we discuss the implications for designing new microfluidic kinetic assays to probe the effect of flow on biochemical processes.

  2. Investigation of the reactions of acrylamide during in vitro multistep enzymatic digestion of thermally processed foods.

    PubMed

    Hamzalıoğlu, Aytül; Gökmen, Vural

    2015-01-01

    This study investigated the fate of acrylamide in thermally processed foods after ingestion. An in vitro multistep enzymatic digestion system simulating gastric, duodenal and colon phases was used to understand the fate of acrylamide in bakery and fried potato products. Acrylamide levels gradually decreased through gastric, duodenal and colon phases during in vitro digestion of biscuits. At the end of digestion, acrylamide reduction was between 49.2% and 73.4% in biscuits. Binary model systems composed of acrylamide and amino acids were used to understand the mechanism of acrylamide reduction. High-resolution mass spectrometry analyses confirmed Michael addition of amino acids to acrylamide during digestion. In contrast to bakery products, acrylamide levels increased significantly during gastric digestion of fried potatoes. The Schiff base formed between reducing sugars and asparagine disappeared rapidly, whereas the acrylamide level increased during the gastric phase. This suggests that intermediates like the Schiff base that accumulate in potatoes during frying are potential precursors of acrylamide under gastric conditions.

  3. Shutdown Dose Rate Analysis Using the Multi-Step CADIS Method

    SciTech Connect

    Ibrahim, Ahmad M.; Peplow, Douglas E.; Peterson, Joshua L.; Grove, Robert E.

    2015-01-01

    The Multi-Step Consistent Adjoint Driven Importance Sampling (MS-CADIS) hybrid Monte Carlo (MC)/deterministic radiation transport method was proposed to speed up the shutdown dose rate (SDDR) neutron MC calculation using an importance function that represents the neutron importance to the final SDDR. This work applied the MS-CADIS method to the ITER SDDR benchmark problem. The MS-CADIS method was also used to calculate the SDDR uncertainty resulting from uncertainties in the MC neutron calculation and to determine the degree of undersampling in SDDR calculations because of the limited ability of the MC method to tally detailed spatial and energy distributions. The analysis that used the ITER benchmark problem compared the efficiency of the MS-CADIS method to the traditional approach of using global MC variance reduction techniques for speeding up SDDR neutron MC calculation. Compared to the standard Forward-Weighted-CADIS (FW-CADIS) method, the MS-CADIS method increased the efficiency of the SDDR neutron MC calculation by 69%. The MS-CADIS method also increased the fraction of nonzero scoring mesh tally elements in the space-energy regions of high importance to the final SDDR.

  4. Shutdown Dose Rate Analysis Using the Multi-Step CADIS Method

    DOE PAGES

    Ibrahim, Ahmad M.; Peplow, Douglas E.; Peterson, Joshua L.; ...

    2015-01-01

    The Multi-Step Consistent Adjoint Driven Importance Sampling (MS-CADIS) hybrid Monte Carlo (MC)/deterministic radiation transport method was proposed to speed up the shutdown dose rate (SDDR) neutron MC calculation using an importance function that represents the neutron importance to the final SDDR. This work applied the MS-CADIS method to the ITER SDDR benchmark problem. The MS-CADIS method was also used to calculate the SDDR uncertainty resulting from uncertainties in the MC neutron calculation and to determine the degree of undersampling in SDDR calculations because of the limited ability of the MC method to tally detailed spatial and energy distributions. The analysismore » that used the ITER benchmark problem compared the efficiency of the MS-CADIS method to the traditional approach of using global MC variance reduction techniques for speeding up SDDR neutron MC calculation. Compared to the standard Forward-Weighted-CADIS (FW-CADIS) method, the MS-CADIS method increased the efficiency of the SDDR neutron MC calculation by 69%. The MS-CADIS method also increased the fraction of nonzero scoring mesh tally elements in the space-energy regions of high importance to the final SDDR.« less

  5. Multi-step process for concentrating magnetic particles in waste sludges

    DOEpatents

    Watson, John L.

    1990-01-01

    This invention involves a multi-step, multi-force process for dewatering sludges which have high concentrations of magnetic particles, such as waste sludges generated during steelmaking. This series of processing steps involves (1) mixing a chemical flocculating agent with the sludge; (2) allowing the particles to aggregate under non-turbulent conditions; (3) subjecting the mixture to a magnetic field which will pull the magnetic aggregates in a selected direction, causing them to form a compacted sludge; (4) preferably, decanting the clarified liquid from the compacted sludge; and (5) using filtration to convert the compacted sludge into a cake having a very high solids content. Steps 2 and 3 should be performed simultaneously. This reduces the treatment time and increases the extent of flocculation and the effectiveness of the process. As partially formed aggregates with active flocculating groups are pulled through the mixture by the magnetic field, they will contact other particles and form larger aggregates. This process can increase the solids concentration of steelmaking sludges in an efficient and economic manner, thereby accomplishing either of two goals: (a) it can convert hazardous wastes into economic resources for recycling as furnace feed material, or (b) it can dramatically reduce the volume of waste material which must be disposed.

  6. Multi-step process for concentrating magnetic particles in waste sludges

    DOEpatents

    Watson, J.L.

    1990-07-10

    This invention involves a multi-step, multi-force process for dewatering sludges which have high concentrations of magnetic particles, such as waste sludges generated during steelmaking. This series of processing steps involves (1) mixing a chemical flocculating agent with the sludge; (2) allowing the particles to aggregate under non-turbulent conditions; (3) subjecting the mixture to a magnetic field which will pull the magnetic aggregates in a selected direction, causing them to form a compacted sludge; (4) preferably, decanting the clarified liquid from the compacted sludge; and (5) using filtration to convert the compacted sludge into a cake having a very high solids content. Steps 2 and 3 should be performed simultaneously. This reduces the treatment time and increases the extent of flocculation and the effectiveness of the process. As partially formed aggregates with active flocculating groups are pulled through the mixture by the magnetic field, they will contact other particles and form larger aggregates. This process can increase the solids concentration of steelmaking sludges in an efficient and economic manner, thereby accomplishing either of two goals: (a) it can convert hazardous wastes into economic resources for recycling as furnace feed material, or (b) it can dramatically reduce the volume of waste material which must be disposed. 7 figs.

  7. Carbon and hydrogen isotope fractionation of benzene and toluene during hydrophobic sorption in multistep batch experiments.

    PubMed

    Imfeld, G; Kopinke, F-D; Fischer, A; Richnow, H-H

    2014-07-01

    The application of compound-specific stable isotope analysis (CSIA) for evaluating degradation of organic pollutants in the field implies that other processes affecting pollutant concentration are minor with respect to isotope fractionation. Sorption is associated with minor isotope fractionation and pollutants may undergo successive sorption-desorption steps during their migration in aquifers. However, little is known about isotope fractionation of BTEX compounds after consecutive sorption steps. Here, we show that partitioning of benzene and toluene between water and organic sorbents (i.e. 1-octanol, dichloromethane, cyclohexane, hexanoic acid and Amberlite XAD-2) generally exhibits very small carbon and hydrogen isotope effects in multistep batch experiments. However, carbon and hydrogen isotope fractionation was observed for the benzene-octanol pair after several sorption steps (Δδ(13)C=1.6 ± 0.3‰ and Δδ(2)H=88 ± 3‰), yielding isotope fractionation factors of αC=1.0030 ± 0.0005 and αH=1.195 ± 0.026. Our results indicate that the cumulative effect of successive hydrophobic partitioning steps in an aquifer generally results in insignificant isotope fractionation for benzene and toluene. However, significant carbon and hydrogen isotope fractionation cannot be excluded for specific sorbate-sorbent pairs, such as sorbates with π-electrons and sorbents with OH-groups. Consequently, functional groups of sedimentary organic matter (SOM) may specifically interact with BTEX compounds migrating in an aquifer, thereby resulting in potentially relevant isotope fractionation.

  8. Optimization of a Multi-Step Procedure for Isolation of Chicken Bone Collagen

    PubMed Central

    2015-01-01

    Chicken bone is not adequately utilized despite its high nutritional value and protein content. Although not a common raw material, chicken bone can be used in many different ways besides manufacturing of collagen products. In this study, a multi-step procedure was optimized to isolate chicken bone collagen for higher yield and quality for manufacture of collagen products. The chemical composition of chicken bone was 2.9% nitrogen corresponding to about 15.6% protein, 9.5% fat, 14.7% mineral and 57.5% moisture. The lowest amount of protein loss was aimed along with the separation of the highest amount of visible impurities, non-collagen proteins, minerals and fats. Treatments under optimum conditions removed 57.1% of fats and 87.5% of minerals with respect to their initial concentrations. Meanwhile, 18.6% of protein and 14.9% of hydroxyproline were lost, suggesting that a selective separation of non-collagen components and isolation of collagen were achieved. A significant part of impurities were selectively removed and over 80% of the original collagen was preserved during the treatments. PMID:26761863

  9. Transformation of quiescent adult oligodendrocyte precursor cells into malignant glioma through a multistep reactivation process.

    PubMed

    Galvao, Rui Pedro; Kasina, Anita; McNeill, Robert S; Harbin, Jordan E; Foreman, Oded; Verhaak, Roel G W; Nishiyama, Akiko; Miller, C Ryan; Zong, Hui

    2014-10-07

    How malignant gliomas arise in a mature brain remains a mystery, hindering the development of preventive and therapeutic interventions. We previously showed that oligodendrocyte precursor cells (OPCs) can be transformed into glioma when mutations are introduced perinatally. However, adult OPCs rarely proliferate compared with their perinatal counterparts. Whether these relatively quiescent cells have the potential to transform is unknown, which is a critical question considering the late onset of human glioma. Additionally, the premalignant events taking place between initial mutation and a fully developed tumor mass are particularly poorly understood in glioma. Here we used a temporally controllable Cre transgene to delete p53 and NF1 specifically in adult OPCs and demonstrated that these cells consistently give rise to malignant gliomas. To investigate the transforming process of quiescent adult OPCs, we then tracked these cells throughout the premalignant phase, which revealed a dynamic multistep transformation, starting with rapid but transient hyperproliferative reactivation, followed by a long period of dormancy, and then final malignant transformation. Using pharmacological approaches, we discovered that mammalian target of rapamycin signaling is critical for both the initial OPC reactivation step and late-stage tumor cell proliferation and thus might be a potential target for both glioma prevention and treatment. In summary, our results firmly establish the transforming potential of adult OPCs and reveal an actionable multiphasic reactivation process that turns slowly dividing OPCs into malignant gliomas.

  10. Multistep Aggregation Pathway of Human Interleukin-1 Receptor Antagonist: Kinetic, Structural, and Morphological Characterization

    PubMed Central

    Krishnan, Sampathkumar; Raibekas, Andrei A.

    2009-01-01

    Abstract The complex, multistep aggregation kinetic and structural behavior of human recombinant interleukin-1 receptor antagonist (IL-1ra) was revealed and characterized by spectral probes and techniques. At a certain range of protein concentration (12–27 mg/mL) and temperature (44–48°C), two sequential aggregation kinetic transitions emerge, where the second transition is preceded by a lag phase and is associated with the main portion of the aggregated protein. Each kinetic transition is linked to a different type of aggregate population, referred to as type I and type II. The aggregate populations, isolated at a series of time points and analyzed by Fourier-transform infrared spectroscopy, show consecutive protein structural changes, from intramolecular (type I) to intermolecular (type II) β-sheet formation. The early type I protein spectral change resembles that seen for IL-1ra in the crystalline state. Moreover, Fourier-transform infrared data demonstrate that type I protein assembly alone can undergo a structural rearrangement and, consequently, convert to the type II aggregate. The aggregated protein structural changes are accompanied by the aggregate morphological changes, leading to a well-defined population of interacting spheres, as detected by scanning electron microscopy. A nucleation-driven IL-1ra aggregation pathway is proposed, and assumes two major activation energy barriers, where the second barrier is associated with the type I → type II aggregate structural rearrangement that, in turn, serves as a pseudonucleus triggering the second kinetic event. PMID:19134476

  11. Method to Improve Indium Bump Bonding via Indium Oxide Removal Using a Multi-Step Plasma Process

    NASA Technical Reports Server (NTRS)

    Greer, H. Frank (Inventor); Jones, Todd J. (Inventor); Vasquez, Richard P. (Inventor); Hoenk, Michael E. (Inventor); Dickie, Matthew R. (Inventor); Nikzad, Shouleh (Inventor)

    2012-01-01

    A process for removing indium oxide from indium bumps in a flip-chip structure to reduce contact resistance, by a multi-step plasma treatment. A first plasma treatment of the indium bumps with an argon, methane and hydrogen plasma reduces indium oxide, and a second plasma treatment with an argon and hydrogen plasma removes residual organics. The multi-step plasma process for removing indium oxide from the indium bumps is more effective in reducing the oxide, and yet does not require the use of halogens, does not change the bump morphology, does not attack the bond pad material or under-bump metallization layers, and creates no new mechanisms for open circuits.

  12. Surface Modified Particles By Multi-Step Michael-Type Addition And Process For The Preparation Thereof

    SciTech Connect

    Cook, Ronald Lee; Elliott, Brian John; Luebben, Silvia DeVito; Myers, Andrew William; Smith, Bryan Matthew

    2005-05-03

    A new class of surface modified particles and a multi-step Michael-type addition surface modification process for the preparation of the same is provided. The multi-step Michael-type addition surface modification process involves two or more reactions to compatibilize particles with various host systems and/or to provide the particles with particular chemical reactivities. The initial step comprises the attachment of a small organic compound to the surface of the inorganic particle. The subsequent steps attach additional compounds to the previously attached organic compounds through reactive organic linking groups. Specifically, these reactive groups are activated carbon—carbon pi bonds and carbon and non-carbon nucleophiles that react via Michael or Michael-type additions.

  13. Systematic review automation technologies.

    PubMed

    Tsafnat, Guy; Glasziou, Paul; Choong, Miew Keen; Dunn, Adam; Galgani, Filippo; Coiera, Enrico

    2014-07-09

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects.We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time.

  14. Systematic review automation technologies

    PubMed Central

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  15. Fabrication of low-loss, single-mode-channel waveguide with DNA-CTMA biopolymer by multistep processing technology.

    PubMed

    Zhou, Jun; Wang, Zhen Yong; Yang, Xin; Wong, C-Y; Pun, Edwin Y B

    2010-05-15

    A multistep processing and reactive ion etching technique has been developed to fabricate optical channel waveguides based on deoxyribonucleic acid-cetyltrimethylammonium biopolymer material. The channel waveguides exhibit excellent single-mode output and high confinement of light because of the sharp waveguide profile with very smooth surfaces and vertical sidewalls. The measurement results show that these channel waveguides have low propagation losses and small polarization dependent losses at 633, 1310, and 1550 nm wavelengths.

  16. A Family of Symmetric Linear Multistep Methods for the Numerical Solution of the Schroedinger Equation and Related Problems

    SciTech Connect

    Anastassi, Z. A.; Simos, T. E.

    2010-09-30

    We develop a new family of explicit symmetric linear multistep methods for the efficient numerical solution of the Schroedinger equation and related problems with oscillatory solution. The new methods are trigonometrically fitted and have improved intervals of periodicity as compared to the corresponding classical method with constant coefficients and other methods from the literature. We also apply the methods along with other known methods to real periodic problems, in order to measure their efficiency.

  17. Multistep modeling (MSM) of biomolecular structure application to the A-G mispair in the B-DNA environment

    NASA Technical Reports Server (NTRS)

    Srinivasan, S.; Raghunathan, G.; Shibata, M.; Rein, R.

    1986-01-01

    A multistep modeling procedure has been evolved to study the structural changes introduced by lesions in DNA. We report here the change in the structure of regular B-DNA geometry due to the incorporation of Ganti-Aanti mispair in place of a regular G-C pair, preserving the helix continuity. The energetics of the structure so obtained is compared with the Ganti-Asyn configuration under similar constrained conditions. We present the methodology adopted and discuss the results.

  18. Hippocampal-prefrontal theta phase synchrony in planning of multi-step actions based on memory retrieval.

    PubMed

    Ishino, Seiya; Takahashi, Susumu; Ogawa, Masaaki; Sakurai, Yoshio

    2017-02-23

    Planning of multi-step actions based on the retrieval of acquired information is essential for efficient foraging. The hippocampus (HPC) and prefrontal cortex (PFC) may play critical roles in this process. However, in rodents, many studies investigating such roles utilized T-maze tasks that only require one-step actions (i.e., selection of one of two alternatives), in which memory retrieval and selection of an action based on the retrieval cannot be clearly differentiated. In monkeys, PFC has been suggested to be involved in planning of multi-step actions; however, the synchrony between HPC and PFC has not been evaluated. To address the combined role of the regions in planning of multi-step actions, we introduced a task in rats that required three successive nose-poke responses to three sequentially illuminated nose-poke holes. During the task, local field potentials (LFP) and spikes from hippocampal CA1 and medial PFC (mPFC) were simultaneously recorded. The position of the first hole indicated whether the following two holes would be presented in a predictable sequence or not. During the first nose-poke period, phase synchrony of LFPs in the theta range (4-10 Hz) between the regions was not different between predictable and unpredictable trials. However, only in trials of predictable sequences, the magnitude of theta phase synchrony during the first nose-poke period was negatively correlated with latency of the two-step ahead nose-poke response. Our findings point to the HPC-mPFC theta phase synchrony as a key mechanism underlying planning of multi-step actions based on memory retrieval rather than the retrieval itself. This article is protected by copyright. All rights reserved.

  19. Managing laboratory automation

    PubMed Central

    Saboe, Thomas J.

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed. PMID:18925018

  20. Multistep conversion of para-substituted phenols by phenol hydroxylase and 2,3-dihydroxybiphenyl 1,2-dioxygenase.

    PubMed

    Qu, Yuanyuan; Shi, Shengnan; Ma, Qiao; Kong, Chunlei; Zhou, Hao; Zhang, Xuwang; Zhou, Jiti

    2013-04-01

    A multistep conversion system of para-substituted phenols by recombinant phenol hydroxylase (PH(IND)) and 2,3-dihydroxybiphenyl 1,2-dioxygenase (BphC(LA-4)) was constructed in this study. Docking studies with different para-substituted phenols and corresponding catechols inside of the active site of PH(IND) and BphC(LA-4) predicted that all the substrates should be transformed. High-performance liquid chromatography-mass spectrometry analysis showed that the products of multistep conversion were the corresponding para-substituted catechols and semialdehydes. For the first-step conversion, the formation rate of 4-fluorocatechol (0.39 μM/min/mg dry weight) by strain PH(IND) hydroxylation was 1.15, 6.50, 3.00, and 1.18-fold higher than the formation of 4-chlorocatechol, 4-bromocatechol, 4-nitrocatechol, and 4-methylcatechol, respectively. For the second-step conversion, the formation rates of semialdehydes by strain BphC(LA-4) were as follows: 5-fluoro-HODA>5-chloro-HODA>2-hydroxy-5-nitro-ODA>5-bromo-HODA>2-hydroxy-5-methyl-ODA. The present study suggested that the multistep conversion by both ring hydroxylase and cleavage dioxygenase should be potential in the synthesis of industrial precursors and provide a novel avenue in the wastewater recycling treatment.

  1. Rapid on-chip multi-step (bio)chemical procedures in continuous flow--manoeuvring particles through co-laminar reagent streams.

    PubMed

    Peyman, Sally A; Iles, Alexander; Pamme, Nicole

    2008-03-14

    We introduce a novel and extremely versatile microfluidic platform in which tedious multi-step biochemical processes can be performed in continuous flow within a fraction of the time required for conventional methods.

  2. DE-FG02-05ER64001 Overcoming the hurdles of multi-step targeting (MST) for effective radioimmunotherapy of solid tumors

    SciTech Connect

    P.I. Steven M. Larson MD Co P.I. Nai-Kong Cheung MD, Ph.D.

    2009-09-21

    The 4 specific aims of this project are: (1) Optimization of MST to increase tumor uptake; (2) Antigen heterogeneity; (3) Characterization and reduction of renal uptake; and (4) Validation in vivo of optimized MST targeted therapy. This proposal focussed upon optimizing multistep immune targeting strategies for the treatment of cancer. Two multi-step targeting constructs were explored during this funding period: (1) anti-Tag-72 and (2) anti-GD2.

  3. Automated decision stations

    NASA Technical Reports Server (NTRS)

    Tischendorf, Mark

    1990-01-01

    This paper discusses the combination of software robots and expert systems to automate everyday business tasks. Tasks which require people to repetitively interact with multiple systems screens as well as multiple systems.

  4. Automating the Media Center.

    ERIC Educational Resources Information Center

    Holloway, Mary A.

    1988-01-01

    Discusses the need to develop more efficient information retrieval skills by the use of new technology. Lists four stages used in automating the media center. Describes North Carolina's pilot programs. Proposes benefits and looks at the media center's future. (MVL)

  5. Planning for Office Automation.

    ERIC Educational Resources Information Center

    Mick, Colin K.

    1983-01-01

    Outlines a practical approach to planning for office automation termed the "Focused Process Approach" (the "what" phase, "how" phase, "doing" phase) which is a synthesis of the problem-solving and participatory planning approaches. Thirteen references are provided. (EJS)

  6. Xenon International Automated Control

    SciTech Connect

    2016-08-05

    The Xenon International Automated Control software monitors, displays status, and allows for manual operator control as well as fully automatic control of multiple commercial and PNNL designed hardware components to generate and transmit atmospheric radioxenon concentration measurements every six hours.

  7. Automated Cyber Red Teaming

    DTIC Science & Technology

    2015-04-01

    possible attack paths for CRT. This report surveys the current state-of-the- art planning techniques, tools and frameworks, their performance at...6 3.3 State of the art Automated Planning ..................................................................... 7 3.3.1...automated planning to CRT problems. Finally, we recommend several state-of-the- art planning tools for trial and, more generally, when it is suitable to use

  8. Automating Index Preparation

    DTIC Science & Technology

    1987-03-23

    Automating Index Preparation* Pehong Chent Michael A. Harrison Computer Science Division University of CaliforniaI Berkeley, CA 94720 March 23, 1987...Abstract Index preparation is a tedious and time-consuming task. In this paper we indicate * how the indexing process can be automated in a way which...identified and analyzed. Specifically, we describe a framework for placing index commands in the document and a general purpose index processor which

  9. Automated Pilot Advisory System

    NASA Technical Reports Server (NTRS)

    Parks, J. L., Jr.; Haidt, J. G.

    1981-01-01

    An Automated Pilot Advisory System (APAS) was developed and operationally tested to demonstrate the concept that low cost automated systems can provide air traffic and aviation weather advisory information at high density uncontrolled airports. The system was designed to enhance the see and be seen rule of flight, and pilots who used the system preferred it over the self announcement system presently used at uncontrolled airports.

  10. Automated Lattice Perturbation Theory

    SciTech Connect

    Monahan, Christopher

    2014-11-01

    I review recent developments in automated lattice perturbation theory. Starting with an overview of lattice perturbation theory, I focus on the three automation packages currently "on the market": HiPPy/HPsrc, Pastor and PhySyCAl. I highlight some recent applications of these methods, particularly in B physics. In the final section I briefly discuss the related, but distinct, approach of numerical stochastic perturbation theory.

  11. Automated Status Notification System

    NASA Technical Reports Server (NTRS)

    2005-01-01

    NASA Lewis Research Center's Automated Status Notification System (ASNS) was born out of need. To prevent "hacker attacks," Lewis' telephone system needed to monitor communications activities 24 hr a day, 7 days a week. With decreasing staff resources, this continuous monitoring had to be automated. By utilizing existing communications hardware, a UNIX workstation, and NAWK (a pattern scanning and processing language), we implemented a continuous monitoring system.

  12. Automated Groundwater Screening

    SciTech Connect

    Taylor, Glenn A.; Collard, Leonard, B.

    2005-10-31

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application.

  13. Metrology automation reliability

    NASA Astrophysics Data System (ADS)

    Chain, Elizabeth E.

    1996-09-01

    At Motorola's MOS-12 facility automated measurements on 200- mm diameter wafers proceed in a hands-off 'load-and-go' mode requiring only wafer loading, measurement recipe loading, and a 'run' command for processing. Upon completion of all sample measurements, the data is uploaded to the factory's data collection software system via a SECS II interface, eliminating the requirement of manual data entry. The scope of in-line measurement automation has been extended to the entire metrology scheme from job file generation to measurement and data collection. Data analysis and comparison to part specification limits is also carried out automatically. Successful integration of automated metrology into the factory measurement system requires that automated functions, such as autofocus and pattern recognition algorithms, display a high degree of reliability. In the 24- hour factory reliability data can be collected automatically on every part measured. This reliability data is then uploaded to the factory data collection software system at the same time as the measurement data. Analysis of the metrology reliability data permits improvements to be made as needed, and provides an accurate accounting of automation reliability. This reliability data has so far been collected for the CD-SEM (critical dimension scanning electron microscope) metrology tool, and examples are presented. This analysis method can be applied to such automated in-line measurements as CD, overlay, particle and film thickness measurements.

  14. Elements of EAF automation processes

    NASA Astrophysics Data System (ADS)

    Ioana, A.; Constantin, N.; Dragna, E. C.

    2017-01-01

    Our article presents elements of Electric Arc Furnace (EAF) automation. So, we present and analyze detailed two automation schemes: the scheme of electrical EAF automation system; the scheme of thermic EAF automation system. The application results of these scheme of automation consists in: the sensitive reduction of specific consummation of electrical energy of Electric Arc Furnace, increasing the productivity of Electric Arc Furnace, increase the quality of the developed steel, increasing the durability of the building elements of Electric Arc Furnace.

  15. Two- and multi-step annealing of cereal starches in relation to gelatinization.

    PubMed

    Shi, Yong-Cheng

    2008-02-13

    Two- and multi-step annealing experiments were designed to determine how much gelatinization temperature of waxy rice, waxy barley, and wheat starches could be increased without causing a decrease in gelatinization enthalpy or a decline in X-ray crystallinity. A mixture of starch and excess water was heated in a differential scanning calorimeter (DSC) pan to a specific temperature and maintained there for 0.5-48 h. The experimental approach was first to anneal a starch at a low temperature so that the gelatinization temperature of the starch was increased without causing a decrease in gelatinization enthalpy. The annealing temperature was then raised, but still was kept below the onset gelatinization temperature of the previously annealed starch. When a second- or third-step annealing temperature was high enough, it caused a decrease in crystallinity, even though the holding temperature remained below the onset gelatinization temperature of the previously annealed starch. These results support that gelatinization is a nonequilibrium process and that dissociation of double helices is driven by the swelling of amorphous regions. Small-scale starch slurry annealing was also performed and confirmed the annealing results conducted in DSC pans. A three-phase model of a starch granule, a mobile amorphous phase, a rigid amorphous phase, and a crystalline phase, was used to interpret the annealing results. Annealing seems to be an interplay between a more efficient packing of crystallites in starch granules and swelling of plasticized amorphous regions. There is always a temperature ceiling that can be used to anneal a starch without causing a decrease in crystallinity. That temperature ceiling is starch-specific, dependent on the structure of a starch, and is lower than the original onset gelatinization of a starch.

  16. Kinetic analysis of overlapping multistep thermal decomposition comprising exothermic and endothermic processes: thermolysis of ammonium dinitramide.

    PubMed

    Muravyev, Nikita V; Koga, Nobuyoshi; Meerov, Dmitry B; Pivkina, Alla N

    2017-01-25

    This study focused on kinetic modeling of a specific type of multistep heterogeneous reaction comprising exothermic and endothermic reaction steps, as exemplified by the practical kinetic analysis of the experimental kinetic curves for the thermal decomposition of molten ammonium dinitramide (ADN). It is known that the thermal decomposition of ADN occurs as a consecutive two step mass-loss process comprising the decomposition of ADN and subsequent evaporation/decomposition of in situ generated ammonium nitrate. These reaction steps provide exothermic and endothermic contributions, respectively, to the overall thermal effect. The overall reaction process was deconvoluted into two reaction steps using simultaneously recorded thermogravimetry and differential scanning calorimetry (TG-DSC) curves by considering the different physical meanings of the kinetic data derived from TG and DSC by P value analysis. The kinetic data thus separated into exothermic and endothermic reaction steps were kinetically characterized using kinetic computation methods including isoconversional method, combined kinetic analysis, and master plot method. The overall kinetic behavior was reproduced as the sum of the kinetic equations for each reaction step considering the contributions to the rate data derived from TG and DSC. During reproduction of the kinetic behavior, the kinetic parameters and contributions of each reaction step were optimized using kinetic deconvolution analysis. As a result, the thermal decomposition of ADN was successfully modeled as partially overlapping exothermic and endothermic reaction steps. The logic of the kinetic modeling was critically examined, and the practical usefulness of phenomenological modeling for the thermal decomposition of ADN was illustrated to demonstrate the validity of the methodology and its applicability to similar complex reaction processes.

  17. Detection of Heterogeneous Small Inclusions by a Multi-Step MUSIC Method

    NASA Astrophysics Data System (ADS)

    Solimene, Raffaele; Dell'Aversano, Angela; Leone, Giovanni

    2014-05-01

    In this contribution the problem of detecting and localizing scatterers with small (in terms of wavelength) cross sections by collecting their scattered field is addressed. The problem is dealt with for a two-dimensional and scalar configuration where the background is given as a two-layered cylindrical medium. More in detail, while scattered field data are taken in the outermost layer, inclusions are embedded within the inner layer. Moreover, the case of heterogeneous inclusions (i.e., having different scattering coefficients) is addressed. As a pertinent applicative context we identify the problem of diagnose concrete pillars in order to detect and locate rebars, ducts and other small in-homogeneities that can populate the interior of the pillar. The nature of inclusions influences the scattering coefficients. For example, the field scattered by rebars is stronger than the one due to ducts. Accordingly, it is expected that the more weakly scattering inclusions can be difficult to be detected as their scattered fields tend to be overwhelmed by those of strong scatterers. In order to circumvent this problem, in this contribution a multi-step MUltiple SIgnal Classification (MUSIC) detection algorithm is adopted [1]. In particular, the first stage aims at detecting rebars. Once rebars have been detected, their positions are exploited to update the Green's function and to subtract the scattered field due to their presence. The procedure is repeated until all the inclusions are detected. The analysis is conducted by numerical experiments for a multi-view/multi-static single-frequency configuration and the synthetic data are generated by a FDTD forward solver. Acknowledgement This work benefited from networking activities carried out within the EU funded COST Action TU1208 "Civil Engineering Applications of Ground Penetrating Radar." [1] R. Solimene, A. Dell'Aversano and G. Leone, "MUSIC algorithms for rebar detection," J. of Geophysics and Engineering, vol. 10, pp. 1

  18. On the Analysis of Multistep-Out-of-Grid Method for Celestial Mechanics Tasks

    NASA Astrophysics Data System (ADS)

    Olifer, L.; Choliy, V.

    2016-09-01

    Occasionally, there is a necessity in high-accurate prediction of celestial body trajectory. The most common way to do that is to solve Kepler's equation analytically or to use Runge-Kutta or Adams integrators to solve equation of motion numerically. For low-orbit satellites, there is a critical need in accounting geopotential and another forces which influence motion. As the result, the right side of equation of motion becomes much bigger, and classical integrators will not be quite effective. On the other hand, there is a multistep-out-of-grid (MOG) method which combines Runge-Kutta and Adams methods. The MOG method is based on using m on-grid values of the solution and n × m off-grid derivative estimations. Such method could provide stable integrators of maximum possible order, O (hm+mn+n-1). The main subject of this research was to implement and analyze the MOG method for solving satellite equation of motion with taking into account Earth geopotential model (ex. EGM2008 (Pavlis at al., 2008)) and with possibility to add other perturbations such as atmospheric drag or solar radiation pressure. Simulations were made for satellites on low orbit and with various eccentricities (from 0.1 to 0.9). Results of the MOG integrator were compared with results of Runge-Kutta and Adams integrators. It was shown that the MOG method has better accuracy than classical ones of the same order and less right-hand value estimations when is working on high orders. That gives it some advantage over "classical" methods.

  19. Mouse Embryonic Stem Cells Inhibit Murine Cytomegalovirus Infection through a Multi-Step Process

    PubMed Central

    Kawasaki, Hideya; Kosugi, Isao; Arai, Yoshifumi; Iwashita, Toshihide; Tsutsui, Yoshihiro

    2011-01-01

    In humans, cytomegalovirus (CMV) is the most significant infectious cause of intrauterine infections that cause congenital anomalies of the central nervous system. Currently, it is not known how this process is affected by the timing of infection and the susceptibility of early-gestational-period cells. Embryonic stem (ES) cells are more resistant to CMV than most other cell types, although the mechanism responsible for this resistance is not well understood. Using a plaque assay and evaluation of immediate-early 1 mRNA and protein expression, we found that mouse ES cells were resistant to murine CMV (MCMV) at the point of transcription. In ES cells infected with MCMV, treatment with forskolin and trichostatin A did not confer full permissiveness to MCMV. In ES cultures infected with elongation factor-1α (EF-1α) promoter-green fluorescent protein (GFP) recombinant MCMV at a multiplicity of infection of 10, less than 5% of cells were GFP-positive, despite the fact that ES cells have relatively high EF-1α promoter activity. Quantitative PCR analysis of the MCMV genome showed that ES cells allow approximately 20-fold less MCMV DNA to enter the nucleus than mouse embryonic fibroblasts (MEFs) do, and that this inhibition occurs in a multi-step manner. In situ hybridization revealed that ES cell nuclei have significantly less MCMV DNA than MEF nuclei. This appears to be facilitated by the fact that ES cells express less heparan sulfate, β1 integrin, and vimentin, and have fewer nuclear pores, than MEF. This may reduce the ability of MCMV to attach to and enter through the cellular membrane, translocate to the nucleus, and cross the nuclear membrane in pluripotent stem cells (ES/induced pluripotent stem cells). The results presented here provide perspective on the relationship between CMV susceptibility and cell differentiation. PMID:21407806

  20. ALG: automated genotype calling of Luminex assays.

    PubMed

    Bourgey, Mathieu; Lariviere, Mathieu; Richer, Chantal; Sinnett, Daniel

    2011-05-06

    Single nucleotide polymorphisms (SNPs) are the most commonly used polymorphic markers in genetics studies. Among the different platforms for SNP genotyping, Luminex is one of the less exploited mainly due to the lack of a robust (semi-automated and replicable) freely available genotype calling software. Here we describe a clustering algorithm that provides automated SNP calls for Luminex genotyping assays. We genotyped 3 SNPs in a cohort of 330 childhood leukemia patients, 200 parents of patient and 325 healthy individuals and used the Automated Luminex Genotyping (ALG) algorithm for SNP calling. ALG genotypes were called twice to test for reproducibility and were compared to sequencing data to test for accuracy. Globally, this analysis demonstrates the accuracy (99.6%) of the method, its reproducibility (99.8%) and the low level of no genotyping calls (3.4%). The high efficiency of the method proves that ALG is a suitable alternative to the current commercial software. ALG is semi-automated, and provides numerical measures of confidence for each SNP called, as well as an effective graphical plot. Moreover ALG can be used either through a graphical user interface, requiring no specific informatics knowledge, or through command line with access to the open source code. The ALG software has been implemented in R and is freely available for non-commercial use either at http://alg.sourceforge.net or by request to mathieu.bourgey@umontreal.ca.

  1. Disposable and removable nucleic acid extraction and purification cartridges for automated flow-through systems

    DOEpatents

    Regan, John Frederick

    2014-09-09

    Removable cartridges are used on automated flow-through systems for the purpose of extracting and purifying genetic material from complex matrices. Different types of cartridges are paired with specific automated protocols to concentrate, extract, and purifying pathogenic or human genetic material. Their flow-through nature allows large quantities sample to be processed. Matrices may be filtered using size exclusion and/or affinity filters to concentrate the pathogen of interest. Lysed material is ultimately passed through a filter to remove the insoluble material before the soluble genetic material is delivered past a silica-like membrane that binds the genetic material, where it is washed, dried, and eluted. Cartridges are inserted into the housing areas of flow-through automated instruments, which are equipped with sensors to ensure proper placement and usage of the cartridges. Properly inserted cartridges create fluid- and air-tight seals with the flow lines of an automated instrument.

  2. Power subsystem automation study

    NASA Technical Reports Server (NTRS)

    Imamura, M. S.; Moser, R. L.; Veatch, M.

    1983-01-01

    Generic power-system elements and their potential faults are identified. Automation functions and their resulting benefits are defined and automation functions between power subsystem, central spacecraft computer, and ground flight-support personnel are partitioned. All automation activities were categorized as data handling, monitoring, routine control, fault handling, planning and operations, or anomaly handling. Incorporation of all these classes of tasks, except for anomaly handling, in power subsystem hardware and software was concluded to be mandatory to meet the design and operational requirements of the space station. The key drivers are long mission lifetime, modular growth, high-performance flexibility, a need to accommodate different electrical user-load equipment, onorbit assembly/maintenance/servicing, and potentially large number of power subsystem components. A significant effort in algorithm development and validation is essential in meeting the 1987 technology readiness date for the space station.

  3. Fully automated protein purification

    PubMed Central

    Camper, DeMarco V.; Viola, Ronald E.

    2009-01-01

    Obtaining highly purified proteins is essential to begin investigating their functional and structural properties. The steps that are typically involved in purifying proteins can include an initial capture, intermediate purification, and a final polishing step. Completing these steps can take several days and require frequent attention to ensure success. Our goal was to design automated protocols that will allow the purification of proteins with minimal operator intervention. Separate methods have been produced and tested that automate the sample loading, column washing, sample elution and peak collection steps for ion-exchange, metal affinity, hydrophobic interaction and gel filtration chromatography. These individual methods are designed to be coupled and run sequentially in any order to achieve a flexible and fully automated protein purification protocol. PMID:19595984

  4. Automated telescope scheduling

    NASA Technical Reports Server (NTRS)

    Johnston, Mark D.

    1988-01-01

    With the ever increasing level of automation of astronomical telescopes the benefits and feasibility of automated planning and scheduling are becoming more apparent. Improved efficiency and increased overall telescope utilization are the most obvious goals. Automated scheduling at some level has been done for several satellite observatories, but the requirements on these systems were much less stringent than on modern ground or satellite observatories. The scheduling problem is particularly acute for Hubble Space Telescope: virtually all observations must be planned in excruciating detail weeks to months in advance. Space Telescope Science Institute has recently made significant progress on the scheduling problem by exploiting state-of-the-art artificial intelligence software technology. What is especially interesting is that this effort has already yielded software that is well suited to scheduling groundbased telescopes, including the problem of optimizing the coordinated scheduling of more than one telescope.

  5. Automated segmentation and reconstruction of patient-specific cardiac anatomy and pathology from in vivo MRI*

    NASA Astrophysics Data System (ADS)

    Ringenberg, Jordan; Deo, Makarand; Devabhaktuni, Vijay; Filgueiras-Rama, David; Pizarro, Gonzalo; Ibañez, Borja; Berenfeld, Omer; Boyers, Pamela; Gold, Jeffrey

    2012-12-01

    This paper presents an automated method to segment left ventricle (LV) tissues from functional and delayed-enhancement (DE) cardiac magnetic resonance imaging (MRI) scans using a sequential multi-step approach. First, a region of interest (ROI) is computed to create a subvolume around the LV using morphological operations and image arithmetic. From the subvolume, the myocardial contours are automatically delineated using difference of Gaussians (DoG) filters and GSV snakes. These contours are used as a mask to identify pathological tissues, such as fibrosis or scar, within the DE-MRI. The presented automated technique is able to accurately delineate the myocardium and identify the pathological tissue in patient sets. The results were validated by two expert cardiologists, and in one set the automated results are quantitatively and qualitatively compared with expert manual delineation. Furthermore, the method is patient-specific, performed on an entire patient MRI series. Thus, in addition to providing a quick analysis of individual MRI scans, the fully automated segmentation method is used for effectively tagging regions in order to reconstruct computerized patient-specific 3D cardiac models. These models can then be used in electrophysiological studies and surgical strategy planning.

  6. Analysis of amyotrophic lateral sclerosis as a multistep process: a population-based modelling study

    PubMed Central

    Al-Chalabi, Ammar; Calvo, Andrea; Chio, Adriano; Colville, Shuna; Ellis, Cathy M; Hardiman, Orla; Heverin, Mark; Howard, Robin S; Huisman, Mark H B; Keren, Noa; Leigh, P Nigel; Mazzini, Letizia; Mora, Gabriele; Orrell, Richard W; Rooney, James; Scott, Kirsten M; Scotton, William J; Seelen, Meinie; Shaw, Christopher E; Sidle, Katie S; Swingler, Robert; Tsuda, Miho; Veldink, Jan H; Visser, Anne E; van den Berg, Leonard H; Pearce, Neil

    2014-01-01

    Summary Background Amyotrophic lateral sclerosis shares characteristics with some cancers, such as onset being more common in later life, progression usually being rapid, the disease affecting a particular cell type, and showing complex inheritance. We used a model originally applied to cancer epidemiology to investigate the hypothesis that amyotrophic lateral sclerosis is a multistep process. Methods We generated incidence data by age and sex from amyotrophic lateral sclerosis population registers in Ireland (registration dates 1995–2012), the Netherlands (2006–12), Italy (1995–2004), Scotland (1989–98), and England (2002–09), and calculated age and sex-adjusted incidences for each register. We regressed the log of age-specific incidence against the log of age with least squares regression. We did the analyses within each register, and also did a combined analysis, adjusting for register. Findings We identified 6274 cases of amyotrophic lateral sclerosis from a catchment population of about 34 million people. We noted a linear relationship between log incidence and log age in all five registers: England r2=0·95, Ireland r2=0·99, Italy r2=0·95, the Netherlands r2=0·99, and Scotland r2=0·97; overall r2=0·99. All five registers gave similar estimates of the linear slope ranging from 4·5 to 5·1, with overlapping confidence intervals. The combination of all five registers gave an overall slope of 4·8 (95% CI 4·5–5·0), with similar estimates for men (4·6, 4·3–4·9) and women (5·0, 4·5–5·5). Interpretation A linear relationship between the log incidence and log age of onset of amyotrophic lateral sclerosis is consistent with a multistage model of disease. The slope estimate suggests that amyotrophic lateral sclerosis is a six-step process. Identification of these steps could lead to preventive and therapeutic avenues. Funding UK Medical Research Council; UK Economic and Social Research Council; Ireland Health Research Board; The

  7. Achieving effective terminal exciton delivery in quantum dot antenna-sensitized multistep DNA photonic wires.

    PubMed

    Spillmann, Christopher M; Ancona, Mario G; Buckhout-White, Susan; Algar, W Russ; Stewart, Michael H; Susumu, Kimihiro; Huston, Alan L; Goldman, Ellen R; Medintz, Igor L

    2013-08-27

    exciton transfer efficiencies approaching 100% are seen when the dye spacings are 0.5 × R0. However, as additional dyes are included in each wire, strong nonidealities appear that are suspected to arise predominantly from the poor photophysical performance of the last two acceptor dyes (Cy5 and Cy5.5). The results are discussed in the context of improving exciton transfer efficiency along photonic wires and the contributions these architectures can make to understanding multistep FRET processes.

  8. Transarterial chemoembolization using iodized oil for unresectable hepatocellular carcinoma: perspective from multistep hepatocarcinogenesis.

    PubMed

    Yoshimitsu, Kengo

    2014-01-01

    Transarterial chemoembolization (TACE) using iodized oil (Lipiodol(®)) (Lp-TACE) as a carrier of chemotherapeutic agents has been routinely performed to control hepatocellular carcinomas (HCC) in Japan, and its use has yielded fairly beneficial therapeutic results. Lipiodol is thought to pass through the tumor sinusoids of HCC and reach the outflow drainage areas, namely, the portal venous side of the tumor. By doing this, Lipiodol blocks not only the tumor's arterial inflow but also its portal venous outflow, providing sufficient ischemic effects. It is known that the inflow blood system, tumor sinusoids, and outflow blood system change drastically during the process of multistep hepatocarcinogenesis; thus, it is reasonable to postulate that the distribution of Lipiodol and the subsequent therapeutic effect of Lp-TACE may also change during that process. Arterial inflow to HCC is highest for moderately differentiated HCC (mHCC) and is relatively low in well or poorly differentiated HCC (wHCC and pHCC, respectively). It has been suggested that the metabolic state of wHCC and mHCC is aerobic, while that of pHCC is anaerobic. The tumor sinusoids in wHCC and mHCC are small in size and large in number, while those in pHCC are large in size and small in number. This finding results in a greater chance of tumor cell exposure to chemotherapeutic agents in the former and a lesser chance in the latter. The outflow tract, namely, the drainage system via the residual portal venous branches within the pseudocapsule, is more complete in mHCC and pHCC and less so in wHCC. Considering all of these components of HCC of different histological grades, Lp-TACE should have the greatest effect on mHCC and a relatively low effect on wHCC and pHCC. To achieve consistently high therapeutic results, it is important to consider these components, which affect the sensitivity of HCC to Lp-TACE, to maximize both the chemotherapeutic and ischemic effects of this therapy.

  9. Multi-steps infrared spectroscopic characterization of the effect of flowering on medicinal value of Cistanche tubulosa

    NASA Astrophysics Data System (ADS)

    Lai, Zuliang; Xu, Peng; Wu, Peiyi

    2009-01-01

    Multi-steps infrared spectroscopic methods, including conventional Fourier transform infrared spectroscopy (FT-IR), second derivative spectroscopy and two-dimensional infrared (2D-IR) correlation spectroscopy, have been proved to be effective methods to examine complicated mixture system such as Chinese herbal medicine. The focus of this paper is the investigation on the effect of flowering on the pharmaceutical components of Cistanche tubulosa by using the Multi-steps infrared spectroscopic method. Power-spectrum analysis is applied to improve the resolution of 2D-IR contour maps and much more details of overlapped peaks are detected. According to the results of FT-IR and second derivative spectra, the peak at 1732 cm -1 assigned to C dbnd O is stronger before flowering than that after flowering in the stem, while more C dbnd O groups are found in the top after flowering. The spectra of root change a lot in the process of flowering for the reason that many peaks shift and disappear after flowering. Seven peaks in the spectra of stem, which are assigned to different kinds of glycoside components, are distinguished by Power-spectra in the range of 900-1200 cm -1. The results provide a scientific explanation to the traditional experience that flowering consumes the pharmaceutical components in stem and the seeds absorb some nutrients of stem after flowering. In conclusion, the Multi-steps infrared spectroscopic method combined with Power-spectra is a promising method to investigate the flowering process of C. tubulosa and discriminate various parts of the herbal medicine.

  10. Automating the CMS DAQ

    SciTech Connect

    Bauer, G.; et al.

    2014-01-01

    We present the automation mechanisms that have been added to the Data Acquisition and Run Control systems of the Compact Muon Solenoid (CMS) experiment during Run 1 of the LHC, ranging from the automation of routine tasks to automatic error recovery and context-sensitive guidance to the operator. These mechanisms helped CMS to maintain a data taking efficiency above 90% and to even improve it to 95% towards the end of Run 1, despite an increase in the occurrence of single-event upsets in sub-detector electronics at high LHC luminosity.

  11. Automated Library System Specifications.

    DTIC Science & Technology

    1986-06-01

    AD-A78 95 AUTOMATED LIBRARY SYSTEM SPECIFICATIONS(U) ARMY LIBRARY /i MANAGEMENT OFFICE ALEXANDRIA VA ASSISTANT CHIEF OF STAFF FOR INFORMATION... MANAGEMENT M B BONNETT JUN 86 UNCLASSIFIED F/G 9/2 NLEElIIhllEEEEE IllEEEEEllllEI .1lm lliml * ~I fI.L25 MI, [OCM RL,;OCLUTO fl. ’N k~ AUTOMATED LIBRARY...SYSTEM SPECIFICATIONS .,I Prepared by Mary B. Bonnett ARMY LIBRARY MANAGEMENT OFFICE OFFICE OF THE ASSISTANT CHIEF OF STAFF FOR INFORMATION MANAGEMENT Lij

  12. Automated gas chromatography

    DOEpatents

    Mowry, Curtis D.; Blair, Dianna S.; Rodacy, Philip J.; Reber, Stephen D.

    1999-01-01

    An apparatus and process for the continuous, near real-time monitoring of low-level concentrations of organic compounds in a liquid, and, more particularly, a water stream. A small liquid volume of flow from a liquid process stream containing organic compounds is diverted by an automated process to a heated vaporization capillary where the liquid volume is vaporized to a gas that flows to an automated gas chromatograph separation column to chromatographically separate the organic compounds. Organic compounds are detected and the information transmitted to a control system for use in process control. Concentrations of organic compounds less than one part per million are detected in less than one minute.

  13. Automated knowledge generation

    NASA Technical Reports Server (NTRS)

    Myler, Harley R.; Gonzalez, Avelino J.

    1988-01-01

    The general objectives of the NASA/UCF Automated Knowledge Generation Project were the development of an intelligent software system that could access CAD design data bases, interpret them, and generate a diagnostic knowledge base in the form of a system model. The initial area of concentration is in the diagnosis of the process control system using the Knowledge-based Autonomous Test Engineer (KATE) diagnostic system. A secondary objective was the study of general problems of automated knowledge generation. A prototype was developed, based on object-oriented language (Flavors).

  14. Investigation and comparison of analytical, numerical, and experimentally measured coupling losses for multi-step index optical fibers.

    PubMed

    Aldabaldetreku, Gotzon; Durana, Gaizka; Zubia, Joseba; Arrue, Jon; Poisel, Hans; Losada, María

    2005-05-30

    The aim of the present paper is to provide a comprehensive analysis of the coupling losses in multi-step index (MSI) fibres. Their light power acceptance properties are investigated to obtain the corresponding analytical expressions taking into account longitudinal, transverse, and angular misalignments. For this purpose, a uniform power distribution is assumed. In addition, we perform several experimental measurements and computer simulations in order to calculate the coupling losses for two different MSI polymer optical fibres (MSI-POFs). These results serve us to validate the theoretical expressions we have obtained.

  15. Multi-Step Ka/Ka Dichroic Plate with Rounded Corners for NASA's 34m Beam Waveguide Antenna

    NASA Technical Reports Server (NTRS)

    Veruttipong, Watt; Khayatian, Behrouz; Hoppe, Daniel; Long, Ezra

    2013-01-01

    A multi-step Ka/Ka dichroic plate Frequency Selective Surface (FSS structure) is designed, manufactured and tested for use in NASA's Deep Space Network (DSN) 34m Beam Waveguide (BWG) antennas. The proposed design allows ease of manufacturing and ability to handle the increased transmit power (reflected off the FSS) of the DSN BWG antennas from 20kW to 100 kW. The dichroic is designed using HFSS and results agree well with measured data considering the manufacturing tolerances that could be achieved on the dichroic.

  16. Altering user' acceptance of automation through prior automation exposure.

    PubMed

    Bekier, Marek; Molesworth, Brett R C

    2016-08-22

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  17. Automating Small Libraries.

    ERIC Educational Resources Information Center

    Swan, James

    1996-01-01

    Presents a four-phase plan for small libraries strategizing for automation: inventory and weeding, data conversion, implementation, and enhancements. Other topics include selecting a system, MARC records, compatibility, ease of use, industry standards, searching capabilities, support services, system security, screen displays, circulation modules,…

  18. Automated conflict resolution issues

    NASA Technical Reports Server (NTRS)

    Wike, Jeffrey S.

    1991-01-01

    A discussion is presented of how conflicts for Space Network resources should be resolved in the ATDRSS era. The following topics are presented: a description of how resource conflicts are currently resolved; a description of issues associated with automated conflict resolution; present conflict resolution strategies; and topics for further discussion.

  19. Automated Accounting. Instructor Guide.

    ERIC Educational Resources Information Center

    Moses, Duane R.

    This curriculum guide was developed to assist business instructors using Dac Easy Accounting College Edition Version 2.0 software in their accounting programs. The module consists of four units containing assignment sheets and job sheets designed to enable students to master competencies identified in the area of automated accounting. The first…

  20. Automated Student Model Improvement

    ERIC Educational Resources Information Center

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  1. Personnel Department Automation.

    ERIC Educational Resources Information Center

    Wilkinson, David

    In 1989, the Austin Independent School District's Office of Research and Evaluation was directed to monitor the automation of personnel information and processes in the district's Department of Personnel. Earlier, a study committee appointed by the Superintendent during the 1988-89 school year identified issues related to Personnel Department…

  2. Validating Automated Speaking Tests

    ERIC Educational Resources Information Center

    Bernstein, Jared; Van Moere, Alistair; Cheng, Jian

    2010-01-01

    This paper presents evidence that supports the valid use of scores from fully automatic tests of spoken language ability to indicate a person's effectiveness in spoken communication. The paper reviews the constructs, scoring, and the concurrent validity evidence of "facility-in-L2" tests, a family of automated spoken language tests in Spanish,…

  3. Automated Essay Scoring

    ERIC Educational Resources Information Center

    Dikli, Semire

    2006-01-01

    The impacts of computers on writing have been widely studied for three decades. Even basic computers functions, i.e. word processing, have been of great assistance to writers in modifying their essays. The research on Automated Essay Scoring (AES) has revealed that computers have the capacity to function as a more effective cognitive tool (Attali,…

  4. Automated Microbial Genome Annotation

    SciTech Connect

    Land, Miriam

    2009-05-29

    Miriam Land of the DOE Joint Genome Institute at Oak Ridge National Laboratory gives a talk on the current state and future challenges of moving toward automated microbial genome annotation at the "Sequencing, Finishing, Analysis in the Future" meeting in Santa Fe, NM

  5. Automated Administrative Data Bases

    NASA Technical Reports Server (NTRS)

    Marrie, M. D.; Jarrett, J. R.; Reising, S. A.; Hodge, J. E.

    1984-01-01

    Improved productivity and more effective response to information requirements for internal management, NASA Centers, and Headquarters resulted from using automated techniques. Modules developed to provide information on manpower, RTOPS, full time equivalency, and physical space reduced duplication, increased communication, and saved time. There is potential for greater savings by sharing and integrating with those who have the same requirements.

  6. Automated Management Of Documents

    NASA Technical Reports Server (NTRS)

    Boy, Guy

    1995-01-01

    Report presents main technical issues involved in computer-integrated documentation. Problems associated with automation of management and maintenance of documents analyzed from perspectives of artificial intelligence and human factors. Technologies that may prove useful in computer-integrated documentation reviewed: these include conventional approaches to indexing and retrieval of information, use of hypertext, and knowledge-based artificial-intelligence systems.

  7. Building Automation Systems.

    ERIC Educational Resources Information Center

    Honeywell, Inc., Minneapolis, Minn.

    A number of different automation systems for use in monitoring and controlling building equipment are described in this brochure. The system functions include--(1) collection of information, (2) processing and display of data at a central panel, and (3) taking corrective action by sounding alarms, making adjustments, or automatically starting and…

  8. Guide to Library Automation.

    ERIC Educational Resources Information Center

    Toohill, Barbara G.

    Directed toward librarians and library administrators who wish to procure automated systems or services for their libraries, this guide offers practical suggestions, advice, and methods for determining requirements, estimating costs and benefits, writing specifications procuring systems, negotiating contracts, and installing systems. The advice…

  9. Microcontroller for automation application

    NASA Technical Reports Server (NTRS)

    Cooper, H. W.

    1975-01-01

    The description of a microcontroller currently being developed for automation application was given. It is basically an 8-bit microcomputer with a 40K byte random access memory/read only memory, and can control a maximum of 12 devices through standard 15-line interface ports.

  10. Automated EEG acquisition

    NASA Technical Reports Server (NTRS)

    Frost, J. D., Jr.; Hillman, C. E., Jr.

    1977-01-01

    Automated self-contained portable device can be used by technicians with minimal training. Data acquired from patient at remote site are transmitted to centralized interpretation center using conventional telephone equipment. There, diagnostic information is analyzed, and results are relayed back to remote site.

  11. Automated Inadvertent Intruder Application

    SciTech Connect

    Koffman, Larry D.; Lee, Patricia L.; Cook, James R.; Wilhite, Elmer L.

    2008-01-15

    The Environmental Analysis and Performance Modeling group of Savannah River National Laboratory (SRNL) conducts performance assessments of the Savannah River Site (SRS) low-level waste facilities to meet the requirements of DOE Order 435.1. These performance assessments, which result in limits on the amounts of radiological substances that can be placed in the waste disposal facilities, consider numerous potential exposure pathways that could occur in the future. One set of exposure scenarios, known as inadvertent intruder analysis, considers the impact on hypothetical individuals who are assumed to inadvertently intrude onto the waste disposal site. Inadvertent intruder analysis considers three distinct scenarios for exposure referred to as the agriculture scenario, the resident scenario, and the post-drilling scenario. Each of these scenarios has specific exposure pathways that contribute to the overall dose for the scenario. For the inadvertent intruder analysis, the calculation of dose for the exposure pathways is a relatively straightforward algebraic calculation that utilizes dose conversion factors. Prior to 2004, these calculations were performed using an Excel spreadsheet. However, design checks of the spreadsheet calculations revealed that errors could be introduced inadvertently when copying spreadsheet formulas cell by cell and finding these errors was tedious and time consuming. This weakness led to the specification of functional requirements to create a software application that would automate the calculations for inadvertent intruder analysis using a controlled source of input parameters. This software application, named the Automated Inadvertent Intruder Application, has undergone rigorous testing of the internal calculations and meets software QA requirements. The Automated Inadvertent Intruder Application was intended to replace the previous spreadsheet analyses with an automated application that was verified to produce the same calculations and

  12. Rapid access to compound libraries through flow technology: fully automated synthesis of a 3-aminoindolizine library via orthogonal diversification.

    PubMed

    Lange, Paul P; James, Keith

    2012-10-08

    A novel methodology for the synthesis of druglike heterocycle libraries has been developed through the use of flow reactor technology. The strategy employs orthogonal modification of a heterocyclic core, which is generated in situ, and was used to construct both a 25-membered library of druglike 3-aminoindolizines, and selected examples of a 100-member virtual library. This general protocol allows a broad range of acylation, alkylation and sulfonamidation reactions to be performed in conjunction with a tandem Sonogashira coupling/cycloisomerization sequence. All three synthetic steps were conducted under full automation in the flow reactor, with no handling or isolation of intermediates, to afford the desired products in good yields. This fully automated, multistep flow approach opens the way to highly efficient generation of druglike heterocyclic systems as part of a lead discovery strategy or within a lead optimization program.

  13. Automating spectral measurements

    NASA Astrophysics Data System (ADS)

    Goldstein, Fred T.

    2008-09-01

    This paper discusses the architecture of software utilized in spectroscopic measurements. As optical coatings become more sophisticated, there is mounting need to automate data acquisition (DAQ) from spectrophotometers. Such need is exacerbated when 100% inspection is required, ancillary devices are utilized, cost reduction is crucial, or security is vital. While instrument manufacturers normally provide point-and-click DAQ software, an application programming interface (API) may be missing. In such cases automation is impossible or expensive. An API is typically provided in libraries (*.dll, *.ocx) which may be embedded in user-developed applications. Users can thereby implement DAQ automation in several Windows languages. Another possibility, developed by FTG as an alternative to instrument manufacturers' software, is the ActiveX application (*.exe). ActiveX, a component of many Windows applications, provides means for programming and interoperability. This architecture permits a point-and-click program to act as automation client and server. Excel, for example, can control and be controlled by DAQ applications. Most importantly, ActiveX permits ancillary devices such as barcode readers and XY-stages to be easily and economically integrated into scanning procedures. Since an ActiveX application has its own user-interface, it can be independently tested. The ActiveX application then runs (visibly or invisibly) under DAQ software control. Automation capabilities are accessed via a built-in spectro-BASIC language with industry-standard (VBA-compatible) syntax. Supplementing ActiveX, spectro-BASIC also includes auxiliary serial port commands for interfacing programmable logic controllers (PLC). A typical application is automatic filter handling.

  14. Abundance and composition of indigenous bacterial communities in a multi-step biofiltration-based drinking water treatment plant.

    PubMed

    Lautenschlager, Karin; Hwang, Chiachi; Ling, Fangqiong; Liu, Wen-Tso; Boon, Nico; Köster, Oliver; Egli, Thomas; Hammes, Frederik

    2014-10-01

    Indigenous bacterial communities are essential for biofiltration processes in drinking water treatment systems. In this study, we examined the microbial community composition and abundance of three different biofilter types (rapid sand, granular activated carbon, and slow sand filters) and their respective effluents in a full-scale, multi-step treatment plant (Zürich, CH). Detailed analysis of organic carbon degradation underpinned biodegradation as the primary function of the biofilter biomass. The biomass was present in concentrations ranging between 2-5 × 10(15) cells/m(3) in all filters but was phylogenetically, enzymatically and metabolically diverse. Based on 16S rRNA gene-based 454 pyrosequencing analysis for microbial community composition, similar microbial taxa (predominantly Proteobacteria, Planctomycetes, Acidobacteria, Bacteriodetes, Nitrospira and Chloroflexi) were present in all biofilters and in their respective effluents, but the ratio of microbial taxa was different in each filter type. This change was also reflected in the cluster analysis, which revealed a change of 50-60% in microbial community composition between the different filter types. This study documents the direct influence of the filter biomass on the microbial community composition of the final drinking water, particularly when the water is distributed without post-disinfection. The results provide new insights on the complexity of indigenous bacteria colonizing drinking water systems, especially in different biofilters of a multi-step treatment plant.

  15. Comparison of GenomEra C. difficile and Xpert C. difficile as confirmatory tests in a multistep algorithm for diagnosis of Clostridium difficile infection.

    PubMed

    Alcalá, Luis; Reigadas, Elena; Marín, Mercedes; Fernández-Chico, Antonia; Catalán, Pilar; Bouza, Emilio

    2015-01-01

    We compared two multistep diagnostic algorithms based on C. Diff Quik Chek Complete and, as confirmatory tests, GenomEra C. difficile and Xpert C. difficile. The sensitivity, specificity, positive predictive value, and negative predictive value were 87.2%, 99.7%, 97.1%, and 98.3%, respectively, for the GenomEra-based algorithm and 89.7%, 99.4%, 95.5%, and 98.6%, respectively, for the Xpert-based algorithm. GenomEra represents an alternative to Xpert as a confirmatory test of a multistep algorithm for Clostridium difficile infection (CDI) diagnosis.

  16. Genetic and Developmental Basis of Cardiovascular Malformations

    PubMed Central

    Azhar, Mohamad; Ware, Stephanie M.

    2015-01-01

    Cardiovascular malformations (CVMs) are the most common birth defect, occurring in 1–5% of all live births. Genetic, epigenetic, and environmental factors all influence the development of CVMs, and an improved understanding of causation of CVMs is a prerequisite for prevention. Cardiac development is a complex, multi-step process of morphogenesis that is under genetic regulation. Multiple developmental pathways act independently or in combination to effect proper cardiac lineage specification, differentiation, and structure. Because of this complexity, there are numerous potential mechanisms by which genetic variation can impact both fetal cardiac development and latent cardiac disease. Although the genetic contribution to CVMs is well recognized, the genetic causes of human CVMs are still identified relatively infrequently. Mouse models are important tools to investigate the molecular mechanisms underpinning cardiac development as well as the complex genetics that characterize human CVMs. In this review we provide an overview of the key genetic concepts characterizing human CVMs, review their developmental basis, and provide examples to illustrate the critical developmental and genetic concepts underlying the pathogenesis of CVMs. PMID:26876120

  17. Automated campaign system

    NASA Astrophysics Data System (ADS)

    Vondran, Gary; Chao, Hui; Lin, Xiaofan; Beyer, Dirk; Joshi, Parag; Atkins, Brian; Obrador, Pere

    2006-02-01

    To run a targeted campaign involves coordination and management across numerous organizations and complex process flows. Everything from market analytics on customer databases, acquiring content and images, composing the materials, meeting the sponsoring enterprise brand standards, driving through production and fulfillment, and evaluating results; all processes are currently performed by experienced highly trained staff. Presented is a developed solution that not only brings together technologies that automate each process, but also automates the entire flow so that a novice user could easily run a successful campaign from their desktop. This paper presents the technologies, structure, and process flows used to bring this system together. Highlighted will be how the complexity of running a targeted campaign is hidden from the user through technologies, all while providing the benefits of a professionally managed campaign.

  18. Automated macromolecular crystallization screening

    DOEpatents

    Segelke, Brent W.; Rupp, Bernhard; Krupka, Heike I.

    2005-03-01

    An automated macromolecular crystallization screening system wherein a multiplicity of reagent mixes are produced. A multiplicity of analysis plates is produced utilizing the reagent mixes combined with a sample. The analysis plates are incubated to promote growth of crystals. Images of the crystals are made. The images are analyzed with regard to suitability of the crystals for analysis by x-ray crystallography. A design of reagent mixes is produced based upon the expected suitability of the crystals for analysis by x-ray crystallography. A second multiplicity of mixes of the reagent components is produced utilizing the design and a second multiplicity of reagent mixes is used for a second round of automated macromolecular crystallization screening. In one embodiment the multiplicity of reagent mixes are produced by a random selection of reagent components.

  19. Terminal automation system maintenance

    SciTech Connect

    Coffelt, D.; Hewitt, J.

    1997-01-01

    Nothing has improved petroleum product loading in recent years more than terminal automation systems. The presence of terminal automation systems (TAS) at loading racks has increased operational efficiency and safety and enhanced their accounting and management capabilities. However, like all finite systems, they occasionally malfunction or fail. Proper servicing and maintenance can minimize this. And in the unlikely event a TAS breakdown does occur, prompt and effective troubleshooting can reduce its impact on terminal productivity. To accommodate around-the-clock loading at racks, increasingly unattended by terminal personnel, TAS maintenance, servicing and troubleshooting has become increasingly demanding. It has also become increasingly important. After 15 years of trial and error at petroleum and petrochemical storage and transfer terminals, a number of successful troubleshooting programs have been developed. These include 24-hour {open_quotes}help hotlines,{close_quotes} internal (terminal company) and external (supplier) support staff, and {open_quotes}layered{close_quotes} support. These programs are described.

  20. Automated Chromosome Breakage Assessment

    NASA Technical Reports Server (NTRS)

    Castleman, Kenneth

    1985-01-01

    An automated karyotyping machine was built at JPL in 1972. It does computerized karyotyping, but it has some hardware limitations. The image processing hardware that was available at a reasonable price in 1972 was marginal, at best, for this job. In the meantime, NASA has developed an interest in longer term spaceflights and an interest in using chromosome breakage studies as a dosimeter for radiation or perhaps other damage that might occur to the tissues. This uses circulating lymphocytes as a physiological dosimeter looking for chromosome breakage on long-term spaceflights. For that reason, we have reactivated the automated karyotyping work at JPL. An update on that work, and a description of where it appears to be headed is presented.

  1. The automation of science.

    PubMed

    King, Ross D; Rowland, Jem; Oliver, Stephen G; Young, Michael; Aubrey, Wayne; Byrne, Emma; Liakata, Maria; Markham, Magdalena; Pir, Pinar; Soldatova, Larisa N; Sparkes, Andrew; Whelan, Kenneth E; Clare, Amanda

    2009-04-03

    The basis of science is the hypothetico-deductive method and the recording of experiments in sufficient detail to enable reproducibility. We report the development of Robot Scientist "Adam," which advances the automation of both. Adam has autonomously generated functional genomics hypotheses about the yeast Saccharomyces cerevisiae and experimentally tested these hypotheses by using laboratory automation. We have confirmed Adam's conclusions through manual experiments. To describe Adam's research, we have developed an ontology and logical language. The resulting formalization involves over 10,000 different research units in a nested treelike structure, 10 levels deep, that relates the 6.6 million biomass measurements to their logical description. This formalization describes how a machine contributed to scientific knowledge.

  2. Automated gas chromatography

    DOEpatents

    Mowry, C.D.; Blair, D.S.; Rodacy, P.J.; Reber, S.D.

    1999-07-13

    An apparatus and process for the continuous, near real-time monitoring of low-level concentrations of organic compounds in a liquid, and, more particularly, a water stream. A small liquid volume of flow from a liquid process stream containing organic compounds is diverted by an automated process to a heated vaporization capillary where the liquid volume is vaporized to a gas that flows to an automated gas chromatograph separation column to chromatographically separate the organic compounds. Organic compounds are detected and the information transmitted to a control system for use in process control. Concentrations of organic compounds less than one part per million are detected in less than one minute. 7 figs.

  3. Automated assembly in space

    NASA Technical Reports Server (NTRS)

    Srivastava, Sandanand; Dwivedi, Suren N.; Soon, Toh Teck; Bandi, Reddy; Banerjee, Soumen; Hughes, Cecilia

    1989-01-01

    The installation of robots and their use of assembly in space will create an exciting and promising future for the U.S. Space Program. The concept of assembly in space is very complicated and error prone and it is not possible unless the various parts and modules are suitably designed for automation. Certain guidelines are developed for part designing and for an easy precision assembly. Major design problems associated with automated assembly are considered and solutions to resolve these problems are evaluated in the guidelines format. Methods for gripping and methods for part feeding are developed with regard to the absence of gravity in space. The guidelines for part orientation, adjustments, compliances and various assembly construction are discussed. Design modifications of various fasteners and fastening methods are also investigated.

  4. Automated Assembly Center (AAC)

    NASA Technical Reports Server (NTRS)

    Stauffer, Robert J.

    1993-01-01

    The objectives of this project are as follows: to integrate advanced assembly and assembly support technology under a comprehensive architecture; to implement automated assembly technologies in the production of high-visibility DOD weapon systems; and to document the improved cost, quality, and lead time. This will enhance the production of DOD weapon systems by utilizing the latest commercially available technologies combined into a flexible system that will be able to readily incorporate new technologies as they emerge. Automated assembly encompasses the following areas: product data, process planning, information management policies and framework, three schema architecture, open systems communications, intelligent robots, flexible multi-ability end effectors, knowledge-based/expert systems, intelligent workstations, intelligent sensor systems, and PDES/PDDI data standards.

  5. The automated command transmission

    NASA Astrophysics Data System (ADS)

    Inoue, Y.; Satoh, S.

    A technique for automated command transmission (ACT) to GEO-stationed satellites is presented. The system is intended for easing the command center workload. The ACT system determines the relation of the commands to on-board units, connects the telemetry with on-board units, defines the control path on the spacecraft, identifies the correspondence of back-up units to primary units, and ascertains sunlight or eclipse conditions. The system also has the address of satellite and command decoders, the ID and content for the mission command sequence, group and inhibit codes, a listing of all available commands, and restricts the data to a command sequence. Telemetry supplies data for automated problem correction. All other missions operations are terminated during system recovery data processing after a crash. The ACT system is intended for use with the GMS spacecraft.

  6. Genetic Mapping

    MedlinePlus

    ... Fact Sheets Fact Sheets En Español: Mapeo Genético Genetic Mapping What is genetic mapping? How do researchers ... genetic map? What are genetic markers? What is genetic mapping? Among the main goals of the Human ...

  7. Genetic Counseling

    MedlinePlus

    ... Home > Pregnancy > Before or between pregnancies > Genetic counseling Genetic counseling E-mail to a friend Please fill ... a genetic counselor in your area. What is genetic counseling? Genetic counseling helps you understand how genes , ...

  8. Automated RSO Stability Analysis

    NASA Astrophysics Data System (ADS)

    Johnson, T.

    2016-09-01

    A methodology for assessing the attitude stability of a Resident Space Object (RSO) using visual magnitude data is presented and then scaled to run in an automated fashion across the entire satellite catalog. Results obtained by applying the methodology to the Commercial Space Operations Center (COMSpOC) catalog are presented and summarized, identifying objects that have changed stability. We also examine the timeline for detecting the transition from stable to unstable attitude

  9. Automation in Photogrammetry,

    DTIC Science & Technology

    1980-07-25

    Allam , 1978), and the OM-Bendix AS-lIB-X (Scarano and Bruma, 1976). The UNAMACE and GPM-2 employ analog (electronic) correlation technology. However...Survey (USGS) and the Surveys and Mapping Branch (Canada) have formed integrated systems based on the Gestalt GPM 2 (Brunson and Olson, 1978; Allam , 1978...ten years off, and the full automation of planimetric extraction may be more than 20 years in the future. REFERENCES Allam , M. M., 1978. The Role of

  10. Automated Nitrocellulose Analysis

    DTIC Science & Technology

    1978-12-01

    is acceptable. (4) As would be expected from the theory of osmosis , a high saline content in the dialysis recipient stream (countersolution) is of...Block 39, II different from Report; IS. SUPPLEMENTARY NOTES IS. KEY WOROS (Continue on rereri Analysis Automated analysis Dialysis Glyceryl...Technicon AutoAnalyzer, involves aspiration of a stirred nitrocellulose suspension, dialysis against 9 percent saline, and hydrolysis with 5N sodium

  11. Automated Cooperative Trajectories

    NASA Technical Reports Server (NTRS)

    Hanson, Curt; Pahle, Joseph; Brown, Nelson

    2015-01-01

    This presentation is an overview of the Automated Cooperative Trajectories project. An introduction to the phenomena of wake vortices is given, along with a summary of past research into the possibility of extracting energy from the wake by flying close parallel trajectories. Challenges and barriers to adoption of civilian automatic wake surfing technology are identified. A hardware-in-the-loop simulation is described that will support future research. Finally, a roadmap for future research and technology transition is proposed.

  12. Power subsystem automation study

    NASA Technical Reports Server (NTRS)

    Tietz, J. C.; Sewy, D.; Pickering, C.; Sauers, R.

    1984-01-01

    The purpose of the phase 2 of the power subsystem automation study was to demonstrate the feasibility of using computer software to manage an aspect of the electrical power subsystem on a space station. The state of the art in expert systems software was investigated in this study. This effort resulted in the demonstration of prototype expert system software for managing one aspect of a simulated space station power subsystem.

  13. Automated Microfluidics for Genomics

    DTIC Science & Technology

    2007-11-02

    the automation of it, see [4]. In the Genomation Laboratory at the Univ. of Washington (http://rcs.ee.washington.edu/GNL/genomation.html) and with Orca ...reproducible biology without contamination . The high throughput capability is competitive with large scale robotic batch processing. III. INSTRUMENTATION...essentially arbitrary low volume, and without any contact that might cause contamination . A. ACAPELLA-5K Core Processor The ACAPELLA-5K was designed with

  14. Automated RTOP Management System

    NASA Technical Reports Server (NTRS)

    Hayes, P.

    1984-01-01

    The structure of NASA's Office of Aeronautics and Space Technology electronic information system network from 1983 to 1985 is illustrated. The RTOP automated system takes advantage of existing hardware, software, and expertise, and provides: (1) computerized cover sheet and resources forms; (2) electronic signature and transmission; (3) a data-based information system; (4) graphics; (5) intercenter communications; (6) management information; and (7) text editing. The system is coordinated with Headquarters efforts in codes R,E, and T.

  15. Automated Microbial Metabolism Laboratory

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Development of the automated microbial metabolism laboratory (AMML) concept is reported. The focus of effort of AMML was on the advanced labeled release experiment. Labeled substrates, inhibitors, and temperatures were investigated to establish a comparative biochemical profile. Profiles at three time intervals on soil and pure cultures of bacteria isolated from soil were prepared to establish a complete library. The development of a strategy for the return of a soil sample from Mars is also reported.

  16. Cavendish Balance Automation

    NASA Technical Reports Server (NTRS)

    Thompson, Bryan

    2000-01-01

    This is the final report for a project carried out to modify a manual commercial Cavendish Balance for automated use in cryostat. The scope of this project was to modify an off-the-shelf manually operated Cavendish Balance to allow for automated operation for periods of hours or days in cryostat. The purpose of this modification was to allow the balance to be used in the study of effects of superconducting materials on the local gravitational field strength to determine if the strength of gravitational fields can be reduced. A Cavendish Balance was chosen because it is a fairly simple piece of equipment for measuring gravity, one the least accurately known and least understood physical constants. The principle activities that occurred under this purchase order were: (1) All the components necessary to hold and automate the Cavendish Balance in a cryostat were designed. Engineering drawings were made of custom parts to be fabricated, other off-the-shelf parts were procured; (2) Software was written in LabView to control the automation process via a stepper motor controller and stepper motor, and to collect data from the balance during testing; (3)Software was written to take the data collected from the Cavendish Balance and reduce it to give a value for the gravitational constant; (4) The components of the system were assembled and fitted to a cryostat. Also the LabView hardware including the control computer, stepper motor driver, data collection boards, and necessary cabling were assembled; and (5) The system was operated for a number of periods, data collected, and reduced to give an average value for the gravitational constant.

  17. Algorithms for automated DNA assembly

    PubMed Central

    Densmore, Douglas; Hsiau, Timothy H.-C.; Kittleson, Joshua T.; DeLoache, Will; Batten, Christopher; Anderson, J. Christopher

    2010-01-01

    Generating a defined set of genetic constructs within a large combinatorial space provides a powerful method for engineering novel biological functions. However, the process of assembling more than a few specific DNA sequences can be costly, time consuming and error prone. Even if a correct theoretical construction scheme is developed manually, it is likely to be suboptimal by any number of cost metrics. Modular, robust and formal approaches are needed for exploring these vast design spaces. By automating the design of DNA fabrication schemes using computational algorithms, we can eliminate human error while reducing redundant operations, thus minimizing the time and cost required for conducting biological engineering experiments. Here, we provide algorithms that optimize the simultaneous assembly of a collection of related DNA sequences. We compare our algorithms to an exhaustive search on a small synthetic dataset and our results show that our algorithms can quickly find an optimal solution. Comparison with random search approaches on two real-world datasets show that our algorithms can also quickly find lower-cost solutions for large datasets. PMID:20335162

  18. Automation model of sewerage rehabilitation planning.

    PubMed

    Yang, M D; Su, T C

    2006-01-01

    The major steps of sewerage rehabilitation include inspection of sewerage, assessment of structural conditions, computation of structural condition grades, and determination of rehabilitation methods and materials. Conventionally, sewerage rehabilitation planning relies on experts with professional background that is tedious and time-consuming. This paper proposes an automation model of planning optimal sewerage rehabilitation strategies for the sewer system by integrating image process, clustering technology, optimization, and visualization display. Firstly, image processing techniques, such as wavelet transformation and co-occurrence features extraction, were employed to extract various characteristics of structural failures from CCTV inspection images. Secondly, a classification neural network was established to automatically interpret the structural conditions by comparing the extracted features with the typical failures in a databank. Then, to achieve optimal rehabilitation efficiency, a genetic algorithm was used to determine appropriate rehabilitation methods and substitution materials for the pipe sections with a risk of mal-function and even collapse. Finally, the result from the automation model can be visualized in a geographic information system in which essential information of the sewer system and sewerage rehabilitation plans are graphically displayed. For demonstration, the automation model of optimal sewerage rehabilitation planning was applied to a sewer system in east Taichung, Chinese Taiwan.

  19. Autonomy, Automation, and Systems

    NASA Astrophysics Data System (ADS)

    Turner, Philip R.

    1987-02-01

    Aerospace industry interest in autonomy and automation, given fresh impetus by the national goal of establishing a Space Station, is becoming a major item of research and technology development. The promise of new technology arising from research in Artificial Intelligence (AI) has focused much attention on its potential in autonomy and automation. These technologies can improve performance in autonomous control functions that involve planning, scheduling, and fault diagnosis of complex systems. There are, however, many aspects of system and subsystem design in an autonomous system that impact AI applications, but do not directly involve AI technology. Development of a system control architecture, establishment of an operating system within the design, providing command and sensory data collection features appropriate to automated operation, and the use of design analysis tools to support system engineering are specific examples of major design issues. Aspects such as these must also receive attention and technology development support if we are to implement complex autonomous systems within the realistic limitations of mass, power, cost, and available flight-qualified technology that are all-important to a flight project.

  20. Automating existing stations

    SciTech Connect

    Little, J.E.

    1986-09-01

    The task was to automate 20 major compressor stations along ANR Pipeline Co.'s Southeastern and Southwestern pipelines in as many months. Meeting this schedule required standardized hardware and software design. Working with Bristol Babcock Co., ANR came up with an off-the-shelf station automation package suitable for a variety of compressor stations. The project involved 148 engines with 488,880-hp in the 20 stations. ANR Pipeline developed software for these engines and compressors, including horsepower prediction and efficiency. The system places processors ''intelligence'' at each station and engine to monitor and control operations. The station processor receives commands from the company's gas dispatch center at Detroit and informs dispatchers of alarms, conditions, and decision it makes. The automation system is controlled by the Detroit center through a central communications network. Operating orders from the center are sent to the station processor, which obeys orders using the most efficient means of operation at the station's disposal. In a malfunction, a control and communications backup system takes over. Commands and information are directly transmitted between the center and the individual compressor stations. Stations receive their orders based on throughput, with suction and discharge pressure overrides. Additionally, a discharge temperature override protects pipeline coatings.

  1. Automation in biological crystallization.

    PubMed

    Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen

    2014-06-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.

  2. Automation in biological crystallization

    PubMed Central

    Shaw Stewart, Patrick; Mueller-Dieckmann, Jochen

    2014-01-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given. PMID:24915074

  3. Automation of optical tweezers

    NASA Astrophysics Data System (ADS)

    Hsieh, Tseng-Ming; Chang, Bo-Jui; Hsu, Long

    2000-07-01

    Optical tweezers is a newly developed instrument, which makes possible the manipulation of micro-optical particles under a microscope. In this paper, we present the automation of an optical tweezers which consists of a modified optical tweezers, equipped with two motorized actuators to deflect a 1 W argon laser beam, and a computer control system including a joystick. The trapping of a single bead and a group of lactoacidofilus was shown, separately. With the aid of the joystick and two auxiliary cursers superimposed on the real-time image of a trapped bead, we demonstrated the simple and convenient operation of the automated optical tweezers. By steering the joystick and then pressing a button on it, we assign a new location for the trapped bead to move to. The increment of the motion 0.04 (mu) m for a 20X objective, is negligible. With a fast computer for image processing, the manipulation of the trapped bead is smooth and accurate. The automation of the optical tweezers is also programmable. This technique may be applied to accelerate the DNA hybridization in a gene chip. The combination of the modified optical tweezers with the computer control system provides a tool for precise manipulation of micro particles in many scientific fields.

  4. Effects of Stroke on Ipsilesional End-Effector Kinematics in a Multi-Step Activity of Daily Living

    PubMed Central

    Gulde, Philipp; Hughes, Charmayne Mary Lee; Hermsdörfer, Joachim

    2017-01-01

    Background: Stroke frequently impairs activities of daily living (ADL) and deteriorates the function of the contra- as well as the ipsilesional limbs. In order to analyze alterations of higher motor control unaffected by paresis or sensory loss, the kinematics of ipsilesional upper limb movements in patients with stroke has previously been analyzed during prehensile movements and simple tool use actions. By contrast, motion recording of multi-step ADL is rare and patient-control comparisons for movement kinematics are largely lacking. Especially in clinical research, objective quantification of complex externally valid tasks can improve the assessment of neurological impairments. Methods: In this preliminary study we employed three-dimensional motion recording and applied kinematic analysis in a multi-step ADL (tea-making). The trials were examined with respect to errors and sub-action structure, durations, path lengths (PLs), peak velocities, relative activity (RA) and smoothness. In order to check for specific burdens the sub-actions of the task were extracted and compared. To examine the feasibility of the approach, we determined the behavioral and kinematic metrics of the (ipsilesional) unimanual performance of seven chronic stroke patients (64a ± 11a, 3 with right/4 with left brain damage (LBD), 2 with signs of apraxia, variable severity of paresis) and compared the results with data of 14 neurologically healthy age-matched control participants (70a ± 7a). Results: T-tests revealed that while the quantity and structure of sub-actions of the task were similar. The analysis of end-effector kinematics was able to detect clear group differences in the associated parameters. Specifically, trial duration (TD) was increased (Cohen’s d = 1.77); the RA (Cohen’s d = 1.72) and the parameters of peak velocities (Cohen’s d = 1.49/1.97) were decreased in the patient group. Analysis of the task’s sub-actions repeated measures analysis of variance (rmANOVA) revealed

  5. Predictions for rapid methods and automation in food microbiology.

    PubMed

    Fung, Daniel Y C

    2002-01-01

    A discussion is presented on the present status of rapid methods and automation in microbiology. Predictions are also presented for development in the following areas: viable cell counts; real-time monitoring of hygiene; polymerase chain reaction, ribotyping, and genetic tests in food laboratories; automated enzyme-linked immunosorbent assay and immunotests; rapid dipstick technology; biosensors for Hazard Analysis Critical Control Point programs; instant detection of target pathogens by computer-generated matrix; effective separation and concentration for rapid identification of target cells; microbiological alert systems in food packages; and rapid alert kits for detecting pathogens at home.

  6. Automated Proactive Fault Isolation: A Key to Automated Commissioning

    SciTech Connect

    Katipamula, Srinivas; Brambley, Michael R.

    2007-07-31

    In this paper, we present a generic model for automated continuous commissioing and then delve in detail into one of the processes, proactive testing for fault isolation, which is key to automating commissioning. The automated commissioining process uses passive observation-based fault detction and diagnostic techniques, followed by automated proactive testing for fault isolation, automated fault evaluation, and automated reconfiguration of controls together to continuously keep equipment controlled and running as intended. Only when hard failures occur or a physical replacement is required does the process require human intervention, and then sufficient information is provided by the automated commissioning system to target manual maintenance where it is needed. We then focus on fault isolation by presenting detailed logic that can be used to automatically isolate faults in valves, a common component in HVAC systems, as an example of how automated proactive fault isolation can be accomplished. We conclude the paper with a discussion of how this approach to isolating faults can be applied to other common HVAC components and their automated commmissioning and a summary of key conclusions of the paper.

  7. Automated genotyping of dinucleotide repeat markers

    SciTech Connect

    Perlin, M.W.; Hoffman, E.P. |

    1994-09-01

    The dinucleotide repeats (i.e., microsatellites) such as CA-repeats are a highly polymorphic, highly abundant class of PCR-amplifiable markers that have greatly streamlined genetic mapping experimentation. It is expected that over 30,000 such markers (including tri- and tetranucleotide repeats) will be characterized for routine use in the next few years. Since only size determination, and not sequencing, is required to determine alleles, in principle, dinucleotide repeat genotyping is easily performed on electrophoretic gels, and can be automated using DNA sequencers. Unfortunately, PCR stuttering with these markers generates not one band for each allele, but a pattern of bands. Since closely spaced alleles must be disambiguated by human scoring, this poses a key obstacle to full automation. We have developed methods that overcome this obstacle. Our model is that the observed data is generated by arithmetic superposition (i.e., convolution) of multiple allele patterns. By quantitatively measuring the size of each component band, and exploiting the unique stutter pattern associated with each marker, closely spaced alleles can be deconvolved; this unambiguously reconstructs the {open_quotes}true{close_quotes} allele bands, with stutter artifact removed. We used this approach in a system for automated diagnosis of (X-linked) Duchenne muscular dystrophy; four multiplexed CA-repeats within the dystrophin gene were assayed on a DNA sequencer. Our method accurately detected small variations in gel migration that shifted the allele size estimate. In 167 nonmutated alleles, 89% (149/167) showed no size variation, 9% (15/167) showed 1 bp variation, and 2% (3/167) showed 2 bp variation. We are currently developing a library of dinucleotide repeat patterns; together with our deconvolution methods, this library will enable fully automated genotyping of dinucleotide repeats from sizing data.

  8. PaR-PaR laboratory automation platform.

    PubMed

    Linshiz, Gregory; Stawski, Nina; Poust, Sean; Bi, Changhao; Keasling, Jay D; Hillson, Nathan J

    2013-05-17

    Labor-intensive multistep biological tasks, such as the construction and cloning of DNA molecules, are prime candidates for laboratory automation. Flexible and biology-friendly operation of robotic equipment is key to its successful integration in biological laboratories, and the efforts required to operate a robot must be much smaller than the alternative manual lab work. To achieve these goals, a simple high-level biology-friendly robot programming language is needed. We have developed and experimentally validated such a language: Programming a Robot (PaR-PaR). The syntax and compiler for the language are based on computer science principles and a deep understanding of biological workflows. PaR-PaR allows researchers to use liquid-handling robots effectively, enabling experiments that would not have been considered previously. After minimal training, a biologist can independently write complicated protocols for a robot within an hour. Adoption of PaR-PaR as a standard cross-platform language would enable hand-written or software-generated robotic protocols to be shared across laboratories.

  9. PaR-PaR Laboratory Automation Platform

    SciTech Connect

    Linshiz, G; Stawski, N; Poust, S; Bi, CH; Keasling, JD; Hilson, NJ

    2013-05-01

    Labor-intensive multistep biological tasks, such as the construction and cloning of DNA molecules, are prime candidates for laboratory automation. Flexible and biology-friendly operation of robotic equipment is key to its successful integration in biological laboratories, and the efforts required to operate a robot must be much smaller than the alternative manual lab work. To achieve these goals, a simple high-level biology-friendly robot programming language is needed. We have developed and experimentally validated such a language: Programming a Robot (PaR-PaR). The syntax and compiler for the language are based on computer science principles and a deep understanding of biological workflows. PaR-PaR allows researchers to use liquid-handling robots effectively, enabling experiments that would not have been considered previously. After minimal training, a biologist can independently write complicated protocols for a robot within an hour. Adoption of PaR-PaR as a standard cross-platform language would enable hand-written or software-generated robotic protocols to be shared across laboratories.

  10. Teaching multi-step requesting and social communication to two children with autism spectrum disorders with three AAC options.

    PubMed

    van der Meer, Larah; Kagohara, Debora; Roche, Laura; Sutherland, Dean; Balandin, Susan; Green, Vanessa A; O'Reilly, Mark F; Lancioni, Giulio E; Marschik, Peter B; Sigafoos, Jeff

    2013-09-01

    The present study involved comparing the acquisition of multi-step requesting and social communication across three AAC options: manual signing (MS), picture exchange (PE), and speech-generating devices (SGDs). Preference for each option was also assessed. The participants were two children with autism spectrum disorders (ASD) who had previously been taught to use each option to request preferred items. Intervention was implemented in an alternating-treatments design. During baseline, participants demonstrated low levels of correct communicative responding. With intervention, both participants learned the target responses (two- and three-step requesting responses, greetings, answering questions, and social etiquette responses) to varying levels of proficiency with each communication option. One participant demonstrated a preference for using the SGD and the other preferred PE. The importance of examining preferences for using one AAC option over others is discussed.

  11. Strong textured SmCo5 nanoflakes with ultrahigh coercivity prepared by multistep (three steps) surfactant-assisted ball milling.

    PubMed

    Zuo, Wen-Liang; Zhao, Xin; Xiong, Jie-Fu; Zhang, Ming; Zhao, Tong-Yun; Hu, Feng-Xia; Sun, Ji-Rong; Shen, Bao-Gen

    2015-08-14

    The high coercivity of 26.2 kOe for SmCo5 nanoflakes are obtained by multistep (three steps) surfactant-assisted ball milling. The magnetic properties, phase structure and morphology are studied by VSM, XRD and SEM, respectively. The results demonstrate that the three step ball-milling can keep more complete crystallinity (relatively less defects) during the process of milling compared with one step high energy ball-milling, which enhances the texture degree and coercivity. In addition, the mechanism of coercivity are also studied by the temperature dependence of demagnetization curves for aligned SmCo5 nanoflakes/resin composite, the result indicates that the magnetization reversal could be controlled by co-existed mechanisms of pinning and nucleation.

  12. Texture and anisotropy of yield strength in multistep isothermally forged Mg-5.8Zn-0.65Zr alloy

    NASA Astrophysics Data System (ADS)

    Nugmanov, D. R.; Sitdikov, O. Sh; Markushev, M. V.

    2015-04-01

    The effect of multistep isothermal forging (MIF) on microstructure, texture and anisotropy of ambient temperature yield strength of MA14 (Mg-5.8Mg-0.65Zr (%, wt)) magnesium alloy hot-pressed rod was analyzed. It has been found that the initial axial texture is quite stable under 1st MIF step up to strain e∼4.2 at 400°C and has been gradually transformed into much weaker single-peak one at further processing at 300 and 200°C to total strain of 10.2. Such texture changes were accompanied by strong grain refinement, along with significant reduction of the alloy strength anisotropy.

  13. Discovery of novel, non-acidic mPGES-1 inhibitors by virtual screening with a multistep protocol

    PubMed Central

    Noha, Stefan M.; Fischer, Katrin; Koeberle, Andreas; Garscha, Ulrike; Werz, Oliver; Schuster, Daniela

    2015-01-01

    Microsomal prostaglandin E2 synthase-1 (mPGES-1) inhibitors are considered as potential therapeutic agents for the treatment of inflammatory pain and certain types of cancer. So far, several series of acidic as well as non-acidic inhibitors of mPGES-1 have been discovered. Acidic inhibitors, however, may have issues, such as loss of potency in human whole blood and in vivo, stressing the importance of the design and identification of novel, non-acidic chemical scaffolds of mPGES-1 inhibitors. Using a multistep virtual screening protocol, the Vitas-M compound library (∼1.3 million entries) was filtered and 16 predicted compounds were experimentally evaluated in a biological assay in vitro. This approach yielded two molecules active in the low micromolar range (IC50 values: 4.5 and 3.8 μM, respectively). PMID:26088337

  14. Multistep continuous-flow synthesis in medicinal chemistry: discovery and preliminary structure-activity relationships of CCR8 ligands.

    PubMed

    Petersen, Trine P; Mirsharghi, Sahar; Rummel, Pia C; Thiele, Stefanie; Rosenkilde, Mette M; Ritzén, Andreas; Ulven, Trond

    2013-07-08

    A three-step continuous-flow synthesis system and its application to the assembly of a new series of chemokine receptor ligands directly from commercial building blocks is reported. No scavenger columns or solvent switches are necessary to recover the desired test compounds, which were obtained in overall yields of 49-94%. The system is modular and flexible, and the individual steps of the sequence can be interchanged with similar outcome, extending the scope of the chemistry. Biological evaluation confirmed activity on the chemokine CCR8 receptor and provided initial structure-activity-relationship (SAR) information for this new ligand series, with the most potent member displaying full agonist activity with single-digit nanomolar potency. To the best of our knowledge, this represents the first published example of efficient use of multistep flow synthesis combined with biological testing and SAR studies in medicinal chemistry.

  15. Automation in organizations: Eternal conflict

    NASA Technical Reports Server (NTRS)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  16. A versatile valving toolkit for automating fluidic operations in paper microfluidic devices

    PubMed Central

    Toley, Bhushan J.; Wang, Jessica A.; Gupta, Mayuri; Buser, Joshua R.; Lafleur, Lisa K.; Lutz, Barry R.; Fu, Elain; Yager, Paul

    2015-01-01

    Failure to utilize valving and automation techniques has restricted the complexity of fluidic operations that can be performed in paper microfluidic devices. We developed a toolkit of paper microfluidic valves and methods for automatic valve actuation using movable paper strips and fluid-triggered expanding elements. To the best of our knowledge, this is the first functional demonstration of this valving strategy in paper microfluidics. After introduction of fluids on devices, valves can actuate automatically a) after a certain period of time, or b) after the passage of a certain volume of fluid. Timing of valve actuation can be tuned with greater than 8.5% accuracy by changing lengths of timing wicks, and we present timed on-valves, off-valves, and diversion (channel-switching) valves. The actuators require ~30 μl fluid to actuate and the time required to switch from one state to another ranges from ~5 s for short to ~50s for longer wicks. For volume-metered actuation, the size of a metering pad can be adjusted to tune actuation volume, and we present two methods – both methods can achieve greater than 9% accuracy. Finally, we demonstrate the use of these valves in a device that conducts a multi-step assay for the detection of the malaria protein PfHRP2. Although slightly more complex than devices that do not have moving parts, this valving and automation toolkit considerably expands the capabilities of paper microfluidic devices. Components of this toolkit can be used to conduct arbitrarily complex, multi-step fluidic operations on paper-based devices, as demonstrated in the malaria assay device. PMID:25606810

  17. Facilitating Students' Review of the Chemistry of Nitrogen-Containing Heterocyclic Compounds and Their Characterization through Multistep Synthesis of Thieno[2,3-"b"]Pyridine Derivatives

    ERIC Educational Resources Information Center

    Liu, Hanlin; Zaplishnyy, Vladimir; Mikhaylichenko, Lana

    2016-01-01

    A multistep synthesis of thieno[2,3-"b"]pyridine derivatives is described that is suitable for the upper-level undergraduate organic laboratory. This experiment exposes students to various hands-on experimental techniques as well as methods of product characterization such as IR and [superscript 1]H NMR spectroscopy, and…

  18. Coping Strategies Applied to Comprehend Multistep Arithmetic Word Problems by Students with Above-Average Numeracy Skills and Below-Average Reading Skills

    ERIC Educational Resources Information Center

    Nortvedt, Guri A.

    2011-01-01

    This article discusses how 13-year-old students with above-average numeracy skills and below-average reading skills cope with comprehending word problems. Compared to other students who are proficient in numeracy and are skilled readers, these students are more disadvantaged when solving single-step and multistep arithmetic word problems. The…

  19. Continuous Video Modeling to Assist with Completion of Multi-Step Home Living Tasks by Young Adults with Moderate Intellectual Disability

    ERIC Educational Resources Information Center

    Mechling, Linda C.; Ayres, Kevin M.; Bryant, Kathryn J.; Foster, Ashley L.

    2014-01-01

    The current study evaluated a relatively new video-based procedure, continuous video modeling (CVM), to teach multi-step cleaning tasks to high school students with moderate intellectual disability. CVM in contrast to video modeling and video prompting allows repetition of the video model (looping) as many times as needed while the user completes…

  20. [Genetics and genetic counseling].

    PubMed

    Izzi, Claudia; Liut, Francesca; Dallera, Nadia; Mazza, Cinzia; Magistroni, Riccardo; Savoldi, Gianfranco; Scolari, Francesco

    2016-01-01

    Autosomal Dominant Polycystic Kidney Disease (ADPKD) is the most frequent genetic disease, characterized by progressive development of bilateral renal cysts. Two causative genes have been identified: PKD1 and PKD2. ADPKD phenotype is highly variable. Typically, ADPKD is an adult onset disease. However, occasionally, ADPKD manifests as very early onset disease. The phenotypic variability of ADPKD can be explained at three genetic levels: genic, allelic and gene modifier effects. Recent advances in molecular screening for PKD gene mutations and the introduction of the new next generation sequencing (NGS)- based genotyping approach have generated considerable improvement regarding the knowledge of genetic basis of ADPKD. The purpose of this article is to provide a comprehensive review of the genetics of ADPKD, focusing on new insights in genotype-phenotype correlation and exploring novel clinical approach to genetic testing. Evaluation of these new genetic information requires a multidisciplinary approach involving a nephrologist and a clinical geneticist.

  1. 77 FR 48527 - National Customs Automation Program (NCAP) Test Concerning Automated Commercial Environment (ACE...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-14

    ... Automated Commercial Environment (ACE) Simplified Entry: Modification of Participant Selection Criteria and... (NCAP) test concerning the simplified entry functionality in the Automated Commercial Environment (ACE...) National Customs Automation Program (NCAP) test concerning Automated Commercial Environment...

  2. World-wide distribution automation systems

    SciTech Connect

    Devaney, T.M.

    1994-12-31

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems.

  3. Automating CPM-GOMS

    NASA Technical Reports Server (NTRS)

    John, Bonnie; Vera, Alonso; Matessa, Michael; Freed, Michael; Remington, Roger

    2002-01-01

    CPM-GOMS is a modeling method that combines the task decomposition of a GOMS analysis with a model of human resource usage at the level of cognitive, perceptual, and motor operations. CPM-GOMS models have made accurate predictions about skilled user behavior in routine tasks, but developing such models is tedious and error-prone. We describe a process for automatically generating CPM-GOMS models from a hierarchical task decomposition expressed in a cognitive modeling tool called Apex. Resource scheduling in Apex automates the difficult task of interleaving the cognitive, perceptual, and motor resources underlying common task operators (e.g. mouse move-and-click). Apex's UI automatically generates PERT charts, which allow modelers to visualize a model's complex parallel behavior. Because interleaving and visualization is now automated, it is feasible to construct arbitrarily long sequences of behavior. To demonstrate the process, we present a model of automated teller interactions in Apex and discuss implications for user modeling. available to model human users, the Goals, Operators, Methods, and Selection (GOMS) method [6, 21] has been the most widely used, providing accurate, often zero-parameter, predictions of the routine performance of skilled users in a wide range of procedural tasks [6, 13, 15, 27, 28]. GOMS is meant to model routine behavior. The user is assumed to have methods that apply sequences of operators and to achieve a goal. Selection rules are applied when there is more than one method to achieve a goal. Many routine tasks lend themselves well to such decomposition. Decomposition produces a representation of the task as a set of nested goal states that include an initial state and a final state. The iterative decomposition into goals and nested subgoals can terminate in primitives of any desired granularity, the choice of level of detail dependent on the predictions required. Although GOMS has proven useful in HCI, tools to support the

  4. Automated Propellant Blending

    NASA Technical Reports Server (NTRS)

    Hohmann, Carl W. (Inventor); Harrington, Douglas W. (Inventor); Dutton, Maureen L. (Inventor); Tipton, Billy Charles, Jr. (Inventor); Bacak, James W. (Inventor); Salazar, Frank (Inventor)

    2000-01-01

    An automated propellant blending apparatus and method that uses closely metered addition of countersolvent to a binder solution with propellant particles dispersed therein to precisely control binder precipitation and particle aggregation is discussed. A profile of binder precipitation versus countersolvent-solvent ratio is established empirically and used in a computer algorithm to establish countersolvent addition parameters near the cloud point for controlling the transition of properties of the binder during agglomeration and finishing of the propellant composition particles. The system is remotely operated by computer for safety, reliability and improved product properties, and also increases product output.

  5. Automated fiber pigtailing machine

    DOEpatents

    Strand, Oliver T.; Lowry, Mark E.

    1999-01-01

    The Automated Fiber Pigtailing Machine (AFPM) aligns and attaches optical fibers to optoelectonic (OE) devices such as laser diodes, photodiodes, and waveguide devices without operator intervention. The so-called pigtailing process is completed with sub-micron accuracies in less than 3 minutes. The AFPM operates unattended for one hour, is modular in design and is compatible with a mass production manufacturing environment. This machine can be used to build components which are used in military aircraft navigation systems, computer systems, communications systems and in the construction of diagnostics and experimental systems.

  6. Automated fiber pigtailing machine

    DOEpatents

    Strand, O.T.; Lowry, M.E.

    1999-01-05

    The Automated Fiber Pigtailing Machine (AFPM) aligns and attaches optical fibers to optoelectronic (OE) devices such as laser diodes, photodiodes, and waveguide devices without operator intervention. The so-called pigtailing process is completed with sub-micron accuracies in less than 3 minutes. The AFPM operates unattended for one hour, is modular in design and is compatible with a mass production manufacturing environment. This machine can be used to build components which are used in military aircraft navigation systems, computer systems, communications systems and in the construction of diagnostics and experimental systems. 26 figs.

  7. Automated wire preparation system

    NASA Astrophysics Data System (ADS)

    McCulley, Deborah J.

    The first step toward an automated wire harness facility for the aerospace industry has been taken by implementing the Wire Vektor 2000 into the wire harness preparation area. An overview of the Wire Vektor 2000 is given, including the facilities for wire cutting, marking, and transporting, for wire end processing, and for system control. Production integration in the Wire Vektor 2000 system is addressed, considering the hardware/software debug system and the system throughput. The manufacturing changes that have to be made in implementing the Wire Vektor 2000 are discussed.

  8. Automated Propellant Blending

    NASA Technical Reports Server (NTRS)

    Hohmann, Carl W. (Inventor); Harrington, Douglas W. (Inventor); Dutton, Maureen L. (Inventor); Tipton, Billy Charles, Jr. (Inventor); Bacak, James W. (Inventor); Salazar, Frank (Inventor)

    1999-01-01

    An automated propellant blending apparatus and method uses closely metered addition of countersolvent to a binder solution with propellant particles dispersed therein to precisely control binder precipitation and particle aggregation. A profile of binder precipitation versus countersolvent-solvent ratio is established empirically and used in a computer algorithm to establish countersolvent addition parameters near the cloud point for controlling the transition of properties of the binder during agglomeration and finishing of the propellant composition particles. The system is remotely operated by computer for safety, reliability and improved product properties, and also increases product output.

  9. Evaluation of glycodendron and synthetically-modified dextran clearing agents for multi-step targeting of radioisotopes for molecular imaging and radioimmunotherapy

    PubMed Central

    Cheal, Sarah M.; Yoo, Barney; Boughdad, Sarah; Punzalan, Blesida; Yang, Guangbin; Dilhas, Anna; Torchon, Geralda; Pu, Jun; Axworthy, Don B.; Zanzonico, Pat; Ouerfelli, Ouathek; Larson, Steven M.

    2014-01-01

    A series of N-acetylgalactosamine-dendrons (NAG-dendrons) and dextrans bearing biotin moieties were compared for their ability to complex with and sequester circulating bispecific anti-tumor antibody (scFv4) streptavidin (SA) fusion protein (scFv4-SA) in vivo, to improve tumor to normal tissue concentration ratios for targeted radioimmunotherapy and diagnosis. Specifically, a total of five NAG-dendrons employing a common synthetic scaffold structure containing 4, 8, 16, or 32 carbohydrate residues and a single biotin moiety were prepared (NAGB), and for comparative purposes, a biotinylated-dextran with average molecular weight (MW) of 500 kD was synthesized from amino-dextran (DEXB). One of the NAGB compounds, CA16, has been investigated in humans; our aim was to determine if other NAGB analogs (e.g. CA8 or CA4) were bioequivalent to CA16 and/or better suited as MST reagents. In vivo studies included dynamic positron-emission tomography (PET) imaging of 124I-labelled-scFv4-SA clearance and dual-label biodistribution studies following multi-step targeting (MST) directed at subcutaneous (s.c.) human colon adenocarcinoma xenografts in mice. The MST protocol consists of three injections: first, a bispecific antibody specific for an anti-tumor associated glycoprotein (TAG-72) single chain genetically-fused with SA (scFv4-SA); second, CA16 or other clearing agent; and third, radiolabeled biotin. We observed using PET imaging of 124I-labelled-scFv4-SA clearance that the spatial arrangement of ligands conjugated to NAG (i.e. biotin) can impact the binding to antibody in circulation and subsequent liver uptake of the NAG-antibody complex. Also, NAGB CA32-LC or CA16-LC can be utilized during MST to achieve comparable tumor- to-blood ratios and absolute tumor uptake seen previously with CA16. Finally, DEXB was equally effective as NAGB CA32-LC at lowering scFv4-SA in circulation, but at the expense of reducing absolute tumor uptake of radiolabeled biotin. PMID:24219178

  10. Automated System Marketplace 1995: The Changing Face of Automation.

    ERIC Educational Resources Information Center

    Barry, Jeff; And Others

    1995-01-01

    Discusses trends in the automated system marketplace with specific attention to online vendors and their customers: academic, public, school, and special libraries. Presents vendor profiles; tables and charts on computer systems and sales; and sidebars that include a vendor source list and the differing views on procuring an automated library…

  11. Space station advanced automation

    NASA Technical Reports Server (NTRS)

    Woods, Donald

    1990-01-01

    In the development of a safe, productive and maintainable space station, Automation and Robotics (A and R) has been identified as an enabling technology which will allow efficient operation at a reasonable cost. The Space Station Freedom's (SSF) systems are very complex, and interdependent. The usage of Advanced Automation (AA) will help restructure, and integrate system status so that station and ground personnel can operate more efficiently. To use AA technology for the augmentation of system management functions requires a development model which consists of well defined phases of: evaluation, development, integration, and maintenance. The evaluation phase will consider system management functions against traditional solutions, implementation techniques and requirements; the end result of this phase should be a well developed concept along with a feasibility analysis. In the development phase the AA system will be developed in accordance with a traditional Life Cycle Model (LCM) modified for Knowledge Based System (KBS) applications. A way by which both knowledge bases and reasoning techniques can be reused to control costs is explained. During the integration phase the KBS software must be integrated with conventional software, and verified and validated. The Verification and Validation (V and V) techniques applicable to these KBS are based on the ideas of consistency, minimal competency, and graph theory. The maintenance phase will be aided by having well designed and documented KBS software.

  12. Automated office blood pressure.

    PubMed

    Myers, Martin G; Godwin, Marshall

    2012-05-01

    Manual blood pressure (BP) is gradually disappearing from clinical practice with the mercury sphygmomanometer now considered to be an environmental hazard. Manual BP is also subject to measurement error on the part of the physician/nurse and patient-related anxiety which can result in poor quality BP measurements and office-induced (white coat) hypertension. Automated office (AO) BP with devices such as the BpTRU (BpTRU Medical Devices, Coquitlam, BC) has already replaced conventional manual BP in many primary care practices in Canada and has also attracted interest in other countries where research studies using AOBP have been undertaken. The basic principles of AOBP include multiple readings taken with a fully automated recorder with the patient resting alone in a quiet room. When these principles are followed, office-induced hypertension is eliminated and AOBP exhibits a much stronger correlation with the awake ambulatory BP as compared with routine manual BP measurements. Unlike routine manual BP, AOBP correlates as well with left ventricular mass as does the awake ambulatory BP. AOBP also simplifies the definition of hypertension in that the cut point for a normal AOBP (< 135/85 mm Hg) is the same as for the awake ambulatory BP and home BP. This article summarizes the currently available evidence supporting the use of AOBP in routine clinical practice and proposes an algorithm in which AOBP replaces manual BP for the diagnosis and management of hypertension.

  13. Maneuver Automation Software

    NASA Technical Reports Server (NTRS)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; Illsley, Jeannette

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  14. Agile automated vision

    NASA Astrophysics Data System (ADS)

    Fandrich, Juergen; Schmitt, Lorenz A.

    1994-11-01

    The microelectronic industry is a protagonist in driving automated vision to new paradigms. Today semiconductor manufacturers use vision systems quite frequently in their fabs in the front-end process. In fact, the process depends on reliable image processing systems. In the back-end process, where ICs are assembled and packaged, today vision systems are only partly used. But in the next years automated vision will become compulsory for the back-end process as well. Vision will be fully integrated into every IC package production machine to increase yields and reduce costs. Modem high-speed material processing requires dedicated and efficient concepts in image processing. But the integration of various equipment in a production plant leads to unifying handling of data flow and interfaces. Only agile vision systems can act with these contradictions: fast, reliable, adaptable, scalable and comprehensive. A powerful hardware platform is a unneglectable requirement for the use of advanced and reliable, but unfortunately computing intensive image processing algorithms. The massively parallel SIMD hardware product LANTERN/VME supplies a powerful platform for existing and new functionality. LANTERN/VME is used with a new optical sensor for IC package lead inspection. This is done in 3D, including horizontal and coplanarity inspection. The appropriate software is designed for lead inspection, alignment and control tasks in IC package production and handling equipment, like Trim&Form, Tape&Reel and Pick&Place machines.

  15. Automating quantum experiment control

    NASA Astrophysics Data System (ADS)

    Stevens, Kelly E.; Amini, Jason M.; Doret, S. Charles; Mohler, Greg; Volin, Curtis; Harter, Alexa W.

    2017-03-01

    The field of quantum information processing is rapidly advancing. As the control of quantum systems approaches the level needed for useful computation, the physical hardware underlying the quantum systems is becoming increasingly complex. It is already becoming impractical to manually code control for the larger hardware implementations. In this chapter, we will employ an approach to the problem of system control that parallels compiler design for a classical computer. We will start with a candidate quantum computing technology, the surface electrode ion trap, and build a system instruction language which can be generated from a simple machine-independent programming language via compilation. We incorporate compile time generation of ion routing that separates the algorithm description from the physical geometry of the hardware. Extending this approach to automatic routing at run time allows for automated initialization of qubit number and placement and additionally allows for automated recovery after catastrophic events such as qubit loss. To show that these systems can handle real hardware, we present a simple demonstration system that routes two ions around a multi-zone ion trap and handles ion loss and ion placement. While we will mainly use examples from transport-based ion trap quantum computing, many of the issues and solutions are applicable to other architectures.

  16. New Genetics

    MedlinePlus

    ... Home > Science Education > The New Genetics The New Genetics Living Laboratories Classroom Poster Order a Free Copy ... Piece to a Century-Old Evolutionary Puzzle Computing Genetics Model Organisms RNA Interference The New Genetics is ...

  17. Genetic Disorders

    MedlinePlus

    ... Management Education & Events Advocacy For Patients About ACOG Genetic Disorders Home For Patients Search FAQs Genetic Disorders ... Spanish Genetic Disorders FAQ094, April 2014 PDF Format Genetic Disorders Pregnancy What are genes? What are chromosomes? ...

  18. Genetics Home Reference: tyrosinemia

    MedlinePlus

    ... in the multistep process that breaks down the amino acid tyrosine, a building block of most proteins. If ... Resources MedlinePlus (4 links) Encyclopedia: Aminoaciduria Health Topic: Amino Acid Metabolism Disorders Health Topic: Liver Diseases Health Topic: ...

  19. Classification of Automated Search Traffic

    NASA Astrophysics Data System (ADS)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  20. Translation: Aids, Robots, and Automation.

    ERIC Educational Resources Information Center

    Andreyewsky, Alexander

    1981-01-01

    Examines electronic aids to translation both as ways to automate it and as an approach to solve problems resulting from shortage of qualified translators. Describes the limitations of robotic MT (Machine Translation) systems, viewing MAT (Machine-Aided Translation) as the only practical solution and the best vehicle for further automation. (MES)

  1. Robotics/Automated Systems Technicians.

    ERIC Educational Resources Information Center

    Doty, Charles R.

    Major resources exist that can be used to develop or upgrade programs in community colleges and technical institutes that educate robotics/automated systems technicians. The first category of resources is Economic, Social, and Education Issues. The Office of Technology Assessment (OTA) report, "Automation and the Workplace," presents analyses of…

  2. Opening up Library Automation Software

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2009-01-01

    Throughout the history of library automation, the author has seen a steady advancement toward more open systems. In the early days of library automation, when proprietary systems dominated, the need for standards was paramount since other means of inter-operability and data exchange weren't possible. Today's focus on Application Programming…

  3. Automated Power-Distribution System

    NASA Technical Reports Server (NTRS)

    Ashworth, Barry; Riedesel, Joel; Myers, Chris; Miller, William; Jones, Ellen F.; Freeman, Kenneth; Walsh, Richard; Walls, Bryan K.; Weeks, David J.; Bechtel, Robert T.

    1992-01-01

    Autonomous power-distribution system includes power-control equipment and automation equipment. System automatically schedules connection of power to loads and reconfigures itself when it detects fault. Potential terrestrial applications include optimization of consumption of power in homes, power supplies for autonomous land vehicles and vessels, and power supplies for automated industrial processes.

  4. Medical genetics

    SciTech Connect

    Nora, J.J.; Fraser, F.C.

    1989-01-01

    This book presents a discussion of medical genetics for the practitioner treating or counseling patients with genetic disease. It includes a discussion of the relationship of heredity and diseases, the chromosomal basis for heredity, gene frequencies, and genetics of development and maldevelopment. The authors also focus on teratology, somatic cell genetics, genetics and cancer, genetics of behavior.

  5. Automated design of aerospace structures

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Mccomb, H. G.

    1974-01-01

    The current state-of-the-art in structural analysis of aerospace vehicles is characterized, automated design technology is discussed, and an indication is given of the future direction of research in analysis and automated design. Representative computer programs for analysis typical of those in routine use in vehicle design activities are described, and results are shown for some selected analysis problems. Recent and planned advances in analysis capability are indicated. Techniques used to automate the more routine aspects of structural design are discussed, and some recently developed automated design computer programs are described. Finally, discussion is presented of early accomplishments in interdisciplinary automated design systems, and some indication of the future thrust of research in this field is given.

  6. An Automated Field Phenotyping Pipeline for Application in Grapevine Research

    PubMed Central

    Kicherer, Anna; Herzog, Katja; Pflanz, Michael; Wieland, Markus; Rüger, Philipp; Kecke, Steffen; Kuhlmann, Heiner; Töpfer, Reinhard

    2015-01-01

    Due to its perennial nature and size, the acquisition of phenotypic data in grapevine research is almost exclusively restricted to the field and done by visual estimation. This kind of evaluation procedure is limited by time, cost and the subjectivity of records. As a consequence, objectivity, automation and more precision of phenotypic data evaluation are needed to increase the number of samples, manage grapevine repositories, enable genetic research of new phenotypic traits and, therefore, increase the efficiency in plant research. In the present study, an automated field phenotyping pipeline was setup and applied in a plot of genetic resources. The application of the PHENObot allows image acquisition from at least 250 individual grapevines per hour directly in the field without user interaction. Data management is handled by a database (IMAGEdata). The automatic image analysis tool BIVcolor (Berries in Vineyards-color) permitted the collection of precise phenotypic data of two important fruit traits, berry size and color, within a large set of plants. The application of the PHENObot represents an automated tool for high-throughput sampling of image data in the field. The automated analysis of these images facilitates the generation of objective and precise phenotypic data on a larger scale. PMID:25730485

  7. Automated Desalting Apparatus

    NASA Technical Reports Server (NTRS)

    Spencer, Maegan K.; Liu, De-Ling; Kanik, Isik; Beegle, Luther

    2010-01-01

    Because salt and metals can mask the signature of a variety of organic molecules (like amino acids) in any given sample, an automated system to purify complex field samples has been created for the analytical techniques of electrospray ionization/ mass spectroscopy (ESI/MS), capillary electrophoresis (CE), and biological assays where unique identification requires at least some processing of complex samples. This development allows for automated sample preparation in the laboratory and analysis of complex samples in the field with multiple types of analytical instruments. Rather than using tedious, exacting protocols for desalting samples by hand, this innovation, called the Automated Sample Processing System (ASPS), takes analytes that have been extracted through high-temperature solvent extraction and introduces them into the desalting column. After 20 minutes, the eluent is produced. This clear liquid can then be directly analyzed by the techniques listed above. The current apparatus including the computer and power supplies is sturdy, has an approximate mass of 10 kg, and a volume of about 20 20 20 cm, and is undergoing further miniaturization. This system currently targets amino acids. For these molecules, a slurry of 1 g cation exchange resin in deionized water is packed into a column of the apparatus. Initial generation of the resin is done by flowing sequentially 2.3 bed volumes of 2N NaOH and 2N HCl (1 mL each) to rinse the resin, followed by .5 mL of deionized water. This makes the pH of the resin near neutral, and eliminates cross sample contamination. Afterward, 2.3 mL of extracted sample is then loaded into the column onto the top of the resin bed. Because the column is packed tightly, the sample can be applied without disturbing the resin bed. This is a vital step needed to ensure that the analytes adhere to the resin. After the sample is drained, oxalic acid (1 mL, pH 1.6-1.8, adjusted with NH4OH) is pumped into the column. Oxalic acid works as a

  8. Automating the analytical laboratory via the Chemical Analysis Automation paradigm

    SciTech Connect

    Hollen, R.; Rzeszutko, C.

    1997-10-01

    To address the need for standardization within the analytical chemistry laboratories of the nation, the Chemical Analysis Automation (CAA) program within the US Department of Energy, Office of Science and Technology`s Robotic Technology Development Program is developing laboratory sample analysis systems that will automate the environmental chemical laboratories. The current laboratory automation paradigm consists of islands-of-automation that do not integrate into a system architecture. Thus, today the chemist must perform most aspects of environmental analysis manually using instrumentation that generally cannot communicate with other devices in the laboratory. CAA is working towards a standardized and modular approach to laboratory automation based upon the Standard Analysis Method (SAM) architecture. Each SAM system automates a complete chemical method. The building block of a SAM is known as the Standard Laboratory Module (SLM). The SLM, either hardware or software, automates a subprotocol of an analysis method and can operate as a standalone or as a unit within a SAM. The CAA concept allows the chemist to easily assemble an automated analysis system, from sample extraction through data interpretation, using standardized SLMs without the worry of hardware or software incompatibility or the necessity of generating complicated control programs. A Task Sequence Controller (TSC) software program schedules and monitors the individual tasks to be performed by each SLM configured within a SAM. The chemist interfaces with the operation of the TSC through the Human Computer Interface (HCI), a logical, icon-driven graphical user interface. The CAA paradigm has successfully been applied in automating EPA SW-846 Methods 3541/3620/8081 for the analysis of PCBs in a soil matrix utilizing commercially available equipment in tandem with SLMs constructed by CAA.

  9. Automated Electrostatics Environmental Chamber

    NASA Technical Reports Server (NTRS)

    Calle, Carlos; Lewis, Dean C.; Buchanan, Randy K.; Buchanan, Aubri

    2005-01-01

    The Mars Electrostatics Chamber (MEC) is an environmental chamber designed primarily to create atmospheric conditions like those at the surface of Mars to support experiments on electrostatic effects in the Martian environment. The chamber is equipped with a vacuum system, a cryogenic cooling system, an atmospheric-gas replenishing and analysis system, and a computerized control system that can be programmed by the user and that provides both automation and options for manual control. The control system can be set to maintain steady Mars-like conditions or to impose temperature and pressure variations of a Mars diurnal cycle at any given season and latitude. In addition, the MEC can be used in other areas of research because it can create steady or varying atmospheric conditions anywhere within the wide temperature, pressure, and composition ranges between the extremes of Mars-like and Earth-like conditions.

  10. Robust automated knowledge capture.

    SciTech Connect

    Stevens-Adams, Susan Marie; Abbott, Robert G.; Forsythe, James Chris; Trumbo, Michael Christopher Stefan; Haass, Michael Joseph; Hendrickson, Stacey M. Langfitt

    2011-10-01

    This report summarizes research conducted through the Sandia National Laboratories Robust Automated Knowledge Capture Laboratory Directed Research and Development project. The objective of this project was to advance scientific understanding of the influence of individual cognitive attributes on decision making. The project has developed a quantitative model known as RumRunner that has proven effective in predicting the propensity of an individual to shift strategies on the basis of task and experience related parameters. Three separate studies are described which have validated the basic RumRunner model. This work provides a basis for better understanding human decision making in high consequent national security applications, and in particular, the individual characteristics that underlie adaptive thinking.

  11. Automating Frame Analysis

    SciTech Connect

    Sanfilippo, Antonio P.; Franklin, Lyndsey; Tratz, Stephen C.; Danielson, Gary R.; Mileson, Nicholas D.; Riensche, Roderick M.; McGrath, Liam

    2008-04-01

    Frame Analysis has come to play an increasingly stronger role in the study of social movements in Sociology and Political Science. While significant steps have been made in providing a theory of frames and framing, a systematic characterization of the frame concept is still largely lacking and there are no rec-ognized criteria and methods that can be used to identify and marshal frame evi-dence reliably and in a time and cost effective manner. Consequently, current Frame Analysis work is still too reliant on manual annotation and subjective inter-pretation. The goal of this paper is to present an approach to the representation, acquisition and analysis of frame evidence which leverages Content Analysis, In-formation Extraction and Semantic Search methods to provide a systematic treat-ment of a Frame Analysis and automate frame annotation.

  12. Automated mapping system patented

    NASA Astrophysics Data System (ADS)

    A patent on a satellite system dubbed Mapsat, which would be able to map the earth from space and would thereby reduce the time and cost of mapping on a smaller scale, has been issued to the U.S. Geological Survey.The Mapsat concept, invented by Alden F. Colvocoresses, a research cartographer at the USGS National Center, is based on Landsat technology but uses sensors that acquire higher-resolution image data in either a stereo or monoscopic mode. Stereo data can be processed relatively simply with automation to produce images for interpretation or to produce maps. Monoscopic and multispectral data can be processed in a computer to derive information on earth resources. Ground control, one of the most expensive phases of mapping, could be kept to a minimum.

  13. Automating the multiprocessing environment

    NASA Technical Reports Server (NTRS)

    Arpasi, Dale J.

    1989-01-01

    An approach to automate the programming and operation of tree-structured networks of multiprocessor systems is discussed. A conceptual, knowledge-based operating environment is presented, and requirements for two major technology elements are identified as follows: (1) An intelligent information translator is proposed for implementating information transfer between dissimilar hardware and software, thereby enabling independent and modular development of future systems and promoting a language-independence of codes and information; (2) A resident system activity manager, which recognizes the systems capabilities and monitors the status of all systems within the environment, is proposed for integrating dissimilar systems into effective parallel processing resources to optimally meet user needs. Finally, key computational capabilities which must be provided before the environment can be realized are identified.

  14. [From automation to robotics].

    PubMed

    1985-01-01

    The introduction of automation into the laboratory of biology seems to be unavoidable. But at which cost, if it is necessary to purchase a new machine for every new application? Fortunately the same image processing techniques, belonging to a theoretic framework called Mathematical Morphology, may be used in visual inspection tasks, both in car industry and in the biology lab. Since the market for industrial robotics applications is much higher than the market of biomedical applications, the price of image processing devices drops, and becomes sometimes less than the price of a complete microscope equipment. The power of the image processing methods of Mathematical Morphology will be illustrated by various examples, as automatic silver grain counting in autoradiography, determination of HLA genotype, electrophoretic gels analysis, automatic screening of cervical smears... Thus several heterogeneous applications may share the same image processing device, provided there is a separate and devoted work station for each of them.

  15. Berkeley automated supernova search

    SciTech Connect

    Kare, J.T.; Pennypacker, C.R.; Muller, R.A.; Mast, T.S.; Crawford, F.S.; Burns, M.S.

    1981-01-01

    The Berkeley automated supernova search employs a computer controlled 36-inch telescope and charge coupled device (CCD) detector to image 2500 galaxies per night. A dedicated minicomputer compares each galaxy image with stored reference data to identify supernovae in real time. The threshold for detection is m/sub v/ = 18.8. We plan to monitor roughly 500 galaxies in Virgo and closer every night, and an additional 6000 galaxies out to 70 Mpc on a three night cycle. This should yield very early detection of several supernovae per year for detailed study, and reliable premaximum detection of roughly 100 supernovae per year for statistical studies. The search should be operational in mid-1982.

  16. Automated Analysis Workstation

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Information from NASA Tech Briefs of work done at Langley Research Center and the Jet Propulsion Laboratory assisted DiaSys Corporation in manufacturing their first product, the R/S 2000. Since then, the R/S 2000 and R/S 2003 have followed. Recently, DiaSys released their fourth workstation, the FE-2, which automates the process of making and manipulating wet-mount preparation of fecal concentrates. The time needed to read the sample is decreased, permitting technologists to rapidly spot parasites, ova and cysts, sometimes carried in the lower intestinal tract of humans and animals. Employing the FE-2 is non-invasive, can be performed on an out-patient basis, and quickly provides confirmatory results.

  17. Automated attendance accounting system

    NASA Technical Reports Server (NTRS)

    Chapman, C. P. (Inventor)

    1973-01-01

    An automated accounting system useful for applying data to a computer from any or all of a multiplicity of data terminals is disclosed. The system essentially includes a preselected number of data terminals which are each adapted to convert data words of decimal form to another form, i.e., binary, usable with the computer. Each data terminal may take the form of a keyboard unit having a number of depressable buttons or switches corresponding to selected data digits and/or function digits. A bank of data buffers, one of which is associated with each data terminal, is provided as a temporary storage. Data from the terminals is applied to the data buffers on a digit by digit basis for transfer via a multiplexer to the computer.

  18. Automated Standard Hazard Tool

    NASA Technical Reports Server (NTRS)

    Stebler, Shane

    2014-01-01

    The current system used to generate standard hazard reports is considered cumbersome and iterative. This study defines a structure for this system's process in a clear, algorithmic way so that standard hazard reports and basic hazard analysis may be completed using a centralized, web-based computer application. To accomplish this task, a test server is used to host a prototype of the tool during development. The prototype is configured to easily integrate into NASA's current server systems with minimal alteration. Additionally, the tool is easily updated and provides NASA with a system that may grow to accommodate future requirements and possibly, different applications. Results of this project's success are outlined in positive, subjective reviews complete by payload providers and NASA Safety and Mission Assurance personnel. Ideally, this prototype will increase interest in the concept of standard hazard automation and lead to the full-scale production of a user-ready application.

  19. Multi-step usage of in vivo models during rational drug design and discovery.

    PubMed

    Williams, Charles H; Hong, Charles C

    2011-01-01

    In this article we propose a systematic development method for rational drug design while reviewing paradigms in industry, emerging techniques and technologies in the field. Although the process of drug development today has been accelerated by emergence of computational methodologies, it is a herculean challenge requiring exorbitant resources; and often fails to yield clinically viable results. The current paradigm of target based drug design is often misguided and tends to yield compounds that have poor absorption, distribution, metabolism, and excretion, toxicology (ADMET) properties. Therefore, an in vivo organism based approach allowing for a multidisciplinary inquiry into potent and selective molecules is an excellent place to begin rational drug design. We will review how organisms like the zebrafish and Caenorhabditis elegans can not only be starting points, but can be used at various steps of the drug development process from target identification to pre-clinical trial models. This systems biology based approach paired with the power of computational biology; genetics and developmental biology provide a methodological framework to avoid the pitfalls of traditional target based drug design.

  20. Comparative genomics reveals multistep pathogenesis of E2A-PBX1 acute lymphoblastic leukemia

    PubMed Central

    Duque-Afonso, Jesús; Feng, Jue; Scherer, Florian; Lin, Chiou-Hong; Wong, Stephen H.K.; Wang, Zhong; Iwasaki, Masayuki; Cleary, Michael L.

    2015-01-01

    Acute lymphoblastic leukemia (ALL) is the most common childhood cancer; however, its genetic diversity limits investigation into the molecular pathogenesis of disease and development of therapeutic strategies. Here, we engineered mice that conditionally express the E2A-PBX1 fusion oncogene, which results from chromosomal translocation t(1;19) and is present in 5% to 7% of pediatric ALL cases. The incidence of leukemia in these mice varied from 5% to 50%, dependent on the Cre-driving promoter (Cd19, Mb1, or Mx1) used to induce E2A-PBX1 expression. Two distinct but highly similar subtypes of B cell precursor ALLs that differed by their pre–B cell receptor (pre-BCR) status were induced and displayed maturation arrest at the pro-B/large pre–B II stages of differentiation, similar to human E2A-PBX1 ALL. Somatic activation of E2A-PBX1 in B cell progenitors enhanced self-renewal and led to acquisition of multiple secondary genomic aberrations, including prominent spontaneous loss of Pax5. In preleukemic mice, conditional Pax5 deletion cooperated with E2A-PBX1 to expand progenitor B cell subpopulations, increasing penetrance and shortening leukemia latency. Recurrent secondary activating mutations were detected in key signaling pathways, most notably JAK/STAT, that leukemia cells require for proliferation. These data support conditional E2A-PBX1 mice as a model of human ALL and suggest targeting pre-BCR signaling and JAK kinases as potential therapeutic strategies. PMID:26301816

  1. An open source multistep model to predict mutagenicity from statistical analysis and relevant structural alerts

    PubMed Central

    2010-01-01

    Background Mutagenicity is the capability of a substance to cause genetic mutations. This property is of high public concern because it has a close relationship with carcinogenicity and potentially with reproductive toxicity. Experimentally, mutagenicity can be assessed by the Ames test on Salmonella with an estimated experimental reproducibility of 85%; this intrinsic limitation of the in vitro test, along with the need for faster and cheaper alternatives, opens the road to other types of assessment methods, such as in silico structure-activity prediction models. A widely used method checks for the presence of known structural alerts for mutagenicity. However the presence of such alerts alone is not a definitive method to prove the mutagenicity of a compound towards Salmonella, since other parts of the molecule can influence and potentially change the classification. Hence statistically based methods will be proposed, with the final objective to obtain a cascade of modeling steps with custom-made properties, such as the reduction of false negatives. Results A cascade model has been developed and validated on a large public set of molecular structures and their associated Salmonella mutagenicity outcome. The first step consists in the derivation of a statistical model and mutagenicity prediction, followed by further checks for specific structural alerts in the "safe" subset of the prediction outcome space. In terms of accuracy (i.e., overall correct predictions of both negative and positives), the obtained model approached the 85% reproducibility of the experimental mutagenicity Ames test. Conclusions The model and the documentation for regulatory purposes are freely available on the CAESAR website. The input is simply a file of molecular structures and the output is the classification result. PMID:20678181

  2. Programmable Automated Welding System (PAWS)

    NASA Technical Reports Server (NTRS)

    Kline, Martin D.

    1994-01-01

    An ambitious project to develop an advanced, automated welding system is being funded as part of the Navy Joining Center with Babcock & Wilcox as the prime integrator. This program, the Programmable Automated Welding System (PAWS), involves the integration of both planning and real-time control activities. Planning functions include the development of a graphical decision support system within a standard, portable environment. Real-time control functions include the development of a modular, intelligent, real-time control system and the integration of a number of welding process sensors. This paper presents each of these components of the PAWS and discusses how they can be utilized to automate the welding operation.

  3. Automated Engineering Design (AED); An approach to automated documentation

    NASA Technical Reports Server (NTRS)

    Mcclure, C. W.

    1970-01-01

    The automated engineering design (AED) is reviewed, consisting of a high level systems programming language, a series of modular precoded subroutines, and a set of powerful software machine tools that effectively automate the production and design of new languages. AED is used primarily for development of problem and user-oriented languages. Software production phases are diagramed, and factors which inhibit effective documentation are evaluated.

  4. Medical genetics

    SciTech Connect

    Jorde, L.B.; Carey, J.C.; White, R.L.

    1995-10-01

    This book on the subject of medical genetics is a textbook aimed at a very broad audience: principally, medical students, nursing students, graduate, and undergraduate students. The book is actually a primer of general genetics as applied to humans and provides a well-balanced introduction to the scientific and clinical basis of human genetics. The twelve chapters include: Introduction, Basic Cell Biology, Genetic Variation, Autosomal Dominant and Recessive Inheritance, Sex-linked and Mitochondrial Inheritance, Clinical Cytogenetics, Gene Mapping, Immunogenetics, Cancer Genetics, Multifactorial Inheritance and Common Disease, Genetic Screening, Genetic Diagnosis and Gene Therapy, and Clinical Genetics and Genetic Counseling.

  5. Multi-step formation of a hemifusion diaphragm for vesicle fusion revealed by all-atom molecular dynamics simulations.

    PubMed

    Tsai, Hui-Hsu Gavin; Chang, Che-Ming; Lee, Jian-Bin

    2014-06-01

    Membrane fusion is essential for intracellular trafficking and virus infection, but the molecular mechanisms underlying the fusion process remain poorly understood. In this study, we employed all-atom molecular dynamics simulations to investigate the membrane fusion mechanism using vesicle models which were pre-bound by inter-vesicle Ca(2+)-lipid clusters to approximate Ca(2+)-catalyzed fusion. Our results show that the formation of the hemifusion diaphragm for vesicle fusion is a multi-step event. This result contrasts with the assumptions made in most continuum models. The neighboring hemifused states are separated by an energy barrier on the energy landscape. The hemifusion diaphragm is much thinner than the planar lipid bilayers. The thinning of the hemifusion diaphragm during its formation results in the opening of a fusion pore for vesicle fusion. This work provides new insights into the formation of the hemifusion diaphragm and thus increases understanding of the molecular mechanism of membrane fusion. This article is part of a Special Issue entitled: Membrane Structure and Function: Relevance in the Cell's Physiology, Pathology and Therapy.

  6. Gadolinium trace determination in biomedical samples by diode-laser-based multi-step resonance ionization mass spectrometry

    NASA Astrophysics Data System (ADS)

    Geppert, Ch.; Blaum, K.; Diel, S.; Müller, P.; Schreiber, W. G.; Wendt, K.

    2001-08-01

    Diode laser based multi-step resonance ionization mass spectrometry (RIMS), which has been developed primarily for ultra trace analysis of long lived radioactive isotopes has been adapted for the application to elements within the sequence of the rare earths. First investigations concern Gd isotopes. Here high suppression of isobars, as provided by RIMS, is mandatory. Using a three step resonant excitation scheme into an autoionizing state, which has been the subject of preparatory spectroscopic investigations, high efficiency of >1×10-6 and good isobaric selectivity >107 was realized. Additionally the linearity of the method has been demonstrated over six orders of magnitude. Avoiding contaminations from the Titanium-carrier foil resulted in a suppression of background of more than one order of magnitude and a correspondingly low detection limit of 4×109 atoms, equivalent to lpg of Gd. The technique has been applied for trace determination of the Gd-content in animal tissue. Bio-medical micro samples were analyzed shortly after Gd-chelat, which is used as the primary contrast medium for magnetic resonance imaging (MRI) in biomedical investigations, has been injected. Correlated in-vivo magnetic resonance images have been taken. The RIMS measurements show high reproducibility as a well as good precision, and contribute to new insight into the distribution and kinetics of Gd within different healthy and cancerous tissues.

  7. Multiwavelength Observations of a Slow-rise, Multistep X1.6 Flare and the Associated Eruption

    NASA Astrophysics Data System (ADS)

    Yurchyshyn, V.; Kumar, P.; Cho, K.-S.; Lim, E.-K.; Abramenko, V. I.

    2015-10-01

    Using multiwavelength observations, we studied a slow-rise, multistep X1.6 flare that began on 2014 November 7 as a localized eruption of core fields inside a δ-sunspot and later engulfed the entire active region (AR). This flare event was associated with formation of two systems of post-eruption arcades (PEAs) and several J-shaped flare ribbons showing extremely fine details, irreversible changes in the photospheric magnetic fields, and it was accompanied by a fast and wide coronal mass ejection. Data from the Solar Dynamics Observatory and IRIS spacecraft, along with the ground-based data from the New Solar Telescope, present evidence that (i) the flare and the eruption were directly triggered by a flux emergence that occurred inside a δ-sunspot at the boundary between two umbrae; (ii) this event represented an example of the formation of an unstable flux rope observed only in hot AIA channels (131 and 94 Å) and LASCO C2 coronagraph images; (iii) the global PEA spanned the entire AR and was due to global-scale reconnection occurring at heights of about one solar radius, indicating the global spatial and temporal scale of the eruption.

  8. Multiwavelength Observations of a Slow Raise, Multi-Step X1.6 Flare and the Associated Eruption

    NASA Astrophysics Data System (ADS)

    Yurchyshyn, V.

    2015-12-01

    Using multi-wavelength observations we studied a slow rise, multi-step X1.6 flare that began on November 7, 2014 as a localized eruption of core fields inside a δ-sunspot and later engulfed the entire active region. This flare event was associated with formation of two systems of post eruption arcades (PEAs) and several J-shaped flare ribbons showing extremely fine details, irreversible changes in the photospheric magnetic fields, and it was accompanied by a fast and wide coronal mass ejection. Data from the Solar Dynamics Observatory, IRIS spacecraft along with the ground based data from the New Solar Telescope (NST) present evidence that i) the flare and the eruption were directly triggered by a flux emergence that occurred inside a δ--sunspot at the boundary between two umbrae; ii) this event represented an example of an in-situ formation of an unstable flux rope observed only in hot AIA channels (131 and 94Å) and LASCO C2 coronagraph images; iii) the global PEA system spanned the entire AR and was due to global scale reconnection occurring at heights of about one solar radii, indicating on the global spatial and temporal scale of the eruption.

  9. In-cell aggregation of a polyglutamine-containing chimera is a multistep process initiated by the flanking sequence.

    PubMed

    Ignatova, Zoya; Thakur, Ashwani K; Wetzel, Ronald; Gierasch, Lila M

    2007-12-14

    Toxicity in amyloid diseases is intimately linked to the nature of aggregates, with early oligomeric species believed to be more cytotoxic than later fibrillar aggregates. Yet mechanistic understanding of how aggregating species evolve with time is currently lacking. We have explored the aggregation process of a chimera composed of a globular protein (cellular retinoic acid-binding protein, CRABP) and huntingtin exon 1 with polyglutamine tracts either above (Q53) or below (Q20) the pathological threshold using Escherichia coli cells as a model intracellular environment. Previously we showed that fusion of the huntingtin exon 1 sequence with >40Q led to structural perturbation and decreased stability of CRABP (Ignatova, Z., and Gierasch, L. M. (2006) J. Biol. Chem. 281, 12959-12967). Here we report that the Q53 chimera aggregates in cells via a multistep process: early stage aggregates are spherical and detergent-soluble, characteristics of prefibrillar aggregates, and appear to be dominated structurally by CRABP, in that they can promote aggregation of a CRABP variant but not oligoglutamine aggregation, and the CRABP domain is relatively sequestered based on its protection from proteolysis. Late stage aggregates appear to be dominated by polyGln; they are fibrillar, detergent-resistant, capable of seeding aggregation of oligoglutamine but not the CRABP variant, and show relative protection of the polyglutamine-exon1 domain from proteolysis. These results point to an evolution of the dominant sequences in intracellular aggregates and may provide molecular insight into origins of toxic prefibrillar aggregates.

  10. Thrombospondin-2 overexpression in the skin of transgenic mice reduces the susceptibility to chemically-induced multistep skin carcinogenesis

    PubMed Central

    Kunstfeld, Rainer; Hawighorst, Thomas; Streit, Michael; Hong, Young-Kwon; Nguyen, Lynh; Brown, Lawrence F.; Detmar, Michael

    2014-01-01

    Background We have previously reported stromal upregulation of the endogenous angiogenesis inhibitor thrombospondin-2 (TSP-2) during multistep carcinogenesis, and we found accelerated and enhanced skin angiogenesis and carcinogenesis in TSP-2 deficient mice. Goals To investigate whether enhanced levels of TSP-2 might protect from skin cancer development. Methods We established transgenic mice with targeted overexpression of TSP-2 in the skin and subjected hemizygous TSP-2 transgenic mice and their wild-type littermates to a chemical skin carcinogenesis regimen. Results TSP-2 transgenic mice showed a significantly delayed onset of tumor formation compared to wild-type mice, whereas the ratio of malignant conversion to squamous cell carcinomas was comparable in both genotypes. Computer-assisted morphometric analysis of blood vessels revealed pronounced tumor angiogenesis already in the early stages of carcinogenesis in wild type mice. TSP-2 overexpression significantly reduced tumor blood vessel density in transgenic mice but had no overt effect on LYVE-1 positive lymphatic vessels. The percentage of desmin surrounded, mature tumor-associated blood vessels and the degree of epithelial differentiation remained unaffected. The antiangiogenic effect of transgenic TSP-2 was accompanied by a significantly increased number of apoptotic tumor cells in transgenic mice. Conclusion Our results demonstrate that enhanced levels of TSP-2 in the skin result in reduced susceptibility to chemically-induced skin carcinogenesis and identify TSP-2 as a new target for the prevention of skin cancer. PMID:24507936

  11. Altered expression of CKs 14/20 is an early event in a rat model of multistep bladder carcinogenesis.

    PubMed

    Gil da Costa, Rui M; Oliveira, Paula A; Vasconcelos-Nóbrega, Carmen; Arantes-Rodrigues, Regina; Pinto-Leite, Rosário; Colaço, Aura A; de la Cruz, Luis F; Lopes, Carlos

    2015-10-01

    Cytokeratins (CKs) 14 and 20 are promising markers for diagnosing urothelial lesions and for studying their prognosis and histogenesis. This work aimed to study the immunohistochemical staining patterns of CK14/20 during multistep carcinogenesis leading to papillary bladder cancer in a rat model. Thirty female Fischer 344 rats were divided into three groups: group 1 (control); group 2, which received N-butyl-N-(4-hydroxybutyl)nitrosamine (BBN) for 20 weeks plus 1 week without treatment; and group 3, which received BBN for 20 weeks plus 8 weeks without treatment. Bladder lesions were classified histologically. CK14 and CK20 immunostaining was assessed according to its distribution and intensity. In control animals, 0-25% of basal cells and umbrella cells stained positive for CK14 and CK20 respectively. On groups 2 and 3, nodular hyperplastic lesions showed normal CK20 and moderately increased CK14 staining (26-50% of cells). Dysplasia, squamous metaplasia, papilloma, papillary tumours of low malignant potential and low- and high-grade papillary carcinomas showed increased CK14 and CK20 immunostaining in all epithelial layers. Altered CK14 and CK20 expression is an early event in urothelial carcinogenesis and is present in a wide spectrum of urothelial superficial neoplastic and preneoplastic lesions.

  12. MULTIWAVELENGTH OBSERVATIONS OF A SLOW-RISE, MULTISTEP X1.6 FLARE AND THE ASSOCIATED ERUPTION

    SciTech Connect

    Yurchyshyn, V.; Kumar, P.; Cho, K.-S.; Lim, E.-K.; Abramenko, V. I.

    2015-10-20

    Using multiwavelength observations, we studied a slow-rise, multistep X1.6 flare that began on 2014 November 7 as a localized eruption of core fields inside a δ-sunspot and later engulfed the entire active region (AR). This flare event was associated with formation of two systems of post-eruption arcades (PEAs) and several J-shaped flare ribbons showing extremely fine details, irreversible changes in the photospheric magnetic fields, and it was accompanied by a fast and wide coronal mass ejection. Data from the Solar Dynamics Observatory and IRIS spacecraft, along with the ground-based data from the New Solar Telescope, present evidence that (i) the flare and the eruption were directly triggered by a flux emergence that occurred inside a δ-sunspot at the boundary between two umbrae; (ii) this event represented an example of the formation of an unstable flux rope observed only in hot AIA channels (131 and 94 Å) and LASCO C2 coronagraph images; (iii) the global PEA spanned the entire AR and was due to global-scale reconnection occurring at heights of about one solar radius, indicating the global spatial and temporal scale of the eruption.

  13. Segmenting the Femoral Head and Acetabulum in the Hip Joint Automatically Using a Multi-Step Scheme

    NASA Astrophysics Data System (ADS)

    Wang, Ji; Cheng, Yuanzhi; Fu, Yili; Zhou, Shengjun; Tamura, Shinichi

    We describe a multi-step approach for automatic segmentation of the femoral head and the acetabulum in the hip joint from three dimensional (3D) CT images. Our segmentation method consists of the following steps: 1) construction of the valley-emphasized image by subtracting valleys from the original images; 2) initial segmentation of the bone regions by using conventional techniques including the initial threshold and binary morphological operations from the valley-emphasized image; 3) further segmentation of the bone regions by using the iterative adaptive classification with the initial segmentation result; 4) detection of the rough bone boundaries based on the segmented bone regions; 5) 3D reconstruction of the bone surface using the rough bone boundaries obtained in step 4) by a network of triangles; 6) correction of all vertices of the 3D bone surface based on the normal direction of vertices; 7) adjustment of the bone surface based on the corrected vertices. We evaluated our approach on 35 CT patient data sets. Our experimental results show that our segmentation algorithm is more accurate and robust against noise than other conventional approaches for automatic segmentation of the femoral head and the acetabulum. Average root-mean-square (RMS) distance from manual reference segmentations created by experienced users was approximately 0.68mm (in-plane resolution of the CT data).

  14. Retropath: automated pipeline for embedded metabolic circuits.

    PubMed

    Carbonell, Pablo; Parutto, Pierre; Baudier, Claire; Junot, Christophe; Faulon, Jean-Loup

    2014-08-15

    Metabolic circuits are a promising alternative to other conventional genetic circuits as modular parts implementing functionalities required for synthetic biology applications. To date, metabolic design has been mainly focused on production circuits. Emergent applications such as smart therapeutics, however, require circuits that enable sensing and regulation. Here, we present RetroPath, an automated pipeline for embedded metabolic circuits that explores the circuit design space from a given set of specifications and selects the best circuits to implement based on desired constraints. Synthetic biology circuits embedded in a chassis organism that are capable of controlling the production, processing, sensing, and the release of specific molecules were enumerated in the metabolic space through a standard procedure. In that way, design and implementation of applications such as therapeutic circuits that autonomously diagnose and treat disease, are enabled, and their optimization is streamlined.

  15. Genetic algorithms

    NASA Technical Reports Server (NTRS)

    Wang, Lui; Bayer, Steven E.

    1991-01-01

    Genetic algorithms are mathematical, highly parallel, adaptive search procedures (i.e., problem solving methods) based loosely on the processes of natural genetics and Darwinian survival of the fittest. Basic genetic algorithms concepts are introduced, genetic algorithm applications are introduced, and results are presented from a project to develop a software tool that will enable the widespread use of genetic algorithm technology.

  16. Fuzzy Control/Space Station automation

    NASA Technical Reports Server (NTRS)

    Gersh, Mark

    1990-01-01

    Viewgraphs on fuzzy control/space station automation are presented. Topics covered include: Space Station Freedom (SSF); SSF evolution; factors pointing to automation & robotics (A&R); astronaut office inputs concerning A&R; flight system automation and ground operations applications; transition definition program; and advanced automation software tools.

  17. Automation and Human Resource Management.

    ERIC Educational Resources Information Center

    Taft, Michael

    1988-01-01

    Discussion of the automation of personnel administration in libraries covers (1) new developments in human resource management systems; (2) system requirements; (3) software evaluation; (4) vendor evaluation; (5) selection of a system; (6) training and support; and (7) benefits. (MES)

  18. Office Automation Boosts University's Productivity.

    ERIC Educational Resources Information Center

    School Business Affairs, 1986

    1986-01-01

    The University of Pittsburgh has a 2-year agreement designating the Xerox Corporation as the primary supplier of word processing and related office automation equipment in order to increase productivity and more efficient use of campus resources. (MLF)

  19. Office Automation at Memphis State.

    ERIC Educational Resources Information Center

    Smith, R. Eugene; And Others

    1986-01-01

    The development of a university-wide office automation plan, beginning with a short-range pilot project and a five-year plan for the entire organization with the potential for modular implementation, is described. (MSE)

  20. Automation of antimicrobial activity screening.

    PubMed

    Forry, Samuel P; Madonna, Megan C; López-Pérez, Daneli; Lin, Nancy J; Pasco, Madeleine D

    2016-03-01

    Manual and automated methods were compared for routine screening of compounds for antimicrobial activity. Automation generally accelerated assays and required less user intervention while producing comparable results. Automated protocols were validated for planktonic, biofilm, and agar cultures of the oral microbe Streptococcus mutans that is commonly associated with tooth decay. Toxicity assays for the known antimicrobial compound cetylpyridinium chloride (CPC) were validated against planktonic, biofilm forming, and 24 h biofilm culture conditions, and several commonly reported toxicity/antimicrobial activity measures were evaluated: the 50 % inhibitory concentration (IC50), the minimum inhibitory concentration (MIC), and the minimum bactericidal concentration (MBC). Using automated methods, three halide salts of cetylpyridinium (CPC, CPB, CPI) were rapidly screened with no detectable effect of the counter ion on antimicrobial activity.

  1. Real Automation in the Field

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Mayero, Micaela; Bushnell, Dennis M. (Technical Monitor)

    2001-01-01

    We provide a package of strategies for automation of non-linear arithmetic in PVS. In particular, we describe a simplication procedure for the field of real numbers and a strategy for cancellation of common terms.

  2. Technology modernization assessment flexible automation

    SciTech Connect

    Bennett, D.W.; Boyd, D.R.; Hansen, N.H.; Hansen, M.A.; Yount, J.A.

    1990-12-01

    The objectives of this report are: to present technology assessment guidelines to be considered in conjunction with defense regulations before an automation project is developed to give examples showing how assessment guidelines may be applied to a current project to present several potential areas where automation might be applied successfully in the depot system. Depots perform primarily repair and remanufacturing operations, with limited small batch manufacturing runs. While certain activities (such as Management Information Systems and warehousing) are directly applicable to either environment, the majority of applications will require combining existing and emerging technologies in different ways, with the special needs of depot remanufacturing environment. Industry generally enjoys the ability to make revisions to its product lines seasonally, followed by batch runs of thousands or more. Depot batch runs are in the tens, at best the hundreds, of parts with a potential for large variation in product mix; reconfiguration may be required on a week-to-week basis. This need for a higher degree of flexibility suggests a higher level of operator interaction, and, in turn, control systems that go beyond the state of the art for less flexible automation and industry in general. This report investigates the benefits and barriers to automation and concludes that, while significant benefits do exist for automation, depots must be prepared to carefully investigate the technical feasibility of each opportunity and the life-cycle costs associated with implementation. Implementation is suggested in two ways: (1) develop an implementation plan for automation technologies based on results of small demonstration automation projects; (2) use phased implementation for both these and later stage automation projects to allow major technical and administrative risk issues to be addressed. 10 refs., 2 figs., 2 tabs. (JF)

  3. Multifunction automated crawling system

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph (Inventor); Joffe, Benjamin (Inventor); Backes, Paul Gregory (Inventor)

    1999-01-01

    The present invention is an automated crawling robot system including a platform, a first leg assembly, a second leg assembly, first and second rails attached to the platform, and an onboard electronic computer controller. The first leg assembly has an intermittent coupling device and the second leg assembly has an intermittent coupling device for intermittently coupling the respective first and second leg assemblies to a particular object. The first and second leg assemblies are slidably coupled to the rail assembly and are slidably driven by motors to thereby allow linear movement. In addition, the first leg assembly is rotary driven by a rotary motor to thereby provide rotary motion relative to the platform. To effectuate motion, the intermittent coupling devices of the first and second leg assemblies alternately couple the respective first and second leg assemblies to an object. This motion is done while simultaneously moving one of the leg assemblies linearly in the desired direction and preparing the next step. This arrangement allows the crawler of the present invention to traverse an object in a range of motion covering 360 degrees.

  4. Automated Supernova Discovery (Abstract)

    NASA Astrophysics Data System (ADS)

    Post, R. S.

    2015-12-01

    (Abstract only) We are developing a system of robotic telescopes for automatic recognition of Supernovas as well as other transient events in collaboration with the Puckett Supernova Search Team. At the SAS2014 meeting, the discovery program, SNARE, was first described. Since then, it has been continuously improved to handle searches under a wide variety of atmospheric conditions. Currently, two telescopes are used to build a reference library while searching for PSN with a partial library. Since data is taken every night without clouds, we must deal with varying atmospheric and high background illumination from the moon. Software is configured to identify a PSN, reshoot for verification with options to change the run plan to acquire photometric or spectrographic data. The telescopes are 24-inch CDK24, with Alta U230 cameras, one in CA and one in NM. Images and run plans are sent between sites so the CA telescope can search while photometry is done in NM. Our goal is to find bright PSNs with magnitude 17.5 or less which is the limit of our planned spectroscopy. We present results from our first automated PSN discoveries and plans for PSN data acquisition.

  5. Automated document analysis system

    NASA Astrophysics Data System (ADS)

    Black, Jeffrey D.; Dietzel, Robert; Hartnett, David

    2002-08-01

    A software application has been developed to aid law enforcement and government intelligence gathering organizations in the translation and analysis of foreign language documents with potential intelligence content. The Automated Document Analysis System (ADAS) provides the capability to search (data or text mine) documents in English and the most commonly encountered foreign languages, including Arabic. Hardcopy documents are scanned by a high-speed scanner and are optical character recognized (OCR). Documents obtained in an electronic format bypass the OCR and are copied directly to a working directory. For translation and analysis, the script and the language of the documents are first determined. If the document is not in English, the document is machine translated to English. The documents are searched for keywords and key features in either the native language or translated English. The user can quickly review the document to determine if it has any intelligence content and whether detailed, verbatim human translation is required. The documents and document content are cataloged for potential future analysis. The system allows non-linguists to evaluate foreign language documents and allows for the quick analysis of a large quantity of documents. All document processing can be performed manually or automatically on a single document or a batch of documents.

  6. Automated Gas Distribution System

    NASA Astrophysics Data System (ADS)

    Starke, Allen; Clark, Henry

    2012-10-01

    The cyclotron of Texas A&M University is one of the few and prized cyclotrons in the country. Behind the scenes of the cyclotron is a confusing, and dangerous setup of the ion sources that supplies the cyclotron with particles for acceleration. To use this machine there is a time consuming, and even wasteful step by step process of switching gases, purging, and other important features that must be done manually to keep the system functioning properly, while also trying to maintain the safety of the working environment. Developing a new gas distribution system to the ion source prevents many of the problems generated by the older manually setup process. This developed system can be controlled manually in an easier fashion than before, but like most of the technology and machines in the cyclotron now, is mainly operated based on software programming developed through graphical coding environment Labview. The automated gas distribution system provides multi-ports for a selection of different gases to decrease the amount of gas wasted through switching gases, and a port for the vacuum to decrease the amount of time spent purging the manifold. The Labview software makes the operation of the cyclotron and ion sources easier, and safer for anyone to use.

  7. Automated call tracking systems

    SciTech Connect

    Hardesty, C.

    1993-03-01

    User Services groups are on the front line with user support. We are the first to hear about problems. The speed, accuracy, and intelligence with which we respond determines the user`s perception of our effectiveness and our commitment to quality and service. To keep pace with the complex changes at our sites, we must have tools to help build a knowledge base of solutions, a history base of our users, and a record of every problem encountered. Recently, I completed a survey of twenty sites similar to the National Energy Research Supercomputer Center (NERSC). This informal survey reveals that 27% of the sites use a paper system to log calls, 60% employ homegrown automated call tracking systems, and 13% use a vendor-supplied system. Fifty-four percent of those using homegrown systems are exploring the merits of switching to a vendor-supplied system. The purpose of this paper is to provide guidelines for evaluating a call tracking system. In addition, insights are provided to assist User Services groups in selecting a system that fits their needs.

  8. Towards automated traceability maintenance.

    PubMed

    Mäder, Patrick; Gotel, Orlena

    2012-10-01

    Traceability relations support stakeholders in understanding the dependencies between artifacts created during the development of a software system and thus enable many development-related tasks. To ensure that the anticipated benefits of these tasks can be realized, it is necessary to have an up-to-date set of traceability relations between the established artifacts. This goal requires the creation of traceability relations during the initial development process. Furthermore, the goal also requires the maintenance of traceability relations over time as the software system evolves in order to prevent their decay. In this paper, an approach is discussed that supports the (semi-) automated update of traceability relations between requirements, analysis and design models of software systems expressed in the UML. This is made possible by analyzing change events that have been captured while working within a third-party UML modeling tool. Within the captured flow of events, development activities comprised of several events are recognized. These are matched with predefined rules that direct the update of impacted traceability relations. The overall approach is supported by a prototype tool and empirical results on the effectiveness of tool-supported traceability maintenance are provided.

  9. Automated ISS Flight Utilities

    NASA Technical Reports Server (NTRS)

    Offermann, Jan Tuzlic

    2016-01-01

    During my internship at NASA Johnson Space Center, I worked in the Space Radiation Analysis Group (SRAG), where I was tasked with a number of projects focused on the automation of tasks and activities related to the operation of the International Space Station (ISS). As I worked on a number of projects, I have written short sections below to give a description for each, followed by more general remarks on the internship experience. My first project is titled "General Exposure Representation EVADOSE", also known as "GEnEVADOSE". This project involved the design and development of a C++/ ROOT framework focused on radiation exposure for extravehicular activity (EVA) planning for the ISS. The utility helps mission managers plan EVAs by displaying information on the cumulative radiation doses that crew will receive during an EVA as a function of the egress time and duration of the activity. SRAG uses a utility called EVADOSE, employing a model of the space radiation environment in low Earth orbit to predict these doses, as while outside the ISS the astronauts will have less shielding from charged particles such as electrons and protons. However, EVADOSE output is cumbersome to work with, and prior to GEnEVADOSE, querying data and producing graphs of ISS trajectories and cumulative doses versus egress time required manual work in Microsoft Excel. GEnEVADOSE automates all this work, reading in EVADOSE output file(s) along with a plaintext file input by the user providing input parameters. GEnEVADOSE will output a text file containing all the necessary dosimetry for each proposed EVA egress time, for each specified EVADOSE file. It also plots cumulative dose versus egress time and the ISS trajectory, and displays all of this information in an auto-generated presentation made in LaTeX. New features have also been added, such as best-case scenarios (egress times corresponding to the least dose), interpolated curves for trajectories, and the ability to query any time in the

  10. Automated Microbial Metabolism Laboratory

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The Automated Microbial Metabolism Laboratory (AMML) 1971-1972 program involved the investigation of three separate life detection schemes. The first was a continued further development of the labeled release experiment. The possibility of chamber reuse without inbetween sterilization, to provide comparative biochemical information was tested. Findings show that individual substrates or concentrations of antimetabolites may be sequentially added to a single test chamber. The second detection system which was investigated for possible inclusion in the AMML package of assays, was nitrogen fixation as detected by acetylene reduction. Thirdly, a series of preliminary steps were taken to investigate the feasibility of detecting biopolymers in soil. A strategy for the safe return to Earth of a Mars sample prior to manned landings on Mars is outlined. The program assumes that the probability of indigenous life on Mars is unity and then broadly presents the procedures for acquisition and analysis of the Mars sample in a manner to satisfy the scientific community and the public that adequate safeguards are being taken.

  11. An automation simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.; Mutammara, Atheel

    1988-01-01

    The work being done in porting ROBOSIM (a graphical simulation system developed jointly by NASA-MSFC and Vanderbilt University) to the HP350SRX graphics workstation is described. New additional ROBOSIM features, like collision detection and new kinematics simulation methods are also discussed. Based on the experiences of the work on ROBOSIM, a new graphics structural modeling environment is suggested which is intended to be a part of a new knowledge-based multiple aspect modeling testbed. The knowledge-based modeling methodologies and tools already available are described. Three case studies in the area of Space Station automation are also reported. First a geometrical structural model of the station is presented. This model was developed using the ROBOSIM package. Next the possible application areas of an integrated modeling environment in the testing of different Space Station operations are discussed. One of these possible application areas is the modeling of the Environmental Control and Life Support System (ECLSS), which is one of the most complex subsystems of the station. Using the multiple aspect modeling methodology, a fault propagation model of this system is being built and is described.

  12. Automated leak test systems

    SciTech Connect

    Cordaro, J.V.; Thompson, W.D.; Reeves, G.

    1997-09-15

    An automated leak test system for tritium shipping containers has been developed at Westinghouse Savannah River Co. (WSRC). The leak detection system employs a computer controlled helium detector which allows an operator to enter key information when prompted. The software for controlling the tests and the equipment apparatus were both designed and manufactured at the Savannah River Technology Center within WSRC. Recertification Test: Every twelve months, the pressure vessel portion of the shipping container itself must undergo a rigorous recertification leak test. After an empty pressure vessel (shipping container) is assembled, it is placed into one of six stainless steel belljars for helium leak testing. The belljars are fashioned in row much the same as assembly line arrangement. Post-load Test: A post-load leak test is performed upon reservoirs that have been filled with tritium and placed inside the shipping containers mentioned above. These leak tests are performed by a rate-of-rise method where the area around the shipping container seals is evacuated, valved off from the vacuum pump, and then the vacuum pressure is monitored over a two-minute period. The Post Load Leak Test is a quality verification test to ensure that the shipping container has been correctly assembled. 2 figs.

  13. Baculovirus expression system and method for high throughput expression of genetic material

    DOEpatents

    Clark, Robin; Davies, Anthony

    2001-01-01

    The present invention provides novel recombinant baculovirus expression systems for expressing foreign genetic material in a host cell. Such expression systems are readily adapted to an automated method for expression foreign genetic material in a high throughput manner. In other aspects, the present invention features a novel automated method for determining the function of foreign genetic material by transfecting the same into a host by way of the recombinant baculovirus expression systems according to the present invention.

  14. Automated detection of retinal nerve fiber layer defects on fundus images: false positive reduction based on vessel likelihood

    NASA Astrophysics Data System (ADS)

    Muramatsu, Chisako; Ishida, Kyoko; Sawada, Akira; Hatanaka, Yuji; Yamamoto, Tetsuya; Fujita, Hiroshi

    2016-03-01

    Early detection of glaucoma is important to slow down or cease progression of the disease and for preventing total blindness. We have previously proposed an automated scheme for detection of retinal nerve fiber layer defect (NFLD), which is one of the early signs of glaucoma observed on retinal fundus images. In this study, a new multi-step detection scheme was included to improve detection of subtle and narrow NFLDs. In addition, new features were added to distinguish between NFLDs and blood vessels, which are frequent sites of false positives (FPs). The result was evaluated with a new test dataset consisted of 261 cases, including 130 cases with NFLDs. Using the proposed method, the initial detection rate was improved from 82% to 98%. At the sensitivity of 80%, the number of FPs per image was reduced from 4.25 to 1.36. The result indicates the potential usefulness of the proposed method for early detection of glaucoma.

  15. Prototyping a genetics deductive database

    SciTech Connect

    Hearne, C.; Cui, Zhan; Parsons, S.; Hajnal, S.

    1994-12-31

    We are developing a laboratory notebook system known as the Genetics Deductive Database. Currently our prototype provides storage for biological facts and rules with flexible access via an interactive graphical display. We have introduced a formal basis for the representation and reasoning necessary to order genome map data and handle the uncertainty inherent in biological data. We aim to support laboratory activities by introducing an experiment planner into our prototype. The Genetics Deductive Database is built using new database technology which provides an object-oriented conceptual model, a declarative rule language, and a procedural update language. This combination of features allows the implementation of consistency maintenance, automated reasoning, and data verification.

  16. Automated ship image acquisition

    NASA Astrophysics Data System (ADS)

    Hammond, T. R.

    2008-04-01

    The experimental Automated Ship Image Acquisition System (ASIA) collects high-resolution ship photographs at a shore-based laboratory, with minimal human intervention. The system uses Automatic Identification System (AIS) data to direct a high-resolution SLR digital camera to ship targets and to identify the ships in the resulting photographs. The photo database is then searchable using the rich data fields from AIS, which include the name, type, call sign and various vessel identification numbers. The high-resolution images from ASIA are intended to provide information that can corroborate AIS reports (e.g., extract identification from the name on the hull) or provide information that has been omitted from the AIS reports (e.g., missing or incorrect hull dimensions, cargo, etc). Once assembled into a searchable image database, the images can be used for a wide variety of marine safety and security applications. This paper documents the author's experience with the practicality of composing photographs based on AIS reports alone, describing a number of ways in which this can go wrong, from errors in the AIS reports, to fixed and mobile obstructions and multiple ships in the shot. The frequency with which various errors occurred in automatically-composed photographs collected in Halifax harbour in winter time were determined by manual examination of the images. 45% of the images examined were considered of a quality sufficient to read identification markings, numbers and text off the entire ship. One of the main technical challenges for ASIA lies in automatically differentiating good and bad photographs, so that few bad ones would be shown to human users. Initial attempts at automatic photo rating showed 75% agreement with manual assessments.

  17. Genetic control of Drosophila nerve cord development

    NASA Technical Reports Server (NTRS)

    Skeath, James B.; Thor, Stefan

    2003-01-01

    The Drosophila ventral nerve cord has been a central model system for studying the molecular genetic mechanisms that control CNS development. Studies show that the generation of neural diversity is a multistep process initiated by the patterning and segmentation of the neuroectoderm. These events act together with the process of lateral inhibition to generate precursor cells (neuroblasts) with specific identities, distinguished by the expression of unique combinations of regulatory genes. The expression of these genes in a given neuroblast restricts the fate of its progeny, by activating specific combinations of downstream genes. These genes in turn specify the identity of any given postmitotic cell, which is evident by its cellular morphology and choice of neurotransmitter.

  18. Toward fully automated genotyping: genotyping microsatellite markers by deconvolution.

    PubMed Central

    Perlin, M W; Lancia, G; Ng, S K

    1995-01-01

    Dense genetic linkage maps have been constructed for the human and mouse genomes, with average densities of 2.9 cM and 0.35 cM, respectively. These genetic maps are crucial for mapping both Mendelian and complex traits and are useful in clinical genetic diagnosis. Current maps are largely comprised of abundant, easily assayed, and highly polymorphic PCR-based microsatellite markers, primarily dinucleotide (CA)n repeats. One key limitation of these length polymorphisms is the PCR stutter (or slippage) artifact that introduces additional stutter bands. With two (or more) closely spaced alleles, the stutter bands overlap, and it is difficult to accurately determine the correct alleles; this stutter phenomenon has all but precluded full automation, since a human must visually inspect the allele data. We describe here novel deconvolution methods for accurate genotyping that mathematically remove PCR stutter artifact from microsatellite markers. These methods overcome the manual interpretation bottleneck and thereby enable full automation of genetic map construction and use. New functionalities, including the pooling of DNAs and the pooling of markers, are described that may greatly reduce the associated experimentation requirements. PMID:7485172

  19. Multistep mass spectrometry methodology for direct characterization of polar lipids in green microalgae using paper spray ionization.

    PubMed

    Oradu, Sheran A; Cooks, R Graham

    2012-12-18

    Paper spray ionization, an ambient ionization method, has been applied for the identification of polar lipids in green microalgae with no sample preparation. A multistep experimental protocol was employed to characterize the lipid species of two microalgae strains, Kyo-Chlorella in tablet form and Nannochloropsis in paste form by mass spectrometry (MS). Tandem mass spectrometry (MS/MS) experiments using collision induced dissociation (CID) were employed for initial characterization of the detected lipid species, which were dominated by polar glycolipids and phospholipids. Product ion scan experiments were performed to determine the lipid head groups and fatty acid composition. Precursor ion scan experiments using fragment ions such as m/z 184, which is characteristic of the phosphocholine headgroup, were then used to confirm the lipid classification. Lipid elemental compositions were determined by exact mass measurements using high resolution mass spectrometry. Finally, the position of unsaturation was determined using reactive paper spray ionization experiments with ozone used as a reagent to cleave double bonds. Ozone was produced in situ using dielectric barrier discharge from a low temperature plasma, and it reacted in ambient air with the spray of ions produced by paper spray ionization. Using the precursor ion scan experiment, the resulting ozone cleavage product ions were used to determine the position of unsaturation for some of these species. By applying this experimental protocol, the molecular formulas and key aspects of the structures of glycerophosphocholines (PCs) such as 9Z-16:1/9Z,12Z-16:2 PC and 6Z,9Z-18:2/6Z,9Z,12Z-18:3PC and monogalactosyldiacylglycerols (MGDGs) such as 18:3/16:3MGDG were identified in the positive ion mode, while glycerophosphoglycerols (PGs) such as 18:3/16:0 PG and sulfoquinovosyldiacylglycerols (SQDGs) such as 18:3/16:0 SQDG were identified in the negative ion mode.

  20. A Novel Molten Salt Reactor Concept to Implement the Multi-Step Time-Scheduled Transmutation Strategy

    SciTech Connect

    Csom, Gyula; Feher, Sandor; Szieberthj, Mate

    2002-07-01

    Nowadays the molten salt reactor (MSR) concept seems to revive as one of the most promising systems for the realization of transmutation. In the molten salt reactors and subcritical systems the fuel and material to be transmuted circulate dissolved in some molten salt. The main advantage of this reactor type is the possibility of the continuous feed and reprocessing of the fuel. In the present paper a novel molten salt reactor concept is introduced and its transmutation capabilities are studied. The goal is the development of a transmutation technique along with a device implementing it, which yield higher transmutation efficiencies than that of the known procedures and thus results in radioactive waste whose load on the environment is reduced both in magnitude and time length. The procedure is the multi-step time-scheduled transmutation, in which transformation is done in several consecutive steps of different neutron flux and spectrum. In the new MSR concept, named 'multi-region' MSR (MRMSR), the primary circuit is made up of a few separate loops, in which salt-fuel mixtures of different compositions are circulated. The loop sections constituting the core region are only neutronically and thermally coupled. This new concept makes possible the utilization of the spatial dependence of spectrum as well as the advantageous features of liquid fuel such as the possibility of continuous chemical processing etc. In order to compare a 'conventional' MSR and a proposed MRMSR in terms of efficiency, preliminary calculational results are shown. Further calculations in order to find the optimal implementation of this new concept and to emphasize its other advantageous features are going on. (authors)

  1. Evaluation and optimisation of phenomenological multi-step soot model for spray combustion under diesel engine-like operating conditions

    NASA Astrophysics Data System (ADS)

    Pang, Kar Mun; Jangi, Mehdi; Bai, Xue-Song; Schramm, Jesper

    2015-05-01

    In this work, a two-dimensional computational fluid dynamics study is reported of an n-heptane combustion event and the associated soot formation process in a constant volume combustion chamber. The key interest here is to evaluate the sensitivity of the chemical kinetics and submodels of a semi-empirical soot model in predicting the associated events. Numerical computation is performed using an open-source code and a chemistry coordinate mapping approach is used to expedite the calculation. A library consisting of various phenomenological multi-step soot models is constructed and integrated with the spray combustion solver. Prior to the soot modelling, combustion simulations are carried out. Numerical results show that the ignition delay times and lift-off lengths exhibit good agreement with the experimental measurements across a wide range of operating conditions, apart from those in the cases with ambient temperature lower than 850 K. The variation of the soot precursor production with respect to the change of ambient oxygen levels qualitatively agrees with that of the conceptual models when the skeletal n-heptane mechanism is integrated with a reduced pyrene chemistry. Subsequently, a comprehensive sensitivity analysis is carried out to appraise the existing soot formation and oxidation submodels. It is revealed that the soot formation is captured when the surface growth rate is calculated using a square root function of the soot specific surface area and when a pressure-dependent model constant is considered. An optimised soot model is then proposed based on the knowledge gained through this exercise. With the implementation of optimised model, the simulated soot onset and transport phenomena before reaching quasi-steady state agree reasonably well with the experimental observation. Also, variation of spatial soot distribution and soot mass produced at oxygen molar fractions ranging from 10.0 to 21.0% for both low and high density conditions are reproduced.

  2. Astrocytes derived from p53-deficient mice provide a multistep in vitro model for development of malignant gliomas.

    PubMed Central

    Yahanda, A M; Bruner, J M; Donehower, L A; Morrison, R S

    1995-01-01

    Loss or mutation of p53 is thought to be an early event in the malignant transformation of many human astrocytic tumors. To better understand the role of p53 in their growth and transformation, we developed a model employing cultured neonatal astrocytes derived from mice deficient in one (p53 +/-) or both (p53 -/-) p53 alleles, comparing them with wild-type (p53 +/+) cells. Studies of in vitro and in vivo growth and transformation were performed, and flow cytometry and karyotyping were used to correlate changes in growth with genomic instability. Early-passage (EP) p53 -/- astrocytes achieved higher saturation densities and had more rapid growth than EP p53 +/- and +/+ cells. The EP p53 -/- cells were not transformed, as they were unable to grow in serum-free medium or in nude mice. With continued passaging, p53 -/- cells exhibited a multistep progression to a transformed phenotype. Late-passage p53 -/- cells achieved saturation densities 50 times higher than those of p53 +/+ cells and formed large, well-vascularized tumors in nude mice. p53 +/- astrocytes exhibited early loss of the remaining wild-type p53 allele and then evolved in a manner phenotypically similar to p53 -/- astrocytes. In marked contrast, astrocytes retaining both wild-type p53 alleles never exhibited a transformed phenotype and usually senesced after 7 to 10 passages. Dramatic alterations in ploidy and karyotype occurred and were restricted to cells deficient in wild-type p53 following repeated passaging. The results of these studies suggest that loss of wild-type p53 function promotes genomic instability, accelerated growth, and malignant transformation in astrocytes. PMID:7623819

  3. Rapid determination and chemical change tracking of benzoyl peroxide in wheat flour by multi-step IR macro-fingerprinting

    NASA Astrophysics Data System (ADS)

    Guo, Xiao-Xi; Hu, Wei; Liu, Yuan; Sun, Su-Qin; Gu, Dong-Chen; He, Helen; Xu, Chang-Hua; Wang, Xi-Chang

    2016-02-01

    BPO is often added to wheat flour as flour improver, but its excessive use and edibility are receiving increasing concern. A multi-step IR macro-fingerprinting was employed to identify BPO in wheat flour and unveil its changes during storage. BPO contained in wheat flour (< 3.0 mg/kg) was difficult to be identified by infrared spectra with correlation coefficients between wheat flour and wheat flour samples contained BPO all close to 0.98. By applying second derivative spectroscopy, obvious differences among wheat flour and wheat flour contained BPO before and after storage in the range of 1500-1400 cm- 1 were disclosed. The peak of 1450 cm- 1 which belonged to BPO was blue shifted to 1453 cm- 1 (1455) which belonged to benzoic acid after one week of storage, indicating that BPO changed into benzoic acid after storage. Moreover, when using two-dimensional correlation infrared spectroscopy (2DCOS-IR) to track changes of BPO in wheat flour (0.05 mg/g) within one week, intensities of auto-peaks at 1781 cm- 1 and 669 cm- 1 which belonged to BPO and benzoic acid, respectively, were changing inversely, indicating that BPO was decomposed into benzoic acid. Moreover, another autopeak at 1767 cm- 1 which does not belong to benzoic acid was also rising simultaneously. By heating perturbation treatment of BPO in wheat flour based on 2DCOS-IR and spectral subtraction analysis, it was found that BPO in wheat flour not only decomposed into benzoic acid and benzoate, but also produced other deleterious substances, e.g., benzene. This study offers a promising method with minimum pretreatment and time-saving to identify BPO in wheat flour and its chemical products during storage in a holistic manner.

  4. Rapid determination and chemical change tracking of benzoyl peroxide in wheat flour by multi-step IR macro-fingerprinting.

    PubMed

    Guo, Xiao-Xi; Hu, Wei; Liu, Yuan; Sun, Su-Qin; Gu, Dong-Chen; He, Helen; Xu, Chang-Hua; Wang, Xi-Chang

    2016-02-05

    BPO is often added to wheat flour as flour improver, but its excessive use and edibility are receiving increasing concern. A multi-step IR macro-fingerprinting was employed to identify BPO in wheat flour and unveil its changes during storage. BPO contained in wheat flour (<3.0 mg/kg) was difficult to be identified by infrared spectra with correlation coefficients between wheat flour and wheat flour samples contained BPO all close to 0.98. By applying second derivative spectroscopy, obvious differences among wheat flour and wheat flour contained BPO before and after storage in the range of 1500-1400 cm(-1) were disclosed. The peak of 1450 cm(-1) which belonged to BPO was blue shifted to 1453 cm(-1) (1455) which belonged to benzoic acid after one week of storage, indicating that BPO changed into benzoic acid after storage. Moreover, when using two-dimensional correlation infrared spectroscopy (2DCOS-IR) to track changes of BPO in wheat flour (0.05 mg/g) within one week, intensities of auto-peaks at 1781 cm(-1) and 669 cm(-1) which belonged to BPO and benzoic acid, respectively, were changing inversely, indicating that BPO was decomposed into benzoic acid. Moreover, another autopeak at 1767 cm(-1) which does not belong to benzoic acid was also rising simultaneously. By heating perturbation treatment of BPO in wheat flour based on 2DCOS-IR and spectral subtraction analysis, it was found that BPO in wheat flour not only decomposed into benzoic acid and benzoate, but also produced other deleterious substances, e.g., benzene. This study offers a promising method with minimum pretreatment and time-saving to identify BPO in wheat flour and its chemical products during storage in a holistic manner.

  5. Evaluation of BRCA1 and BRCA2 mutation prevalence, risk prediction models and a multistep testing approach in French‐Canadian families with high risk of breast and ovarian cancer

    PubMed Central

    Simard, Jacques; Dumont, Martine; Moisan, Anne‐Marie; Gaborieau, Valérie; Vézina, Hélène; Durocher, Francine; Chiquette, Jocelyne; Plante, Marie; Avard, Denise; Bessette, Paul; Brousseau, Claire; Dorval, Michel; Godard, Béatrice; Houde, Louis; Joly, Yann; Lajoie, Marie‐Andrée; Leblanc, Gilles; Lépine, Jean; Lespérance, Bernard; Malouin, Hélène; Parboosingh, Jillian; Pichette, Roxane; Provencher, Louise; Rhéaume, Josée; Sinnett, Daniel; Samson, Carolle; Simard, Jean‐Claude; Tranchant, Martine; Voyer, Patricia; BRCAs, INHERIT; Easton, Douglas; Tavtigian, Sean V; Knoppers, Bartha‐Maria; Laframboise, Rachel; Bridge, Peter; Goldgar, David

    2007-01-01

    Background and objective In clinical settings with fixed resources allocated to predictive genetic testing for high‐risk cancer predisposition genes, optimal strategies for mutation screening programmes are critically important. These depend on the mutation spectrum found in the population under consideration and the frequency of mutations detected as a function of the personal and family history of cancer, which are both affected by the presence of founder mutations and demographic characteristics of the underlying population. The results of multistep genetic testing for mutations in BRCA1 or BRCA2 in a large series of families with breast cancer in the French‐Canadian population of Quebec, Canada are reported. Methods A total of 256 high‐risk families were ascertained from regional familial cancer clinics throughout the province of Quebec. Initially, families were tested for a panel of specific mutations known to occur in this population. Families in which no mutation was identified were then comprehensively tested. Three algorithms to predict the presence of mutations were evaluated, including the prevalence tables provided by Myriad Genetics Laboratories, the Manchester Scoring System and a logistic regression approach based on the data from this study. Results 8 of the 15 distinct mutations found in 62 BRCA1/BRCA2‐positive families had never been previously reported in this population, whereas 82% carried 1 of the 4 mutations currently observed in ⩾2 families. In the subset of 191 families in which at least 1 affected individual was tested, 29% carried a mutation. Of these 27 BRCA1‐positive and 29 BRCA2‐positive families, 48 (86%) were found to harbour a mutation detected by the initial test. Among the remaining 143 inconclusive families, all 8 families found to have a mutation after complete sequencing had Manchester Scores ⩾18. The logistic regression and Manchester Scores provided equal predictive power, and both were significantly better

  6. High-throughput sample processing and sample management; the functional evolution of classical cytogenetic assay towards automation.

    PubMed

    Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S

    2015-11-01

    High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput.

  7. Automation: Decision Aid or Decision Maker?

    NASA Technical Reports Server (NTRS)

    Skitka, Linda J.

    1998-01-01

    This study clarified that automation bias is something unique to automated decision making contexts, and is not the result of a general tendency toward complacency. By comparing performance on exactly the same events on the same tasks with and without an automated decision aid, we were able to determine that at least the omission error part of automation bias is due to the unique context created by having an automated decision aid, and is not a phenomena that would occur even if people were not in an automated context. However, this study also revealed that having an automated decision aid did lead to modestly improved performance across all non-error events. Participants in the non- automated condition responded with 83.68% accuracy, whereas participants in the automated condition responded with 88.67% accuracy, across all events. Automated decision aids clearly led to better overall performance when they were accurate. People performed almost exactly at the level of reliability as the automation (which across events was 88% reliable). However, also clear, is that the presence of less than 100% accurate automated decision aids creates a context in which new kinds of errors in decision making can occur. Participants in the non-automated condition responded with 97% accuracy on the six "error" events, whereas participants in the automated condition had only a 65% accuracy rate when confronted with those same six events. In short, the presence of an AMA can lead to vigilance decrements that can lead to errors in decision making.

  8. Space power subsystem automation technology

    NASA Technical Reports Server (NTRS)

    Graves, J. R. (Compiler)

    1982-01-01

    The technology issues involved in power subsystem automation and the reasonable objectives to be sought in such a program were discussed. The complexities, uncertainties, and alternatives of power subsystem automation, along with the advantages from both an economic and a technological perspective were considered. Whereas most spacecraft power subsystems now use certain automated functions, the idea of complete autonomy for long periods of time is almost inconceivable. Thus, it seems prudent that the technology program for power subsystem automation be based upon a growth scenario which should provide a structured framework of deliberate steps to enable the evolution of space power subsystems from the current practice of limited autonomy to a greater use of automation with each step being justified on a cost/benefit basis. Each accomplishment should move toward the objectives of decreased requirement for ground control, increased system reliability through onboard management, and ultimately lower energy cost through longer life systems that require fewer resources to operate and maintain. This approach seems well-suited to the evolution of more sophisticated algorithms and eventually perhaps even the use of some sort of artificial intelligence. Multi-hundred kilowatt systems of the future will probably require an advanced level of autonomy if they are to be affordable and manageable.

  9. Automated mapping of hammond's landforms

    USGS Publications Warehouse

    Gallant, A.L.; Brown, D.D.; Hoffer, R.M.

    2005-01-01

    We automated a method for mapping Hammond's landforms over large landscapes using digital elevation data. We compared our results against Hammond's published landform maps, derived using manual interpretation procedures. We found general agreement in landform patterns mapped by the manual and the automated approaches, and very close agreement in characterization of local topographic relief. The two approaches produced different interpretations of intermediate landforms, which relied upon quantification of the proportion of landscape having gently sloping terrain. This type of computation is more efficiently and consistently applied by computer than human. Today's ready access to digital data and computerized geospatial technology provides a good foundation for mapping terrain features, but the mapping criteria guiding manual techniques in the past may not be appropriate for automated approaches. We suggest that future efforts center on the advantages offered by digital advancements in refining an approach to better characterize complex landforms. ?? 2005 IEEE.

  10. Intelligent software for laboratory automation.

    PubMed

    Whelan, Ken E; King, Ross D

    2004-09-01

    The automation of laboratory techniques has greatly increased the number of experiments that can be carried out in the chemical and biological sciences. Until recently, this automation has focused primarily on improving hardware. Here we argue that future advances will concentrate on intelligent software to integrate physical experimentation and results analysis with hypothesis formulation and experiment planning. To illustrate our thesis, we describe the 'Robot Scientist' - the first physically implemented example of such a closed loop system. In the Robot Scientist, experimentation is performed by a laboratory robot, hypotheses concerning the results are generated by machine learning and experiments are allocated and selected by a combination of techniques derived from artificial intelligence research. The performance of the Robot Scientist has been evaluated by a rediscovery task based on yeast functional genomics. The Robot Scientist is proof that the integration of programmable laboratory hardware and intelligent software can be used to develop increasingly automated laboratories.

  11. Visual automated macromolecular model building.

    PubMed

    Langer, Gerrit G; Hazledine, Saul; Wiegels, Tim; Carolan, Ciaran; Lamzin, Victor S

    2013-04-01

    Automated model-building software aims at the objective interpretation of crystallographic diffraction data by means of the construction or completion of macromolecular models. Automated methods have rapidly gained in popularity as they are easy to use and generate reproducible and consistent results. However, the process of model building has become increasingly hidden and the user is often left to decide on how to proceed further with little feedback on what has preceded the output of the built model. Here, ArpNavigator, a molecular viewer tightly integrated into the ARP/wARP automated model-building package, is presented that directly controls model building and displays the evolving output in real time in order to make the procedure transparent to the user.

  12. Automated Approaches to RFI Flagging

    NASA Astrophysics Data System (ADS)

    Garimella, Karthik; Momjian, Emmanuel

    2017-01-01

    It is known that Radio Frequency Interference (RFI) is a major issue in centimeter wavelength radio astronomy. Radio astronomy software packages include tools to excise RFI; both manual and automated utilizing the visibilities (the uv data). Here we present results on an automated RFI flagging approach that utilizes a uv-grid, which is the intermediate product when converting uv data points to an image. It is a well known fact that any signal that appears widespread in a given domain (e.g., image domain) is compact in the Fourier domain (uv-grid domain), i.e., RFI sources that appear as large scale structures (e.g., stripes) in images can be located and flagged using the uv-grid data set. We developed several automated uv-grid based flagging algorithms to detect and excise RFI. These algorithms will be discussed, and results of applying them to measurement sets will be presented.

  13. Automated power management and control

    NASA Technical Reports Server (NTRS)

    Dolce, James L.

    1991-01-01

    A comprehensive automation design is being developed for Space Station Freedom's electric power system. A joint effort between NASA's Office of Aeronautics and Exploration Technology and NASA's Office of Space Station Freedom, it strives to increase station productivity by applying expert systems and conventional algorithms to automate power system operation. The initial station operation will use ground-based dispatches to perform the necessary command and control tasks. These tasks constitute planning and decision-making activities that strive to eliminate unplanned outages. We perceive an opportunity to help these dispatchers make fast and consistent on-line decisions by automating three key tasks: failure detection and diagnosis, resource scheduling, and security analysis. Expert systems will be used for the diagnostics and for the security analysis; conventional algorithms will be used for the resource scheduling.

  14. Automated UV process analyzers/distributed control boost emission control process efficiency

    SciTech Connect

    Fabre, M.C.

    1987-10-01

    The Marathon Petroleum Company refinery in Garyville, LA, refines more than 200,000 bbl/day of crude oil. Waste process gases-H/sub 2/S and NH/sub 3/-are handled by a single system. Emission control efficiency and reliability needed to be improve in the H/sub 2/S and NH/sub 3/ acid gas conversion process. To maintain the EPA emission maximum of only 10 ppm H/sub 2/S, the process required almost continuous manual inspection. The need for frequent optical measurements, the susceptibility of process upset due to human error or steam variances, and stream overloading problems combined to make the process unreliable. In its ongoing effort to ensure maximum emission control efficiency, Marathathon retrofit the process to an automated self-diagnostic treatment and monitoring system in 1986. The multistep treatment process controls and treats Marathon;s acid gas-by-product through two existing Claus process units and SO/sub 2/-to-H/sub 2/S converters, a desuperheater, an amine scrubber and a thermal oxidizer. Critical to maintaining both the stack emission control and the efficiency of the process are a pair of automated UV-photometric analyzers. The instruments were incorporated to monitor the gas streams and to fine-tune the process equipment (through the plant's existing distributed control system) to meet variably operating conditions. Since the retrofitted and monitoring system became operational, Marathon has eliminated the compliance reporting problems that had formerly plaqued the plant. Stack efficiency (measuring stream content of SO/sub 2/) has been consistently maintained at levels of 50% or less of the allowable EPA maximum. By automating the analysis procedures, little hands-on-or visual maintenance, sample testing, calibration, and report preparation time are required, saving an estimated 60% in yearly operations and maintenance costs.

  15. BOA: Framework for automated builds

    SciTech Connect

    N. Ratnikova et al.

    2003-09-30

    Managing large-scale software products is a complex software engineering task. The automation of the software development, release and distribution process is most beneficial in the large collaborations, where the big number of developers, multiple platforms and distributed environment are typical factors. This paper describes Build and Output Analyzer framework and its components that have been developed in CMS to facilitate software maintenance and improve software quality. The system allows to generate, control and analyze various types of automated software builds and tests, such as regular rebuilds of the development code, software integration for releases and installation of the existing versions.

  16. Advanced automation for space missions

    NASA Technical Reports Server (NTRS)

    Freitas, R. A., Jr.; Healy, T. J.; Long, J. E.

    1982-01-01

    A NASA/ASEE Summer Study conducted at the University of Santa Clara in 1980 examined the feasibility of using advanced artificial intelligence and automation technologies in future NASA space missions. Four candidate applications missions were considered: (1) An intelligent earth-sensing information system, (2) an autonomous space exploration system, (3) an automated space manufacturing facility, and (4) a self-replicating, growing lunar factory. The study assessed the various artificial intelligence and machine technologies which must be developed if such sophisticated missions are to become feasible by century's end.

  17. Automated Tools for Subject Matter Expert Evaluation of Automated Scoring

    ERIC Educational Resources Information Center

    Williamson, David M.; Bejar, Isaac I.; Sax, Anne

    2004-01-01

    As automated scoring of complex constructed-response examinations reaches operational status, the process of evaluating the quality of resultant scores, particularly in contrast to scores of expert human graders, becomes as complex as the data itself. Using a vignette from the Architectural Registration Examination (ARE), this article explores the…

  18. Automation U.S.A.: Overcoming Barriers to Automation.

    ERIC Educational Resources Information Center

    Brody, Herb

    1985-01-01

    Although labor unions and inadequate technology play minor roles, the principal barrier to factory automation is "fear of change." Related problems include long-term benefits, nontechnical executives, and uncertainty of factory cost accounting. Industry support for university programs is helping to educate engineers to design, implement, and…

  19. Automating Shallow Seismic Imaging

    SciTech Connect

    Steeples, Don W.

    2004-12-09

    This seven-year, shallow-seismic reflection research project had the aim of improving geophysical imaging of possible contaminant flow paths. Thousands of chemically contaminated sites exist in the United States, including at least 3,700 at Department of Energy (DOE) facilities. Imaging technologies such as shallow seismic reflection (SSR) and ground-penetrating radar (GPR) sometimes are capable of identifying geologic conditions that might indicate preferential contaminant-flow paths. Historically, SSR has been used very little at depths shallower than 30 m, and even more rarely at depths of 10 m or less. Conversely, GPR is rarely useful at depths greater than 10 m, especially in areas where clay or other electrically conductive materials are present near the surface. Efforts to image the cone of depression around a pumping well using seismic methods were only partially successful (for complete references of all research results, see the full Final Technical Report, DOE/ER/14826-F), but peripheral results included development of SSR methods for depths shallower than one meter, a depth range that had not been achieved before. Imaging at such shallow depths, however, requires geophone intervals of the order of 10 cm or less, which makes such surveys very expensive in terms of human time and effort. We also showed that SSR and GPR could be used in a complementary fashion to image the same volume of earth at very shallow depths. The primary research focus of the second three-year period of funding was to develop and demonstrate an automated method of conducting two-dimensional (2D) shallow-seismic surveys with the goal of saving time, effort, and money. Tests involving the second generation of the hydraulic geophone-planting device dubbed the ''Autojuggie'' showed that large numbers of geophones can be placed quickly and automatically and can acquire high-quality data, although not under rough topographic conditions. In some easy-access environments, this device could

  20. ASteCA: Automated Stellar Cluster Analysis

    NASA Astrophysics Data System (ADS)

    Perren, G. I.; Vázquez, R. A.; Piatti, A. E.

    2015-04-01

    We present the Automated Stellar Cluster Analysis package (ASteCA), a suit of tools designed to fully automate the standard tests applied on stellar clusters to determine their basic parameters. The set of functions included in the code make use of positional and photometric data to obtain precise and objective values for a given cluster's center coordinates, radius, luminosity function and integrated color magnitude, as well as characterizing through a statistical estimator its probability of being a true physical cluster rather than a random overdensity of field stars. ASteCA incorporates a Bayesian field star decontamination algorithm capable of assigning membership probabilities using photometric data alone. An isochrone fitting process based on the generation of synthetic clusters from theoretical isochrones and selection of the best fit through a genetic algorithm is also present, which allows ASteCA to provide accurate estimates for a cluster's metallicity, age, extinction and distance values along with its uncertainties. To validate the code we applied it on a large set of over 400 synthetic MASSCLEAN clusters with varying degrees of field star contamination as well as a smaller set of 20 observed Milky Way open clusters (Berkeley 7, Bochum 11, Czernik 26, Czernik 30, Haffner 11, Haffner 19, NGC 133, NGC 2236, NGC 2264, NGC 2324, NGC 2421, NGC 2627, NGC 6231, NGC 6383, NGC 6705, Ruprecht 1, Tombaugh 1, Trumpler 1, Trumpler 5 and Trumpler 14) studied in the literature. The results show that ASteCA is able to recover cluster parameters with an acceptable precision even for those clusters affected by substantial field star contamination. ASteCA is written in Python and is made available as an open source code which can be downloaded ready to be used from its official site.

  1. Multi-step Monte Carlo calculations applied to nuclear reactor instrumentation - source definition and renormalization to physical values

    SciTech Connect

    Radulovic, Vladimir; Barbot, Loic; Fourmentel, Damien; Villard, Jean-Francois; Snoj, Luka; Zerovnik, Gasper; Trkov, Andrej

    2015-07-01

    Significant efforts have been made over the last few years in the French Alternative Energies and Atomic Energy Commission (CEA) to adopt multi-step Monte Carlo calculation schemes in the investigation and interpretation of the response of nuclear reactor instrumentation detectors (e.g. miniature ionization chambers - MICs and self-powered neutron or gamma detectors - SPNDs and SPGDs). The first step consists of the calculation of the primary data, i.e. evaluation of the neutron and gamma flux levels and spectra in the environment where the detector is located, using a computational model of the complete nuclear reactor core and its surroundings. These data are subsequently used to define sources for the following calculation steps, in which only a model of the detector under investigation is used. This approach enables calculations with satisfactory statistical uncertainties (of the order of a few %) within regions which are very small in size (the typical volume of which is of the order of 1 mm{sup 3}). The main drawback of a calculation scheme as described above is that perturbation effects on the radiation conditions caused by the detectors themselves are not taken into account. Depending on the detector, the nuclear reactor and the irradiation position, the perturbation in the neutron flux as primary data may reach 10 to 20%. A further issue is whether the model used in the second step calculations yields physically representative results. This is generally not the case, as significant deviations may arise, depending on the source definition. In particular, as presented in the paper, the injudicious use of special options aimed at increasing the computation efficiency (e.g. reflective boundary conditions) may introduce unphysical bias in the calculated flux levels and distortions in the spectral shapes. This paper presents examples of the issues described above related to a case study on the interpretation of the signal from different types of SPNDs, which

  2. Evaluation of glycodendron and synthetically modified dextran clearing agents for multistep targeting of radioisotopes for molecular imaging and radioimmunotherapy.

    PubMed

    Cheal, Sarah M; Yoo, Barney; Boughdad, Sarah; Punzalan, Blesida; Yang, Guangbin; Dilhas, Anna; Torchon, Geralda; Pu, Jun; Axworthy, Don B; Zanzonico, Pat; Ouerfelli, Ouathek; Larson, Steven M

    2014-02-03

    A series of N-acetylgalactosamine-dendrons (NAG-dendrons) and dextrans bearing biotin moieties were compared for their ability to complex with and sequester circulating bispecific antitumor antibody streptavidin fusion protein (scFv4-SA) in vivo, to improve tumor-to-normal tissue concentration ratios for multistep targeted (MST) radioimmunotherapy and diagnosis. Specifically, a total of five NAG-dendrons employing a common synthetic scaffold structure containing 4, 8, 16, or 32 carbohydrate residues and a single biotin moiety were prepared (NAGB), and for comparative purposes, a biotinylated-dextran with an average molecular weight of 500 kD was synthesized from amino-dextran (DEXB). One of the NAGB compounds, CA16, has been investigated in humans; our aim was to determine if other NAGB analogues (e.g., CA8 or CA4) were bioequivalent to CA16 and/or better suited as MST reagents. In vivo studies included dynamic positron-emission tomography (PET) imaging of (124)I-labeled-scFv4-SA clearance and dual-label biodistribution studies following MST directed at subcutaneous (s.c.) human colon adenocarcinoma xenografts in mice. The MST protocol consists of three injections: first, a scFv4-SA specific for an antitumor-associated glycoprotein (TAG-72); second, CA16 or other clearing agent; and third, radiolabeled biotin. We observed using PET imaging of the (124)I-labeled-scFv4-SA clearance that the spatial arrangement of ligands conjugated to NAG (i.e., biotin linked with an extended spacer, referred to herein as long-chain (LC)) can impact the binding to the antibody in circulation and subsequent liver uptake of the NAG-antibody complex. Also, NAGB CA32-LC or CA16-LC can be utilized during MST to achieve comparable tumor-to-blood ratios and absolute tumor uptake seen previously with CA16. Finally, DEXB was equally effective as NAGB CA32-LC at lowering scFv4-SA in circulation, but at the expense of reducing absolute tumor uptake of radiolabeled biotin.

  3. Paxillin-dependent paxillin kinase linker and p21-activated kinase localization to focal adhesions involves a multistep activation pathway.

    PubMed

    Brown, Michael C; West, Kip A; Turner, Christopher E

    2002-05-01

    The precise temporal-spatial regulation of the p21-activated serine-threonine kinase PAK at the plasma membrane is required for proper cytoskeletal reorganization and cell motility. However, the mechanism by which PAK localizes to focal adhesions has not yet been elucidated. Indirect binding of PAK to the focal adhesion protein paxillin via the Arf-GAP protein paxillin kinase linker (PKL) and PIX/Cool suggested a mechanism. In this report, we demonstrate an essential role for a paxillin-PKL interaction in the recruitment of activated PAK to focal adhesions. Similar to PAK, expression of activated Cdc42 and Rac1, but not RhoA, stimulated the translocation of PKL from a generally diffuse localization to focal adhesions. Expression of the PAK regulatory domain (PAK1-329) or the autoinhibitory domain (AID 83-149) induced PKL, PIX, and PAK localization to focal adhesions, indicating a role for PAK scaffold activation. We show PIX, but not NCK, binding to PAK is necessary for efficient focal adhesion localization of PAK and PKL, consistent with a PAK-PIX-PKL linkage. Although PAK activation is required, it is not sufficient for localization. The PKL amino terminus, containing the PIX-binding site, but lacking paxillin-binding subdomain 2 (PBS2), was unable to localize to focal adhesions and also abrogated PAK localization. An identical result was obtained after PKLDeltaPBS2 expression. Finally, neither PAK nor PKL was capable of localizing to focal adhesions in cells overexpressing paxillinDeltaLD4, confirming a requirement for this motif in recruitment of the PAK-PIX-PKL complex to focal adhesions. These results suggest a GTP-Cdc42/GTP-Rac triggered multistep activation cascade leading to the stimulation of the adaptor function of PAK, which through interaction with PIX provokes a functional PKL PBS2-paxillin LD4 association and consequent recruitment to focal adhesions. This mechanism is probably critical for the correct subcellular positioning of PAK, thereby

  4. Automation of existing natural gas compressor stations

    SciTech Connect

    Little, J.E.

    1986-05-01

    ANR Pipeline Co., in automating 20 major compressor stations in 20 months' time, standardized on hardware and software design. In this article, the author tells how off-the-shelf automation was used and how the systems work.

  5. Understanding human management of automation errors

    PubMed Central

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  6. The automation and evaluation of nested clade phylogeographic analysis.

    PubMed

    Panchal, Mahesh; Beaumont, Mark A

    2007-06-01

    Nested clade phylogeographic analysis (NCPA) is a popular method for reconstructing the demographic history of spatially distributed populations from genetic data. Although some parts of the analysis are automated, there is no unique and widely followed algorithm for doing this in its entirety, beginning with the data, and ending with the inferences drawn from the data. This article describes a method that automates NCPA, thereby providing a framework for replicating analyses in an objective way. To do so, a number of decisions need to be made so that the automated implementation is representative of previous analyses. We review how the NCPA procedure has evolved since its inception and conclude that there is scope for some variability in the manual application of NCPA. We apply the automated software to three published datasets previously analyzed manually and replicate many details of the manual analyses, suggesting that the current algorithm is representative of how a typical user will perform NCPA. We simulate a large number of replicate datasets for geographically distributed, but entirely random-mating, populations. These are then analyzed using the automated NCPA algorithm. Results indicate that NCPA tends to give a high frequency of false positives. In our simulations we observe that 14% of the clades give a conclusive inference that a demographic event has occurred, and that 75% of the datasets have at least one clade that gives such an inference. This is mainly due to the generation of multiple statistics per clade, of which only one is required to be significant to apply the inference key. We survey the inferences that have been made in recent publications and show that the most commonly inferred processes (restricted gene flow with isolation by distance and contiguous range expansion) are those that are commonly inferred in our simulations. However, published datasets typically yield a richer set of inferences with NCPA than obtained in our random

  7. Automated RNA Extraction and Purification for Multiplexed Pathogen Detection

    SciTech Connect

    Bruzek, Amy K.; Bruckner-Lea, Cindy J.

    2005-01-01

    Pathogen detection has become an extremely important part of our nation?s defense in this post 9/11 world where the threat of bioterrorist attacks are a grim reality. When a biological attack takes place, response time is critical. The faster the biothreat is assessed, the faster countermeasures can be put in place to protect the health of the general public. Today some of the most widely used methods for detecting pathogens are either time consuming or not reliable [1]. Therefore, a method that can detect multiple pathogens that is inherently reliable, rapid, automated and field portable is needed. To that end, we are developing automated fluidics systems for the recovery, cleanup, and direct labeling of community RNA from suspect environmental samples. The advantage of using RNA for detection is that there are multiple copies of mRNA in a cell, whereas there are normally only one or two copies of DNA [2]. Because there are multiple copies of mRNA in a cell for highly expressed genes, no amplification of the genetic material may be necessary, and thus rapid and direct detection of only a few cells may be possible [3]. This report outlines the development of both manual and automated methods for the extraction and purification of mRNA. The methods were evaluated using cell lysates from Escherichia coli 25922 (nonpathogenic), Salmonella typhimurium (pathogenic), and Shigella spp (pathogenic). Automated RNA purification was achieved using a custom sequential injection fluidics system consisting of a syringe pump, a multi-port valve and a magnetic capture cell. mRNA was captured using silica coated superparamagnetic beads that were trapped in the tubing by a rare earth magnet. RNA was detected by gel electrophoresis and/or by hybridization of the RNA to microarrays. The versatility of the fluidics systems and the ability to automate these systems allows for quick and easy processing of samples and eliminates the need for an experienced operator.

  8. Automated Quantitative Rare Earth Elements Mineralogy by Scanning Electron Microscopy

    NASA Astrophysics Data System (ADS)

    Sindern, Sven; Meyer, F. Michael

    2016-09-01

    Increasing industrial demand of rare earth elements (REEs) stems from the central role they play for advanced technologies and the accelerating move away from carbon-based fuels. However, REE production is often hampered by the chemical, mineralogical as well as textural complexity of the ores with a need for better understanding of their salient properties. This is not only essential for in-depth genetic interpretations but also for a robust assessment of ore quality and economic viability. The design of energy and cost-efficient processing of REE ores depends heavily on information about REE element deportment that can be made available employing automated quantitative process mineralogy. Quantitative mineralogy assigns numeric values to compositional and textural properties of mineral matter. Scanning electron microscopy (SEM) combined with a suitable software package for acquisition of backscatter electron and X-ray signals, phase assignment and image analysis is one of the most efficient tools for quantitative mineralogy. The four different SEM-based automated quantitative mineralogy systems, i.e. FEI QEMSCAN and MLA, Tescan TIMA and Zeiss Mineralogic Mining, which are commercially available, are briefly characterized. Using examples of quantitative REE mineralogy, this chapter illustrates capabilities and limitations of automated SEM-based systems. Chemical variability of REE minerals and analytical uncertainty can reduce performance of phase assignment. This is shown for the REE phases parisite and synchysite. In another example from a monazite REE deposit, the quantitative mineralogical parameters surface roughness and mineral association derived from image analysis are applied for automated discrimination of apatite formed in a breakdown reaction of monazite and apatite formed by metamorphism prior to monazite breakdown. SEM-based automated mineralogy fulfils all requirements for characterization of complex unconventional REE ores that will become

  9. Ask the experts: automation: part I.

    PubMed

    Allinson, John L; Blick, Kenneth E; Cohen, Lucinda; Higton, David; Li, Ming

    2013-08-01

    Bioanalysis invited a selection of leading researchers to express their views on automation in the bioanalytical laboratory. The topics discussed include the challenges that the modern bioanalyst faces when integrating automation into existing drug-development processes, the impact of automation and how they envision the modern bioanalytical laboratory changing in the near future. Their enlightening responses provide a valuable insight into the impact of automation and the future of the constantly evolving bioanalytical laboratory.

  10. Progress in Fully Automated Abdominal CT Interpretation

    PubMed Central

    Summers, Ronald M.

    2016-01-01

    OBJECTIVE Automated analysis of abdominal CT has advanced markedly over just the last few years. Fully automated assessment of organs, lymph nodes, adipose tissue, muscle, bowel, spine, and tumors are some examples where tremendous progress has been made. Computer-aided detection of lesions has also improved dramatically. CONCLUSION This article reviews the progress and provides insights into what is in store in the near future for automated analysis for abdominal CT, ultimately leading to fully automated interpretation. PMID:27101207

  11. Fatal sepsis caused by an unusual Klebsiella species that was misidentified by an automated identification system.

    PubMed

    Seki, Masafumi; Gotoh, Kazuyoshi; Nakamura, Shota; Akeda, Yukihiro; Yoshii, Tadashi; Miyaguchi, Shinichi; Inohara, Hidenori; Horii, Toshihiro; Oishi, Kazunori; Iida, Tetsuya; Tomono, Kazunori

    2013-05-01

    This is a description of fatal sepsis caused by infection with Klebsiella variicola, which is an isolate genetically related to Klebsiella pneumoniae. The patient's condition was incorrectly diagnosed as common sepsis caused by K. pneumoniae, which was identified using an automated identification system, but next-generation sequencing and the non-fermentation of adonitol finally identified the cause of sepsis as K. variicola.

  12. Formative Automated Computer Testing (FACT).

    ERIC Educational Resources Information Center

    Hunt, Nicoll; Hughes, Janet; Rowe, Glenn

    2002-01-01

    Describes the development of a tool, FACT (Formative Automated Computer Testing), to formatively assess information technology skills of college students in the United Kingdom. Topics include word processing competency; tests designed by tutors and delivered via a network; and results of an evaluation that showed students preferred automated…

  13. Automated calculation and simulation systems

    NASA Astrophysics Data System (ADS)

    Ohl, Thorsten

    2003-04-01

    I briefly summarize the parallel sessions on Automated Calculation and Simulation Systems for high-energy particle physics phenomenology at ACAT 2002 (Moscow State University, June 2002) and present a short overview over the current status of the field and try to identify the important trends.

  14. Automating the conflict resolution process

    NASA Technical Reports Server (NTRS)

    Wike, Jeffrey S.

    1991-01-01

    The purpose is to initiate a discussion of how the conflict resolution process at the Network Control Center can be made more efficient. Described here are how resource conflicts are currently resolved as well as the impacts of automating conflict resolution in the ATDRSS era. A variety of conflict resolution strategies are presented.

  15. Automated Accounting. Payroll. Instructor Module.

    ERIC Educational Resources Information Center

    Moses, Duane R.

    This teacher's guide was developed to assist business instructors using Dac Easy Accounting Payroll Version 3.0 edition software in their accounting programs. The module contains assignment sheets and job sheets designed to enable students to master competencies identified in the area of automated accounting--payroll. Basic accounting skills are…

  16. Office Automation in Student Affairs.

    ERIC Educational Resources Information Center

    Johnson, Sharon L.; Hamrick, Florence A.

    1987-01-01

    Offers recommendations to assist in introducing or expanding computer assistance in student affairs. Describes need for automation and considers areas of choosing hardware and software, funding and competitive bidding, installation and training, and system management. Cites greater efficiency in handling tasks and data and increased levels of…

  17. Safety in the Automated Office.

    ERIC Educational Resources Information Center

    Graves, Pat R.; Greathouse, Lillian R.

    1990-01-01

    Office automation has introduced new hazards to the workplace: electrical hazards related to computer wiring, musculoskeletal problems resulting from use of computer terminals and design of work stations, and environmental concerns related to ventilation, noise levels, and office machine chemicals. (SK)

  18. Automated analysis of oxidative metabolites

    NASA Technical Reports Server (NTRS)

    Furner, R. L. (Inventor)

    1974-01-01

    An automated system for the study of drug metabolism is described. The system monitors the oxidative metabolites of aromatic amines and of compounds which produce formaldehyde on oxidative dealkylation. It includes color developing compositions suitable for detecting hyroxylated aromatic amines and formaldehyde.

  19. Automated ac galvanomagnetic measurement system

    NASA Technical Reports Server (NTRS)

    Szofran, F. R.; Espy, P. N.

    1985-01-01

    An automated, ac galvanomagnetic measurement system is described. Hall or van der Pauw measurements in the temperature range 10-300 K can be made at a preselected magnetic field without operator attendance. Procedures to validate sample installation and correct operation of other system functions, such as magnetic field and thermometry, are included. Advantages of ac measurements are discussed.

  20. Automated verification system user's guide

    NASA Technical Reports Server (NTRS)

    Hoffman, R. H.

    1972-01-01

    Descriptions of the operational requirements for all of the programs of the Automated Verification System (AVS) are provided. The AVS programs are: (1) FORTRAN code analysis and instrumentation program (QAMOD); (2) Test Effectiveness Evaluation Program (QAPROC); (3) Transfer Control Variable Tracking Program (QATRAK); (4) Program Anatomy Table Generator (TABGEN); and (5) Network Path Analysis Program (RAMBLE).

  1. Automation on the Laboratory Bench.

    ERIC Educational Resources Information Center

    Legrand, M.; Foucard, A.

    1978-01-01

    A kit is described for use in automation of routine chemical research procedures. The kit uses sensors to evaluate the state of the system, actuators which modify the adjustable parameters, and an organ of decision which uses the information from the sensors. (BB)

  2. Automated Solar-Array Assembly

    NASA Technical Reports Server (NTRS)

    Soffa, A.; Bycer, M.

    1982-01-01

    Large arrays are rapidly assembled from individual solar cells by automated production line developed for NASA's Jet Propulsion Laboratory. Apparatus positions cells within array, attaches interconnection tabs, applies solder flux, and solders interconnections. Cells are placed in either straight or staggered configurations and may be connected either in series or in parallel. Are attached at rate of one every 5 seconds.

  3. Automating a High School Restroom.

    ERIC Educational Resources Information Center

    Ritner-Heir, Robbin

    1999-01-01

    Discusses how one high school transformed its restrooms into cleaner and more vandal-resistant environments by automating them. Solutions discussed include installing perforated stainless steel panel ceilings, using epoxy-based paint for walls, selecting china commode fixtures instead of stainless steel, installing electronic faucets and sensors,…

  4. Automated species identification: why not?

    PubMed Central

    Gaston, Kevin J; O'Neill, Mark A

    2004-01-01

    Where possible, automation has been a common response of humankind to many activities that have to be repeated numerous times. The routine identification of specimens of previously described species has many of the characteristics of other activities that have been automated, and poses a major constraint on studies in many areas of both pure and applied biology. In this paper, we consider some of the reasons why automated species identification has not become widely employed, and whether it is a realistic option, addressing the notions that it is too difficult, too threatening, too different or too costly. Although recognizing that there are some very real technical obstacles yet to be overcome, we argue that progress in the development of automated species identification is extremely encouraging that such an approach has the potential to make a valuable contribution to reducing the burden of routine identifications. Vision and enterprise are perhaps more limiting at present than practical constraints on what might possibly be achieved. PMID:15253351

  5. Teacherbot: Interventions in Automated Teaching

    ERIC Educational Resources Information Center

    Bayne, Sian

    2015-01-01

    Promises of "teacher-light" tuition and of enhanced "efficiency" via the automation of teaching have been with us since the early days of digital education, sometimes embraced by academics and institutions, and sometimes resisted as a set of moves which are damaging to teacher professionalism and to the humanistic values of…

  6. Automation of Space Inventory Management

    NASA Technical Reports Server (NTRS)

    Fink, Patrick W.; Ngo, Phong; Wagner, Raymond; Barton, Richard; Gifford, Kevin

    2009-01-01

    This viewgraph presentation describes the utilization of automated space-based inventory management through handheld RFID readers and BioNet Middleware. The contents include: 1) Space-Based INventory Management; 2) Real-Time RFID Location and Tracking; 3) Surface Acoustic Wave (SAW) RFID; and 4) BioNet Middleware.

  7. Automation; The New Industrial Revolution.

    ERIC Educational Resources Information Center

    Arnstein, George E.

    Automation is a word that describes the workings of computers and the innovations of automatic transfer machines in the factory. As the hallmark of the new industrial revolution, computers displace workers and create a need for new skills and retraining programs. With improved communication between industry and the educational community to…

  8. Library Automation: Guidelines to Costing.

    ERIC Educational Resources Information Center

    Ford, Geoffrey

    As with all new programs, the costs associated with library automation must be carefully considered before implementation. This document suggests guidelines to be followed and areas to be considered in the costing of library procedures. An existing system model has been suggested as a standard (Appendix A) and a classification of library tasks…

  9. Office Automation, Personnel and the New Technology.

    ERIC Educational Resources Information Center

    Magnus, Margaret

    1980-01-01

    At the first annual Office Automation Conference, the consensus was that personnel involvement in the development of office automation is vital if the new technology is to be successfully deployed. This report explores the problems inherent in office automation and provides a broad overview of the subject. (CT)

  10. Flight-deck automation: Promises and problems

    NASA Technical Reports Server (NTRS)

    Wiener, E. L.; Curry, R. E.

    1980-01-01

    The state of the art in human factors in flight-deck automation is presented. A number of critical problem areas are identified and broad design guidelines are offered. Automation-related aircraft accidents and incidents are discussed as examples of human factors problems in automated flight.

  11. Library Automation in the Netherlands and Pica.

    ERIC Educational Resources Information Center

    Bossers, Anton; Van Muyen, Martin

    1984-01-01

    Describes the Pica Library Automation Network (originally the Project for Integrated Catalogue Automation), which is based on a centralized bibliographic database. Highlights include the Pica conception of library automation, online shared cataloging system, circulation control system, acquisition system, and online Dutch union catalog with…

  12. Automated Assessment in Massive Open Online Courses

    ERIC Educational Resources Information Center

    Ivaniushin, Dmitrii A.; Shtennikov, Dmitrii G.; Efimchick, Eugene A.; Lyamin, Andrey V.

    2016-01-01

    This paper describes an approach to use automated assessments in online courses. Open edX platform is used as the online courses platform. The new assessment type uses Scilab as learning and solution validation tool. This approach allows to use automated individual variant generation and automated solution checks without involving the course…

  13. Does Automated Feedback Improve Writing Quality?

    ERIC Educational Resources Information Center

    Wilson, Joshua; Olinghouse, Natalie G.; Andrada, Gilbert N.

    2014-01-01

    The current study examines data from students in grades 4-8 who participated in a statewide computer-based benchmark writing assessment that featured automated essay scoring and automated feedback. We examined whether the use of automated feedback was associated with gains in writing quality across revisions to an essay, and with transfer effects…

  14. You're a What? Automation Technician

    ERIC Educational Resources Information Center

    Mullins, John

    2010-01-01

    Many people think of automation as laborsaving technology, but it sure keeps Jim Duffell busy. Defined simply, automation is a technique for making a device run or a process occur with minimal direct human intervention. But the functions and technologies involved in automated manufacturing are complex. Nearly all functions, from orders coming in…

  15. Genetic counseling

    MedlinePlus

    ... this page: //medlineplus.gov/ency/patientinstructions/000510.htm Genetic counseling To use the sharing features on this ... cystic fibrosis or Down syndrome. Who May Want Genetic Counseling? It is up to you whether or ...

  16. Genetic Disorders

    MedlinePlus

    ... This can cause a medical condition called a genetic disorder. You can inherit a gene mutation from ... during your lifetime. There are three types of genetic disorders: Single-gene disorders, where a mutation affects ...

  17. Genetic modification and genetic determinism.

    PubMed

    Resnik, David B; Vorhaus, Daniel B

    2006-06-26

    In this article we examine four objections to the genetic modification of human beings: the freedom argument, the giftedness argument, the authenticity argument, and the uniqueness argument. We then demonstrate that each of these arguments against genetic modification assumes a strong version of genetic determinism. Since these strong deterministic assumptions are false, the arguments against genetic modification, which assume and depend upon these assumptions, are therefore unsound. Serious discussion of the morality of genetic modification, and the development of sound science policy, should be driven by arguments that address the actual consequences of genetic modification for individuals and society, not by ones propped up by false or misleading biological assumptions.

  18. Genetic modification and genetic determinism

    PubMed Central

    Resnik, David B; Vorhaus, Daniel B

    2006-01-01

    In this article we examine four objections to the genetic modification of human beings: the freedom argument, the giftedness argument, the authenticity argument, and the uniqueness argument. We then demonstrate that each of these arguments against genetic modification assumes a strong version of genetic determinism. Since these strong deterministic assumptions are false, the arguments against genetic modification, which assume and depend upon these assumptions, are therefore unsound. Serious discussion of the morality of genetic modification, and the development of sound science policy, should be driven by arguments that address the actual consequences of genetic modification for individuals and society, not by ones propped up by false or misleading biological assumptions. PMID:16800884

  19. Imaging Genetics

    ERIC Educational Resources Information Center

    Munoz, Karen E.; Hyde, Luke W.; Hariri, Ahmad R.

    2009-01-01

    Imaging genetics is an experimental strategy that integrates molecular genetics and neuroimaging technology to examine biological mechanisms that mediate differences in behavior and the risks for psychiatric disorder. The basic principles in imaging genetics and the development of the field are discussed.

  20. Automated PCR setup for forensic casework samples using the Normalization Wizard and PCR Setup robotic methods.

    PubMed

    Greenspoon, S A; Sykes, K L V; Ban, J D; Pollard, A; Baisden, M; Farr, M; Graham, N; Collins, B L; Green, M M; Christenson, C C

    2006-12-20

    Human genome, pharmaceutical and research laboratories have long enjoyed the application of robotics to performing repetitive laboratory tasks. However, the utilization of robotics in forensic laboratories for processing casework samples is relatively new and poses particular challenges. Since the quantity and quality (a mixture versus a single source sample, the level of degradation, the presence of PCR inhibitors) of the DNA contained within a casework sample is unknown, particular attention must be paid to procedural susceptibility to contamination, as well as DNA yield, especially as it pertains to samples with little biological material. The Virginia Department of Forensic Science (VDFS) has successfully automated forensic casework DNA extraction utilizing the DNA IQ(trade mark) System in conjunction with the Biomek 2000 Automation Workstation. Human DNA quantitation is also performed in a near complete automated fashion utilizing the AluQuant Human DNA Quantitation System and the Biomek 2000 Automation Workstation. Recently, the PCR setup for casework samples has been automated, employing the Biomek 2000 Automation Workstation and Normalization Wizard, Genetic Identity version, which utilizes the quantitation data, imported into the software, to create a customized automated method for DNA dilution, unique to that plate of DNA samples. The PCR Setup software method, used in conjunction with the Normalization Wizard method and written for the Biomek 2000, functions to mix the diluted DNA samples, transfer the PCR master mix, and transfer the diluted DNA samples to PCR amplification tubes. Once the process is complete, the DNA extracts, still on the deck of the robot in PCR amplification strip tubes, are transferred to pre-labeled 1.5 mL tubes for long-term storage using an automated method. The automation of these steps in the process of forensic DNA casework analysis has been accomplished by performing extensive optimization, validation and testing of the

  1. Knowledge systems support for mission operations automation

    NASA Astrophysics Data System (ADS)

    Atkinson, David J.

    1990-10-01

    A knowledge system which utilizes artificial intelligence technology to automate a subset of real time mission operations functions is described. An overview of spacecraft telecommunications operations at the Jet Propulsion Laboratories (JPL) highlights requirements for automation. The knowledge system, called the Spacecraft Health Automated Reasoning Prototype (SHARP), developed to explore methods for automated health and status analysis is outlined. The advantages of the system were demonstrated during the spacecraft's encounter with the planet Neptune. The design of the fault detection and diagnosis portions of SHARP is discussed. The performance of SHARP during the encounter is discussed along with issues and benefits arising from application of knowledge system to mission operations automation.

  2. Role of automation in new instrumentation.

    PubMed

    Johnson, C A

    1993-04-01

    In recent years there has been an unprecedented increase in the development of automated instrumentation for ophthalmic diagnostic and assessment purposes. An important part of this growth in automated clinical ophthalmic instrumentation has been directed to perimetry and visual field testing. In less than 15 years automated perimetry has advanced from a laboratory curiosity to become the standard for clinical visual field testing. This paper will provide a brief overview of the impact that automated perimetry has had on current clinical ophthalmic practice and patient management. It is presented as a general example of the influence that automated instrumentation has exerted on the clinical environment.

  3. Validation of shortened 2-day sterility testing of mesenchymal stem cell-based therapeutic preparation on an automated culture system.

    PubMed

    Lysák, Daniel; Holubová, Monika; Bergerová, Tamara; Vávrová, Monika; Cangemi, Giuseppina Cristina; Ciccocioppo, Rachele; Kruzliak, Peter; Jindra, Pavel

    2016-03-01

    Cell therapy products represent a new trend of treatment in the field of immunotherapy and regenerative medicine. Their biological nature and multistep preparation procedure require the application of complex release criteria and quality control. Microbial contamination of cell therapy products is a potential source of morbidity in recipients. The automated blood culture systems are widely used for the detection of microorganisms in cell therapy products. However the standard 2-week cultivation period is too long for some cell-based treatments and alternative methods have to be devised. We tried to verify whether a shortened cultivation of the supernatant from the mesenchymal stem cell (MSC) culture obtained 2 days before the cell harvest could sufficiently detect microbial growth and allow the release of MSC for clinical application. We compared the standard Ph. Eur. cultivation method and the automated blood culture system BACTEC (Becton Dickinson). The time to detection (TTD) and the detection limit were analyzed for three bacterial and two fungal strains. The Staphylococcus aureus and Pseudomonas aeruginosa were recognized within 24 h with both methods (detection limit ~10 CFU). The time required for the detection of Bacillus subtilis was shorter with the automated method (TTD 10.3 vs. 60 h for 10-100 CFU). The BACTEC system reached significantly shorter times to the detection of Candida albicans and Aspergillus brasiliensis growth compared to the classical method (15.5 vs. 48 and 31.5 vs. 48 h, respectively; 10-100 CFU). The positivity was demonstrated within 48 h in all bottles, regardless of the size of the inoculum. This study validated the automated cultivation system as a method able to detect all tested microorganisms within a 48-h period with a detection limit of ~10 CFU. Only in case of B. subtilis, the lowest inoculum (~10 CFU) was not recognized. The 2-day cultivation technique is then capable of confirming the microbiological safety of MSC and

  4. Specimen coordinate automated measuring machine/fiducial automated measuring machine

    SciTech Connect

    Hedglen, Robert E.; Jacket, Howard S.; Schwartz, Allan I.

    1991-01-01

    The Specimen coordinate Automated Measuring Machine (SCAMM) and the Fiducial Automated Measuring Machine (FAMM) is a computer controlled metrology system capable of measuring length, width, and thickness, and of locating fiducial marks. SCAMM and FAMM have many similarities in their designs, and they can be converted from one to the other without taking them out of the hot cell. Both have means for: supporting a plurality of samples and a standard; controlling the movement of the samples in the +/- X and Y directions; determining the coordinates of the sample; compensating for temperature effects; and verifying the accuracy of the measurements and repeating as necessary. SCAMM and FAMM are designed to be used in hot cells.

  5. Long-term memory-based control of attention in multi-step tasks requires working memory: evidence from domain-specific interference

    PubMed Central

    Foerster, Rebecca M.; Carbone, Elena; Schneider, Werner X.

    2014-01-01

    Evidence for long-term memory (LTM)-based control of attention has been found during the execution of highly practiced multi-step tasks. However, does LTM directly control for attention or are working memory (WM) processes involved? In the present study, this question was investigated with a dual-task paradigm. Participants executed either a highly practiced visuospatial sensorimotor task (speed stacking) or a verbal task (high-speed poem reciting), while maintaining visuospatial or verbal information in WM. Results revealed unidirectional and domain-specific interference. Neither speed stacking nor high-speed poem reciting was influenced by WM retention. Stacking disrupted the retention of visuospatial locations, but did not modify memory performance of verbal material (letters). Reciting reduced the retention of verbal material substantially whereas it affected the memory performance of visuospatial locations to a smaller degree. We suggest that the selection of task-relevant information from LTM for the execution of overlearned multi-step tasks recruits domain-specific WM. PMID:24847304

  6. Long-term memory-based control of attention in multi-step tasks requires working memory: evidence from domain-specific interference.

    PubMed

    Foerster, Rebecca M; Carbone, Elena; Schneider, Werner X

    2014-01-01

    Evidence for long-term memory (LTM)-based control of attention has been found during the execution of highly practiced multi-step tasks. However, does LTM directly control for attention or are working memory (WM) processes involved? In the present study, this question was investigated with a dual-task paradigm. Participants executed either a highly practiced visuospatial sensorimotor task (speed stacking) or a verbal task (high-speed poem reciting), while maintaining visuospatial or verbal information in WM. Results revealed unidirectional and domain-specific interference. Neither speed stacking nor high-speed poem reciting was influenced by WM retention. Stacking disrupted the retention of visuospatial locations, but did not modify memory performance of verbal material (letters). Reciting reduced the retention of verbal material substantially whereas it affected the memory performance of visuospatial locations to a smaller degree. We suggest that the selection of task-relevant information from LTM for the execution of overlearned multi-step tasks recruits domain-specific WM.

  7. Influence of multi-step washing using Na2EDTA, oxalic acid and phosphoric acid on metal fractionation and spectroscopy characteristics from contaminated soil.

    PubMed

    Wei, Meng; Chen, Jiajun

    2016-11-01

    A multi-step soil washing test using a typical chelating agent (Na2EDTA), organic acid (oxalic acid), and inorganic weak acid (phosphoric acid) was conducted to remediate soil contaminated with heavy metals near an arsenic mining area. The aim of the test was to improve the heavy metal removal efficiency and investigate its influence on metal fractionation and the spectroscopy characteristics of contaminated soil. The results indicated that the orders of the multi-step washing were critical for the removal efficiencies of the metal fractions, bioavailability, and potential mobility due to the different dissolution levels of mineral fractions and the inter-transformation of metal fractions by XRD and FT-IR spectral analyses. The optimal soil washing options were identified as the Na2EDTA-phosphoric-oxalic acid (EPO) and phosphoric-oxalic acid-Na2EDTA (POE) sequences because of their high removal efficiencies (approximately 45 % for arsenic and 88 % for cadmium) and the minimal harmful effects that were determined by the mobility and bioavailability of the remaining heavy metals based on the metal stability (I R ) and modified redistribution index ([Formula: see text]).

  8. Two-dimensional paper network format that enables simple multistep assays for use in low-resource settings in the context of malaria antigen detection.

    PubMed

    Fu, Elain; Liang, Tinny; Spicar-Mihalic, Paolo; Houghtaling, Jared; Ramachandran, Sujatha; Yager, Paul

    2012-05-15

    The lateral flow test has become the standard bioassay format in low-resource settings because it is rapid, easy to use, and low in cost, uses reagents stored in dry form, and is equipment-free. However, lateral flow tests are often limited to a single chemical delivery step and not capable of the multistep processing characteristic of high performance laboratory-based assays. To address this limitation, we are developing a paper network platform that extends the conventional lateral flow test to two dimensions; this allows incorporation of multistep chemical processing, while still retaining the advantages of conventional lateral flow tests. Here, we demonstrate this format for an easy-to-use, signal-amplified sandwich format immunoassay for the malaria protein PfHRP2. The card contains reagents stored in dry form such that the user need only add sample and water. The multiple flows in the device are activated in a single user step of folding the card closed; the configuration of the paper network automatically delivers the appropriate volumes of (i) sample plus antibody conjugated to a gold particle label, (ii) a rinse buffer, and (iii) a signal amplification reagent to the capture region. These results highlight the potential of the paper network platform to enhance access to high-quality diagnostic capabilities in low-resource settings in the developed and developing worlds.

  9. Application of the quality by design approach to the drug substance manufacturing process of an Fc fusion protein: towards a global multi-step design space.

    PubMed

    Eon-duval, Alex; Valax, Pascal; Solacroup, Thomas; Broly, Hervé; Gleixner, Ralf; Strat, Claire L E; Sutter, James

    2012-10-01

    The article describes how Quality by Design principles can be applied to the drug substance manufacturing process of an Fc fusion protein. First, the quality attributes of the product were evaluated for their potential impact on safety and efficacy using risk management tools. Similarly, process parameters that have a potential impact on critical quality attributes (CQAs) were also identified through a risk assessment. Critical process parameters were then evaluated for their impact on CQAs, individually and in interaction with each other, using multivariate design of experiment techniques during the process characterisation phase. The global multi-step Design Space, defining operational limits for the entire drug substance manufacturing process so as to ensure that the drug substance quality targets are met, was devised using predictive statistical models developed during the characterisation study. The validity of the global multi-step Design Space was then confirmed by performing the entire process, from cell bank thawing to final drug substance, at its limits during the robustness study: the quality of the final drug substance produced under different conditions was verified against predefined targets. An adaptive strategy was devised whereby the Design Space can be adjusted to the quality of the input material to ensure reliable drug substance quality. Finally, all the data obtained during the process described above, together with data generated during additional validation studies as well as manufacturing data, were used to define the control strategy for the drug substance manufacturing process using a risk assessment methodology.

  10. Fully automated analysis of multi-resolution four-channel micro-array genotyping data

    NASA Astrophysics Data System (ADS)

    Abbaspour, Mohsen; Abugharbieh, Rafeef; Podder, Mohua; Tebbutt, Scott J.

    2006-03-01

    We present a fully-automated and robust microarray image analysis system for handling multi-resolution images (down to 3-micron with sizes up to 80 MBs per channel). The system is developed to provide rapid and accurate data extraction for our recently developed microarray analysis and quality control tool (SNP Chart). Currently available commercial microarray image analysis applications are inefficient, due to the considerable user interaction typically required. Four-channel DNA microarray technology is a robust and accurate tool for determining genotypes of multiple genetic markers in individuals. It plays an important role in the state of the art trend where traditional medical treatments are to be replaced by personalized genetic medicine, i.e. individualized therapy based on the patient's genetic heritage. However, fast, robust, and precise image processing tools are required for the prospective practical use of microarray-based genetic testing for predicting disease susceptibilities and drug effects in clinical practice, which require a turn-around timeline compatible with clinical decision-making. In this paper we have developed a fully-automated image analysis platform for the rapid investigation of hundreds of genetic variations across multiple genes. Validation tests indicate very high accuracy levels for genotyping results. Our method achieves a significant reduction in analysis time, from several hours to just a few minutes, and is completely automated requiring no manual interaction or guidance.

  11. Automated illustration of patients instructions.

    PubMed

    Bui, Duy; Nakamura, Carlos; Bray, Bruce E; Zeng-Treitler, Qing

    2012-01-01

    A picture can be a powerful communication tool. However, creating pictures to illustrate patient instructions can be a costly and time-consuming task. Building on our prior research in this area, we developed a computer application that automatically converts text to pictures using natural language processing and computer graphics techniques. After iterative testing, the automated illustration system was evaluated using 49 previously unseen cardiology discharge instructions. The completeness of the system-generated illustrations was assessed by three raters using a three-level scale. The average inter-rater agreement for text correctly represented in the pictograph was about 66 percent. Since illustration in this context is intended to enhance rather than replace text, these results support the feasibility of conducting automated illustration.

  12. Automated Demand Response and Commissioning

    SciTech Connect

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Bourassa, Norman

    2005-04-01

    This paper describes the results from the second season of research to develop and evaluate the performance of new Automated Demand Response (Auto-DR) hardware and software technology in large facilities. Demand Response (DR) is a set of activities to reduce or shift electricity use to improve the electric grid reliability and manage electricity costs. Fully-Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. We refer to this as Auto-DR. The evaluation of the control and communications must be properly configured and pass through a set of test stages: Readiness, Approval, Price Client/Price Server Communication, Internet Gateway/Internet Relay Communication, Control of Equipment, and DR Shed Effectiveness. New commissioning tests are needed for such systems to improve connecting demand responsive building systems to the electric grid demand response systems.

  13. Automated Illustration of Patients Instructions

    PubMed Central

    Bui, Duy; Nakamura, Carlos; Bray, Bruce E.; Zeng-Treitler, Qing

    2012-01-01

    A picture can be a powerful communication tool. However, creating pictures to illustrate patient instructions can be a costly and time-consuming task. Building on our prior research in this area, we developed a computer application that automatically converts text to pictures using natural language processing and computer graphics techniques. After iterative testing, the automated illustration system was evaluated using 49 previously unseen cardiology discharge instructions. The completeness of the system-generated illustrations was assessed by three raters using a three-level scale. The average inter-rater agreement for text correctly represented in the pictograph was about 66 percent. Since illustration in this context is intended to enhance rather than replace text, these results support the feasibility of conducting automated illustration. PMID:23304392

  14. Automated nutrient analyses in seawater

    SciTech Connect

    Whitledge, T.E.; Malloy, S.C.; Patton, C.J.; Wirick, C.D.

    1981-02-01

    This manual was assembled for use as a guide for analyzing the nutrient content of seawater samples collected in the marine coastal zone of the Northeast United States and the Bering Sea. Some modifications (changes in dilution or sample pump tube sizes) may be necessary to achieve optimum measurements in very pronounced oligotrophic, eutrophic or brackish areas. Information is presented under the following section headings: theory and mechanics of automated analysis; continuous flow system description; operation of autoanalyzer system; cookbook of current nutrient methods; automated analyzer and data analysis software; computer interfacing and hardware modifications; and trouble shooting. The three appendixes are entitled: references and additional reading; manifold components and chemicals; and software listings. (JGB)

  15. Automated labeling in document images

    NASA Astrophysics Data System (ADS)

    Kim, Jongwoo; Le, Daniel X.; Thoma, George R.

    2000-12-01

    The National Library of Medicine (NLM) is developing an automated system to produce bibliographic records for its MEDLINER database. This system, named Medical Article Record System (MARS), employs document image analysis and understanding techniques and optical character recognition (OCR). This paper describes a key module in MARS called the Automated Labeling (AL) module, which labels all zones of interest (title, author, affiliation, and abstract) automatically. The AL algorithm is based on 120 rules that are derived from an analysis of journal page layouts and features extracted from OCR output. Experiments carried out on more than 11,000 articles in over 1,000 biomedical journals show the accuracy of this rule-based algorithm to exceed 96%.

  16. Algorithms Could Automate Cancer Diagnosis

    NASA Technical Reports Server (NTRS)

    Baky, A. A.; Winkler, D. G.

    1982-01-01

    Five new algorithms are a complete statistical procedure for quantifying cell abnormalities from digitized images. Procedure could be basis for automated detection and diagnosis of cancer. Objective of procedure is to assign each cell an atypia status index (ASI), which quantifies level of abnormality. It is possible that ASI values will be accurate and economical enough to allow diagnoses to be made quickly and accurately by computer processing of laboratory specimens extracted from patients.

  17. Small Business Innovations (Automated Information)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Bruce G. Jackson & Associates Document Director is an automated tool that combines word processing and database management technologies to offer the flexibility and convenience of text processing with the linking capability of database management. Originally developed for NASA, it provides a means to collect and manage information associated with requirements development. The software system was used by NASA in the design of the Assured Crew Return Vehicle, as well as by other government and commercial organizations including the Southwest Research Institute.

  18. Automated Anti-Virus Deployment

    DTIC Science & Technology

    2004-11-01

    External collaborators and visitors also need to keep in contact with their home laboratories or institutes, using the Internet to exchange e - mails or...layered defence system deployed with other components like host or network- based intrusion detection, global and personal firewalls, logical network...and provides the standard services that are requested to a modern enterprise network: office automation, e - mail , Internet access and workgroup file

  19. Automating GONG's Angle Determination Pipeline

    NASA Astrophysics Data System (ADS)

    Toner, C. G.

    2005-05-01

    Recently, GONG started recording regular noon drift-scans throughout the Network (3 per week). This is in an effort to prevent spurious "wobbling" of GONG's merged images by providing regular "reality checks" on the true orientation of the site images. Wobbling can be very detrimental to local helioseismology analyses (A.K.A. the "Washing Machine Effect") Here we describe recent steps to automate the processing of the drift-scans once they arrive in Tucson.

  20. Home automation in the workplace.

    PubMed

    McCormack, J E; Tello, S F

    1994-01-01

    Environmental control units and home automation devices contribute to the independence and potential of individuals with disabilities, both at work and at home. Devices currently exist that can assist people with physical, cognitive, and sensory disabilities to control lighting, appliances, temperature, security, and telephone communications. This article highlights several possible applications for these technologies and discusses emerging technologies that will increase the benefits these devices offer people with disabilities.

  1. Fully automated urban traffic system

    NASA Technical Reports Server (NTRS)

    Dobrotin, B. M.; Hansen, G. R.; Peng, T. K. C.; Rennels, D. A.

    1977-01-01

    The replacement of the driver with an automatic system which could perform the functions of guiding and routing a vehicle with a human's capability of responding to changing traffic demands was discussed. The problem was divided into four technological areas; guidance, routing, computing, and communications. It was determined that the latter three areas being developed independent of any need for fully automated urban traffic. A guidance system that would meet system requirements was not being developed but was technically feasible.

  2. Market Investigation for Automated Warehousing

    DTIC Science & Technology

    1990-06-28

    support supply units can take full advantage of available space and material handling equipment (MHE). These supplies are grouped for warehousing...provides maximum product accessibility with minimum floor space use. On-board machine controls interface with the PC end-of-aisle controllers for...enough to explort the adaptation of AGV 0 4-15 MARKET INVESTIGATION FOR AUTOMATED WAREHOUSING * technology to the field environment. Control

  3. Automated Test Requirement Document Generation

    DTIC Science & Technology

    1987-11-01

    DIAGNOSTICS BASED ON THE PRINCIPLES OF ARTIFICIAL INTELIGENCE ", 1984 International Test Conference, 01Oct84, (A3, 3, Cs D3, E2, G2, H2, 13, J6, K) 425...j0O GLOSSARY OF ACRONYMS 0 ABBREVIATION DEFINITION AFSATCOM Air Force Satellite Communication Al Artificial Intelligence ASIC Application Specific...In-Test Equipment (BITE) and AI ( Artificial Intelligence) - Expert Systems - need to be fully applied before a completely automated process can be

  4. Automated Scheduling Via Artificial Intelligence

    NASA Technical Reports Server (NTRS)

    Biefeld, Eric W.; Cooper, Lynne P.

    1991-01-01

    Artificial-intelligence software that automates scheduling developed in Operations Mission Planner (OMP) research project. Software used in both generation of new schedules and modification of existing schedules in view of changes in tasks and/or available resources. Approach based on iterative refinement. Although project focused upon scheduling of operations of scientific instruments and other equipment aboard spacecraft, also applicable to such terrestrial problems as scheduling production in factory.

  5. ALFA: an automated line fitting algorithm

    NASA Astrophysics Data System (ADS)

    Wesson, R.

    2016-03-01

    I present the automated line fitting algorithm, ALFA, a new code which can fit emission line spectra of arbitrary wavelength coverage and resolution, fully automatically. In contrast to traditional emission line fitting methods which require the identification of spectral features suspected to be emission lines, ALFA instead uses a list of lines which are expected to be present to construct a synthetic spectrum. The parameters used to construct the synthetic spectrum are optimized by means of a genetic algorithm. Uncertainties are estimated using the noise structure of the residuals. An emission line spectrum containing several hundred lines can be fitted in a few seconds using a single processor of a typical contemporary desktop or laptop PC. I show that the results are in excellent agreement with those measured manually for a number of spectra. Where discrepancies exist, the manually measured fluxes are found to be less accurate than those returned by ALFA. Together with the code NEAT, ALFA provides a powerful way to rapidly extract physical information from observations, an increasingly vital function in the era of highly multiplexed spectroscopy. The two codes can deliver a reliable and comprehensive analysis of very large data sets in a few hours with little or no user interaction.

  6. From Crater to Graph: Manual and Automated Crater Counting Techniques

    NASA Astrophysics Data System (ADS)

    Plesko, C. S.; Werner, S. C.; Brumby, S. P.; Foing, B. H.; Asphaug, E.; Neukum, G.; Team, H.; Team, I.

    2005-12-01

    Impact craters are some of the most abundant, and most interesting features on Mars. They hold a wealth of information about Martian geology, providing clues to the relative age, local composition and erosional history of the surface. A great deal of effort has been expended to count and understand the nature of planetary crater populations (Hartman and Neukum, 2001). Highly trained experts have developed personal methods for conducting manual crater surveys. In addition, several efforts are underway to automate this process in order to keep up with the rapid increase in planetary surface image data. These efforts make use of a variety of methods, including the direct application of traditional image processing algorithms such as the Hough transform, and recent developments in genetic programming, an artificial intelligence-based technique, in which manual crater surveys are used as examples to `grow' or `evolve' crater counting algorithms. (Plesko, C. S. et al., LPSC 2005, Kim, J. R. et al., LPSC 2001, Michael, G. G. P&SS 2003, Earl, J. et al, LPSC 2005) In this study we examine automated crater counting techniques, and compare them with traditional manual techniques on MOC imagery, and demonstrate capabilities for the analysis of multi-spectral and HRSC Digital Terrain Model data as well. Techniques are compared and discussed to define and develop a robust automated crater detection strategy.

  7. Automated detection of masses and clustered microcalcifications on mammograms

    NASA Astrophysics Data System (ADS)

    Fujita, Hiroshi; Endo, Tokiko; Matsubara, Tomoko; Hirako, Kenichi; Hara, Takeshi; Ueda, Hitoshi; Torisu, Yasuhiro; Riyahi-Alam, Nader; Horita, Katsuhei; Kido, Choichiro; Ishigaki, Takeo

    1995-05-01

    We are developing automated-detection schemes for the masses and clustered microcalcifications on laser-digitized mammograms (0.1 mm, 10-bit resolution, 2000 X 2510) by using a conventional workstation. The purpose of this paper is to provide an overview of our recent schemes and to evaluate the current performance of the schemes. The fully automated computer system consists of several parts such as the extraction of breast region, detection of masses, detection of clustered microcalcifications, classification of the candidates, and the display of the detected results. Our schemes tested with more than 200 cases of Japanese women achieved an about 95% (86%) true-positive rate with 0.61 (0.55) false-positive masses (clusters) per image. It was found that the automated method has the potential to aid physicians in screening mammograms for breast tumors. Initial results for the mammograms digitized with the pixel sizes of 25, 50, and 100 micrometers are also discussed, in which a genetic algorithm (GA) technique was applied to the detection filter for the microcalcifications. It was indicated from the experiment with a breast phantom that 100- micrometers pixel size is not enough for the computer detection of microcalcifications, and it seems that at least 50-micrometers pixel size is required.

  8. Automating network meta-analysis.

    PubMed

    van Valkenhoef, Gert; Lu, Guobing; de Brock, Bert; Hillege, Hans; Ades, A E; Welton, Nicky J

    2012-12-01

    Mixed treatment comparison (MTC) (also called network meta-analysis) is an extension of traditional meta-analysis to allow the simultaneous pooling of data from clinical trials comparing more than two treatment options. Typically, MTCs are performed using general-purpose Markov chain Monte Carlo software such as WinBUGS, requiring a model and data to be specified using a specific syntax. It would be preferable if, for the most common cases, both could be derived from a well-structured data file that can be easily checked for errors. Automation is particularly valuable for simulation studies in which the large number of MTCs that have to be estimated may preclude manual model specification and analysis. Moreover, automated model generation raises issues that provide additional insight into the nature of MTC. We present a method for the automated generation of Bayesian homogeneous variance random effects consistency models, including the choice of basic parameters and trial baselines, priors, and starting values for the Markov chain(s). We validate our method against the results of five published MTCs. The method is implemented in freely available open source software. This means that performing an MTC no longer requires manually writing a statistical model. This reduces time and effort, and facilitates error checking of the dataset. Copyright © 2012 John Wiley & Sons, Ltd.

  9. Towards a rapid automated kinematic source inversion: applications to Mw 6.5-8.0 earthquakes in Japan

    NASA Astrophysics Data System (ADS)

    Cesca, S.; Heimann, S.; Dahm, T.; Krüger, F.

    2009-04-01

    A main problem for stable and automated routines for the inversion of kinematic earthquake sources arise from the overparameterization of the rupture model, as occurs for example using slip map representations. Using such an approach, it is possible to well reproduce observations, but the inversion is often unstable and solutions ambiguous, as several different source models may equally well fit the observations. To overcome this problem and implement an automated kinematic inversion, we adopt the eikonal source model to represent the extended earthquake source. This model offers a flexible and realistic description of the rupture process, which is fully described by only 13 parameters. We use a multi-step inversion strategy, which has already been successfully applied at local and regional distances, in order to retrieve both point and extended source parameters. Significant source information, including focal mechanism, source depth, magnitude, centroid location, resolution of the fault plane ambiguity, rupture size, average slip and directivity effects can be provided. We include specific applications to a set of moderate to large earthquakes occurred in Japan, using broadband seismic data at regional and teleseismic distances. Quality and stability of inversion results are first discussed, by using full waveform information. Point source parameters are always well determined, while kinematic parameters such as the rupture extension, the average slip and the unilateral or bilateral character of the rupture can be resolved in many cases. The possibility of providing fast solutions, which are needed within early-warning systems, is further discussed.

  10. Automated saccharification assay for determination of digestibility in plant materials

    PubMed Central

    2010-01-01

    Background Cell wall resistance represents the main barrier for the production of second generation biofuels. The deconstruction of lignocellulose can provide sugars for the production of fuels or other industrial products through fermentation. Understanding the biochemical basis of the recalcitrance of cell walls to digestion will allow development of more effective and cost efficient ways to produce sugars from biomass. One approach is to identify plant genes that play a role in biomass recalcitrance, using association genetics. Such an approach requires a robust and reliable high throughput (HT) assay for biomass digestibility, which can be used to screen the large numbers of samples involved in such studies. Results We developed a HT saccharification assay based on a robotic platform that can carry out in a 96-well plate format the enzymatic digestion and quantification of the released sugars. The handling of the biomass powder for weighing and formatting into 96 wells is performed by a robotic station, where the plant material is ground, delivered to the desired well in the plates and weighed with a precision of 0.1 mg. Once the plates are loaded, an automated liquid handling platform delivers an optional mild pretreatment (< 100°C) followed by enzymatic hydrolysis of the biomass. Aliquots from the hydrolysis are then analyzed for the release of reducing sugar equivalents. The same platform can be used for the comparative evaluation of different enzymes and enzyme cocktails. The sensitivity and reliability of the platform was evaluated by measuring the saccharification of stems from lignin modified tobacco plants, and the results of automated and manual analyses compared. Conclusions The automated assay systems are sensitive, robust and reliable. The system can reliably detect differences in the saccharification of plant tissues, and is able to process large number of samples with a minimum amount of human intervention. The automated system uncovered

  11. Process development for automated solar cell and module production. Task 4: Automated array assembly

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A process sequence which can be used in conjunction with automated equipment for the mass production of solar cell modules for terrestrial use was developed. The process sequence was then critically analyzed from a technical and economic standpoint to determine the technological readiness of certain process steps for implementation. The steps receiving analysis were: back contact metallization, automated cell array layup/interconnect, and module edge sealing. For automated layup/interconnect, both hard automation and programmable automation (using an industrial robot) were studied. The programmable automation system was then selected for actual hardware development.

  12. The genetic basis of addictive disorders.

    PubMed

    Ducci, Francesca; Goldman, David

    2012-06-01

    Addictions are common, chronic, and relapsing diseases that develop through a multistep process. The impact of addictions on morbidity and mortality is high worldwide. Twin studies have shown that the heritability of addictions ranges from 0.39 (hallucinogens) to 0.72 (cocaine). Twin studies indicate that genes influence each stage from initiation to addiction, although the genetic determinants may differ. Addictions are by definition the result of gene × environment interaction. These disorders, which are in part volitional, in part inborn, and in part determined by environmental experience, pose the full range of medical, genetic, policy, and moral challenges. Gene discovery is being facilitated by a variety of powerful approaches, but is in its infancy. It is not surprising that the genes discovered so far act in a variety of ways: via altered metabolism of drug (the alcohol and nicotine metabolic gene variants), via altered function of a drug receptor (the nicotinic receptor, which may alter affinity for nicotine but as discussed may also alter circuitry of reward), and via general mechanisms of addiction (genes such as monoamine oxidase A and the serotonin transporter that modulate stress response, emotion, and behavioral control). Addiction medicine today benefits from genetic studies that buttress the case for a neurobiologic origin of addictive behavior, and some general information on familially transmitted propensity that can be used to guide prevention. A few well-validated, specific predictors such as OPRM1, ADH1B, ALDH2, CHRNA5, and CYP26 have been identified and can provide some specific guidance, for example, to understand alcohol-related flushing and upper GI cancer risk (ADH1B and AKLDH2), variation in nicotine metabolism (CYP26), and, potentially, naltrexone treatment response (OPRM1). However, the genetic predictors available are few in number and account for only a small portion of the genetic variance in liability, and have not been integrated

  13. Genetic barcodes

    DOEpatents

    Weier, Heinz -Ulrich G

    2015-08-04

    Herein are described multicolor FISH probe sets termed "genetic barcodes" targeting several cancer or disease-related loci to assess gene rearrangements and copy number changes in tumor cells. Two, three or more different fluorophores are used to detect the genetic barcode sections thus permitting unique labeling and multilocus analysis in individual cell nuclei. Gene specific barcodes can be generated and combined to provide both numerical and structural genetic information for these and other pertinent disease associated genes.

  14. Arduino-based automation of a DNA extraction system.

    PubMed

    Kim, Kyung-Won; Lee, Mi-So; Ryu, Mun-Ho; Kim, Jong-Won

    2015-01-01

    There have been many studies to detect infectious diseases with the molecular genetic method. This study presents an automation process for a DNA extraction system based on microfluidics and magnetic bead, which is part of a portable molecular genetic test system. This DNA extraction system consists of a cartridge with chambers, syringes, four linear stepper actuators, and a rotary stepper actuator. The actuators provide a sequence of steps in the DNA extraction process, such as transporting, mixing, and washing for the gene specimen, magnetic bead, and reagent solutions. The proposed automation system consists of a PC-based host application and an Arduino-based controller. The host application compiles a G code sequence file and interfaces with the controller to execute the compiled sequence. The controller executes stepper motor axis motion, time delay, and input-output manipulation. It drives the stepper motor with an open library, which provides a smooth linear acceleration profile. The controller also provides a homing sequence to establish the motor's reference position, and hard limit checking to prevent any over-travelling. The proposed system was implemented and its functionality was investigated, especially regarding positioning accuracy and velocity profile.

  15. Operator versus computer control of adaptive automation

    NASA Technical Reports Server (NTRS)

    Hilburn, Brian; Molloy, Robert; Wong, Dick; Parasuraman, Raja

    1993-01-01

    Adaptive automation refers to real-time allocation of functions between the human operator and automated subsystems. The article reports the results of a series of experiments whose aim is to examine the effects of adaptive automation on operator performance during multi-task flight simulation, and to provide an empirical basis for evaluations of different forms of adaptive logic. The combined results of these studies suggest several things. First, it appears that either excessively long, or excessively short, adaptation cycles can limit the effectiveness of adaptive automation in enhancing operator performance of both primary flight and monitoring tasks. Second, occasional brief reversions to manual control can counter some of the monitoring inefficiency typically associated with long cycle automation, and further, that benefits of such reversions can be sustained for some time after return to automated control. Third, no evidence was found that the benefits of such reversions depend on the adaptive logic by which long-cycle adaptive switches are triggered.

  16. Human-centered aircraft automation: A concept and guidelines

    NASA Technical Reports Server (NTRS)

    Billings, Charles E.

    1991-01-01

    Aircraft automation is examined and its effects on flight crews. Generic guidelines are proposed for the design and use of automation in transport aircraft, in the hope of stimulating increased and more effective dialogue among designers of automated cockpits, purchasers of automated aircraft, and the pilots who must fly those aircraft in line operations. The goal is to explore the means whereby automation may be a maximally effective tool or resource for pilots without compromising human authority and with an increase in system safety. After definition of the domain of the aircraft pilot and brief discussion of the history of aircraft automation, a concept of human centered automation is presented and discussed. Automated devices are categorized as a control automation, information automation, and management automation. The environment and context of aircraft automation are then considered, followed by thoughts on the likely future of automation of that category.

  17. Automated systems for identification of microorganisms.

    PubMed Central

    Stager, C E; Davis, J R

    1992-01-01

    Automated instruments for the identification of microorganisms were introduced into clinical microbiology laboratories in the 1970s. During the past two decades, the capabilities and performance characteristics of automated identification systems have steadily progressed and improved. This article explores the development of the various automated identification systems available in the United States and reviews their performance for identification of microorganisms. Observations regarding deficiencies and suggested improvements for these systems are provided. PMID:1498768

  18. Genetic Engineering

    ERIC Educational Resources Information Center

    Phillips, John

    1973-01-01

    Presents a review of genetic engineering, in which the genotypes of plants and animals (including human genotypes) may be manipulated for the benefit of the human species. Discusses associated problems and solutions and provides an extensive bibliography of literature relating to genetic engineering. (JR)

  19. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  20. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...