Science.gov

Sample records for automated multistep genetic

  1. An Automated, Multi-Step Monte Carlo Burnup Code System.

    Energy Science and Technology Software Center (ESTSC)

    2003-07-14

    Version 02 MONTEBURNS Version 2 calculates coupled neutronic/isotopic results for nuclear systems and produces a large number of criticality and burnup results based on various material feed/removal specifications, power(s), and time intervals. MONTEBURNS is a fully automated tool that links the LANL MCNP Monte Carlo transport code with a radioactive decay and burnup code. Highlights on changes to Version 2 are listed in the transmittal letter. Along with other minor improvements in MONTEBURNS Version 2,more » the option was added to use CINDER90 instead of ORIGEN2 as the depletion/decay part of the system. CINDER90 is a multi-group depletion code developed at LANL and is not currently available from RSICC. This MONTEBURNS release was tested with various combinations of CCC-715/MCNPX 2.4.0, CCC-710/MCNP5, CCC-700/MCNP4C, CCC-371/ORIGEN2.2, ORIGEN2.1 and CINDER90. Perl is required software and is not included in this distribution. MCNP, ORIGEN2, and CINDER90 are not included.« less

  2. An Automated, Multi-Step Monte Carlo Burnup Code System.

    SciTech Connect

    TRELLUE, HOLLY R.

    2003-07-14

    Version 02 MONTEBURNS Version 2 calculates coupled neutronic/isotopic results for nuclear systems and produces a large number of criticality and burnup results based on various material feed/removal specifications, power(s), and time intervals. MONTEBURNS is a fully automated tool that links the LANL MCNP Monte Carlo transport code with a radioactive decay and burnup code. Highlights on changes to Version 2 are listed in the transmittal letter. Along with other minor improvements in MONTEBURNS Version 2, the option was added to use CINDER90 instead of ORIGEN2 as the depletion/decay part of the system. CINDER90 is a multi-group depletion code developed at LANL and is not currently available from RSICC. This MONTEBURNS release was tested with various combinations of CCC-715/MCNPX 2.4.0, CCC-710/MCNP5, CCC-700/MCNP4C, CCC-371/ORIGEN2.2, ORIGEN2.1 and CINDER90. Perl is required software and is not included in this distribution. MCNP, ORIGEN2, and CINDER90 are not included.

  3. Genetic circuit design automation.

    PubMed

    Nielsen, Alec A K; Der, Bryan S; Shin, Jonghyeon; Vaidyanathan, Prashant; Paralanov, Vanya; Strychalski, Elizabeth A; Ross, David; Densmore, Douglas; Voigt, Christopher A

    2016-04-01

    Computation can be performed in living cells by DNA-encoded circuits that process sensory information and control biological functions. Their construction is time-intensive, requiring manual part assembly and balancing of regulator expression. We describe a design environment, Cello, in which a user writes Verilog code that is automatically transformed into a DNA sequence. Algorithms build a circuit diagram, assign and connect gates, and simulate performance. Reliable circuit design requires the insulation of gates from genetic context, so that they function identically when used in different circuits. We used Cello to design 60 circuits forEscherichia coli(880,000 base pairs of DNA), for which each DNA sequence was built as predicted by the software with no additional tuning. Of these, 45 circuits performed correctly in every output state (up to 10 regulators and 55 parts), and across all circuits 92% of the output states functioned as predicted. Design automation simplifies the incorporation of genetic circuits into biotechnology projects that require decision-making, control, sensing, or spatial organization. PMID:27034378

  4. Automated multi-step purification protocol for Angiotensin-I-Converting-Enzyme (ACE).

    PubMed

    Eisele, Thomas; Stressler, Timo; Kranz, Bertolt; Fischer, Lutz

    2012-12-12

    Highly purified proteins are essential for the investigation of the functional and biochemical properties of proteins. The purification of a protein requires several steps, which are often time-consuming. In our study, the Angiotensin-I-Converting-Enzyme (ACE; EC 3.4.15.1) was solubilised from pig lung without additional detergents, which are commonly used, under mild alkaline conditions in a Tris-HCl buffer (50mM, pH 9.0) for 48h. An automation of the ACE purification was performed using a multi-step protocol in less than 8h, resulting in a purified protein with a specific activity of 37Umg(-1) (purification factor 308) and a yield of 23.6%. The automated ACE purification used an ordinary fast-protein-liquid-chromatography (FPLC) system equipped with two additional switching valves. These switching valves were needed for the buffer stream inversion and for the connection of the Superloop™ used for the protein parking. Automated ACE purification was performed using four combined chromatography steps, including two desalting procedures. The purification methods contained two hydrophobic interaction chromatography steps, a Cibacron 3FG-A chromatography step and a strong anion exchange chromatography step. The purified ACE was characterised by sodium dodecyl sulphate-polyacrylamide gel electrophoresis (SDS-PAGE) and native-PAGE. The estimated monomer size of the purified glycosylated ACE was determined to be ∼175kDa by SDS-PAGE, with the dimeric form at ∼330kDa as characterised by a native PAGE using a novel activity staining protocol. For the activity staining, the tripeptide l-Phe-Gly-Gly was used as the substrate. The ACE cleaved the dipeptide Gly-Gly, releasing the l-Phe to be oxidised with l-amino acid oxidase. Combined with peroxidase and o-dianisidine, the generated H(2)O(2) stained a brown coloured band. This automated purification protocol can be easily adapted to be used with other protein purification tasks. PMID:23217308

  5. Synthetic Genetic Arrays: Automation of Yeast Genetics.

    PubMed

    Kuzmin, Elena; Costanzo, Michael; Andrews, Brenda; Boone, Charles

    2016-01-01

    Genome-sequencing efforts have led to great strides in the annotation of protein-coding genes and other genomic elements. The current challenge is to understand the functional role of each gene and how genes work together to modulate cellular processes. Genetic interactions define phenotypic relationships between genes and reveal the functional organization of a cell. Synthetic genetic array (SGA) methodology automates yeast genetics and enables large-scale and systematic mapping of genetic interaction networks in the budding yeast,Saccharomyces cerevisiae SGA facilitates construction of an output array of double mutants from an input array of single mutants through a series of replica pinning steps. Subsequent analysis of genetic interactions from SGA-derived mutants relies on accurate quantification of colony size, which serves as a proxy for fitness. Since its development, SGA has given rise to a variety of other experimental approaches for functional profiling of the yeast genome and has been applied in a multitude of other contexts, such as genome-wide screens for synthetic dosage lethality and integration with high-content screening for systematic assessment of morphology defects. SGA-like strategies can also be implemented similarly in a number of other cell types and organisms, includingSchizosaccharomyces pombe,Escherichia coli, Caenorhabditis elegans, and human cancer cell lines. The genetic networks emerging from these studies not only generate functional wiring diagrams but may also play a key role in our understanding of the complex relationship between genotype and phenotype. PMID:27037078

  6. Power Calculation of Multi-step Combined Principal Components with Applications to Genetic Association Studies

    PubMed Central

    Li, Zhengbang; Zhang, Wei; Pan, Dongdong; Li, Qizhai

    2016-01-01

    Principal component analysis (PCA) is a useful tool to identify important linear combination of correlated variables in multivariate analysis and has been applied to detect association between genetic variants and human complex diseases of interest. How to choose adequate number of principal components (PCs) to represent the original system in an optimal way is a key issue for PCA. Note that the traditional PCA, only using a few top PCs while discarding the other PCs, might significantly lose power in genetic association studies if all the PCs contain non-ignorable signals. In order to make full use of information from all PCs, Aschard and his colleagues have proposed a multi-step combined PCs method (named mCPC) recently, which performs well especially when several traits are highly correlated. However, the power superiority of mCPC has just been illustrated by simulation, while the theoretical power performance of mCPC has not been studied yet. In this work, we attempt to investigate theoretical properties of mCPC and further propose a novel and efficient strategy to combine PCs. Extensive simulation results confirm that the proposed method is more robust than existing procedures. A real data application to detect the association between gene TRAF1-C5 and rheumatoid arthritis further shows good performance of the proposed procedure. PMID:27189724

  7. A computational method for automated characterization of genetic components.

    PubMed

    Yordanov, Boyan; Dalchau, Neil; Grant, Paul K; Pedersen, Michael; Emmott, Stephen; Haseloff, Jim; Phillips, Andrew

    2014-08-15

    The ability to design and construct synthetic biological systems with predictable behavior could enable significant advances in medical treatment, agricultural sustainability, and bioenergy production. However, to reach a stage where such systems can be reliably designed from biological components, integrated experimental and computational techniques that enable robust component characterization are needed. In this paper we present a computational method for the automated characterization of genetic components. Our method exploits a recently developed multichannel experimental protocol and integrates bacterial growth modeling, Bayesian parameter estimation, and model selection, together with data processing steps that are amenable to automation. We implement the method within the Genetic Engineering of Cells modeling and design environment, which enables both characterization and design to be integrated within a common software framework. To demonstrate the application of the method, we quantitatively characterize a synthetic receiver device that responds to the 3-oxohexanoyl-homoserine lactone signal, across a range of experimental conditions. PMID:24628037

  8. A Parallel Genetic Algorithm for Automated Electronic Circuit Design

    NASA Technical Reports Server (NTRS)

    Long, Jason D.; Colombano, Silvano P.; Haith, Gary L.; Stassinopoulos, Dimitris

    2000-01-01

    Parallelized versions of genetic algorithms (GAs) are popular primarily for three reasons: the GA is an inherently parallel algorithm, typical GA applications are very compute intensive, and powerful computing platforms, especially Beowulf-style computing clusters, are becoming more affordable and easier to implement. In addition, the low communication bandwidth required allows the use of inexpensive networking hardware such as standard office ethernet. In this paper we describe a parallel GA and its use in automated high-level circuit design. Genetic algorithms are a type of trial-and-error search technique that are guided by principles of Darwinian evolution. Just as the genetic material of two living organisms can intermix to produce offspring that are better adapted to their environment, GAs expose genetic material, frequently strings of 1s and Os, to the forces of artificial evolution: selection, mutation, recombination, etc. GAs start with a pool of randomly-generated candidate solutions which are then tested and scored with respect to their utility. Solutions are then bred by probabilistically selecting high quality parents and recombining their genetic representations to produce offspring solutions. Offspring are typically subjected to a small amount of random mutation. After a pool of offspring is produced, this process iterates until a satisfactory solution is found or an iteration limit is reached. Genetic algorithms have been applied to a wide variety of problems in many fields, including chemistry, biology, and many engineering disciplines. There are many styles of parallelism used in implementing parallel GAs. One such method is called the master-slave or processor farm approach. In this technique, slave nodes are used solely to compute fitness evaluations (the most time consuming part). The master processor collects fitness scores from the nodes and performs the genetic operators (selection, reproduction, variation, etc.). Because of dependency

  9. Integrating GIS and genetic algorithms for automating land partitioning

    NASA Astrophysics Data System (ADS)

    Demetriou, Demetris; See, Linda; Stillwell, John

    2014-08-01

    Land consolidation is considered to be the most effective land management planning approach for controlling land fragmentation and hence improving agricultural efficiency. Land partitioning is a basic process of land consolidation that involves the subdivision of land into smaller sub-spaces subject to a number of constraints. This paper explains the development of a module called LandParcelS (Land Parcelling System) that integrates geographical information systems and a genetic algorithm to automate the land partitioning process by designing and optimising land parcels in terms of their shape, size and value. This new module has been applied to two land blocks that are part of a larger case study area in Cyprus. Partitioning is carried out by guiding a Thiessen polygon process within ArcGIS and it is treated as a multiobjective problem. The results suggest that a step forward has been made in solving this complex spatial problem, although further research is needed to improve the algorithm. The contribution of this research extends land partitioning and space partitioning in general, since these approaches may have relevance to other spatial processes that involve single or multi-objective problems that could be solved in the future by spatial evolutionary algorithms.

  10. Automating the packing heuristic design process with genetic programming.

    PubMed

    Burke, Edmund K; Hyde, Matthew R; Kendall, Graham; Woodward, John

    2012-01-01

    The literature shows that one-, two-, and three-dimensional bin packing and knapsack packing are difficult problems in operational research. Many techniques, including exact, heuristic, and metaheuristic approaches, have been investigated to solve these problems and it is often not clear which method to use when presented with a new instance. This paper presents an approach which is motivated by the goal of building computer systems which can design heuristic methods. The overall aim is to explore the possibilities for automating the heuristic design process. We present a genetic programming system to automatically generate a good quality heuristic for each instance. It is not necessary to change the methodology depending on the problem type (one-, two-, or three-dimensional knapsack and bin packing problems), and it therefore has a level of generality unmatched by other systems in the literature. We carry out an extensive suite of experiments and compare with the best human designed heuristics in the literature. Note that our heuristic design methodology uses the same parameters for all the experiments. The contribution of this paper is to present a more general packing methodology than those currently available, and to show that, by using this methodology, it is possible for a computer system to design heuristics which are competitive with the human designed heuristics from the literature. This represents the first packing algorithm in the literature able to claim human competitive results in such a wide variety of packing domains. PMID:21609273

  11. Programming Cells: Towardsan automatedGenetic Compiler”

    PubMed Central

    Clancy, Kevin; Voigt, Christopher A.

    2010-01-01

    I. Summary The increasing scale and sophistication of genetic engineering will necessitate a new generation of computer-aided design (CAD). For large genetic programs, keeping track of the DNA on the level of nucleotides becomes tedious and error prone. To push the size of projects, it is important to abstract the designer from the process of part selection and optimization. The vision is to specify genetic programs in a higher-level language, which a genetic compiler could automatically convert into a DNA sequence. Steps towards this goal include: defining the semantics of the higher-level language, algorithms to select and assemble parts, and biophysical methods to link DNA sequence to function. These will be coupled to graphic design interfaces and simulation packages to aid in the prediction of program dynamics, optimize genes, and scan projects for errors. PMID:20702081

  12. Genetic Influences on Cognitive Function Using the Cambridge Neuropsychological Test Automated Battery

    ERIC Educational Resources Information Center

    Singer, Jamie J.; MacGregor, Alex J.; Cherkas, Lynn F.; Spector, Tim D.

    2006-01-01

    The genetic relationship between intelligence and components of cognition remains controversial. Conflicting results may be a function of the limited number of methods used in experimental evaluation. The current study is the first to use CANTAB (The Cambridge Neuropsychological Test Automated Battery). This is a battery of validated computerised…

  13. Automated Test Assembly for Cognitive Diagnosis Models Using a Genetic Algorithm

    ERIC Educational Resources Information Center

    Finkelman, Matthew; Kim, Wonsuk; Roussos, Louis A.

    2009-01-01

    Much recent psychometric literature has focused on cognitive diagnosis models (CDMs), a promising class of instruments used to measure the strengths and weaknesses of examinees. This article introduces a genetic algorithm to perform automated test assembly alongside CDMs. The algorithm is flexible in that it can be applied whether the goal is to…

  14. A Droplet Microfluidic Platform for Automating Genetic Engineering.

    PubMed

    Gach, Philip C; Shih, Steve C C; Sustarich, Jess; Keasling, Jay D; Hillson, Nathan J; Adams, Paul D; Singh, Anup K

    2016-05-20

    We present a water-in-oil droplet microfluidic platform for transformation, culture and expression of recombinant proteins in multiple host organisms including bacteria, yeast and fungi. The platform consists of a hybrid digital microfluidic/channel-based droplet chip with integrated temperature control to allow complete automation and integration of plasmid addition, heat-shock transformation, addition of selection medium, culture, and protein expression. The microfluidic format permitted significant reduction in consumption (100-fold) of expensive reagents such as DNA and enzymes compared to the benchtop method. The chip contains a channel to continuously replenish oil to the culture chamber to provide a fresh supply of oxygen to the cells for long-term (∼5 days) cell culture. The flow channel also replenished oil lost to evaporation and increased the number of droplets that could be processed and cultured. The platform was validated by transforming several plasmids into Escherichia coli including plasmids containing genes for fluorescent proteins GFP, BFP and RFP; plasmids with selectable markers for ampicillin or kanamycin resistance; and a Golden Gate DNA assembly reaction. We also demonstrate the applicability of this platform for transformation in widely used eukaryotic organisms such as Saccharomyces cerevisiae and Aspergillus niger. Duration and temperatures of the microfluidic heat-shock procedures were optimized to yield transformation efficiencies comparable to those obtained by benchtop methods with a throughput up to 6 droplets/min. The proposed platform offers potential for automation of molecular biology experiments significantly reducing cost, time and variability while improving throughput. PMID:26830031

  15. A Parallel Genetic Algorithm for Automated Electronic Circuit Design

    NASA Technical Reports Server (NTRS)

    Lohn, Jason D.; Colombano, Silvano P.; Haith, Gary L.; Stassinopoulos, Dimitris; Norvig, Peter (Technical Monitor)

    2000-01-01

    We describe a parallel genetic algorithm (GA) that automatically generates circuit designs using evolutionary search. A circuit-construction programming language is introduced and we show how evolution can generate practical analog circuit designs. Our system allows circuit size (number of devices), circuit topology, and device values to be evolved. We present experimental results as applied to analog filter and amplifier design tasks.

  16. Automating data manipulation for genetic analysis using a data base management system.

    PubMed

    Farrer, L A; Haines, J L; Yount, E A

    1985-01-01

    Inefficient coding and manipulation of pedigree data have often hindered the progress of genetic studies. In this paper we present the methodology for interfacing a data base management system (DBMS) called MEGADATS with a linkage analysis program called LIPED. Two families that segregate a dominant trait and one test marker were used in a simulated exercise to demonstrate how a DBMS can be used to automate tedious clerical steps and improve the efficiency of a genetic analysis. The merits of this approach to data management are discussed. We conclude that a standardized format for genetic analysis programs would greatly facilitate data analysis. PMID:3840122

  17. Single-Cell Genetic Analysis Using Automated Microfluidics to Resolve Somatic Mosaicism.

    PubMed

    Szulwach, Keith E; Chen, Peilin; Wang, Xiaohui; Wang, Jing; Weaver, Lesley S; Gonzales, Michael L; Sun, Gang; Unger, Marc A; Ramakrishnan, Ramesh

    2015-01-01

    Somatic mosaicism occurs throughout normal development and contributes to numerous disease etiologies, including tumorigenesis and neurological disorders. Intratumor genetic heterogeneity is inherent to many cancers, creating challenges for effective treatments. Unfortunately, analysis of bulk DNA masks subclonal phylogenetic architectures created by the acquisition and distribution of somatic mutations amongst cells. As a result, single-cell genetic analysis is becoming recognized as vital for accurately characterizing cancers. Despite this, methods for single-cell genetics are lacking. Here we present an automated microfluidic workflow enabling efficient cell capture, lysis, and whole genome amplification (WGA). We find that ~90% of the genome is accessible in single cells with improved uniformity relative to current single-cell WGA methods. Allelic dropout (ADO) rates were limited to 13.75% and variant false discovery rates (SNV FDR) were 4.11x10(-6), on average. Application to ER-/PR-/HER2+ breast cancer cells and matched normal controls identified novel mutations that arose in a subpopulation of cells and effectively resolved the segregation of known cancer-related mutations with single-cell resolution. Finally, we demonstrate effective cell classification using mutation profiles with 10X average exome coverage depth per cell. Our data demonstrate an efficient automated microfluidic platform for single-cell WGA that enables the resolution of somatic mutation patterns in single cells. PMID:26302375

  18. Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction

    PubMed Central

    Gallistel, C. R.; Balci, Fuat; Freestone, David; Kheifets, Aaron; King, Adam

    2014-01-01

    We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be

  19. Automated, quantitative cognitive/behavioral screening of mice: for genetics, pharmacology, animal cognition and undergraduate instruction.

    PubMed

    Gallistel, C R; Balci, Fuat; Freestone, David; Kheifets, Aaron; King, Adam

    2014-01-01

    We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be

  20. Automated synthesis of both the topology and numerical parameters for seven patented optical lens systems using genetic programming

    NASA Astrophysics Data System (ADS)

    Jones, Lee W.; Al-Sakran, Sameer H.; Koza, John R.

    2005-08-01

    This paper describes how genetic programming was used as an automated invention machine to synthesize both the topology and numerical parameters for seven previously patented optical lens systems, including one aspherical system and one issued in the 21st-century. Two of the evolved optical lens systems infringe the claims of the patents and the others are novel solutions that satisfy the design goals stated in the patent. The automatic synthesis was done "from scratch"--that is, without starting from a pre-existing good design and without pre-specifying the number of lenses, the topological layout of the lenses, or the numerical parameters of the lenses. Genetic programming is a form of evolutionary computation used to automatically solve problems. It starts from a high-level statement of what needs to be done and progressively breeds a population of candidate individuals over many generations using the principle of Darwinian natural selection and genetic recombination. The paper describes how genetic programming created eyepieces that duplicated the functionality of seven previously patented lens systems. The seven designs were created in a substantially similar and routine way, suggesting that the use of genetic programming in the automated design of both the topology and numerical parameters for optical lens systems may have widespread utility.

  1. Dissolvable fluidic time delays for programming multi-step assays in instrument-free paper diagnostics

    PubMed Central

    Lutz, Barry; Liang, Tinny; Fu, Elain; Ramachandran, Sujatha; Kauffman, Peter; Yager, Paul

    2013-01-01

    Lateral flow tests (LFTs) are an ingenious format for rapid and easy-to-use diagnostics, but they are fundamentally limited to assay chemistries that can be reduced to a single chemical step. In contrast, most laboratory diagnostic assays rely on multiple timed steps carried out by a human or a machine. Here, we use dissolvable sugar applied to paper to create programmable flow delays and present a paper network topology that uses these time delays to program automated multi-step fluidic protocols. Solutions of sucrose at different concentrations (10-70% of saturation) were added to paper strips and dried to create fluidic time delays spanning minutes to nearly an hour. A simple folding card format employing sugar delays was shown to automate a four-step fluidic process initiated by a single user activation step (folding the card); this device was used to perform a signal-amplified sandwich immunoassay for a diagnostic biomarker for malaria. The cards are capable of automating multi-step assay protocols normally used in laboratories, but in a rapid, low-cost, and easy-to-use format. PMID:23685876

  2. Multistep sintering to synthesize fast lithium garnets

    NASA Astrophysics Data System (ADS)

    Xu, Biyi; Duan, Huanan; Xia, Wenhao; Guo, Yiping; Kang, Hongmei; Li, Hua; Liu, Hezhou

    2016-01-01

    A multistep sintering schedule is developed to synthesize Li7La3Zr2O12 (LLZO) doped with 0.2 mol% Al3+. The effect of sintering steps on phase, relative density and ionic conductivity of Al-doped LLZO has been evaluated using powder X-Ray diffraction (XRD), scanning electron microscopy (SEM), 27Al magic spinning nuclear magnetic resonance (NMR) spectroscopy and electrochemical impedance spectroscopy (EIS). The results show that by holding the sample at 900 °C for 6 h, the mixture of tetragonal and cubic garnet phases are obtained; by continuously holding at 1100 °C for 6 h, the tetragonal phase completely transforms into cubic phase; by holding at 1200 °C, the relative density increases without decomposition of the cubic phase. The Al-LLZO pellets after multistep sintering exhibit cubic phase, relative density of 94.25% and ionic conductivity of 4.5 × 10-4 S cm-1 at room temperature. Based on the observation, a sintering model is proposed and discussed.

  3. Synthesis of Silver Nanostructures by Multistep Methods

    PubMed Central

    Zhang, Tong; Song, Yuan-Jun; Zhang, Xiao-Yang; Wu, Jing-Yuan

    2014-01-01

    The shape of plasmonic nanostructures such as silver and gold is vital to their physical and chemical properties and potential applications. Recently, preparation of complex nanostructures with rich function by chemical multistep methods is the hotspot of research. In this review we introduce three typical multistep methods to prepare silver nanostructures with well-controlled shapes, including the double reductant method, etching technique and construction of core-shell nanostructures. The growth mechanism of double the reductant method is that different favorable facets of silver nanocrystals are produced in different reductants, which can be used to prepare complex nanostructures such as nanoflags with ultranarrow resonant band bandwidth or some silver nanostructures which are difficult to prepare using other methods. The etching technique can selectively remove nanoparticles to achieve the aim of shape control and is widely used for the synthesis of nanoflowers and hollow nanostructures. Construction of core-shell nanostructures is another tool to control shape and size. The three methods can not only prepare various silver nanostructures with well-controlled shapes, which exhibit unique optical properties, such as strong surface-enhanced Raman scattering (SERS) signal and localized surface plasmon resonance (LSPR) effect, but also have potential application in many areas. PMID:24670722

  4. Multistep Charge Method by Charge Arrays

    NASA Astrophysics Data System (ADS)

    Segami, Go; Kusawake, Hiroaki; Shimizu, Yasuhiro; Iwasa, Minoru; Kibe, Koichi

    2008-09-01

    We studied reduction of the size and weight of the Power Control Unit (PCU). In this study, we specifically examined the weight of the Battery Charge Regulator (BCR), which accounts for half of the PCU weight for a low earth orbit (LEO) satellite. We found a multistep charge method by charge arrays and adopted a similar method for GEO satellites, thereby enabling the BCR reduction. We found the possibility of reducing the size and weight of PCU through more detailed design than that for a conventional PCU.BCRC1R1batterySAPower Control UnitBCRC1R1batterySAPower UnitHowever, this method decreases the state of charge (SOC) of the battery. Battery tests, a battery simulator test, and numerical analysis were used to evaluate the SOC decrease. We also studied effects of this method on the battery lifetime. The multistep charge method by charge arrays enabled charging to the same level of SOC as the conventional constant current/ constant voltage (CC/CV) charge method for a LEO satellite.

  5. Multi-step avalanche chambers for final experiment E605

    NASA Astrophysics Data System (ADS)

    Hubbard, J. R.; Coutrakon, G.; Cribier, M.; Mangeot, Ph.; Martin, H.; Mullié, J.; Palanque, S.; Pelle, J.

    1980-10-01

    Physical processein multi-step avalanche chambers, detector properties, and difficulties in operation are discussed. Advantages of multi-step chambers over classical MWPCs for specific experimental problems encountered in experiment E605 (high-flux environment and Cherenkov imaging) are described. Some details of detector design are presented.

  6. Scheme for multistep resonance photoionization of atoms

    NASA Astrophysics Data System (ADS)

    Liu, Bo; Ning, Xi-Jing

    2001-07-01

    Traditional schemes for multistep resonance photoionization of atoms let every employed laser beam interact with the atoms simultaneously. In such a situation, analyses via time-dependent Schrödinger equation show that high ionization probability requires all the laser beams must be intense enough. In order to decrease laser intensity, we proposed a scheme that the laser beam used to pump the excited atoms (in a higher bound state) into an autoionization state does not interact with the atoms until all the population is transferred by the other lasers from a ground state to the bound state. As an interesting example, we examined three-step photoionization of 235U with our scheme, showing that the intensity of two laser beams can be lowered by two orders of magnitude without losing high ionization probability.

  7. Automated microscopy system for detection and genetic characterization of fetal nucleated red blood cells on slides

    NASA Astrophysics Data System (ADS)

    Ravkin, Ilya; Temov, Vladimir

    1998-04-01

    The detection and genetic analysis of fetal cells in maternal blood will permit noninvasive prenatal screening for genetic defects. Applied Imaging has developed and is currently evaluating a system for semiautomatic detection of fetal nucleated red blood cells on slides and acquisition of their DNA probe FISH images. The specimens are blood smears from pregnant women (9 - 16 weeks gestation) enriched for nucleated red blood cells (NRBC). The cells are identified by using labeled monoclonal antibodies directed to different types of hemoglobin chains (gamma, epsilon); the nuclei are stained with DAPI. The Applied Imaging system has been implemented with both Olympus BX and Nikon Eclipse series microscopes which were equipped with transmission and fluorescence optics. The system includes the following motorized components: stage, focus, transmission, and fluorescence filter wheels. A video camera with light integration (COHU 4910) permits low light imaging. The software capabilities include scanning, relocation, autofocusing, feature extraction, facilities for operator review, and data analysis. Detection of fetal NRBCs is achieved by employing a combination of brightfield and fluorescence images of nuclear and cytoplasmic markers. The brightfield and fluorescence images are all obtained with a single multi-bandpass dichroic mirror. A Z-stack of DNA probe FISH images is acquired by moving focus and switching excitation filters. This stack is combined to produce an enhanced image for presentation and spot counting.

  8. Development of a Genetic Algorithm to Automate Clustering of a Dependency Structure Matrix

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; Korte, John J.; Bilardo, Vincent J.

    2006-01-01

    Much technology assessment and organization design data exists in Microsoft Excel spreadsheets. Tools are needed to put this data into a form that can be used by design managers to make design decisions. One need is to cluster data that is highly coupled. Tools such as the Dependency Structure Matrix (DSM) and a Genetic Algorithm (GA) can be of great benefit. However, no tool currently combines the DSM and a GA to solve the clustering problem. This paper describes a new software tool that interfaces a GA written as an Excel macro with a DSM in spreadsheet format. The results of several test cases are included to demonstrate how well this new tool works.

  9. Spi-1/PU.1 transgenic mice develop multistep erythroleukemias.

    PubMed Central

    Moreau-Gachelin, F; Wendling, F; Molina, T; Denis, N; Titeux, M; Grimber, G; Briand, P; Vainchenker, W; Tavitian, A

    1996-01-01

    Insertional mutagenesis of the spi-1 gene is associated with the emergence of malignant proerythroblasts during Friend virus-induced acute erythroleukemia. To determine the role of spi-1/PU.1 in the genesis of leukemia, we generated spi-1 transgenic mice. In one founder line the transgene was overexpressed as an unexpected-size transcript in various mouse tissues. Homozygous transgenic animals gave rise to live-born offspring, but 50% of the animals developed a multistep erythroleukemia within 1.5 to 6 months of birth whereas the remainder survived without evidence of disease. At the onset of the disease, mice became severely anemic. Their hematopoietic tissues were massively invaded with nontumorigenic proerythroblasts that express a high level of Spi-1 protein. These transgenic proerythroblasts are partially blocked in differentiation and strictly dependent on erythropoietin for their proliferation both in vivo and in vitro. A complete but transient regression of the disease was observed after erythrocyte transfusion, suggesting that the constitutive expression of spi-1 is related to the block of the differentiation of erythroid precursors. At relapse, erythropoietin-independent malignant proerythroblasts arose. Growth factor autonomy could be partially explained by the autocrine secretion of erythropoietin; however, other genetic events appear to be necessary to confer the full malignant phenotype. These results reveal that overexpression of spi-1 is essential for malignant erythropoiesis and does not alter other hematopoietic lineages. PMID:8628313

  10. Genetic algorithm based feature selection combined with dual classification for the automated detection of proliferative diabetic retinopathy.

    PubMed

    Welikala, R A; Fraz, M M; Dehmeshki, J; Hoppe, A; Tah, V; Mann, S; Williamson, T H; Barman, S A

    2015-07-01

    Proliferative diabetic retinopathy (PDR) is a condition that carries a high risk of severe visual impairment. The hallmark of PDR is the growth of abnormal new vessels. In this paper, an automated method for the detection of new vessels from retinal images is presented. This method is based on a dual classification approach. Two vessel segmentation approaches are applied to create two separate binary vessel map which each hold vital information. Local morphology features are measured from each binary vessel map to produce two separate 4-D feature vectors. Independent classification is performed for each feature vector using a support vector machine (SVM) classifier. The system then combines these individual outcomes to produce a final decision. This is followed by the creation of additional features to generate 21-D feature vectors, which feed into a genetic algorithm based feature selection approach with the objective of finding feature subsets that improve the performance of the classification. Sensitivity and specificity results using a dataset of 60 images are 0.9138 and 0.9600, respectively, on a per patch basis and 1.000 and 0.975, respectively, on a per image basis. PMID:25841182

  11. Automated docking of peptides and proteins by using a genetic algorithm combined with a tabu search.

    PubMed

    Hou, T; Wang, J; Chen, L; Xu, X

    1999-08-01

    A genetic algorithm (GA) combined with a tabu search (TA) has been applied as a minimization method to rake the appropriate associated sites for some biomolecular systems. In our docking procedure, surface complementarity and energetic complementarity of a ligand with its receptor have been considered separately in a two-stage docking method. The first stage was to find a set of potential associated sites mainly based on surface complementarity using a genetic algorithm combined with a tabu search. This step corresponds with the process of finding the potential binding sites where pharmacophores will bind. In the second stage, several hundreds of GA minimization steps were performed for each associated site derived from the first stage mainly based on the energetic complementarity. After calculations for both of the two stages, we can offer several solutions of associated sites for every complex. In this paper, seven biomolecular systems, including five bound complexes and two unbound complexes, were chosen from the Protein Data Bank (PDB) to test our method. The calculated results were very encouraging-the hybrid minimization algorithm successfully reaches the correct solutions near the best binded modes for these protein complexes. The docking results not only predict the bound complexes very well, but also get a relatively accurate complexed conformation for unbound systems. For the five bound complexes, the results show that surface complementarity is enough to find the precise binding modes, the top solution from the tabu list generally corresponds to the correct binding mode. For the two unbound complexes, due to the conformational changes upon binding, it seems more difficult to get their correct binding conformations. The predicted results show that the correct binding mode also corresponds to a relatively large surface complementarity score. In these two test cases, the correct solution can be found in the top several solutions from the tabu list. For

  12. Multistep, effective drug distribution within solid tumors

    PubMed Central

    Shemi, Amotz; Khvalevsky, Elina Zorde; Gabai, Rachel Malka; Domb, Abraham; Barenholz, Yechezkel

    2015-01-01

    The distribution of drugs within solid tumors presents a long-standing barrier for efficient cancer therapies. Tumors are highly resistant to diffusion, and the lack of blood and lymphatic flows suppresses convection. Prolonged, continuous intratumoral drug delivery from a miniature drug source offers an alternative to both systemic delivery and intratumoral injection. Presented here is a model of drug distribution from such a source, in a multistep process. At delivery onset the drug mainly affects the closest surroundings. Such ‘priming’ enables drug penetration to successive cell layers. Tumor ‘void volume’ (volume not occupied by cells) increases, facilitating lymphatic perfusion. The drug is then transported by hydraulic convection downstream along interstitial fluid pressure (IFP) gradients, away from the tumor core. After a week tumor cell death occurs throughout the entire tumor and IFP gradients are flattened. Then, the drug is transported mainly by ‘mixing’, powered by physiological bulk body movements. Steady state is achieved and the drug covers the entire tumor over several months. Supporting measurements are provided from the LODER™ system, releasing siRNA against mutated KRAS over months in pancreatic cancer in-vivo models. LODER™ was also successfully employed in a recent Phase 1/2 clinical trial with pancreatic cancer patients. PMID:26416413

  13. Genetic Algorithm Based Framework for Automation of Stochastic Modeling of Multi-Season Streamflows

    NASA Astrophysics Data System (ADS)

    Srivastav, R. K.; Srinivasan, K.; Sudheer, K.

    2009-05-01

    bootstrap (MABB) ) based on the explicit objective functions of minimizing the relative bias and relative root mean square error in estimating the storage capacity of the reservoir. The optimal parameter set of the hybrid model is obtained based on the search over a multi- dimensional parameter space (involving simultaneous exploration of the parametric (PAR(1)) as well as the non-parametric (MABB) components). This is achieved using the efficient evolutionary search based optimization tool namely, non-dominated sorting genetic algorithm - II (NSGA-II). This approach helps in reducing the drudgery involved in the process of manual selection of the hybrid model, in addition to predicting the basic summary statistics dependence structure, marginal distribution and water-use characteristics accurately. The proposed optimization framework is used to model the multi-season streamflows of River Beaver and River Weber of USA. In case of both the rivers, the proposed GA-based hybrid model yields a much better prediction of the storage capacity (where simultaneous exploration of both parametric and non-parametric components is done) when compared with the MLE-based hybrid models (where the hybrid model selection is done in two stages, thus probably resulting in a sub-optimal model). This framework can be further extended to include different linear/non-linear hybrid stochastic models at other temporal and spatial scales as well.

  14. Evaluating the impact of scoring parameters on the structure of intra-specific genetic variation using RawGeno, an R package for automating AFLP scoring

    PubMed Central

    Arrigo, Nils; Tuszynski, Jarek W; Ehrich, Dorothee; Gerdes, Tommy; Alvarez, Nadir

    2009-01-01

    Background Since the transfer and application of modern sequencing technologies to the analysis of amplified fragment-length polymorphisms (AFLP), evolutionary biologists have included an increasing number of samples and markers in their studies. Although justified in this context, the use of automated scoring procedures may result in technical biases that weaken the power and reliability of further analyses. Results Using a new scoring algorithm, RawGeno, we show that scoring errors – in particular "bin oversplitting" (i.e. when variant sizes of the same AFLP marker are not considered as homologous) and "technical homoplasy" (i.e. when two AFLP markers that differ slightly in size are mistakenly considered as being homologous) – induce a loss of discriminatory power, decrease the robustness of results and, in extreme cases, introduce erroneous information in genetic structure analyses. In the present study, we evaluate several descriptive statistics that can be used to optimize the scoring of the AFLP analysis, and we describe a new statistic, the information content per bin (Ibin) that represents a valuable estimator during the optimization process. This statistic can be computed at any stage of the AFLP analysis without requiring the inclusion of replicated samples. Finally, we show that downstream analyses are not equally sensitive to scoring errors. Indeed, although a reasonable amount of flexibility is allowed during the optimization of the scoring procedure without causing considerable changes in the detection of genetic structure patterns, notable discrepancies are observed when estimating genetic diversities from differently scored datasets. Conclusion Our algorithm appears to perform as well as a commercial program in automating AFLP scoring, at least in the context of population genetics or phylogeographic studies. To our knowledge, RawGeno is the only freely available public-domain software for fully automated AFLP scoring, from electropherogram

  15. Comparing multistep immobilized metal affinity chromatography and multistep TiO2 methods for phosphopeptide enrichment.

    PubMed

    Yue, Xiaoshan; Schunter, Alissa; Hummon, Amanda B

    2015-09-01

    Phosphopeptide enrichment from complicated peptide mixtures is an essential step for mass spectrometry-based phosphoproteomic studies to reduce sample complexity and ionization suppression effects. Typical methods for enriching phosphopeptides include immobilized metal affinity chromatography (IMAC) or titanium dioxide (TiO2) beads, which have selective affinity and can interact with phosphopeptides. In this study, the IMAC enrichment method was compared with the TiO2 enrichment method, using a multistep enrichment strategy from whole cell lysate, to evaluate their abilities to enrich for different types of phosphopeptides. The peptide-to-beads ratios were optimized for both IMAC and TiO2 beads. Both IMAC and TiO2 enrichments were performed for three rounds to enable the maximum extraction of phosphopeptides from the whole cell lysates. The phosphopeptides that are unique to IMAC enrichment, unique to TiO2 enrichment, and identified with both IMAC and TiO2 enrichment were analyzed for their characteristics. Both IMAC and TiO2 enriched similar amounts of phosphopeptides with comparable enrichment efficiency. However, phosphopeptides that are unique to IMAC enrichment showed a higher percentage of multiphosphopeptides as well as a higher percentage of longer, basic, and hydrophilic phosphopeptides. Also, the IMAC and TiO2 procedures clearly enriched phosphopeptides with different motifs. Finally, further enriching with two rounds of TiO2 from the supernatant after IMAC enrichment or further enriching with two rounds of IMAC from the supernatant TiO2 enrichment does not fully recover the phosphopeptides that are not identified with the corresponding multistep enrichment. PMID:26237447

  16. Multi-step motion planning: Application to free-climbing robots

    NASA Astrophysics Data System (ADS)

    Bretl, Timothy Wolfe

    This dissertation addresses the problem of planning the motion of a multi-limbed robot to "free-climb" vertical rock surfaces. Free-climbing relies on natural features and friction (such as holes or protrusions) rather than special fixtures or tools. It requires strength, but more importantly it requires deliberate reasoning: not only must the robot decide how to adjust its posture to reach the next feature without falling, it must plan an entire sequence of steps, where each one might have future consequences. This process of reasoning is called multi-step planning. A multi-step planning framework is presented for computing non-gaited, free-climbing motions. This framework derives from an analysis of a free-climbing robot's configuration space, which can be decomposed into constraint manifolds associated with each state of contact between the robot and its environment. An understanding of the adjacency between manifolds motivates a two-stage strategy that uses a candidate sequence of steps to direct the subsequent search for motions. Three algorithms are developed to support the framework. The first algorithm reduces the amount of time required to plan each potential step, a large number of which must be considered over an entire multi-step search. It extends the probabilistic roadmap (PRM) approach based on an analysis of the interaction between balance and the topology of closed kinematic chains. The second algorithm addresses a problem with the PRM approach, that it is unable to distinguish challenging steps (which may be critical) from impossible ones. This algorithm detects impossible steps explicitly, using automated algebraic inference and machine learning. The third algorithm provides a fast constraint checker (on which the PRM approach depends), in particular a test of balance at the initially unknown number of sampled configurations associated with each step. It is a method of incremental precomputation, fast because it takes advantage of the sample

  17. Digital multi-step phase-shifting profilometry for three-dimensional ballscrew surface imaging

    NASA Astrophysics Data System (ADS)

    Liu, Cheng-Yang; Yen, Tzu-Ping

    2016-05-01

    A digital multi-step phase-shifting profilometry for three-dimensional (3-D) ballscrew surface imaging is presented. The 3-D digital imaging system is capable of capturing fringe pattern images. The straight fringe patterns generated by software in the computer are projected onto the ballscrew surface by the DLP projector. The distorted fringe patterns are captured by the CCD camera at different detecting directions for reconstruction algorithms. The seven-step phase-shifting algorithm and quality guided path unwrapping algorithm are used to calculate absolute phase at each pixel position. The 3-D calibration method is used to obtain the relationship between the absolute phase map and ballscrew shape. The angular dependence of 3-D shape imaging for ballscrews is analyzed and characterized. The experimental results may provide a novel, fast, and high accuracy imaging system to inspect the surface features of the ballscrew without length limitation for automated optical inspection industry.

  18. Error behaviour of multistep methods applied to unstable differential systems

    NASA Technical Reports Server (NTRS)

    Brown, R. L.

    1978-01-01

    The problem of modelling a dynamic system described by a system of ordinary differential equations which has unstable components for limited periods of time is discussed. It is shown that the global error in a multistep numerical method is the solution to a difference equation initial value problem, and the approximate solution is given for several popular multistep integration formulae. Inspection of the solution leads to the formulation of four criteria for integrators appropriate to unstable problems. A sample problem is solved numerically using three popular formulae and two different stepsizes to illustrate the appropriateness of the criteria.

  19. INDES User's guide multistep input design with nonlinear rotorcraft modeling

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The INDES computer program, a multistep input design program used as part of a data processing technique for rotorcraft systems identification, is described. Flight test inputs base on INDES improve the accuracy of parameter estimates. The input design algorithm, program input, and program output are presented.

  20. Genetics

    MedlinePlus

    Homozygous; Inheritance; Heterozygous; Inheritance patterns; Heredity and disease; Heritable; Genetic markers ... The chromosomes are made up of strands of genetic information called DNA. Each chromosome contains sections of ...

  1. Genetics

    MedlinePlus

    ... Inheritance; Heterozygous; Inheritance patterns; Heredity and disease; Heritable; Genetic markers ... The chromosomes are made up of strands of genetic information called DNA. Each chromosome contains sections of ...

  2. Automated discovery of structural features of the optic nerve head on the basis of image and genetic data

    NASA Astrophysics Data System (ADS)

    Christopher, Mark; Tang, Li; Fingert, John H.; Scheetz, Todd E.; Abramoff, Michael D.

    2014-03-01

    Evaluation of optic nerve head (ONH) structure is a commonly used clinical technique for both diagnosis and monitoring of glaucoma. Glaucoma is associated with characteristic changes in the structure of the ONH. We present a method for computationally identifying ONH structural features using both imaging and genetic data from a large cohort of participants at risk for primary open angle glaucoma (POAG). Using 1054 participants from the Ocular Hypertension Treatment Study, ONH structure was measured by application of a stereo correspondence algorithm to stereo fundus images. In addition, the genotypes of several known POAG genetic risk factors were considered for each participant. ONH structural features were discovered using both a principal component analysis approach to identify the major modes of variance within structural measurements and a linear discriminant analysis approach to capture the relationship between genetic risk factors and ONH structure. The identified ONH structural features were evaluated based on the strength of their associations with genotype and development of POAG by the end of the OHTS study. ONH structural features with strong associations with genotype were identified for each of the genetic loci considered. Several identified ONH structural features were significantly associated (p < 0.05) with the development of POAG after Bonferroni correction. Further, incorporation of genetic risk status was found to substantially increase performance of early POAG prediction. These results suggest incorporating both imaging and genetic data into ONH structural modeling significantly improves the ability to explain POAG-related changes to ONH structure.

  3. Optimal Installation Locations for Automated External Defibrillators in Taipei 7-Eleven Stores: Using GIS and a Genetic Algorithm with a New Stirring Operator

    PubMed Central

    Wen, Tzai-Hung

    2014-01-01

    Immediate treatment with an automated external defibrillator (AED) increases out-of-hospital cardiac arrest (OHCA) patient survival potential. While considerable attention has been given to determining optimal public AED locations, spatial and temporal factors such as time of day and distance from emergency medical services (EMSs) are understudied. Here we describe a geocomputational genetic algorithm with a new stirring operator (GANSO) that considers spatial and temporal cardiac arrest occurrence factors when assessing the feasibility of using Taipei 7-Eleven stores as installation locations for AEDs. Our model is based on two AED conveyance modes, walking/running and driving, involving service distances of 100 and 300 meters, respectively. Our results suggest different AED allocation strategies involving convenience stores in urban settings. In commercial areas, such installations can compensate for temporal gaps in EMS locations when responding to nighttime OHCA incidents. In residential areas, store installations can compensate for long distances from fire stations, where AEDs are currently held in Taipei. PMID:25045396

  4. The Application of Baum-Welch Algorithm in Multistep Attack

    PubMed Central

    Zhang, Yanxue; Zhao, Dongmei; Liu, Jinxing

    2014-01-01

    The biggest difficulty of hidden Markov model applied to multistep attack is the determination of observations. Now the research of the determination of observations is still lacking, and it shows a certain degree of subjectivity. In this regard, we integrate the attack intentions and hidden Markov model (HMM) and support a method to forecasting multistep attack based on hidden Markov model. Firstly, we train the existing hidden Markov model(s) by the Baum-Welch algorithm of HMM. Then we recognize the alert belonging to attack scenarios with the Forward algorithm of HMM. Finally, we forecast the next possible attack sequence with the Viterbi algorithm of HMM. The results of simulation experiments show that the hidden Markov models which have been trained are better than the untrained in recognition and prediction. PMID:24991642

  5. DEFORMATION DEPENDENT TUL MULTI-STEP DIRECT MODEL

    SciTech Connect

    WIENKE,H.; CAPOTE, R.; HERMAN, M.; SIN, M.

    2007-04-22

    The Multi-Step Direct (MSD) module TRISTAN in the nuclear reaction code EMPIRE has been extended in order to account for nuclear deformation. The new formalism was tested in calculations of neutron emission spectra emitted from the {sup 232}Th(n,xn) reaction. These calculations include vibration-rotational Coupled Channels (CC) for the inelastic scattering to low-lying collective levels, ''deformed'' MSD with quadrupole deformation for inelastic scattering to the continuum, Multi-Step Compound (MSC) and Hauser-Feshbach with advanced treatment of the fission channel. Prompt fission neutrons were also calculated. The comparison with experimental data shows clear improvement over the ''spherical'' MSD calculations and JEFF-3.1 and JENDL-3.3 evaluations.

  6. Hypersonic flow over a multi-step afterbody

    NASA Astrophysics Data System (ADS)

    Menezes, V.; Kumar, S.; Maruta, K.; Reddy, K. P. J.; Takayama, K.

    2005-12-01

    Effect of a multi-step base on the total drag of a missile shaped body was studied in a shock tunnel at a hypersonic Mach number of 5.75. Total drag over the body was measured using a single component accelerometer force balance. Experimental results indicated a reduction of 8% in total drag over the body with a multi-step base in comparison with the base-line (model with a flat base) configuration.The flow fields around the above bodies were simulated using a 2-D axisymmetric Navier Stokes solver and the simulated results on total drag were compared with the measured results. The simulated flow field pictures give an insight into the involved flow physics.

  7. Illumination system design with multi-step optimization

    NASA Astrophysics Data System (ADS)

    Magarill, Simon; Cassarly, William J.

    2015-08-01

    Automatic optimization algorithms can be used when designing illumination systems. For systems with many design variables, optimization using an adjustable set of variables at different steps of the process can provide different local minima. We present a few examples of implementing a multi-step optimization method. We have found that this approach can sometimes lead to more efficient solutions. In this paper we illustrate the effectiveness of using a commercially available optimization algorithm with a slightly modified procedure.

  8. A one-pot multistep cyclization yielding thiadiazoloimidazole derivatives

    PubMed Central

    Rana, Anup; Bats, Jan W

    2014-01-01

    Summary A versatile synthetic procedure is described to prepare the benzimidazole-fused 1,2,4-thiadiazoles 2a–c via a methanesulfonyl chloride initiated multistep cyclization involving the intramolecular reaction of an in-situ generated carbodiimide with a thiourea unit. The structure of the intricate heterocycle 2a was confirmed by single-crystal X-ray analysis and its mechanism of formation supported by DFT computations. PMID:25670969

  9. On the Dynamics of Implicit Linear Multistep Methods

    NASA Technical Reports Server (NTRS)

    Yee, H. C.; Sweby, P. K.; Rai, Man Mohan (Technical Monitor)

    1995-01-01

    Some new guidelines on the usage of implicit linear multistep methods (LMMs) as time-dependent approaches for obtaining steady-state numerical solutions in computational fluid dynamics (CFD) are explored. The commonly used implicit LMMs in CFD belong to the class of superstable time discretizations. It can be shown that the nonlinear asymptotic behavior in terms of bifurcation diagrams and basins of attractions of these schemes can provide an improved range of initial data and time step over the linearized stability limit.

  10. Event-triggered logical flow control for comprehensive process integration of multi-step assays on centrifugal microfluidic platforms.

    PubMed

    Kinahan, David J; Kearney, Sinéad M; Dimov, Nikolay; Glynn, Macdara T; Ducrée, Jens

    2014-07-01

    The centrifugal "lab-on-a-disc" concept has proven to have great potential for process integration of bioanalytical assays, in particular where ease-of-use, ruggedness, portability, fast turn-around time and cost efficiency are of paramount importance. Yet, as all liquids residing on the disc are exposed to the same centrifugal field, an inherent challenge of these systems remains the automation of multi-step, multi-liquid sample processing and subsequent detection. In order to orchestrate the underlying bioanalytical protocols, an ample palette of rotationally and externally actuated valving schemes has been developed. While excelling with the level of flow control, externally actuated valves require interaction with peripheral instrumentation, thus compromising the conceptual simplicity of the centrifugal platform. In turn, for rotationally controlled schemes, such as common capillary burst valves, typical manufacturing tolerances tend to limit the number of consecutive laboratory unit operations (LUOs) that can be automated on a single disc. In this paper, a major advancement on recently established dissolvable film (DF) valving is presented; for the very first time, a liquid handling sequence can be controlled in response to completion of preceding liquid transfer event, i.e. completely independent of external stimulus or changes in speed of disc rotation. The basic, event-triggered valve configuration is further adapted to leverage conditional, large-scale process integration. First, we demonstrate a fluidic network on a disc encompassing 10 discrete valving steps including logical relationships such as an AND-conditional as well as serial and parallel flow control. Then we present a disc which is capable of implementing common laboratory unit operations such as metering and selective routing of flows. Finally, as a pilot study, these functions are integrated on a single disc to automate a common, multi-step lab protocol for the extraction of total RNA from

  11. Strengthening and softening of nanocrystalline nickel during multistep nanoindentation

    SciTech Connect

    Nieh, Tai-Gang; Pan, D; Chen, Ming-Wei

    2006-04-01

    Multistep load-unload nanoindentation was employed to address the effect of deformation-induced microstructural evolution on mechanical behavior of nanocrystalline Ni. Deformation discontinuity was deliberately introduced by unloading-reloading during nanoindentation testing, which allows us to examine the influence of microstructural evolution on the successive deformation. Strain strengthening/softening of nanocrystalline nickel, associated with the transition of deformation behavior from dislocation activity at high loading rates to a grain-boundary-mediated process at low loading rates, was uncovered by means of this experimental methodology.

  12. Hard-X-Ray-Induced Multistep Ultrafast Dissociation

    NASA Astrophysics Data System (ADS)

    Travnikova, Oksana; Marchenko, Tatiana; Goldsztejn, Gildas; Jänkälä, Kari; Sisourat, Nicolas; Carniato, Stéphane; Guillemin, Renaud; Journel, Loïc; Céolin, Denis; Püttner, Ralph; Iwayama, Hiroshi; Shigemasa, Eiji; Piancastelli, Maria Novella; Simon, Marc

    2016-05-01

    Creation of deep core holes with very short (τ ≤1 fs ) lifetimes triggers a chain of relaxation events leading to extensive nuclear dynamics on a few-femtosecond time scale. Here we demonstrate a general multistep ultrafast dissociation on an example of HCl following Cl 1 s →σ* excitation. Intermediate states with one or multiple holes in the shallower core electron shells are generated in the course of the decay cascades. The repulsive character and large gradients of the potential energy surfaces of these intermediates enable ultrafast fragmentation after the absorption of a hard x-ray photon.

  13. Hard-X-Ray-Induced Multistep Ultrafast Dissociation.

    PubMed

    Travnikova, Oksana; Marchenko, Tatiana; Goldsztejn, Gildas; Jänkälä, Kari; Sisourat, Nicolas; Carniato, Stéphane; Guillemin, Renaud; Journel, Loïc; Céolin, Denis; Püttner, Ralph; Iwayama, Hiroshi; Shigemasa, Eiji; Piancastelli, Maria Novella; Simon, Marc

    2016-05-27

    Creation of deep core holes with very short (τ≤1  fs) lifetimes triggers a chain of relaxation events leading to extensive nuclear dynamics on a few-femtosecond time scale. Here we demonstrate a general multistep ultrafast dissociation on an example of HCl following Cl 1s→σ^{*} excitation. Intermediate states with one or multiple holes in the shallower core electron shells are generated in the course of the decay cascades. The repulsive character and large gradients of the potential energy surfaces of these intermediates enable ultrafast fragmentation after the absorption of a hard x-ray photon. PMID:27284654

  14. Multistep matrix integrators for real-time simulation

    NASA Technical Reports Server (NTRS)

    De Abreu-Garcia, J. A.; Hartley, T. T.

    1990-01-01

    An explicit linear multistep matrix-integration technique is presented for vector systems of ODEs which employs the stability region placement approach to permit the time-step to be chosen independently of system eigenvalues. Closed-form solutions for the general p-step method and the case where the system matrix has zero eigenvalues are given. It is shown that system mode shapes are preserved over the integration process, and that the technique remains applicable to systems with eigenvalues at their origin without need for computing a matrix inversion.

  15. Performance of "look-ahead" linear multistep methods

    NASA Astrophysics Data System (ADS)

    Mitsui, Taketomo

    2016-06-01

    We are concerned with the initial-value problem of ordinary differential equations (ODEs). LALMM, which stands for "look-ahead" linear multistep methods, is a new class among the discrete variable methods (DVMs) for the problem. Here we refer DVMs to the methods which yield a sequence of approximations yn ≈ y(xn) on the set of points xn+1 = xn + hn (n = 0, 1, 2, …). Along with the look-for value yn+k and the back-values yn, yn+1, …, yn+k-1, we include the look-ahead value yn+k+1 in the stepping mechanism of the method upon the equi-distant step points {xn}. By employing two different linear multistep schemes, we approximate the look-for value with the predictor-corrector iteration. The core issue of numerical analysis of new methods is whether they can perform better than the existing methods. We derived several LALMM schemes of two-step (k = 2) family (LALTM) and examine their performance through test examples of ODEs. We will report their test results by several numerical examples and describe a possible way to overcome their difficulties shown in the examples.

  16. Multigeometry Nanoparticle Engineering via Kinetic Control through Multistep assembly

    NASA Astrophysics Data System (ADS)

    Chen, Yingchao; Wang, Xiaojun; Zhang, Ke; Zhang, Fuwu; Mays, Jimmy; Wooley, Karen; Pochan, Darrin

    2014-03-01

    Organization of block copolymers into complicated multicompartment (MCM) and multigeometry (MGM) nanostructures is of increasing interest. Multistep, co-assembly methods resulting in kinetic control processing was used to produce complex nanoparticles that are not obtained via other assembly methods. Vesicle-cylinder, separate vesicle and cylinder, disk-cylinder, and mixed vesicle nanoparticles were constructed by binary blends of distinct diblock copolymers. Initially, the vesicle former polyacrylic acid-polyisoprene and cylinder former polyacrylic acid-polystyrene which share the same hydrophilic domain but immiscible hydrophobic domain were blended in THF. Secondly, dimaine molecules are added to associate with the common hydrophilic PAA. Importantly, and lastly, by tuning the kinetic addition rate of selective, miscible solvent water, the unlike hydrophobic blocks are kinetically trapped into one particle and eventually nanophase separate to form multiple compartments and multigeometries. The effective bottom-up multistep assembly strategies can be applied in other binary/ternary blends, in which new vesicle-sphere, disk-disk and cylinder-cylinder MCM/MGM nanoparticles were programed. We are grateful for the financial support from the National Science Funding DMR-0906815 (D.J.P. and K.L.W.) and NIST METROLOGY POCHAN 2012.

  17. Adaptation to Vocal Expressions Reveals Multistep Perception of Auditory Emotion

    PubMed Central

    Maurage, Pierre; Rouger, Julien; Latinus, Marianne; Belin, Pascal

    2014-01-01

    The human voice carries speech as well as important nonlinguistic signals that influence our social interactions. Among these cues that impact our behavior and communication with other people is the perceived emotional state of the speaker. A theoretical framework for the neural processing stages of emotional prosody has suggested that auditory emotion is perceived in multiple steps (Schirmer and Kotz, 2006) involving low-level auditory analysis and integration of the acoustic information followed by higher-level cognition. Empirical evidence for this multistep processing chain, however, is still sparse. We examined this question using functional magnetic resonance imaging and a continuous carry-over design (Aguirre, 2007) to measure brain activity while volunteers listened to non-speech-affective vocalizations morphed on a continuum between anger and fear. Analyses dissociated neuronal adaptation effects induced by similarity in perceived emotional content between consecutive stimuli from those induced by their acoustic similarity. We found that bilateral voice-sensitive auditory regions as well as right amygdala coded the physical difference between consecutive stimuli. In contrast, activity in bilateral anterior insulae, medial superior frontal cortex, precuneus, and subcortical regions such as bilateral hippocampi depended predominantly on the perceptual difference between morphs. Our results suggest that the processing of vocal affect recognition is a multistep process involving largely distinct neural networks. Amygdala and auditory areas predominantly code emotion-related acoustic information while more anterior insular and prefrontal regions respond to the abstract, cognitive representation of vocal affect. PMID:24920615

  18. Multi-objective genetic algorithm for the automated planning of a wireless sensor network to monitor a critical facility

    NASA Astrophysics Data System (ADS)

    Jourdan, Damien B.; de Weck, Olivier L.

    2004-09-01

    This paper examines the optimal placement of nodes for a Wireless Sensor Network (WSN) designed to monitor a critical facility in a hostile region. The sensors are dropped from an aircraft, and they must be connected (directly or via hops) to a High Energy Communication Node (HECN), which serves as a relay from the ground to a satellite or a high-altitude aircraft. The sensors are assumed to have fixed communication and sensing ranges. The facility is modeled as circular and served by two roads. This simple model is used to benchmark the performance of the optimizer (a Multi-Objective Genetic Algorithm, or MOGA) in creating WSN designs that provide clear assessments of movements in and out of the facility, while minimizing both the likelihood of sensors being discovered and the number of sensors to be dropped. The algorithm is also tested on two other scenarios; in the first one the WSN must detect movements in and out of a circular area, and in the second one it must cover uniformly a square region. The MOGA is shown again to perform well on those scenarios, which shows its flexibility and possible application to more complex mission scenarios with multiple and diverse targets of observation.

  19. Rapid Identification of Fungi by Using the ITS2 Genetic Region and an Automated Fluorescent Capillary Electrophoresis System

    PubMed Central

    Turenne, Christine Y.; Sanche, Steven E.; Hoban, Daryl J.; Karlowsky, James A.; Kabani, Amin M.

    1999-01-01

    Invasive fungal disease often plays an important role in the morbidity and mortality of immunocompromised patients. The poor sensitivity of current fungal blood culture and histological practices has led to the development of highly sensitive and specific molecular techniques, such as the PCR. Sequence variability of the internal transcribed spacer 2 (ITS2) region of fungi is potentially useful in rapid and accurate diagnosis of clinical fungal isolates. PCR with fungus-specific primers targeted toward conserved sequences of the 5.8S and 28S ribosomal DNA (rDNA) results in amplification of the species-specific ITS2 regions, which are variable in amplicon length. We have made use of the ABI PRISM 310 genetic analyzer and the ABI PRISM 310 GeneScan analysis software for the determination of variable size differences of the ITS2 region of clinically important fungi, including Candida and non-Candida yeasts, Aspergillus species, and a variety of dermatophytes. No cross-reaction occurred when samples were tested against human and bacterial genomic DNA. We have found that most clinically significant fungal isolates can be differentiated by this method, and it therefore serves to be a promising tool for the rapid (<7 h) diagnosis of fungemia and other invasive fungal infections. PMID:10325335

  20. Genetics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The genus Capsicum represents one of several well characterized Solanaceous genera. A wealth of classical and molecular genetics research is available for the genus. Information gleaned from its cultivated relatives, tomato and potato, provide further insight for basic and applied studies. Early ...

  1. Genetics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Maintaining genetic variation in wild populations of Arctic organisms is fundamental to the long-term persistence of high latitude biodiversity. Variability is important because it provides options for species to respond to changing environmental conditions and novel challenges such as emerging path...

  2. Controlled multistep synthesis in a three-phase droplet reactor

    PubMed Central

    Nightingale, Adrian M.; Phillips, Thomas W.; Bannock, James H.; de Mello, John C.

    2014-01-01

    Channel-fouling is a pervasive problem in continuous flow chemistry, causing poor product control and reactor failure. Droplet chemistry, in which the reaction mixture flows as discrete droplets inside an immiscible carrier liquid, prevents fouling by isolating the reaction from the channel walls. Unfortunately, the difficulty of controllably adding new reagents to an existing droplet stream has largely restricted droplet chemistry to simple reactions in which all reagents are supplied at the time of droplet formation. Here we describe an effective method for repeatedly adding controlled quantities of reagents to droplets. The reagents are injected into a multiphase fluid stream, comprising the carrier liquid, droplets of the reaction mixture and an inert gas that maintains a uniform droplet spacing and suppresses new droplet formation. The method, which is suited to many multistep reactions, is applied to a five-stage quantum dot synthesis wherein particle growth is sustained by repeatedly adding fresh feedstock. PMID:24797034

  3. Regioselective multistep reconstructions of half-saturated zigzag carbon nanotubes.

    PubMed

    Wang, Wei-Wei; Dang, Jing-Shuang; Zhao, Xiang; Nagase, Shigeru

    2016-06-01

    The open edge reconstruction of half-saturated (6,0) zigzag carbon nanotube (CNT) was introduced by density functional calculations. The multistep rearrangement was demonstrated as a regioselective process to generate a defective edge with alternating pentagons and heptagons. Not only the thermal stability was found to be enhanced significantly after reconstruction but also the total spin of CNT was proved to be reduced gradually from high-spin septet to close-shell singlet, revealing the critical role of deformed edge on the geometrical and magnetic properties of open-ended CNTs. Kinetically, the initial transformation was confirmed as the rate-determining step with relatively the largest reaction barrier and the following steps can take place spontaneously. © 2016 Wiley Periodicals, Inc. PMID:26940857

  4. The functional basis of c-myc and bcl-2 complementation during multistep lymphomagenesis in vivo.

    PubMed

    Marin, M C; Hsu, B; Stephens, L C; Brisbay, S; McDonnell, T J

    1995-04-01

    Oncogenes are known to be deregulated by chromosomal translocations occurring at high frequency in specific malignancies. Among the most well characterized of these are c-myc, associated with the t(8;14) in Burkitt's lymphomas, and bcl-2, associated with the t(14;18) in follicular lymphomas. In addition to their role in regulating rates of proliferation, it is known that oncogenes and tumor suppressor genes can also regulate rates of apoptotic cell death. The contribution of c-myc and bcl-2 to the regulation of cell death during lymphomagenesis in vivo is assessed using bcl-2-Ig and emu-myc trangenic mice and bcl-2/myc hybrid transgenic mice. Translocations between the endogenous c-myc gene and immunoglobulin loci, e.g., t(12;15), are common in lymphomas arising in the bcl-2-Ig mice. Furthermore, bcl-2/c-myc double transgenic mice exhibit accelerated lymphomagenesis, indicating cooperation between these two oncogenes. Genetic complementation of c-myc and bcl-2 during lymphomagenesis resulted from the suppression of c-myc-associated apoptosis. Other genes are likely involved in regulating cell death during multistep lymphomagenesis. PMID:7698223

  5. Properties of true quaternary fission of nuclei with allowance for its multistep and sequential character

    SciTech Connect

    Kadmensky, S. G. Titova, L. V.; Bulychev, A. O.

    2015-07-15

    An analysis of basicmechanisms of binary and ternary fission of nuclei led to the conclusion that true ternary and quaternary fission of nuclei has a sequential two-step (three-step) character, where, at the first step, a fissile nucleus emits a third light particle (third and fourth light particles) under shakeup effects associated with a nonadiabatic character of its collective deformation motion, whereupon the residual nucleus undergoes fission to two fission fragments. Owing to this, the formulas derived earlier for the widths with respect to sequential two- and three-step decays of nuclei in constructing the theory of two-step twoproton decays and multistep decays in chains of genetically related nuclei could be used to describe the relative yields and angular and energy distributions of third and fourth light particles emitted in (α, α), (t, t), and (α, t) pairs upon the true quaternary spontaneous fission of {sup 252}Cf and thermal-neutron-induced fission of {sup 235}U and {sup 233}U target nuclei. Mechanisms that explain a sharp decrease in the yield of particles appearing second in time and entering into the composition of light-particle pairs that originate from true quaternary fission of nuclei in relation to the yields of analogous particles in true ternary fission of nuclei are proposed.

  6. Comprehensive Control of Networked Control Systems with Multistep Delay

    PubMed Central

    Jiang, Jie

    2014-01-01

    In networked control systems with multi-step delay, long time-delay causes vacant sampling and controller design difficulty. In order to solve the above problems, comprehensive control methods are proposed in this paper. Time-delay compensation control and linear-quadratic-Guassian (LQG) optimal control are adopted and the systems switch different controllers between two different states. LQG optimal controller is used with probability 1 − α in normal state, which is shown to render the systems mean square exponentially stable. Time-delay compensation controller is used with probability α in abnormal state to compensate vacant sampling and long time-delay. In addition, a buffer window is established at the actuator of the systems to store some history control inputs which are used to estimate the control state of present sampling period under the vacant sampling cases. The comprehensive control methods simplify control design which is easier to be implemented in engineering. The performance of the systems is also improved. Simulation results verify the validity of the proposed theory. PMID:25101322

  7. Global model including multistep ionizations in helium plasmas

    NASA Astrophysics Data System (ADS)

    Oh, Seungju; Lee, Hyo-Chang; Chung, Chin-Wook

    2015-09-01

    Particle and power balance equations including stepwise ionizations are derived and solved in helium plasma. In the balance equations, two metastable states (23S1 in singlet and 21S1 triplet) are considered and followings are obtained. The plasma density linearly increases and electron temperature is relatively in constant value against the absorbed power. It is also found that the contribution to multi-step ionization respect to the single-step ionization is in the range of 8% - 23%, as the gas pressure increases from 10 mTorr to 100 mTorr. There has little variation in the collisional energy loss per electron-ion pair created (Ec). These results indicate that the stepwise ionizations are the minor effect in case of the helium plasma compared to argon plasma. This is because that helium gas has very small collisional cross sections and higher inelastic collision threshold energy resulting in the little variations for the collisional energy loss per electron-ion pair created.

  8. Statistical properties of multistep enzyme-mediated reactions

    SciTech Connect

    Nemenman, Ilya; Sinitsyn, Nikolai A; De Ronde, Wiet H; Daniels, Bryan C; Mugler, Andrew

    2008-01-01

    Enzyme-mediated reactions may proceed through multiple intermediate conformational states before creating a final product molecule, and one often wishes to identify such intermediate structures from observations of the product creation. In this paper, we address this problem by solving the chemical master equations for various enzymatic reactions. We devise a perturbation theory analogous to that used in quantum mechanics that allows us to determine the first () and the second (variance) cumulants of the distribution of created product molecules as a function of the substrate concentration and the kinetic rates of the intermediate processes. The mean product flux V=d/dt (or 'dose-response' curve) and the Fano factor F=variance/ are both realistically measurable quantities, and while the mean flux can often appear the same for different reaction types, the Fano factor can be quite different. This suggests both qualitative and quantitative ways to discriminate between different reaction schemes, and we explore this possibility in the context of four sample multistep enzymatic reactions. We argue that measuring both the mean flux and the Fano factor can not only discriminate between reaction types, but can also provide some detailed information about the internal, unobserved kinetic rates, and this can be done without measuring single-molecule transition events.

  9. Solvent recyclability in a multistep direct liquefaction process

    SciTech Connect

    Hetland, M.D.; Rindt, J.R.

    1995-12-31

    Direct liquefaction research at the Energy & Environmental Research Center (EERC) has, for a number of years, concentrated on developing a direct liquefaction process specifically for low-rank coals (LRCs) through the use of hydrogen-donating solvents and solvents similar to coal-derived liquids, the water/gas shift reaction, and lower-severity reaction conditions. The underlying assumption of all of the research was that advantage could be taken of the reactivity and specific qualities of LRCs to produce a tetrahydrofuran (THF)-soluble material that might be easier to upgrade than the soluble residuum produced during direct liquefaction of high-rank coals. A multistep approach was taken to produce the THF-soluble material, consisting of (1) preconversion treatment to prepare the coal for solubilization, (2) solubilization of the coal in the solvent, and (3) polishing to complete solubilization of the remaining material. The product of these three steps can then be upgraded during a traditional hydrotreatment step. The results of the EERC`s research indicated that additional studies to develop this process more fully were justified. Two areas were targeted for further research: (1) determination of the recyclability of the solvent used during solubilization and (2) determination of the minimum severity required for hydrotreatment of the liquid product. The current project was funded to investigate these two areas.

  10. Optimal spread spectrum watermark embedding via a multistep feasibility formulation.

    PubMed

    Altun, H Oktay; Orsdemir, Adem; Sharma, Gaurav; Bocko, Mark F

    2009-02-01

    We consider optimal formulations of spread spectrum watermark embedding where the common requirements of watermarking, such as perceptual closeness of the watermarked image to the cover and detectability of the watermark in the presence of noise and compression, are posed as constraints while one metric pertaining to these requirements is optimized. We propose an algorithmic framework for solving these optimal embedding problems via a multistep feasibility approach that combines projections onto convex sets (POCS) based feasibility watermarking with a bisection parameter search for determining the optimum value of the objective function and the optimum watermarked image. The framework is general and can handle optimal watermark embedding problems with convex and quasi-convex formulations of watermark requirements with assured convergence to the global optimum. The proposed scheme is a natural extension of set-theoretic watermark design and provides a link between convex feasibility and optimization formulations for watermark embedding. We demonstrate a number of optimal watermark embeddings in the proposed framework corresponding to maximal robustness to additive noise, maximal robustness to compression, minimal frequency weighted perceptual distortion, and minimal watermark texture visibility. Experimental results demonstrate that the framework is effective in optimizing the desired characteristic while meeting the constraints. The results also highlight both anticipated and unanticipated competition between the common requirements for watermark embedding. PMID:19131302

  11. Linear multistep methods, particle filtering and sequential Monte Carlo

    NASA Astrophysics Data System (ADS)

    Arnold, Andrea; Calvetti, Daniela; Somersalo, Erkki

    2013-08-01

    Numerical integration is the main bottleneck in particle filter methodologies for dynamic inverse problems to estimate model parameters, initial values, and non-observable components of an ordinary differential equation (ODE) system from partial, noisy observations, because proposals may result in stiff systems which first slow down or paralyze the time integration process, then end up being discarded. The immediate advantage of formulating the problem in a sequential manner is that the integration is carried out on shorter intervals, thus reducing the risk of long integration processes followed by rejections. We propose to solve the ODE systems within a particle filter framework with higher order numerical integrators which can handle stiffness and to base the choice of the variance of the innovation on estimates of the discretization errors. The application of linear multistep methods to particle filters gives a handle on the stability and accuracy of the propagation, and linking the innovation variance to the accuracy estimate helps keep the variance of the estimate as low as possible. The effectiveness of the methodology is demonstrated with a simple ODE system similar to those arising in biochemical applications.

  12. Mesoscopic statistical properties of multistep enzyme-mediated reactions.

    PubMed

    de Ronde, W H; Daniels, B C; Mugler, A; Sinitsyn, N A; Nemenman, I

    2009-09-01

    Enzyme-mediated reactions may proceed through multiple intermediate conformational states before creating a final product molecule, and one often wishes to identify such intermediate structures from observations of the product creation. In this study, the authors address this problem by solving the chemical master equations for various enzymatic reactions. A perturbation theory analogous to that used in quantum mechanics allows the determination of the first (n) and the second (σ2) cumulants of the distribution of created product molecules as a function of the substrate concentration and the kinetic rates of the intermediate processes. The mean product flux V=d(n)/dt (or 'dose-response' curve) and the Fano factor F= σ2/(n) are both realistically measurable quantities, and whereas the mean flux can often appear the same for different reaction types, the Fano factor can be quite different. This suggests both qualitative and quantitative ways to discriminate between different reaction schemes, and the authors explore this possibility in the context of four sample multistep enzymatic reactions. Measuring both the mean flux and the Fano factor can not only discriminate between reaction types, but can also provide some detailed information about the internal, unobserved kinetic rates, and this can be done without measuring single-molecule transition events. PMID:21028932

  13. A Multistep Chaotic Model for Municipal Solid Waste Generation Prediction

    PubMed Central

    Song, Jingwei; He, Jiaying

    2014-01-01

    Abstract In this study, a univariate local chaotic model is proposed to make one-step and multistep forecasts for daily municipal solid waste (MSW) generation in Seattle, Washington. For MSW generation prediction with long history data, this forecasting model was created based on a nonlinear dynamic method called phase-space reconstruction. Compared with other nonlinear predictive models, such as artificial neural network (ANN) and partial least square–support vector machine (PLS-SVM), and a commonly used linear seasonal autoregressive integrated moving average (sARIMA) model, this method has demonstrated better prediction accuracy from 1-step ahead prediction to 14-step ahead prediction assessed by both mean absolute percentage error (MAPE) and root mean square error (RMSE). Max error, MAPE, and RMSE show that chaotic models were more reliable than the other three models. As chaotic models do not involve random walk, their performance does not vary while ANN and PLS-SVM make different forecasts in each trial. Moreover, this chaotic model was less time consuming than ANN and PLS-SVM models. PMID:25125942

  14. Hierarchical multi-step organization during viral capsid assembly.

    PubMed

    Lampel, Ayala; Varenik, Maxim; Regev, Oren; Gazit, Ehud

    2015-12-01

    Formation of the HIV-1 core by the association of capsid proteins is a critical, not fully understood, step in the viral life cycle. Understanding the early stages of the mechanism may improve treatment opportunities. Here, spectroscopic analysis (opacity) is used to follow the kinetics of capsid protein assembly, which shows three stages: a lag phase, followed by a linear increase stage and terminated by a plateau. Adding pre-incubated capsid proteins at the start of the lag phase shortens it and increases the rate of assembly at the linear stage, demonstrating autoacceleration and cooperative assembly. Cryogenic transmission electron microscopy is used to probe structural evolution at these three stages. At the beginning of the lag phase, short tubular assemblies are found alongside micron long tubes. Their elongation continues all throughout the lag phase, at the end of which tubes start to assemble into bundles. Based on these results, we suggest a multi-step self-assembly process including fast nucleation and elongation followed by tubes packing into arrays. PMID:26497114

  15. A Multistep Chaotic Model for Municipal Solid Waste Generation Prediction.

    PubMed

    Song, Jingwei; He, Jiaying

    2014-08-01

    In this study, a univariate local chaotic model is proposed to make one-step and multistep forecasts for daily municipal solid waste (MSW) generation in Seattle, Washington. For MSW generation prediction with long history data, this forecasting model was created based on a nonlinear dynamic method called phase-space reconstruction. Compared with other nonlinear predictive models, such as artificial neural network (ANN) and partial least square-support vector machine (PLS-SVM), and a commonly used linear seasonal autoregressive integrated moving average (sARIMA) model, this method has demonstrated better prediction accuracy from 1-step ahead prediction to 14-step ahead prediction assessed by both mean absolute percentage error (MAPE) and root mean square error (RMSE). Max error, MAPE, and RMSE show that chaotic models were more reliable than the other three models. As chaotic models do not involve random walk, their performance does not vary while ANN and PLS-SVM make different forecasts in each trial. Moreover, this chaotic model was less time consuming than ANN and PLS-SVM models. PMID:25125942

  16. Space station automation study: Automation requirements derived from space manufacturing concepts. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The electroepitaxial process and the Very Large Scale Integration (VLSI) circuits (chips) facilities were chosen because each requires a very high degree of automation, and therefore involved extensive use of teleoperators, robotics, process mechanization, and artificial intelligence. Both cover a raw materials process and a sophisticated multi-step process and are therfore highly representative of the kinds of difficult operation, maintenance, and repair challenges which can be expected for any type of space manufacturing facility. Generic areas were identified which will require significant further study. The initial design will be based on terrestrial state-of-the-art hard automation. One hundred candidate missions were evaluated on the basis of automation portential and availability of meaning ful knowldege. The design requirements and unconstrained design concepts developed for the two missions are presented.

  17. Germline mutations of STR-alleles include multi-step mutations as defined by sequencing of repeat and flanking regions.

    PubMed

    Dauber, Eva-Maria; Kratzer, Adelgunde; Neuhuber, Franz; Parson, Walther; Klintschar, Michael; Bär, Walter; Mayr, Wolfgang R

    2012-05-01

    Well defined estimates of mutation rates are a prerequisite for the use of short tandem repeat (STR-) loci in relationship testing. We investigated 65 isolated genetic inconsistencies, which were observed within 50,796 allelic transfers at 23 STR-loci (ACTBP2 (SE33), CD4, CSF1PO, F13A1, F13B, FES, FGA, vWA, TH01, TPOX, D2S1338, D3S1358, D5S818, D7S820, D8S1132, D8S1179, D12S391, D13S317, D16S539, D17S976, D18S51, D19S433, D21S11) in Caucasoid families residing in Austria and Switzerland. Sequencing data of repeat and flanking regions and the median of all theoretically possible mutational steps showed valuable information to characterise the mutational events with regard to parental origin, change of repeat number (mutational step size) and direction of mutation (losses and gains of repeats). Apart from predominant single-step mutations including one case with a double genetic inconsistency, two double-step and two apparent four-step mutations could be identified. More losses than gains of repeats and more mutations originating from the paternal than the maternal lineage were observed (31 losses, 22 gains, 12 losses or gains and 47 paternal, 11 maternal mutations and 7 unclear of parental origin). The mutation in the paternal germline was 3.3 times higher than in the maternal germline. The results of our study show, that apart from the vast majority of single-step mutations rare multi-step mutations can be observed. Therefore, the interpretation of mutational events should not rigidly be restricted to the shortest possible mutational step, because rare but true multi-step mutations can easily be overlooked, if haplotype analysis is not possible. PMID:21873136

  18. Stability with large step sizes for multistep discretizations of stiff ordinary differential equations

    NASA Technical Reports Server (NTRS)

    Majda, George

    1986-01-01

    One-leg and multistep discretizations of variable-coefficient linear systems of ODEs having both slow and fast time scales are investigated analytically. The stability properties of these discretizations are obtained independent of ODE stiffness and compared. The results of numerical computations are presented in tables, and it is shown that for large step sizes the stability of one-leg methods is better than that of the corresponding linear multistep methods.

  19. Stochastic modeling of biochemical systems with multistep reactions using state-dependent time delay.

    PubMed

    Wu, Qianqian; Tian, Tianhai

    2016-01-01

    To deal with the growing scale of molecular systems, sophisticated modelling techniques have been designed in recent years to reduce the complexity of mathematical models. Among them, a widely used approach is delayed reaction for simplifying multistep reactions. However, recent research results suggest that a delayed reaction with constant time delay is unable to describe multistep reactions accurately. To address this issue, we propose a novel approach using state-dependent time delay to approximate multistep reactions. We first use stochastic simulations to calculate time delay arising from multistep reactions exactly. Then we design algorithms to calculate time delay based on system dynamics precisely. To demonstrate the power of proposed method, two processes of mRNA degradation are used to investigate the function of time delay in determining system dynamics. In addition, a multistep pathway of metabolic synthesis is used to explore the potential of the proposed method to simplify multistep reactions with nonlinear reaction rates. Simulation results suggest that the state-dependent time delay is a promising and accurate approach to reduce model complexity and decrease the number of unknown parameters in the models. PMID:27553753

  20. A new theory for multistep discretizations of stiff ordinary differential equations: Stability with large step sizes

    NASA Technical Reports Server (NTRS)

    Majda, G.

    1985-01-01

    A large set of variable coefficient linear systems of ordinary differential equations which possess two different time scales, a slow one and a fast one is considered. A small parameter epsilon characterizes the stiffness of these systems. A system of o.d.e.s. in this set is approximated by a general class of multistep discretizations which includes both one-leg and linear multistep methods. Sufficient conditions are determined under which each solution of a multistep method is uniformly bounded, with a bound which is independent of the stiffness of the system of o.d.e.s., when the step size resolves the slow time scale, but not the fast one. This property is called stability with large step sizes. The theory presented lets one compare properties of one-leg methods and linear multistep methods when they approximate variable coefficient systems of stiff o.d.e.s. In particular, it is shown that one-leg methods have better stability properties with large step sizes than their linear multistep counter parts. The theory also allows one to relate the concept of D-stability to the usual notions of stability and stability domains and to the propagation of errors for multistep methods which use large step sizes.

  1. Stochastic modeling of biochemical systems with multistep reactions using state-dependent time delay

    PubMed Central

    Wu, Qianqian; Tian, Tianhai

    2016-01-01

    To deal with the growing scale of molecular systems, sophisticated modelling techniques have been designed in recent years to reduce the complexity of mathematical models. Among them, a widely used approach is delayed reaction for simplifying multistep reactions. However, recent research results suggest that a delayed reaction with constant time delay is unable to describe multistep reactions accurately. To address this issue, we propose a novel approach using state-dependent time delay to approximate multistep reactions. We first use stochastic simulations to calculate time delay arising from multistep reactions exactly. Then we design algorithms to calculate time delay based on system dynamics precisely. To demonstrate the power of proposed method, two processes of mRNA degradation are used to investigate the function of time delay in determining system dynamics. In addition, a multistep pathway of metabolic synthesis is used to explore the potential of the proposed method to simplify multistep reactions with nonlinear reaction rates. Simulation results suggest that the state-dependent time delay is a promising and accurate approach to reduce model complexity and decrease the number of unknown parameters in the models. PMID:27553753

  2. Automation or De-automation

    NASA Astrophysics Data System (ADS)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  3. Process automation

    SciTech Connect

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs.

  4. Comprehensive Glycomics of a Multistep Human Brain Tumor Model Reveals Specific Glycosylation Patterns Related to Malignancy

    PubMed Central

    Okada, Kazue; Kimura, Taichi; Piao, Jinhua; Tanaka, Shinya; Shinohara, Yasuro

    2015-01-01

    Cancer cells frequently express glycans at different levels and/or with fundamentally different structures from those expressed by normal cells, and therefore elucidation and manipulation of these glycosylations may provide a beneficial approach to cancer therapy. However, the relationship between altered glycosylation and causal genetic alteration(s) is only partially understood. Here, we employed a unique approach that applies comprehensive glycomic analysis to a previously described multistep tumorigenesis model. Normal human astrocytes were transformed via the serial introduction of hTERT, SV40ER, H-RasV12, and myrAKT, thereby mimicking human brain tumor grades I-IV. More than 160 glycans derived from three major classes of cell surface glycoconjugates (N- and O-glycans on glycoproteins, and glycosphingolipids) were quantitatively explored, and specific glycosylation patterns related to malignancy were systematically identified. The sequential introduction of hTERT, SV40ER, H-RasV12, and myrAKT led to (i) temporal expression of pauci-mannose/mono-antennary type N-glycans and GD3 (hTERT); (ii) switching from ganglio- to globo-series glycosphingolipids and the appearance of Neu5Gc (hTERT and SV40ER); (iii) temporal expression of bisecting GlcNAc residues, α2,6-sialylation, and stage-specific embryonic antigen-4, accompanied by suppression of core 2 O-glycan biosynthesis (hTERT, SV40ER and Ras); and (iv) increased expression of (neo)lacto-series glycosphingolipids and fucosylated N-glycans (hTERT, SV40ER, Ras and AKT). These sequential and transient glycomic alterations may be useful for tumor grade diagnosis and tumor prognosis, and also for the prediction of treatment response. PMID:26132161

  5. Automation pilot

    NASA Technical Reports Server (NTRS)

    1983-01-01

    An important concept of the Action Information Management System (AIMS) approach is to evaluate office automation technology in the context of hands on use by technical program managers in the conduct of human acceptance difficulties which may accompany the transition to a significantly changing work environment. The improved productivity and communications which result from application of office automation technology are already well established for general office environments, but benefits unique to NASA are anticipated and these will be explored in detail.

  6. Automated Urinalysis

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Information from NASA Tech Briefs assisted DiaSys Corporation in the development of the R/S 2000 which automates urinalysis, eliminating most manual procedures. An automatic aspirator is inserted into a standard specimen tube, the "Sample" button is pressed, and within three seconds a consistent amount of urine sediment is transferred to a microscope. The instrument speeds up, standardizes, automates and makes urine analysis safer. Additional products based on the same technology are anticipated.

  7. Modeling the Auto-Ignition of Biodiesel Blends with a Multi-Step Model

    SciTech Connect

    Toulson, Dr. Elisa; Allen, Casey M; Miller, Dennis J; McFarlane, Joanna; Schock, Harold; Lee, Tonghun

    2011-01-01

    There is growing interest in using biodiesel in place of or in blends with petrodiesel in diesel engines; however, biodiesel oxidation chemistry is complicated to directly model and existing surrogate kinetic models are very large, making them computationally expensive. The present study describes a method for predicting the ignition behavior of blends of n-heptane and methyl butanoate, fuels whose blends have been used in the past as a surrogate for biodiesel. The autoignition is predicted using a multistep (8-step) model in order to reduce computational time and make this a viable tool for implementation into engine simulation codes. A detailed reaction mechanism for n-heptane-methyl butanoate blends was used as a basis for validating the multistep model results. The ignition delay trends predicted by the multistep model for the n-heptane-methyl butanoate blends matched well with that of the detailed CHEMKIN model for the majority of conditions tested.

  8. Habitat automation

    NASA Technical Reports Server (NTRS)

    Swab, Rodney E.

    1992-01-01

    A habitat, on either the surface of the Moon or Mars, will be designed and built with the proven technologies of that day. These technologies will be mature and readily available to the habitat designer. We believe an acceleration of the normal pace of automation would allow a habitat to be safer and more easily maintained than would be the case otherwise. This document examines the operation of a habitat and describes elements of that operation which may benefit from an increased use of automation. Research topics within the automation realm are then defined and discussed with respect to the role they can have in the design of the habitat. Problems associated with the integration of advanced technologies into real-world projects at NASA are also addressed.

  9. On the origin of multi-step spin transition behaviour in 1D nanoparticles

    NASA Astrophysics Data System (ADS)

    Chiruta, Daniel; Jureschi, Catalin-Maricel; Linares, Jorge; Dahoo, Pierre Richard; Garcia, Yann; Rotaru, Aurelian

    2015-09-01

    To investigate the spin state switching mechanism in spin crossover (SCO) nanoparticles, a special attention is given to three-step thermally induced SCO behavior in 1D chains. An additional term is included in the standard Ising-like Hamiltonian to account for the border interaction between SCO molecules and its local environment. It is shown that this additional interaction, together with the short range interaction, drives the multi-steps thermal hysteretic behavior in 1D SCO systems. The relation between a polymeric matrix and this particular multi-step SCO phenomenon is discussed accordingly. Finally, the environmental influence on the SCO system's size is analyzed as well.

  10. Surface Modified Particles By Multi-Step Addition And Process For The Preparation Thereof

    SciTech Connect

    Cook, Ronald Lee; Elliott, Brian John; Luebben, Silvia DeVito; Myers, Andrew William; Smith, Bryan Matthew

    2006-01-17

    The present invention relates to a new class of surface modified particles and to a multi-step surface modification process for the preparation of the same. The multi-step surface functionalization process involves two or more reactions to produce particles that are compatible with various host systems and/or to provide the particles with particular chemical reactivities. The initial step comprises the attachment of a small organic compound to the surface of the inorganic particle. The subsequent steps attach additional compounds to the previously attached organic compounds through organic linking groups.

  11. Sunitinib Prolongs Survival in Genetically Engineered Mouse Models of Multistep Lung Carcinogenesis

    PubMed Central

    Gandhi, Leena; McNamara, Kate L.; Li, Danan; Borgman, Christa L.; McDermott, Ultan; Brandstetter, Kathleyn A.; Padera, Robert F.; Chirieac, Lucian R.; Settleman, Jeffrey E.; Wong, Kwok-Kin

    2009-01-01

    Non–small cell lung cancer (NSCLC) has a poor prognosis, with substantial mortality rates even among patients diagnosed with early-stage disease. There are few effective measures to block the development or progression of NSCLC. Antiangiogenic drugs represent a new class of agents targeting multiple aspects of tumor progression, including cell proliferation, invasion, migration, and outgrowth of metastatic deposits. We tested the multitargeted angiogenesis inhibitor sunitinib in a novel endogenous mouse model of NSCLC, which expresses a conditional activating mutation in Kras with or without conditional deletion of Lkb1; both alterations are frequent in human NSCLC. We showed that daily treatment with sunitinib reduced tumor size, caused tumor necrosis, blocked tumor progression, and prolonged median survival in both the metastatic (Lkb1/Kras) and nonmetastatic (Kras) mouse models; median survival was not reached in the nonmetastatic model after 1 year. However, the incidence of local and distant metastases was similar in sunitinib-treated and untreated Lkb1/Kras mice, suggesting that prolonged survival with sunitinib in these mice was due to direct effects on primary tumor growth rather than to inhibition of metastatic progression. These collective results suggest that the use of angiogenesis inhibitors in early-stage disease for prevention of tumor development and growth may have major survival benefits in the setting of NSCLC. PMID:19336729

  12. Automating Finance

    ERIC Educational Resources Information Center

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  13. Automated dispenser

    SciTech Connect

    Hollen, R.M.; Stalnaker, N.D.

    1989-04-06

    An automated dispenser having a conventional pipette attached to an actuating cylinder through a flexible cable for delivering precise quantities of a liquid through commands from remotely located computer software. The travel of the flexible cable is controlled by adjustable stops and a locking shaft. The pipette can be positioned manually or by the hands of a robot. 1 fig.

  14. Second derivative multistep method for solving first-order ordinary differential equations

    NASA Astrophysics Data System (ADS)

    Turki, Mohammed Yousif; Ismail, Fudziah; Senu, Norazak; Ibrahim, Zarina Bibi

    2016-06-01

    In this paper, a new second derivative multistep method was constructed to solve first order ordinary differential equations (ODEs). In particular, we used the new method as a corrector method and 5-steps Adam's Bashforth method as a predictor method to solve first order (ODEs). Numerical results were compared with the existing methods which clearly showed the efficiency of the new method.

  15. Multi-step routes of capuchin monkeys in a laser pointer traveling salesman task.

    PubMed

    Howard, Allison M; Fragaszy, Dorothy M

    2014-09-01

    Prior studies have claimed that nonhuman primates plan their routes multiple steps in advance. However, a recent reexamination of multi-step route planning in nonhuman primates indicated that there is no evidence for planning more than one step ahead. We tested multi-step route planning in capuchin monkeys using a pointing device to "travel" to distal targets while stationary. This device enabled us to determine whether capuchins distinguish the spatial relationship between goals and themselves and spatial relationships between goals and the laser dot, allocentrically. In Experiment 1, two subjects were presented with identical food items in Near-Far (one item nearer to subject) and Equidistant (both items equidistant from subject) conditions with a laser dot visible between the items. Subjects moved the laser dot to the items using a joystick. In the Near-Far condition, one subject demonstrated a bias for items closest to self but the other subject chose efficiently. In the second experiment, subjects retrieved three food items in similar Near-Far and Equidistant arrangements. Both subjects preferred food items nearest the laser dot and showed no evidence of multi-step route planning. We conclude that these capuchins do not make choices on the basis of multi-step look ahead strategies. PMID:24700520

  16. Attention and Multistep Problem Solving in 24-Month-Old Children

    ERIC Educational Resources Information Center

    Carrico, Renee L.

    2013-01-01

    The current study examined the role of increased attentional load in 24 month-old children's multistep problem-solving behavior. Children solved an object-based nonspatial working-memory search task, to which a motor component of varying difficulty was added. Significant disruptions in search performance were observed with the introduction of…

  17. Multistep Synthesis of a Terphenyl Derivative Showcasing the Diels-Alder Reaction

    ERIC Educational Resources Information Center

    Davie, Elizabeth A. Colby

    2015-01-01

    An adaptable multistep synthesis project designed for the culmination of a second-year organic chemistry laboratory course is described. The target compound is a terphenyl derivative that is an intermediate in the synthesis of compounds used in organic light-emitting devices. Students react a conjugated diene with dimethylacetylene dicarboxylate…

  18. Mechanical and Metallurgical Evolution of Stainless Steel 321 in a Multi-step Forming Process

    NASA Astrophysics Data System (ADS)

    Anderson, M.; Bridier, F.; Gholipour, J.; Jahazi, M.; Wanjara, P.; Bocher, P.; Savoie, J.

    2016-04-01

    This paper examines the metallurgical evolution of AISI Stainless Steel 321 (SS 321) during multi-step forming, a process that involves cycles of deformation with intermediate heat treatment steps. The multi-step forming process was simulated by implementing interrupted uniaxial tensile testing experiments. Evolution of the mechanical properties as well as the microstructural features, such as twins and textures of the austenite and martensite phases, was studied as a function of the multi-step forming process. The characteristics of the Strain-Induced Martensite (SIM) were also documented for each deformation step and intermediate stress relief heat treatment. The results indicated that the intermediate heat treatments considerably increased the formability of SS 321. Texture analysis showed that the effect of the intermediate heat treatment on the austenite was minor and led to partial recrystallization, while deformation was observed to reinforce the crystallographic texture of austenite. For the SIM, an Olson-Cohen equation type was identified to analytically predict its formation during the multi-step forming process. The generated SIM was textured and weakened with increasing deformation.

  19. Use of Chiral Oxazolidinones for a Multi-Step Synthetic Laboratory Module

    ERIC Educational Resources Information Center

    Betush, Matthew P.; Murphree, S. Shaun

    2009-01-01

    Chiral oxazolidinone chemistry is used as a framework for an advanced multi-step synthesis lab. The cost-effective and robust preparation of chiral starting materials is presented, as well as the use of chiral auxiliaries in a synthesis scheme that is appropriate for students currently in the second semester of the organic sequence. (Contains 1…

  20. A Multistep Synthesis Featuring Classic Carbonyl Chemistry for the Advanced Organic Chemistry Laboratory

    ERIC Educational Resources Information Center

    Duff, David B.; Abbe, Tyler G.; Goess, Brian C.

    2012-01-01

    A multistep synthesis of 5-isopropyl-1,3-cyclohexanedione is carried out from three commodity chemicals. The sequence involves an aldol condensation, Dieckmann-type annulation, ester hydrolysis, and decarboxylation. No purification is required until after the final step, at which point gravity column chromatography provides the desired product in…

  1. A Multistep Synthesis Incorporating a Green Bromination of an Aromatic Ring

    ERIC Educational Resources Information Center

    Cardinal, Pascal; Greer, Brandon; Luong, Horace; Tyagunova, Yevgeniya

    2012-01-01

    Electrophilic aromatic substitution is a fundamental topic taught in the undergraduate organic chemistry curriculum. A multistep synthesis that includes a safer and greener method for the bromination of an aromatic ring than traditional bromination methods is described. This experiment is multifaceted and can be used to teach students about…

  2. Automated rapid iterative negative geotaxis assay and its use in a genetic screen for modifiers of Aβ(42)-induced locomotor decline in Drosophila.

    PubMed

    Liu, Haiyan; Han, Meng; Li, Qingyi; Zhang, Xiao; Wang, Wen-An; Huang, Fu-De

    2015-10-01

    The negative-geotaxis climbing assay is used to efficiently study aging and neurodegeneration in Drosophila. To make it suitable for large-scale study, a method called the rapid iterative negative geotaxis (RING) assay has been established by simultaneously photographing the climbing of multiple groups of flies when they are manually tapped down in test tubes. Here, we automated the assay by using a well-controlled electric motor to drive the tapping, and a homemade program to analyze the climbing height of flies. Using the automated RING (aRING) assay, we found that the climbing ability of a strain of wild-type flies, males in particular, declined rapidly before day 21 after eclosion, but slowly from day 21 to 35. We also found that the expression of arctic mutant Aβ42 accelerated the age-dependent decline in the climbing ability of flies. Moreover, using aRING, we examined the effect of third chromosome deficiencies on the accelerated locomotor decline in Aβ42-expressing flies, and isolated 7 suppressors and 15 enhancers. PMID:26077703

  3. Automated Sampling Procedures Supported by High Persistence of Bacterial Fecal Indicators and Bacteroidetes Genetic Microbial Source Tracking Markers in Municipal Wastewater during Short-Term Storage at 5°C.

    PubMed

    Mayer, R E; Vierheilig, J; Egle, L; Reischer, G H; Saracevic, E; Mach, R L; Kirschner, A K T; Zessner, M; Sommer, R; Farnleitner, A H

    2015-08-01

    Because of high diurnal water quality fluctuations in raw municipal wastewater, the use of proportional autosampling over a period of 24 h at municipal wastewater treatment plants (WWTPs) to evaluate carbon, nitrogen, and phosphorus removal has become a standard in many countries. Microbial removal or load estimation at municipal WWTPs, however, is still based on manually recovered grab samples. The goal of this study was to establish basic knowledge regarding the persistence of standard bacterial fecal indicators and Bacteroidetes genetic microbial source tracking markers in municipal wastewater in order to evaluate their suitability for automated sampling, as the potential lack of persistence is the main argument against such procedures. Raw and secondary treated wastewater of municipal origin from representative and well-characterized biological WWTPs without disinfection (organic carbon and nutrient removal) was investigated in microcosm experiments at 5 and 21°C with a total storage time of 32 h (including a 24-h autosampling component and an 8-h postsampling phase). Vegetative Escherichia coli and enterococci, as well as Clostridium perfringens spores, were selected as indicators for cultivation-based standard enumeration. Molecular analysis focused on total (AllBac) and human-associated genetic Bacteroidetes (BacHum-UCD, HF183 TaqMan) markers by using quantitative PCR, as well as 16S rRNA gene-based next-generation sequencing. The microbial parameters showed high persistence in both raw and treated wastewater at 5°C under the storage conditions used. Surprisingly, and in contrast to results obtained with treated wastewater, persistence of the microbial markers in raw wastewater was also high at 21°C. On the basis of our results, 24-h autosampling procedures with 5°C storage conditions can be recommended for the investigation of fecal indicators or Bacteroidetes genetic markers at municipal WWTPs. Such autosampling procedures will contribute to better

  4. Automated Sampling Procedures Supported by High Persistence of Bacterial Fecal Indicators and Bacteroidetes Genetic Microbial Source Tracking Markers in Municipal Wastewater during Short-Term Storage at 5°C

    PubMed Central

    Mayer, R. E.; Vierheilig, J.; Egle, L.; Reischer, G. H.; Saracevic, E.; Mach, R. L.; Kirschner, A. K. T.; Zessner, M.; Farnleitner, A. H.

    2015-01-01

    Because of high diurnal water quality fluctuations in raw municipal wastewater, the use of proportional autosampling over a period of 24 h at municipal wastewater treatment plants (WWTPs) to evaluate carbon, nitrogen, and phosphorus removal has become a standard in many countries. Microbial removal or load estimation at municipal WWTPs, however, is still based on manually recovered grab samples. The goal of this study was to establish basic knowledge regarding the persistence of standard bacterial fecal indicators and Bacteroidetes genetic microbial source tracking markers in municipal wastewater in order to evaluate their suitability for automated sampling, as the potential lack of persistence is the main argument against such procedures. Raw and secondary treated wastewater of municipal origin from representative and well-characterized biological WWTPs without disinfection (organic carbon and nutrient removal) was investigated in microcosm experiments at 5 and 21°C with a total storage time of 32 h (including a 24-h autosampling component and an 8-h postsampling phase). Vegetative Escherichia coli and enterococci, as well as Clostridium perfringens spores, were selected as indicators for cultivation-based standard enumeration. Molecular analysis focused on total (AllBac) and human-associated genetic Bacteroidetes (BacHum-UCD, HF183 TaqMan) markers by using quantitative PCR, as well as 16S rRNA gene-based next-generation sequencing. The microbial parameters showed high persistence in both raw and treated wastewater at 5°C under the storage conditions used. Surprisingly, and in contrast to results obtained with treated wastewater, persistence of the microbial markers in raw wastewater was also high at 21°C. On the basis of our results, 24-h autosampling procedures with 5°C storage conditions can be recommended for the investigation of fecal indicators or Bacteroidetes genetic markers at municipal WWTPs. Such autosampling procedures will contribute to better

  5. AXIOME: automated exploration of microbial diversity

    PubMed Central

    2013-01-01

    Background Although high-throughput sequencing of small subunit rRNA genes has revolutionized our understanding of microbial ecosystems, these technologies generate data at depths that benefit from automated analysis. Here we present AXIOME (Automation, eXtension, and Integration Of Microbial Ecology), a highly flexible and extensible management tool for popular microbial ecology analysis packages that promotes reproducibility and customization in microbial research. Findings AXIOME streamlines and manages analysis of small subunit (SSU) rRNA marker data in QIIME and mothur. AXIOME also implements features including the PAired-eND Assembler for Illumina sequences (PANDAseq), non-negative matrix factorization (NMF), multi-response permutation procedures (MRPP), exploring and recovering phylogenetic novelty (SSUnique) and indicator species analysis. AXIOME has a companion graphical user interface (GUI) and is designed to be easily extended to facilitate customized research workflows. Conclusions AXIOME is an actively developed, open source project written in Vala and available from GitHub (http://neufeld.github.com/axiome) and as a Debian package. Axiometic, a GUI companion tool is also freely available (http://neufeld.github.com/axiometic). Given that data analysis has become an important bottleneck for microbial ecology studies, the development of user-friendly computational tools remains a high priority. AXIOME represents an important step in this direction by automating multi-step bioinformatic analyses and enabling the customization of procedures to suit the diverse research needs of the microbial ecology community. PMID:23587322

  6. Production of the Cannibalism Toxin SDP Is a Multistep Process That Requires SdpA and SdpB

    PubMed Central

    Pérez Morales, Tiara G.; Ho, Theresa D.; Liu, Wei-Ting; Dorrestein, Pieter C.

    2013-01-01

    During the early stages of sporulation, a subpopulation of Bacillus subtilis cells secrete toxins that kill their genetically identical siblings in a process termed cannibalism. One of these toxins is encoded by the sdpC gene of the sdpABC operon. The active form of the SDP toxin is a 42-amino-acid peptide with a disulfide bond which is processed from an internal fragment of pro-SdpC. The factors required for the processing of pro-SdpC into mature SDP are not known. We provide evidence that pro-SdpC is secreted via the general secretory pathway and that signal peptide cleavage is a required step in the production of SDP. We also demonstrate that SdpAB are essential to produce mature SDP, which has toxin activity. Our data indicate that SdpAB are not required for secretion, translation, or stability of SdpC. Thus, SdpAB may participate in a posttranslation step in the production of SDP. The mature form of the SDP toxin contains a disulfide bond. Our data indicate that while the disulfide bond does increase activity of SDP, it is not essential for SDP activity. We demonstrate that the disulfide bond is formed independently of SdpAB. Taken together, our data suggest that SDP production is a multistep process and that SdpAB are required for SDP production likely by controlling, directly or indirectly, cleavage of SDP from the pro-SdpC precursor. PMID:23687264

  7. Production of the cannibalism toxin SDP is a multistep process that requires SdpA and SdpB.

    PubMed

    Pérez Morales, Tiara G; Ho, Theresa D; Liu, Wei-Ting; Dorrestein, Pieter C; Ellermeier, Craig D

    2013-07-01

    During the early stages of sporulation, a subpopulation of Bacillus subtilis cells secrete toxins that kill their genetically identical siblings in a process termed cannibalism. One of these toxins is encoded by the sdpC gene of the sdpABC operon. The active form of the SDP toxin is a 42-amino-acid peptide with a disulfide bond which is processed from an internal fragment of pro-SdpC. The factors required for the processing of pro-SdpC into mature SDP are not known. We provide evidence that pro-SdpC is secreted via the general secretory pathway and that signal peptide cleavage is a required step in the production of SDP. We also demonstrate that SdpAB are essential to produce mature SDP, which has toxin activity. Our data indicate that SdpAB are not required for secretion, translation, or stability of SdpC. Thus, SdpAB may participate in a posttranslation step in the production of SDP. The mature form of the SDP toxin contains a disulfide bond. Our data indicate that while the disulfide bond does increase activity of SDP, it is not essential for SDP activity. We demonstrate that the disulfide bond is formed independently of SdpAB. Taken together, our data suggest that SDP production is a multistep process and that SdpAB are required for SDP production likely by controlling, directly or indirectly, cleavage of SDP from the pro-SdpC precursor. PMID:23687264

  8. Automated lithocell

    NASA Astrophysics Data System (ADS)

    Englisch, Andreas; Deuter, Armin

    1990-06-01

    Integration and automation have gained more and more ground in modern IC-manufacturing. It is difficult to make a direct calculation of the profit these investments yield. On the other hand, the demands to man, machine and technology have increased enormously of late; it is not difficult to see that only by means of integration and automation can these demands be coped with. Here are some salient points: U the complexity and costs incurred by the equipment and processes have got significantly higher . owing to the reduction of all dimensions, the tolerances within which the various process steps have to be carried out have got smaller and smaller and the adherence to these tolerances more and more difficult U the cycle time has become more and more important both for the development and control of new processes and, to a great extent, for a rapid and reliable supply to the customer. In order that the products be competitive under these conditions, all sort of costs have to be reduced and the yield has to be maximized. Therefore, the computer-aided control of the equipment and the process combined with an automatic data collection and a real-time SPC (statistical process control) has become absolutely necessary for successful IC-manufacturing. Human errors must be eliminated from the execution of the various process steps by automation. The work time set free in this way makes it possible for the human creativity to be employed on a larger scale in stabilizing the processes. Besides, a computer-aided equipment control can ensure the optimal utilization of the equipment round the clock.

  9. Multi-step wrought processing of TiAl-based alloys

    SciTech Connect

    Fuchs, G.E.

    1997-04-01

    Wrought processing will likely be needed for fabrication of a variety of TiAl-based alloy structural components. Laboratory and development work has usually relied on one-step forging to produce test material. Attempts to scale-up TiAl-based alloy processing has indicated that multi-step wrought processing is necessary. The purpose of this study was to examine potential multi-step processing routes, such as two-step isothermal forging and extrusion + isothermal forging. The effects of processing (I/M versus P/M), intermediate recrystallization heat treatments and processing route on the tensile and creep properties of Ti-48Al-2Nb-2Cr alloys were examined. The results of the testing were then compared to samples from the same heats of materials processed by one-step routes. Finally, by evaluating the effect of processing on microstructure and properties, optimized and potentially lower cost processing routes could be identified.

  10. Progress in applyiong the FKK multistep reaction theory to intermediate-energy data evaluation

    SciTech Connect

    Chadwick, M.B.; Young, P.G.

    1994-07-01

    Recent developments to the physics modeling in the FKK-GNASH code system are reviewed. We describe modifications to include a linking of multistep direct and multistep compound processes, which are important when the incident energy is less than about 30 MeV. A model for multiple preequilibrium emission is given, and compared with experimental measurements of proton reactions on {sup 90}Zr at 160 MeV. We also give some preliminary observations concerning FKK calculations which use both normal and non-normal DWBA matrix elements. We describe the application of the FKK-GNASH code to a range of nuclear data applications, including intermediate energy reactions of importance in the accelerator transmutation of waste, and fast neutron and proton cancer radiation treatment. We outline areas where further work is needed for the accurate modeling of nuclear reactions using the FKK theory.

  11. The discrepancies in multistep damage evolution of yttria-stabilized zirconia irradiated with different ions

    SciTech Connect

    Yang, Tengfei; Taylor, Caitlin A.; Kong, Shuyan; Wang, Chenxu; Zhang, Yanwen; Huang, Xuejun; Xue, Jianming; Yan, Sha; Wang, Yugang

    2013-01-01

    This paper reports a comprehensive investigation of structural damage in yttria-stabilized zirconia irradiated with different ions over a wide fluence range. A similar multistep damage accumulation exists for the irradiations of different ions, but the critical doses for occurrence of second damage step, characterized by a faster increase in damage fraction, and the maximum elastic strain at the first damage step are varied and depend on ion mass. For irradiations of heavier ions, the second damage step occurs at a higher dose with a lower critical elastic strain. Furthermore, larger extended defects were observed in the irradiations of heavy ions at the second damage step. Associated with other experiment results and multistep damage accumulation model, the distinct discrepancies in the damage buildup under irradiations of different ions were interpreted by the effects of electronic excitation, energy of primary knock-on atom and chemistry contributions of deposited ions.

  12. Efficient modularity optimization by multistep greedy algorithm and vertex mover refinement.

    PubMed

    Schuetz, Philipp; Caflisch, Amedeo

    2008-04-01

    Identifying strongly connected substructures in large networks provides insight into their coarse-grained organization. Several approaches based on the optimization of a quality function, e.g., the modularity, have been proposed. We present here a multistep extension of the greedy algorithm (MSG) that allows the merging of more than one pair of communities at each iteration step. The essential idea is to prevent the premature condensation into few large communities. Upon convergence of the MSG a simple refinement procedure called "vertex mover" (VM) is used for reassigning vertices to neighboring communities to improve the final modularity value. With an appropriate choice of the step width, the combined MSG-VM algorithm is able to find solutions of higher modularity than those reported previously. The multistep extension does not alter the scaling of computational cost of the greedy algorithm. PMID:18517695

  13. Region-based multi-step optic disk and cup segmentation from color fundus image

    NASA Astrophysics Data System (ADS)

    Xiao, Di; Lock, Jane; Manresa, Javier Moreno; Vignarajan, Janardhan; Tay-Kearney, Mei-Ling; Kanagasingam, Yogesan

    2013-02-01

    Retinal optic cup-disk-ratio (CDR) is a one of important indicators of glaucomatous neuropathy. In this paper, we propose a novel multi-step 4-quadrant thresholding method for optic disk segmentation and a multi-step temporal-nasal segmenting method for optic cup segmentation based on blood vessel inpainted HSL lightness images and green images. The performance of the proposed methods was evaluated on a group of color fundus images and compared with the manual outlining results from two experts. Dice scores of detected disk and cup regions between the auto and manual results were computed and compared. Vertical CDRs were also compared among the three results. The preliminary experiment has demonstrated the robustness of the method for automatic optic disk and cup segmentation and its potential value for clinical application.

  14. Color-Tunable Resonant Photoluminescence and Cavity-Mediated Multistep Energy Transfer Cascade.

    PubMed

    Okada, Daichi; Nakamura, Takashi; Braam, Daniel; Dao, Thang Duy; Ishii, Satoshi; Nagao, Tadaaki; Lorke, Axel; Nabeshima, Tatsuya; Yamamoto, Yohei

    2016-07-26

    Color-tunable resonant photoluminescence (PL) was attained from polystyrene microspheres doped with a single polymorphic fluorescent dye, boron-dipyrrin (BODIPY) 1. The color of the resonant PL depends on the assembling morphology of 1 in the microspheres, which can be selectively controlled from green to red by the initial concentration of 1 in the preparation process of the microspheres. Studies on intersphere PL propagation with multicoupled microspheres, prepared by micromanipulation technique, revealed that multistep photon transfer takes place through the microspheres, accompanying energy transfer cascade with stepwise PL color change. The intersphere energy transfer cascade is direction selective, where energy donor-to-acceptor down conversion direction is only allowed. Such cavity-mediated long-distance and multistep energy transfer will be advantageous for polymer photonics device application. PMID:27348045

  15. A nonparametric method of multi-step ahead forecasting in diffusion processes

    NASA Astrophysics Data System (ADS)

    Yamamura, Mariko; Shoji, Isao

    2010-06-01

    This paper provides a nonparametric model of multi-step ahead forecasting in diffusion processes. The model is constructed from the local linear model with the Gaussian kernel. The paper provides simulation studies to evaluate its performance of multi-step ahead forecasting by comparing with the global linear model, showing the better forecasting performance of the nonparametric model than the global linear model. The paper also conducts empirical analysis for forecasting using intraday data of the Japanese stock price index and the time series of heart rates. The result shows the performance of forecasting does not differ much in the Japanese stock price index, but that the nonparametric model shows significantly better performance in the analysis of the heart rates.

  16. Teaching multi-step math skills to adults with disabilities via video prompting.

    PubMed

    Kellems, Ryan O; Frandsen, Kaitlyn; Hansen, Blake; Gabrielsen, Terisa; Clarke, Brynn; Simons, Kalee; Clements, Kyle

    2016-11-01

    The purpose of this study was to evaluate the effectiveness of teaching multi-step math skills to nine adults with disabilities in an 18-21 post-high school transition program using a video prompting intervention package. The dependent variable was the percentage of steps completed correctly. The independent variable was the video prompting intervention, which involved several multi-step math calculation skills: (a) calculating a tip (15%), (b) calculating item unit prices, and (c) adjusting a recipe for more or fewer people. Results indicated a functional relationship between the video prompting interventions and prompting package and the percentage of steps completed correctly. 8 out of the 9 adults showed significant gains immediately after receiving the video prompting intervention. PMID:27589151

  17. Optimal generalized multistep integration formulae for real-time digital simulation

    NASA Technical Reports Server (NTRS)

    Moerder, D. D.; Halyo, N.

    1985-01-01

    The problem of discretizing a dynamical system for real-time digital simulation is considered. Treating the system and its simulation as stochastic processes leads to a statistical characterization of simulator fidelity. A plant discretization procedure based on an efficient matrix generalization of explicit linear multistep discrete integration formulae is introduced, which minimizes a weighted sum of the mean squared steady-state and transient error between the system and simulator outputs.

  18. An Undergraduate Organic Chemistry Laboratory Experiment: The Multistep Synthesis of a Modified Nucleoside

    NASA Astrophysics Data System (ADS)

    Delannoy, Peter; Howell, Joseph

    1997-08-01

    We have designed and integrated the multistep synthesis of a modified nucleoside into an undergraduate organic chemistry laboratory. The laboratory was designed as a multidisciplinary approach towards a single synthetic problem. Here we report the synthesis and subsequent purification of 5'-O-dimethoxytrityl-2'-O-methyluridine and 5'-O-dimethoxytrityl-3'-O-methyluridine directly from the literature by a protocol that is appropriate in a small school setting.

  19. Contaminant source and release history identification in groundwater: a multi-step approach.

    PubMed

    Gzyl, G; Zanini, A; Frączek, R; Kura, K

    2014-02-01

    The paper presents a new multi-step approach aiming at source identification and release history estimation. The new approach consists of three steps: performing integral pumping tests, identifying sources, and recovering the release history by means of a geostatistical approach. The present paper shows the results obtained from the application of the approach within a complex case study in Poland in which several areal sources were identified. The investigated site is situated in the vicinity of a former chemical plant in southern Poland in the city of Jaworzno in the valley of the Wąwolnica River; the plant has been in operation since the First World War producing various chemicals. From an environmental point of view the most relevant activity was the production of pesticides, especially lindane. The application of the multi-step approach enabled a significant increase in the knowledge of contamination at the site. Some suspected contamination sources have been proven to have minor effect on the overall contamination. Other suspected sources have been proven to have key significance. Some areas not taken into consideration previously have now been identified as key sources. The method also enabled estimation of the magnitude of the sources and, a list of the priority reclamation actions will be drawn as a result. The multi-step approach has proven to be effective and may be applied to other complicated contamination cases. Moreover, the paper shows the capability of the geostatistical approach to manage a complex real case study. PMID:24365394

  20. Contaminant source and release history identification in groundwater: A multi-step approach

    NASA Astrophysics Data System (ADS)

    Gzyl, G.; Zanini, A.; Frączek, R.; Kura, K.

    2014-02-01

    The paper presents a new multi-step approach aiming at source identification and release history estimation. The new approach consists of three steps: performing integral pumping tests, identifying sources, and recovering the release history by means of a geostatistical approach. The present paper shows the results obtained from the application of the approach within a complex case study in Poland in which several areal sources were identified. The investigated site is situated in the vicinity of a former chemical plant in southern Poland in the city of Jaworzno in the valley of the Wąwolnica River; the plant has been in operation since the First World War producing various chemicals. From an environmental point of view the most relevant activity was the production of pesticides, especially lindane. The application of the multi-step approach enabled a significant increase in the knowledge of contamination at the site. Some suspected contamination sources have been proven to have minor effect on the overall contamination. Other suspected sources have been proven to have key significance. Some areas not taken into consideration previously have now been identified as key sources. The method also enabled estimation of the magnitude of the sources and, a list of the priority reclamation actions will be drawn as a result. The multi-step approach has proven to be effective and may be applied to other complicated contamination cases. Moreover, the paper shows the capability of the geostatistical approach to manage a complex real case study.

  1. Multistep-Ahead Air Passengers Traffic Prediction with Hybrid ARIMA-SVMs Models

    PubMed Central

    Ming, Wei; Xiong, Tao

    2014-01-01

    The hybrid ARIMA-SVMs prediction models have been established recently, which take advantage of the unique strength of ARIMA and SVMs models in linear and nonlinear modeling, respectively. Built upon this hybrid ARIMA-SVMs models alike, this study goes further to extend them into the case of multistep-ahead prediction for air passengers traffic with the two most commonly used multistep-ahead prediction strategies, that is, iterated strategy and direct strategy. Additionally, the effectiveness of data preprocessing approaches, such as deseasonalization and detrending, is investigated and proofed along with the two strategies. Real data sets including four selected airlines' monthly series were collected to justify the effectiveness of the proposed approach. Empirical results demonstrate that the direct strategy performs better than iterative one in long term prediction case while iterative one performs better in the case of short term prediction. Furthermore, both deseasonalization and detrending can significantly improve the prediction accuracy for both strategies, indicating the necessity of data preprocessing. As such, this study contributes as a full reference to the planners from air transportation industries on how to tackle multistep-ahead prediction tasks in the implementation of either prediction strategy. PMID:24723814

  2. Evidence for the multistep nature of in vitro human epithelial cell carcinogenesis

    SciTech Connect

    Rhim, J.S.; Yoo, J.H.; Park, J.H.; Thraves, P.; Salehi, Z.; Dritschilo, A. )

    1990-09-01

    In keeping with the multistep development of human cancer in vivo, a stepwise approach to neoplastic transformation in vitro presents a reasonable strategy. We have recently developed an in vitro multistep model suitable for the study of human epithelial cell carcinogenesis. Upon infection with the adenovirus 12-simian virus 40 hybrid virus, primary human epidermal keratinocytes acquired an indefinite life span in culture but did not undergo malignant conversion. Subsequent addition of Kirsten murine sarcoma virus and human ras oncogene or chemical carcinogens (N-methyl-N{prime}-nitro-N-nitrosoguanidine or 4-nitroquinoline 1-oxide) to these cells induced morphological alterations and the acquisition of neoplastic properties. Subsequently it was found that this line could be transformed neoplastically by a variety of retrovirus-containing H-ras, bas, fes, fms, erbB, and src oncogenes. In addition, we found that the immortalized human epidermal keratinocyte (RHEK-1) line can be transformed neoplastically by exposure to ionizing radiation. Thus, this in vitro system may be useful in studying the interaction of a variety of carcinogenic agents and human epithelial cells. These findings demonstrate the malignant transformation of human primary epithelial cells in culture by the combined action of viruses, oncogenes, chemical carcinogens, or X-ray irradiation and support a multistep process for neoplastic conversion.

  3. Multi-Step Ahead Predictions for Critical Levels in Physiological Time Series.

    PubMed

    ElMoaqet, Hisham; Tilbury, Dawn M; Ramachandran, Satya Krishna

    2016-07-01

    Standard modeling and evaluation methods have been classically used in analyzing engineering dynamical systems where the fundamental problem is to minimize the (mean) error between the real and predicted systems. Although these methods have been applied to multi-step ahead predictions of physiological signals, it is often more important to predict clinically relevant events than just to match these signals. Adverse clinical events, which occur after a physiological signal breaches a clinically defined critical threshold, are a popular class of such events. This paper presents a framework for multi-step ahead predictions of critical levels of abnormality in physiological signals. First, a performance metric is presented for evaluating multi-step ahead predictions. Then, this metric is used to identify personalized models optimized with respect to predictions of critical levels of abnormality. To address the paucity of adverse events, weighted support vector machines and cost-sensitive learning are used to optimize the proposed framework with respect to statistical metrics that can take into account the relative rarity of such events. PMID:27244754

  4. Induction of Pectinase Hyper Production by Multistep Mutagenesis Using a Fungal Isolate--Aspergillus flavipes.

    PubMed

    Akbar, Sabika; Prasuna, R Gyana; Khanam, Rasheeda

    2014-04-01

    Aspergillus flavipes, a slow growing pectinase producing ascomycete, was isolated from soil identified and characterised in the previously done preliminary studies. Optimisation studies revealed that Citrus peel--groundnut oil cake [CG] production media is the best media for production of high levels of pectinase up to 39 U/ml using wild strain of A. flavipes. Strain improvement of this isolated strain for enhancement of pectinase production using multistep mutagenesis procedure is the endeavour of this project. For this, the wild strain of A. flavipes was treated with both physical (UV irradiation) and chemical [Colchicine, Ethidium bromide, H2O2] mutagens to obtain Ist generation mutants. The obtained mutants were assayed and differentiated basing on pectinase productivity. The better pectinase producing strains were further subjected to multistep mutagenesis to attain stability in mutants. The goal of this project was achieved by obtaining the best pectinase secreting mutant, UV80 of 45 U/ml compared to wild strain and sister mutants. This fact was confirmed by quantitatively analysing 3rd generation mutants obtained after multistep mutagenesis. PMID:26563068

  5. Both Automation and Paper.

    ERIC Educational Resources Information Center

    Purcell, Royal

    1988-01-01

    Discusses the concept of a paperless society and the current situation in library automation. Various applications of automation and telecommunications are addressed, and future library automation is considered. Automation at the Monroe County Public Library in Bloomington, Indiana, is described as an example. (MES)

  6. Epigenetic Genes and Emotional Reactivity to Daily Life Events: A Multi-Step Gene-Environment Interaction Study

    PubMed Central

    Pishva, Ehsan; Drukker, Marjan; Viechtbauer, Wolfgang; Decoster, Jeroen; Collip, Dina; van Winkel, Ruud; Wichers, Marieke; Jacobs, Nele; Thiery, Evert; Derom, Catherine; Geschwind, Nicole; van den Hove, Daniel; Lataster, Tineke; Myin-Germeys, Inez; van Os, Jim

    2014-01-01

    Recent human and animal studies suggest that epigenetic mechanisms mediate the impact of environment on development of mental disorders. Therefore, we hypothesized that polymorphisms in epigenetic-regulatory genes impact stress-induced emotional changes. A multi-step, multi-sample gene-environment interaction analysis was conducted to test whether 31 single nucleotide polymorphisms (SNPs) in epigenetic-regulatory genes, i.e. three DNA methyltransferase genes DNMT1, DNMT3A, DNMT3B, and methylenetetrahydrofolate reductase (MTHFR), moderate emotional responses to stressful and pleasant stimuli in daily life as measured by Experience Sampling Methodology (ESM). In the first step, main and interactive effects were tested in a sample of 112 healthy individuals. Significant associations in this discovery sample were then investigated in a population-based sample of 434 individuals for replication. SNPs showing significant effects in both the discovery and replication samples were subsequently tested in three other samples of: (i) 85 unaffected siblings of patients with psychosis, (ii) 110 patients with psychotic disorders, and iii) 126 patients with a history of major depressive disorder. Multilevel linear regression analyses showed no significant association between SNPs and negative affect or positive affect. No SNPs moderated the effect of pleasant stimuli on positive affect. Three SNPs of DNMT3A (rs11683424, rs1465764, rs1465825) and 1 SNP of MTHFR (rs1801131) moderated the effect of stressful events on negative affect. Only rs11683424 of DNMT3A showed consistent directions of effect in the majority of the 5 samples. These data provide the first evidence that emotional responses to daily life stressors may be moderated by genetic variation in the genes involved in the epigenetic machinery. PMID:24967710

  7. Multistep cascade annihilations of dark matter and the Galactic Center excess

    DOE PAGESBeta

    Elor, Gilly; Rodd, Nicholas L.; Slatyer, Tracy R.

    2015-05-26

    If dark matter is embedded in a non-trivial dark sector, it may annihilate and decay to lighter dark-sector states which subsequently decay to the Standard Model. Such scenarios - with annihilation followed by cascading dark-sector decays - can explain the apparent excess GeV gamma-rays identified in the central Milky Way, while evading bounds from dark matter direct detection experiments. Each 'step' in the cascade will modify the observable signatures of dark matter annihilation and decay, shifting the resulting photons and other final state particles to lower energies and broadening their spectra. We explore, in a model-independent way, the effect ofmore » multi-step dark-sector cascades on the preferred regions of parameter space to explain the GeV excess. We find that the broadening effects of multi-step cascades can admit final states dominated by particles that would usually produce too sharply peaked photon spectra; in general, if the cascades are hierarchical (each particle decays to substantially lighter particles), the preferred mass range for the dark matter is in all cases 20-150 GeV. Decay chains that have nearly-degenerate steps, where the products are close to half the mass of the progenitor, can admit much higher DM masses. We map out the region of mass/cross-section parameter space where cascades (degenerate, hierarchical or a combination) can fit the signal, for a range of final states. In the current paper, we study multi-step cascades in the context of explaining the GeV excess, but many aspects of our results are general and can be extended to other applications.« less

  8. Multistep cascade annihilations of dark matter and the Galactic Center excess

    SciTech Connect

    Elor, Gilly; Rodd, Nicholas L.; Slatyer, Tracy R.

    2015-05-26

    If dark matter is embedded in a non-trivial dark sector, it may annihilate and decay to lighter dark-sector states which subsequently decay to the Standard Model. Such scenarios - with annihilation followed by cascading dark-sector decays - can explain the apparent excess GeV gamma-rays identified in the central Milky Way, while evading bounds from dark matter direct detection experiments. Each 'step' in the cascade will modify the observable signatures of dark matter annihilation and decay, shifting the resulting photons and other final state particles to lower energies and broadening their spectra. We explore, in a model-independent way, the effect of multi-step dark-sector cascades on the preferred regions of parameter space to explain the GeV excess. We find that the broadening effects of multi-step cascades can admit final states dominated by particles that would usually produce too sharply peaked photon spectra; in general, if the cascades are hierarchical (each particle decays to substantially lighter particles), the preferred mass range for the dark matter is in all cases 20-150 GeV. Decay chains that have nearly-degenerate steps, where the products are close to half the mass of the progenitor, can admit much higher DM masses. We map out the region of mass/cross-section parameter space where cascades (degenerate, hierarchical or a combination) can fit the signal, for a range of final states. In the current paper, we study multi-step cascades in the context of explaining the GeV excess, but many aspects of our results are general and can be extended to other applications.

  9. Laboratory investigation of borehole breakouts and Multi-step failure model

    NASA Astrophysics Data System (ADS)

    Ruan, Xiao-Ping; Mao, Ji-Zheng; Cui, Zhan-Tao

    1993-05-01

    Based on our experiment of borehole breakouts with a group of sandstone samples described in this paper, a multi-step failure model of borehole breakouts are proposed to quantitatively explain the relationship between the section shape of borehole breakouts and the state of crustal stress. In this model the borehole spalling is not only related to the state of stress at a single point but also the state of stress on its neighboring area. The comparison between the experimental results of borehole breakouts and the calculation results shows a good agreement.

  10. Multi-step loading/unloading experiments that challenge constitutive models of glassy polymers

    NASA Astrophysics Data System (ADS)

    Caruthers, James; Medvedev, Grigori

    2014-03-01

    The mechanical response of glassy polymers depends on the thermal and deformational history, where the resulting relaxation phenomenon remains a significant challenge for constitutive modeling. For strain controlled experiments the stress response is measured during loading/unloading ramps and a constant strain. By judiciously combining the basic steps, a set of multi-step experiments have been designed to challenge existing constitutive models for glassy polymers. A particular example is the ``stress memory'' experiment, i.e. loading through yield, unloading to zero stress, and holding at final strain, where the subsequent evolution of the stress exhibits an overshoot. The observed dependence of the overshoot on the loading strain rate cannot be explained by the models where the relaxation time is a function of stress or strain. Another discriminating multi-step history experiment involves strain accumulation to test the common assumption that the phenomenon of strain hardening is caused by a purely elastic contribution to stress. Experimental results will be presented for a low Tg epoxy system, and the data will be used to critically analyze the predictions of both traditional viscoelastic/viscoplastic constitutive models and a recently developed Stochastic Constitutive Model.

  11. Multistep-shaping control based on the static and dynamic behavior of nonlinear optical torsional micromirror

    NASA Astrophysics Data System (ADS)

    Bai, Cheng; Huang, Jin

    2014-05-01

    Electrostatically driven torsional micromirrors are suitable for optical microelectromechanical systems due to their good dynamic response, low adhesion, and simple structure for large-scale-integrated applications. For these devices, how to eliminate the excessive residual vibration in order to achieve more accurate positioning and faster switching is an important research topic. Because of the known nonlinearity issues, traditional shaping techniques based on linear theories are not suitable for nonlinear torsional micromirrors. In addition, due to the difficulties in calculating energy dissipation, the existing nonlinear command shaping techniques using energy method have neglected the effect of damping. We analyze the static and dynamic behavior of the electrostatically actuated torsional micromirrors. Based on the response of these devices, a multistep-shaping control considering the damping effects and the nonlinearity is proposed. Compared to the conventional closed-loop control, the proposed multistep-shaping control is a feedforward approach which can yield a good enough performance without extra sensors and actuators. Simulation results show that, without changing the system structure, the preshaping input reduces the settling time from 4.3 to 0.97 ms, and the overshoot percentage of the mirror response is decreased from 33.2% to 0.2%.

  12. Multistep Model of Cervical Cancer: Participation of miRNAs and Coding Genes

    PubMed Central

    López, Angelica Judith Granados; López, Jesús Adrián

    2014-01-01

    Aberrant miRNA expression is well recognized as an important step in the development of cancer. Close to 70 microRNAs (miRNAs) have been implicated in cervical cancer up to now, nevertheless it is unknown if aberrant miRNA expression causes the onset of cervical cancer. One of the best ways to address this issue is through a multistep model of carcinogenesis. In the progression of cervical cancer there are three well-established steps to reach cancer that we used in the model proposed here. The first step of the model comprises the gene changes that occur in normal cells to be transformed into immortal cells (CIN 1), the second comprises immortal cell changes to tumorigenic cells (CIN 2), the third step includes cell changes to increase tumorigenic capacity (CIN 3), and the final step covers tumorigenic changes to carcinogenic cells. Altered miRNAs and their target genes are located in each one of the four steps of the multistep model of carcinogenesis. miRNA expression has shown discrepancies in different works; therefore, in this model we include miRNAs recording similar results in at least two studies. The present model is a useful insight into studying potential prognostic, diagnostic, and therapeutic miRNAs. PMID:25192291

  13. Multistep model of cervical cancer: participation of miRNAs and coding genes.

    PubMed

    Granados López, Angelica Judith; López, Jesús Adrián

    2014-01-01

    Aberrant miRNA expression is well recognized as an important step in the development of cancer. Close to 70 microRNAs (miRNAs) have been implicated in cervical cancer up to now, nevertheless it is unknown if aberrant miRNA expression causes the onset of cervical cancer. One of the best ways to address this issue is through a multistep model of carcinogenesis. In the progression of cervical cancer there are three well-established steps to reach cancer that we used in the model proposed here. The first step of the model comprises the gene changes that occur in normal cells to be transformed into immortal cells (CIN 1), the second comprises immortal cell changes to tumorigenic cells (CIN 2), the third step includes cell changes to increase tumorigenic capacity (CIN 3), and the final step covers tumorigenic changes to carcinogenic cells. Altered miRNAs and their target genes are located in each one of the four steps of the multistep model of carcinogenesis. miRNA expression has shown discrepancies in different works; therefore, in this model we include miRNAs recording similar results in at least two studies. The present model is a useful insight into studying potential prognostic, diagnostic, and therapeutic miRNAs. PMID:25192291

  14. Forming Prediction of Magnesium Alloy Sheets using a Continuum Damage Mechanics Multistep Inverse Approach

    SciTech Connect

    Bapanapalli, Satish K.; Nguyen, Ba Nghiep

    2008-06-30

    This paper applies multistep inverse approach using a new method to generate the intermediate configurations to analyze the press forming of magnesium alloys. The developed approach considers a final configuration to be formed from a flat blank sheet. It accounts for a series of intermediate configurations that are estimated based on the initial and final configurations as well as tooling conditions using optimization techniques. The approach is based on the concept of minimization of the surface area of the sheet metal subject to the constraints that the punch and die surfaces are not penetrated. Due to the limited formability of magnesium alloys, it is important to realistically estimate the intermediate configurations so that a damage mechanics approach can be explored to predict damage accumulations that can cause rupture of the sheet during forming. Elastic-plastic constitutive laws are used with the modified Hill’s criterion and deformation theory of plasticity to describe the behavior of AZ31 magnesium alloys. Damage is captured by a damage variable that governs the equivalent stress. A damage-plasticity coupled approach is employed for the integration of the constitutive equations. The computed strain increment from two consecutive intermediate configurations is used to predict the resulting damage accumulations during forming. The continuum damage mechanics multistep inverse approach is applied to predict forming of AZ31 magnesium alloys.

  15. Uniform metal patterning on micromachined 3D surfaces using multistep exposure of UV light

    NASA Astrophysics Data System (ADS)

    Suriadi, Arief; Berauer, Frank; Yasunaga, Akari; Pan, Alfred I.; Vander Plas, Hubert A.

    2002-07-01

    Focal depth limitations prevent use of normal lithography tools and processes on three-dimensional structures. A relatively little known form of uniform metal trace patterning over extreme 3-D structured wafers by a multi-step exposure method, called stitching technology, has recently been developed by Hewlett-Packard Company, with equipment support from the Ultratech Stepper Company, the result of which is being reported in this paper. The basic idea is to slice the metal lines to be patterned into topographic layers that can each be exposed in one step. Patches of patterned metal lines can thus be stitch-ed to one another (thus, the term stitching). Exposure of one photo-resist layer by stitching takes several individual exposures at different focus planes. A patent has been applied for this method on behalf of the Hewlett Packard Company. Results of the present investigation demonstrate the superior uniformity of metal trace pattern over 350-um deep trenches produced by multi-step exposure, as compared to the conventional single-step exposure method, typically used on planar semiconductor wafer. The integrated method offers an enabling technology for patterning of extensive topography typically required for a multitude of MEMS structures and designs, novel interconnect structures as well as advanced packaging applications. The method is simple, accurate and relatively low-cost in comparison with other 3-D exposure techniques available and capable of 3-D structure patterning.

  16. Forming Analysis of AZ31 Magnesium Alloy Sheets by Means of a Multistep Inverse Approach

    SciTech Connect

    Nguyen, Ba Nghiep; Bapanapalli, Satish K.

    2009-04-01

    This paper applies a multi-step inverse approach to predict the forming of AZ31 magnesium alloy sheets. An in-house finite element code named “INAPH”, which implements the inverse approach formulation by Guo et al. (Int. J. Numer. Methods Eng., 30, 1385-1401), has been used for the forming analysis. This inverse approach uses the deformation theory of plasticity and assumes that the deformation is independent of the loading history. Failure during forming is predicted by a stress-based criterion or a forming limit diagram-based criterion. The INAPH predictions have been compared with experimental results of Takuda et al (Journal of Materials Processing Technology, 89-90:135-140) and incremental analysis using ABAQUS. The multi-step inverse analysis has been shown to very quickly and fairly accurately predict stress, plastic strain, thickness distributions and failure locations on deeply drawn parts made of AZ31 magnesium alloy. The capability of INAPH to predict the formability of magnesium alloys has also been demonstrated at various temperatures. As magnesium alloys possess very limited formability at room temperature, and their formability becomes better at higher temperatures (> 100oC), the inverse analysis constitutes an efficient and valuable tool to predict forming of magnesium alloy parts as a function of temperature. In addition, other processing and design parameters such as the initial dimensions, final desired shape, blank holder forces, and friction can be quickly adjusted to assess the forming feasibility.

  17. Pressed Paper-Based Dipstick for Detection of Foodborne Pathogens with Multistep Reactions.

    PubMed

    Park, Juhwan; Shin, Joong Ho; Park, Je-Kyun

    2016-04-01

    This paper presents a pressed paper-based dipstick that enables detection of foodborne pathogens with multistep reactions by exploiting the delayed fluid flow and channel partition formation on nitrocellulose (NC) membrane. Fluid behaviors are easily modified by controlling the amount of pressure and the position of pressed region on the NC membrane. Detection region of the dipstick is optimized by controlling flow rate and delayed time based on Darcy's law. All the reagents required for assay are dried on the NC membrane and they are sequentially rehydrated at the prepartitioned regions when the device is dipped into sample solution. In this manner, multistep reactions can be facilitated by one-step dipping of the dipstick into the sample solution. As a proof of concept, we performed detection of two fatal foodborne pathogens (e.g., Escherichia coli O157:H7 and Salmonella typhimurium) with signal enhancement. In addition, we expanded the utilization of channel partitions by developing a pressed paper-based dipstick into dual detection format. PMID:26977712

  18. Automated External Defibrillator

    MedlinePlus

    ... from the NHLBI on Twitter. What Is an Automated External Defibrillator? An automated external defibrillator (AED) is a portable device that ... Institutes of Health Department of Health and Human Services USA.gov

  19. Workflow automation architecture standard

    SciTech Connect

    Moshofsky, R.P.; Rohen, W.T.

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  20. Automation in Clinical Microbiology

    PubMed Central

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  1. Automation of industrial bioprocesses.

    PubMed

    Beyeler, W; DaPra, E; Schneider, K

    2000-01-01

    The dramatic development of new electronic devices within the last 25 years has had a substantial influence on the control and automation of industrial bioprocesses. Within this short period of time the method of controlling industrial bioprocesses has changed completely. In this paper, the authors will use a practical approach focusing on the industrial applications of automation systems. From the early attempts to use computers for the automation of biotechnological processes up to the modern process automation systems some milestones are highlighted. Special attention is given to the influence of Standards and Guidelines on the development of automation systems. PMID:11092132

  2. Multistep modeling of protein structure: application towards refinement of tyr-tRNA synthetase

    NASA Technical Reports Server (NTRS)

    Srinivasan, S.; Shibata, M.; Roychoudhury, M.; Rein, R.

    1987-01-01

    The scope of multistep modeling (MSM) is expanding by adding a least-squares minimization step in the procedure to fit backbone reconstruction consistent with a set of C-alpha coordinates. The analytical solution of Phi and Psi angles, that fits a C-alpha x-ray coordinate is used for tyr-tRNA synthetase. Phi and Psi angles for the region where the above mentioned method fails, are obtained by minimizing the difference in C-alpha distances between the computed model and the crystal structure in a least-squares sense. We present a stepwise application of this part of MSM to the determination of the complete backbone geometry of the 321 N terminal residues of tyrosine tRNA synthetase to a root mean square deviation of 0.47 angstroms from the crystallographic C-alpha coordinates.

  3. Variation of nanopore diameter along porous anodic alumina channels by multi-step anodization.

    PubMed

    Lee, Kwang Hong; Lim, Xin Yuan; Wai, Kah Wing; Romanato, Filippo; Wong, Chee Cheong

    2011-02-01

    In order to form tapered nanocapillaries, we investigated a method to vary the nanopore diameter along the porous anodic alumina (PAA) channels using multi-step anodization. By anodizing the aluminum in either single acid (H3PO4) or multi-acid (H2SO4, oxalic acid and H3PO4) with increasing or decreasing voltage, the diameter of the nanopore along the PAA channel can be varied systematically corresponding to the applied voltages. The pore size along the channel can be enlarged or shrunken in the range of 20 nm to 200 nm. Structural engineering of the template along the film growth direction can be achieved by deliberately designing a suitable voltage and electrolyte together with anodization time. PMID:21456152

  4. A Multi-Step Assessment Scheme for Seismic Network Site Selection in Densely Populated Areas

    NASA Astrophysics Data System (ADS)

    Plenkers, Katrin; Husen, Stephan; Kraft, Toni

    2015-10-01

    We developed a multi-step assessment scheme for improved site selection during seismic network installation in densely populated areas. Site selection is a complex process where different aspects (seismic background noise, geology, and financing) have to be taken into account. In order to improve this process, we developed a step-wise approach that allows quantifying the quality of a site by using, in addition to expert judgement and test measurements, two weighting functions as well as reference stations. Our approach ensures that the recording quality aimed for is reached and makes different sites quantitatively comparable to each other. Last but not least, it is an easy way to document the decision process, because all relevant parameters are listed, quantified, and weighted.

  5. The solution of Parrondo’s games with multi-step jumps

    NASA Astrophysics Data System (ADS)

    Saakian, David B.

    2016-04-01

    We consider the general case of Parrondo’s games, when there is a finite probability to stay in the current state as well as multi-step jumps. We introduce a modification of the model: the transition probabilities between different games depend on the choice of the game in the previous round. We calculate the rate of capital growth as well as the variance of the distribution, following large deviation theory. The modified model allows higher capital growth rates than in standard Parrondo games for the range of parameters considered in the key articles about these games, and positive capital growth is possible for a much wider regime of parameters of the model.

  6. Multistep electrochemical deposition of hierarchical platinum alloy counter electrodes for dye-sensitized solar cells

    NASA Astrophysics Data System (ADS)

    Zhang, Junjun; Ma, Mingming; Tang, Qunwei; Yu, Liangmin

    2016-01-01

    The preferred platinum counter electrode (CE) has been a burden for commercialization of dye-sensitized solar cell (DSSC) due to high expense and chemical corrosion by liquid electrolyte. In the current study, we have successfully realized the multistep deposition of platinum alloy CEs including PtNi, PtFe, and PtCo for liquid-junction DSSC applications. The preliminary results demonstrate that the enhanced electrochemical activities are attributable to high charge-transfer ability and matching work functions of the PtM (M = Ni, Fe, Co) alloy CEs to redox potential of I-/I3- electrolyte. The resultant DSSCs yield impressive power conversion efficiencies of 8.65%, 7.48%, and 7.08% with PtNi, PtFe, and PtCo CEs, respectively. On behalf of the competitive reactions between transition metals with liquid electrolyte, the PtM alloy CEs display enhanced long-term stability.

  7. Multistep Optimization of Composite Drive Shaft Subject to Strength, Buckling, Vibration and Manufacturing Constraints

    NASA Astrophysics Data System (ADS)

    Cherniaev, Aleksandr; Komarov, Valeriy

    2014-09-01

    Composite drive shafts are extensively used in automotive and aeronautical applications due to lightweight combined with exceptional strength and stiffness. Complexity of the drive shaft design problem associated with the need to determine rational values for multiple parameters characterizing composite material (fiber orientation angles, stacking sequence and ply thicknesses), as well as with the fact that multiple conflicting design constraints should be considered simultaneously. In this paper we approach this problem considering carbon/epoxy drive shaft design as a multistep optimization process. It includes the following steps: 1) determination of fiber orientation angles and laminate stacking sequence based on analysis of loading conditions and analytical expressions predicting buckling load and minimal natural frequency of idealized drive shaft; 2) finding rational ply thicknesses using formal optimization procedure utilizing response surface approximations and gradient-based optimization algorithm; and 3) verification analysis of the optimized configuration with the use of nonlinear buckling analysis to ensure satisfaction of stability constraint.

  8. Multistep Optimization of Composite Drive Shaft Subject to Strength, Buckling, Vibration and Manufacturing Constraints

    NASA Astrophysics Data System (ADS)

    Cherniaev, Aleksandr; Komarov, Valeriy

    2015-10-01

    Composite drive shafts are extensively used in automotive and aeronautical applications due to lightweight combined with exceptional strength and stiffness. Complexity of the drive shaft design problem associated with the need to determine rational values for multiple parameters characterizing composite material (fiber orientation angles, stacking sequence and ply thicknesses), as well as with the fact that multiple conflicting design constraints should be considered simultaneously. In this paper we approach this problem considering carbon/epoxy drive shaft design as a multistep optimization process. It includes the following steps: 1) determination of fiber orientation angles and laminate stacking sequence based on analysis of loading conditions and analytical expressions predicting buckling load and minimal natural frequency of idealized drive shaft; 2) finding rational ply thicknesses using formal optimization procedure utilizing response surface approximations and gradient-based optimization algorithm; and 3) verification analysis of the optimized configuration with the use of nonlinear buckling analysis to ensure satisfaction of stability constraint.

  9. Structure of magnesium alloy MA14 after multistep isothermal forging and subsequent isothermal rolling

    NASA Astrophysics Data System (ADS)

    Nugmanov, D. R.; Sitdikov, O. Sh.; Markushev, M. V.

    2015-10-01

    Optical metallography and electron microscopy have been used to analyze the structural changes in magnesium MA14 alloy subjected to processing that combines multistep isothermal forging and isothermal rolling. It has been found that forging of a bulk workpiece leads to the formation of a structure, 85-90% of which consists of recrystallized grains with an average size of less than 5 µm. Subsequent rolling results in a completely recrystallized structure with a grain size of 1-2 µm. It is shown that the resultant structural states are characterized by grain size nonuniformity inherited from the initial hot-pressed semi-finished product. The nature and features of crystallization processes that take place in the alloy during processing are discussed.

  10. Automated Design of Quantum Circuits

    NASA Technical Reports Server (NTRS)

    Williams, Colin P.; Gray, Alexander G.

    2000-01-01

    In order to design a quantum circuit that performs a desired quantum computation, it is necessary to find a decomposition of the unitary matrix that represents that computation in terms of a sequence of quantum gate operations. To date, such designs have either been found by hand or by exhaustive enumeration of all possible circuit topologies. In this paper we propose an automated approach to quantum circuit design using search heuristics based on principles abstracted from evolutionary genetics, i.e. using a genetic programming algorithm adapted specially for this problem. We demonstrate the method on the task of discovering quantum circuit designs for quantum teleportation. We show that to find a given known circuit design (one which was hand-crafted by a human), the method considers roughly an order of magnitude fewer designs than naive enumeration. In addition, the method finds novel circuit designs superior to those previously known.

  11. Multicomponent and multistep radioactive decay modeling module for groundwater flow and contaminant transport computer code

    NASA Astrophysics Data System (ADS)

    Kharkhordin, I. L.

    2013-12-01

    Correct calculations of multistep radioactive decay is important for radionuclide transport forecast at contaminated sites and designing radionuclide storage facilities as well as for a number applications of natural radioactive tracers for understanding of groundwater flow in complex hydrogeological systems. Radioactive chains can involves a number of branches with certain probabilities of decay and up to fourteen steps. General description of radioactive decay in complex system could be presented as a system of linear differential equations. Numerical solution of this system encounters a difficulties connected with wide rage of radioactive decay constants variations. In present work the database with 1253 records of radioactive isotope decay parameters for 97 elements was created. An algorithm of analytical solution construction and solving was elaborated for arbitrary radioactive isotope system taking into account the possible chain branching and connection. The algorithm is based on radionuclide decay graphs. The main steps of algorithm is as follows: a) searching of all possible isotopes in database, creation full isotope list; b) looking for main parent isotopes; c) construction of all possible radioactive chains; d) looking for branching and connections in decay chains, marking of links as primary (left chain in graph for main parent isotope), secondary (after connection), and recurring (before branching); e) construction and calculation the coefficients for analytical solutions. The developed computer code was tested on a few simple systems like follows: Cs-135 - one step decay, Sr-90 (Y-90) - two steps decay, U-238+U-235 mixture - complex decay with branching. Calculation of radiogenic He-4 is also possible witch could be important application for groundwater flow and transport model calibration using natural tracers. The computer code for multistep radioactive calculation was elaborated for incorporation into NIMFA code. NIMFA is a parallel computer code

  12. Star sub-pixel centroid calculation based on multi-step minimum energy difference method

    NASA Astrophysics Data System (ADS)

    Wang, Duo; Han, YanLi; Sun, Tengfei

    2013-09-01

    The star's centroid plays a vital role in celestial navigation, star images which be gotten during daytime, due to the strong sky background, have a low SNR, and the star objectives are nearly submerged in the background, takes a great trouble to the centroid localization. Traditional methods, such as a moment method, weighted centroid calculation method is simple but has a big error, especially in the condition of a low SNR. Gaussian method has a high positioning accuracy, but the computational complexity. Analysis of the energy distribution in star image, a location method for star target centroids based on multi-step minimum energy difference is proposed. This method uses the linear superposition to narrow the centroid area, in the certain narrow area uses a certain number of interpolation to pixels for the pixels' segmentation, and then using the symmetry of the stellar energy distribution, tentatively to get the centroid position: assume that the current pixel is the star centroid position, and then calculates and gets the difference of the sum of the energy which in the symmetric direction(in this paper we take the two directions of transverse and longitudinal) and the equal step length(which can be decided through different conditions, the paper takes 9 as the step length) of the current pixel, and obtain the centroid position in this direction when the minimum difference appears, and so do the other directions, then the validation comparison of simulated star images, and compare with several traditional methods, experiments shows that the positioning accuracy of the method up to 0.001 pixel, has good effect to calculate the centroid of low SNR conditions; at the same time, uses this method on a star map which got at the fixed observation site during daytime in near-infrared band, compare the results of the paper's method with the position messages which were known of the star, it shows that :the multi-step minimum energy difference method achieves a better

  13. Cross-cultural adaptation of instruments assessing breastfeeding determinants: a multi-step approach

    PubMed Central

    2014-01-01

    Background Cross-cultural adaptation is a necessary process to effectively use existing instruments in other cultural and language settings. The process of cross-culturally adapting, including translation, of existing instruments is considered a critical set to establishing a meaningful instrument for use in another setting. Using a multi-step approach is considered best practice in achieving cultural and semantic equivalence of the adapted version. We aimed to ensure the content validity of our instruments in the cultural context of KwaZulu-Natal, South Africa. Methods The Iowa Infant Feeding Attitudes Scale, Breastfeeding Self-Efficacy Scale-Short Form and additional items comprise our consolidated instrument, which was cross-culturally adapted utilizing a multi-step approach during August 2012. Cross-cultural adaptation was achieved through steps to maintain content validity and attain semantic equivalence in the target version. Specifically, Lynn’s recommendation to apply an item-level content validity index score was followed. The revised instrument was translated and back-translated. To ensure semantic equivalence, Brislin’s back-translation approach was utilized followed by the committee review to address any discrepancies that emerged from translation. Results Our consolidated instrument was adapted to be culturally relevant and translated to yield more reliable and valid results for use in our larger research study to measure infant feeding determinants effectively in our target cultural context. Conclusions Undertaking rigorous steps to effectively ensure cross-cultural adaptation increases our confidence that the conclusions we make based on our self-report instrument(s) will be stronger. In this way, our aim to achieve strong cross-cultural adaptation of our consolidated instruments was achieved while also providing a clear framework for other researchers choosing to utilize existing instruments for work in other cultural, geographic and population

  14. Synthesis of 10-Ethyl Flavin: A Multistep Synthesis Organic Chemistry Laboratory Experiment for Upper-Division Undergraduate Students

    ERIC Educational Resources Information Center

    Sichula, Vincent A.

    2015-01-01

    A multistep synthesis of 10-ethyl flavin was developed as an organic chemistry laboratory experiment for upper-division undergraduate students. Students synthesize 10-ethyl flavin as a bright yellow solid via a five-step sequence. The experiment introduces students to various hands-on experimental organic synthetic techniques, such as column…

  15. Synthesis of Frontalin, the Aggregation Pheromone of the Southern Pine Beetle: A Multistep Organic Synthesis for Undergraduate Students.

    ERIC Educational Resources Information Center

    Bartlett, Paul A.; And Others

    1984-01-01

    Background information and experimental procedures are provided for the multistep synthesis of frontalin. The experiment exposes students to a range of practical laboratory problems and important synthetic reactions and provides experiences in working on a medium-size, as well as a relatively small-size scale. (JN)

  16. Synthesis of Two Local Anesthetics from Toluene: An Organic Multistep Synthesis in a Project-Oriented Laboratory Course

    ERIC Educational Resources Information Center

    Demare, Patricia; Regla, Ignacio

    2012-01-01

    This article describes one of the projects in the advanced undergraduate organic chemistry laboratory course concerning the synthesis of two local anesthetic drugs, prilocaine and benzocaine, with a common three-step sequence starting from toluene. Students undertake, in a several-week independent project, the multistep synthesis of a…

  17. A Multistep Organocatalysis Experiment for the Undergraduate Organic Laboratory: An Enantioselective Aldol Reaction Catalyzed by Methyl Prolinamide

    ERIC Educational Resources Information Center

    Wade, Edmir O.; Walsh, Kenneth E.

    2011-01-01

    In recent years, there has been an explosion of research concerning the area of organocatalysis. A multistep capstone laboratory project that combines traditional reactions frequently found in organic laboratory curriculums with this new field of research is described. In this experiment, the students synthesize a prolinamide-based organocatalyst…

  18. Recent advances in molecular genetics of melanoma progression: implications for diagnosis and treatment.

    PubMed

    Yeh, Iwei

    2016-01-01

    According to the multi-step carcinogenesis model of cancer, initiation results in a benign tumor and subsequent genetic alterations lead to tumor progression and the acquisition of the hallmarks of cancer. This article will review recent discoveries in our understanding of initiation and progression in melanocytic neoplasia and the impact on diagnostic dermatopathology. PMID:27408703

  19. Recent advances in molecular genetics of melanoma progression: implications for diagnosis and treatment

    PubMed Central

    Yeh, Iwei

    2016-01-01

    According to the multi-step carcinogenesis model of cancer, initiation results in a benign tumor and subsequent genetic alterations lead to tumor progression and the acquisition of the hallmarks of cancer. This article will review recent discoveries in our understanding of initiation and progression in melanocytic neoplasia and the impact on diagnostic dermatopathology. PMID:27408703

  20. Automated DNA Sequencing System

    SciTech Connect

    Armstrong, G.A.; Ekkebus, C.P.; Hauser, L.J.; Kress, R.L.; Mural, R.J.

    1999-04-25

    Oak Ridge National Laboratory (ORNL) is developing a core DNA sequencing facility to support biological research endeavors at ORNL and to conduct basic sequencing automation research. This facility is novel because its development is based on existing standard biology laboratory equipment; thus, the development process is of interest to the many small laboratories trying to use automation to control costs and increase throughput. Before automation, biology Laboratory personnel purified DNA, completed cycle sequencing, and prepared 96-well sample plates with commercially available hardware designed specifically for each step in the process. Following purification and thermal cycling, an automated sequencing machine was used for the sequencing. A technician handled all movement of the 96-well sample plates between machines. To automate the process, ORNL is adding a CRS Robotics A- 465 arm, ABI 377 sequencing machine, automated centrifuge, automated refrigerator, and possibly an automated SpeedVac. The entire system will be integrated with one central controller that will direct each machine and the robot. The goal of this system is to completely automate the sequencing procedure from bacterial cell samples through ready-to-be-sequenced DNA and ultimately to completed sequence. The system will be flexible and will accommodate different chemistries than existing automated sequencing lines. The system will be expanded in the future to include colony picking and/or actual sequencing. This discrete event, DNA sequencing system will demonstrate that smaller sequencing labs can achieve cost-effective the laboratory grow.

  1. Multistep nature of X-ray-induced neoplastic transformation in golden hamster embryo cells: expression of transformed phenotypes and stepwise changes in karyotypes

    SciTech Connect

    Suzuki, K.; Suzuki, F.; Watanabe, M.; Nikaido, O.

    1989-04-15

    We have examined the expression of transformed phenotypes and genetic changes associated with the expression of each transformed phenotype after X-ray irradiation. Unirradiated cells grown at a constant growth rate until 8 passages (population doubling number, 15) exhibited little morphological change and ceased to divide thereafter. X-irradiated cells escaped from senescence and showed morphological alteration and anchorage independence after a population doubling number of 20. The acquisition of tumorigenicity in nude mice was observed much later (35 population doublings after irradiation). From cytogenetic analysis, all anchorage-independent clones were consistently found to have trisomy of chromosome 7. Furthermore, cells derived from tumors contained three copies of chromosome 9q in addition to the trisomy of chromosome 7. We have not detected any augmented expression of v-Ha-ras- and v-myc-related oncogenes with RNA dot-blot analysis and could not find activation of any type of oncogenes by NIH3T3 transfection experiments. Our studies demonstrated that X-ray-induced neoplastic transformation is a multistep phenomenon and that the numerical change of specific chromosomes may play an important role in the expression of each transformed phenotype. The results suggest that different endogenous oncogenes, other than the ras gene family and myc oncogene, could be responsible for the progressive nature of neoplastic transformation.

  2. Laboratory Automation and Middleware.

    PubMed

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory. PMID:26065792

  3. Management Planning for Workplace Automation.

    ERIC Educational Resources Information Center

    McDole, Thomas L.

    Several factors must be considered when implementing office automation. Included among these are whether or not to automate at all, the effects of automation on employees, requirements imposed by automation on the physical environment, effects of automation on the total organization, and effects on clientele. The reasons behind the success or…

  4. Automated DNA extraction from pollen in honey.

    PubMed

    Guertler, Patrick; Eicheldinger, Adelina; Muschler, Paul; Goerlich, Ottmar; Busch, Ulrich

    2014-04-15

    In recent years, honey has become subject of DNA analysis due to potential risks evoked by microorganisms, allergens or genetically modified organisms. However, so far, only a few DNA extraction procedures are available, mostly time-consuming and laborious. Therefore, we developed an automated DNA extraction method from pollen in honey based on a CTAB buffer-based DNA extraction using the Maxwell 16 instrument and the Maxwell 16 FFS Nucleic Acid Extraction System, Custom-Kit. We altered several components and extraction parameters and compared the optimised method with a manual CTAB buffer-based DNA isolation method. The automated DNA extraction was faster and resulted in higher DNA yield and sufficient DNA purity. Real-time PCR results obtained after automated DNA extraction are comparable to results after manual DNA extraction. No PCR inhibition was observed. The applicability of this method was further successfully confirmed by analysis of different routine honey samples. PMID:24295710

  5. Multi-step regionalization technique and regional model validation for climate studies

    NASA Astrophysics Data System (ADS)

    Argüeso, D.; Hidalgo-Muñoz, J. M.; Calandria-Hernández, D.; Gámiz-Fortis, S. R.; Esteban-Parra, M. J.; Castro-Díez, Y.

    2010-09-01

    A regionalization procedure is proposed to define affinity regions in Andalusia (Southern Spain) regarding maximum and minimum temperature, and precipitation in order to validate a regional climate model (WRF). In situ observations are not suitable for model validation unless they are somehow upscaled. Therefore, a regionalization methodology was adopted to overcome the representation error that arises from the spatial scale disagreement between site-specific observations and model outputs. An observational daily dataset that comprises 412 rain gauges and 120 maximum and minimum temperature series all over Andalusia was used. The observations covered a 10-year period ranging from 1990 to 1999 with no more than 10% of missing values. The original dataset composed by 716 series for precipitation and 243 for temperature were employed to fill the gaps using a correlation method. Precipitation and temperature have been processed separately using the multi-step regionalization methodology formed by three main stages. Firstly, a S-Mode Principal Component Analysis (PCA) was applied to the correlation matrix obtained from daily values to retain principal modes of variability and discard possible information redundancy. Secondly, rotated normalized loadings were used to classify the stations via an agglomerative Clustering Analysis (CA) method to set the number of regions and the centroids associated to those regions. Finally, using the centroids calculated in the previous step and once the appropriate number of regions was identified, a non-hierarchical k-means algorithm was applied to obtain the definitive climate division of Andalusia. The combination of methods attempts to take advantage of their benefits and eliminate their shortcomings when used individually. This multi-step methodology achieves a noticeable reduction of subjectivity in the regionalization process. Furthermore, it is a methodology only based on the data analyzed to perform the regionalization with no

  6. Development of Multistep and Degenerate Variational Integrators for Applications in Plasma Physics

    NASA Astrophysics Data System (ADS)

    Ellison, Charles Leland

    Geometric integrators yield high-fidelity numerical results by retaining conservation laws in the time advance. A particularly powerful class of geometric integrators is symplectic integrators, which are widely used in orbital mechanics and accelerator physics. An important application presently lacking symplectic integrators is the guiding center motion of magnetized particles represented by non-canonical coordinates. Because guiding center trajectories are foundational to many simulations of magnetically confined plasmas, geometric guiding center algorithms have high potential for impact. The motivation is compounded by the need to simulate long-pulse fusion devices, including ITER, and opportunities in high performance computing, including the use of petascale resources and beyond. This dissertation uses a systematic procedure for constructing geometric integrators --- known as variational integration --- to deliver new algorithms for guiding center trajectories and other plasma-relevant dynamical systems. These variational integrators are non-trivial because the Lagrangians of interest are degenerate - the Euler-Lagrange equations are first-order differential equations and the Legendre transform is not invertible. The first contribution of this dissertation is that variational integrators for degenerate Lagrangian systems are typically multistep methods. Multistep methods admit parasitic mode instabilities that can ruin the numerical results. These instabilities motivate the second major contribution: degenerate variational integrators. By replicating the degeneracy of the continuous system, degenerate variational integrators avoid parasitic mode instabilities. The new methods are therefore robust geometric integrators for degenerate Lagrangian systems. These developments in variational integration theory culminate in one-step degenerate variational integrators for non-canonical magnetic field line flow and guiding center dynamics. The guiding center integrator

  7. Automated drilling draws interest

    SciTech Connect

    Not Available

    1985-05-01

    Interest in subsea technology includes recent purchase of both a British yard and Subsea Technology, a Houston-based BOP manufacturer. In France, key personnel from the former Comex Industries have been acquired and a base reinstalled in Marseille. ACB is also investing heavily, with the Norwegians, in automated drilling programs. These automated drilling programs are discussed.

  8. Library Automation Style Guide.

    ERIC Educational Resources Information Center

    Gaylord Bros., Liverpool, NY.

    This library automation style guide lists specific terms and names often used in the library automation industry. The terms and/or acronyms are listed alphabetically and each is followed by a brief definition. The guide refers to the "Chicago Manual of Style" for general rules, and a notes section is included for the convenience of individual…

  9. Automation and Cataloging.

    ERIC Educational Resources Information Center

    Furuta, Kenneth; And Others

    1990-01-01

    These three articles address issues in library cataloging that are affected by automation: (1) the impact of automation and bibliographic utilities on professional catalogers; (2) the effect of the LASS microcomputer software on the cost of authority work in cataloging at the University of Arizona; and (3) online subject heading and classification…

  10. Planning for Office Automation.

    ERIC Educational Resources Information Center

    Sherron, Gene T.

    1982-01-01

    The steps taken toward office automation by the University of Maryland are described. Office automation is defined and some types of word processing systems are described. Policies developed in the writing of a campus plan are listed, followed by a section on procedures adopted to implement the plan. (Author/MLW)

  11. The Automated Office.

    ERIC Educational Resources Information Center

    Naclerio, Nick

    1979-01-01

    Clerical personnel may be able to climb career ladders as a result of office automation and expanded job opportunities in the word processing area. Suggests opportunities in an automated office system and lists books and periodicals on word processing for counselors and teachers. (MF)

  12. Work and Programmable Automation.

    ERIC Educational Resources Information Center

    DeVore, Paul W.

    A new industrial era based on electronics and the microprocessor has arrived, an era that is being called intelligent automation. Intelligent automation, in the form of robots, replaces workers, and the new products, using microelectronic devices, require significantly less labor to produce than the goods they replace. The microprocessor thus…

  13. Automation, Manpower, and Education.

    ERIC Educational Resources Information Center

    Rosenberg, Jerry M.

    Each group in our population will be affected by automation and other forms of technological advancement. This book seeks to identify the needs of these various groups, and to present ways in which educators can best meet them. The author corrects certain prevalent misconceptions concerning manpower utilization and automation. Based on the…

  14. Spontaneous formation of the unlocked state of the ribosome is a multistep process

    PubMed Central

    Munro, James B.; Altman, Roger B.; Tung, Chang-Shung; Cate, Jamie H. D.; Sanbonmatsu, Kevin Y.; Blanchard, Scott C.

    2010-01-01

    The mechanism of substrate translocation through the ribosome is central to the rapid and faithful translation of mRNA into proteins. The rate-limiting step in translocation is an unlocking process that includes the formation of an “unlocked” intermediate state, which requires the convergence of large-scale conformational events within the ribosome including tRNA hybrid states formation, closure of the ribosomal L1 stalk domain, and subunit ratcheting. Here, by imaging of the pretranslocation ribosome complex from multiple structural perspectives using two- and three-color single-molecule fluorescence resonance energy transfer, we observe that tRNA hybrid states formation and L1 stalk closure, events central to the unlocking mechanism, are not tightly coupled. These findings reveal that the unlocked state is achieved through a stochastic-multistep process, where the extent of conformational coupling depends on the nature of tRNA substrates. These data suggest that cellular mechanisms affecting the coupling of conformational processes on the ribosome may regulate the process of translation elongation. PMID:20018653

  15. Enzyme-Instructed Self-Assembly: A Multistep Process for Potential Cancer Therapy

    PubMed Central

    2015-01-01

    The central dogma of the action of current anticancer drugs is that the drug tightly binds to its molecular target for inhibition. The reliance on tight ligand–receptor binding, however, is also the major root of drug resistance in cancer therapy. In this article, we highlight enzyme-instructed self-assembly (EISA)—the integration of enzymatic transformation and molecular self-assembly—as a multistep process for the development of cancer therapy. Using apoptosis as an example, we illustrate that the combination of enzymatic transformation and self-assembly, in fact, is an inherent feature of apoptosis. After the introduction of EISA of small molecules in the context of supramolecular hydrogelation, we describe several key studies to underscore the promises of EISA for developing cancer therapy. Particularly, we will highlight that EISA allows one to develop approaches to target “undruggable” targets or “untargetable” features of cancer cells and provides the opportunity for simultaneously interacting with multiple targets. We envision that EISA, used separately or in combination with current anticancer therapeutics, will ultimately lead to a paradigm shift for developing anticancer medicine that inhibit multiple hallmark capabilities of cancer. PMID:25933032

  16. Exact free vibration of multi-step Timoshenko beam system with several attachments

    NASA Astrophysics Data System (ADS)

    Farghaly, S. H.; El-Sayed, T. A.

    2016-05-01

    This paper deals with the analysis of the natural frequencies, mode shapes of an axially loaded multi-step Timoshenko beam combined system carrying several attachments. The influence of system design and the proposed sub-system non-dimensional parameters on the combined system characteristics are the major part of this investigation. The effect of material properties, rotary inertia and shear deformation of the beam system for each span are included. The end masses are elastically supported against rotation and translation at an offset point from the point of attachment. A sub-system having two degrees of freedom is located at the beam ends and at any of the intermediate stations and acts as a support and/or a suspension. The boundary conditions of the ordinary differential equation governing the lateral deflections and slope due to bending of the beam system including the shear force term, due to the sub-system, have been formulated. Exact global coefficient matrices for the combined modal frequencies, the modal shape and for the discrete sub-system have been derived. Based on these formulae, detailed parametric studies of the combined system are carried out. The applied mathematical model is valid for wide range of applications especially in mechanical, naval and structural engineering fields.

  17. Multistep Compositional Remodeling of Supported Lipid Membranes by Interfacially Active Phosphatidylinositol Kinases.

    PubMed

    Tabaei, Seyed R; Guo, Feng; Rutaganira, Florentine U; Vafaei, Setareh; Choong, Ingrid; Shokat, Kevan M; Glenn, Jeffrey S; Cho, Nam-Joon

    2016-05-17

    The multienzyme catalytic phosphorylation of phosphatidylinositol (PI) in a supported lipid membrane platform is demonstrated for the first time. One-step treatment with PI 4-kinase IIIβ (PI4Kβ) yielded PI 4-phosphate (PI4P), while a multistep enzymatic cascade of PI4Kβ followed by PIP 5-kinase produced PI-4,5-bisphosphate (PI(4,5)P2 or PIP2). By employing quartz crystal microbalance with dissipation monitoring, we were able to track membrane association of kinase enzymes for the first time as well as detect PI4P and PI(4,5)P2 generation based on subsequent antibody binding to the supported lipid bilayers. Pharmacologic inhibition of PI4Kβ by a small molecule inhibitor was also quantitatively assessed, yielding an EC50 value that agrees well with conventional biochemical readout. Taken together, the development of a PI-containing supported membrane platform coupled with surface-sensitive measurement techniques for kinase studies opens the door to exploring the rich biochemistry and pharmacological targeting of membrane-associated phosphoinositides. PMID:27118725

  18. Transformation of quiescent adult oligodendrocyte precursor cells into malignant glioma through a multistep reactivation process

    PubMed Central

    Galvao, Rui Pedro; Kasina, Anita; McNeill, Robert S.; Harbin, Jordan E.; Foreman, Oded; Verhaak, Roel G. W.; Nishiyama, Akiko; Miller, C. Ryan; Zong, Hui

    2014-01-01

    How malignant gliomas arise in a mature brain remains a mystery, hindering the development of preventive and therapeutic interventions. We previously showed that oligodendrocyte precursor cells (OPCs) can be transformed into glioma when mutations are introduced perinatally. However, adult OPCs rarely proliferate compared with their perinatal counterparts. Whether these relatively quiescent cells have the potential to transform is unknown, which is a critical question considering the late onset of human glioma. Additionally, the premalignant events taking place between initial mutation and a fully developed tumor mass are particularly poorly understood in glioma. Here we used a temporally controllable Cre transgene to delete p53 and NF1 specifically in adult OPCs and demonstrated that these cells consistently give rise to malignant gliomas. To investigate the transforming process of quiescent adult OPCs, we then tracked these cells throughout the premalignant phase, which revealed a dynamic multistep transformation, starting with rapid but transient hyperproliferative reactivation, followed by a long period of dormancy, and then final malignant transformation. Using pharmacological approaches, we discovered that mammalian target of rapamycin signaling is critical for both the initial OPC reactivation step and late-stage tumor cell proliferation and thus might be a potential target for both glioma prevention and treatment. In summary, our results firmly establish the transforming potential of adult OPCs and reveal an actionable multiphasic reactivation process that turns slowly dividing OPCs into malignant gliomas. PMID:25246577

  19. Multistep continuous-flow synthesis of (R)- and (S)-rolipram using heterogeneous catalysts

    NASA Astrophysics Data System (ADS)

    Tsubogo, Tetsu; Oyamada, Hidekazu; Kobayashi, Shū

    2015-04-01

    Chemical manufacturing is conducted using either batch systems or continuous-flow systems. Flow systems have several advantages over batch systems, particularly in terms of productivity, heat and mixing efficiency, safety, and reproducibility. However, for over half a century, pharmaceutical manufacturing has used batch systems because the synthesis of complex molecules such as drugs has been difficult to achieve with continuous-flow systems. Here we describe the continuous-flow synthesis of drugs using only columns packed with heterogeneous catalysts. Commercially available starting materials were successively passed through four columns containing achiral and chiral heterogeneous catalysts to produce (R)-rolipram, an anti-inflammatory drug and one of the family of γ-aminobutyric acid (GABA) derivatives. In addition, simply by replacing a column packed with a chiral heterogeneous catalyst with another column packed with the opposing enantiomer, we obtained antipole (S)-rolipram. Similarly, we also synthesized (R)-phenibut, another drug belonging to the GABA family. These flow systems are simple and stable with no leaching of metal catalysts. Our results demonstrate that multistep (eight steps in this case) chemical transformations for drug synthesis can proceed smoothly under flow conditions using only heterogeneous catalysts, without the isolation of any intermediates and without the separation of any catalysts, co-products, by-products, and excess reagents. We anticipate that such syntheses will be useful in pharmaceutical manufacturing.

  20. Discovery of Novel New Delhi Metallo-β-Lactamases-1 Inhibitors by Multistep Virtual Screening

    PubMed Central

    Wang, Xuequan; Lu, Meiling; Shi, Yang; Ou, Yu; Cheng, Xiaodong

    2015-01-01

    The emergence of NDM-1 containing multi-antibiotic resistant "Superbugs" necessitates the needs of developing of novel NDM-1inhibitors. In this study, we report the discovery of novel NDM-1 inhibitors by multi-step virtual screening. From a 2,800,000 virtual drug-like compound library selected from the ZINC database, we generated a focused NDM-1 inhibitor library containing 298 compounds of which 44 chemical compounds were purchased and evaluated experimentally for their ability to inhibit NDM-1 in vitro. Three novel NDM-1 inhibitors with micromolar IC50 values were validated. The most potent inhibitor, VNI-41, inhibited NDM-1 with an IC50 of 29.6 ± 1.3 μM. Molecular dynamic simulation revealed that VNI-41 interacted extensively with the active site. In particular, the sulfonamide group of VNI-41 interacts directly with the metal ion Zn1 that is critical for the catalysis. These results demonstrate the feasibility of applying virtual screening methodologies in identifying novel inhibitors for NDM-1, a metallo-β-lactamase with a malleable active site and provide a mechanism base for rational design of NDM-1 inhibitors using sulfonamide as a functional scaffold. PMID:25734558

  1. Multistep Reaction Based De Novo Drug Design: Generating Synthetically Feasible Design Ideas.

    PubMed

    Masek, Brian B; Baker, David S; Dorfman, Roman J; DuBrucq, Karen; Francis, Victoria C; Nagy, Stephan; Richey, Bree L; Soltanshahi, Farhad

    2016-04-25

    We describe a "multistep reaction driven" evolutionary algorithm approach to de novo molecular design. Structures generated by the approach include a proposed synthesis path intended to aid the chemist in assessing the synthetic feasibility of the ideas that are generated. The methodology is independent of how the design ideas are scored, allowing multicriteria drug design to address multiple issues including activity at one or more pharmacological targets, selectivity, physical and ADME properties, and off target liabilities; the methods are compatible with common computer-aided drug discovery "scoring" methodologies such as 2D- and 3D-ligand similarity, docking, desirability functions based on physiochemical properties, and/or predictions from 2D/3D QSAR or machine learning models and combinations thereof to be used to guide design. We have performed experiments to assess the extent to which known drug space can be covered by our approach. Using a library of 88 generic reactions and a database of ∼20 000 reactants, we find that our methods can identify "close" analogs for ∼50% of the known small molecule drugs with molecular weight less than 300. To assess the quality of the in silico generated synthetic pathways, synthesis chemists were asked to rate the viability of synthesis pathways: both "real" and in silico generated. In silico reaction schemes generated by our methods were rated as very plausible with scores similar to known literature synthesis schemes. PMID:27031173

  2. Multi-step deformations - a stringent test for constitutive models for polymer glasses

    NASA Astrophysics Data System (ADS)

    Medvedev, Grigori; Caruthers, James

    A number of constitutive models have been proposed to describe mechanical behavior of polymer glasses, where the focus has been on the stress-strain curve observed in a constant strain rate deformation. The stress-strain curve possesses several prominent features, including yield, post-yield softening, flow, and hardening, which have proven challenging to predict. As a result, both viscoplastic and nonlinear viscoelastic constitutive models have become quite intricate, where a new mechanism is invoked for each bend of the stress-strain curve. We demonstrate on several examples that when the models are used to describe the multi-step deformations vs. the more common single strain rate deformation, they produce responses that are qualitatively incorrect, revealing the existing models to be parameterizations of a single-step curve. A recently developed stochastic constitutive model has fewer problems than the traditional viscoelastic/viscoplastic models, but it also has difficulties. The implications for the mechanics and physics of glassy polymers will be discussed.

  3. Multistep continuous-flow synthesis of (R)- and (S)-rolipram using heterogeneous catalysts.

    PubMed

    Tsubogo, Tetsu; Oyamada, Hidekazu; Kobayashi, Shū

    2015-04-16

    Chemical manufacturing is conducted using either batch systems or continuous-flow systems. Flow systems have several advantages over batch systems, particularly in terms of productivity, heat and mixing efficiency, safety, and reproducibility. However, for over half a century, pharmaceutical manufacturing has used batch systems because the synthesis of complex molecules such as drugs has been difficult to achieve with continuous-flow systems. Here we describe the continuous-flow synthesis of drugs using only columns packed with heterogeneous catalysts. Commercially available starting materials were successively passed through four columns containing achiral and chiral heterogeneous catalysts to produce (R)-rolipram, an anti-inflammatory drug and one of the family of γ-aminobutyric acid (GABA) derivatives. In addition, simply by replacing a column packed with a chiral heterogeneous catalyst with another column packed with the opposing enantiomer, we obtained antipole (S)-rolipram. Similarly, we also synthesized (R)-phenibut, another drug belonging to the GABA family. These flow systems are simple and stable with no leaching of metal catalysts. Our results demonstrate that multistep (eight steps in this case) chemical transformations for drug synthesis can proceed smoothly under flow conditions using only heterogeneous catalysts, without the isolation of any intermediates and without the separation of any catalysts, co-products, by-products, and excess reagents. We anticipate that such syntheses will be useful in pharmaceutical manufacturing. PMID:25877201

  4. Enzyme-instructed self-assembly: a multistep process for potential cancer therapy.

    PubMed

    Zhou, Jie; Xu, Bing

    2015-06-17

    The central dogma of the action of current anticancer drugs is that the drug tightly binds to its molecular target for inhibition. The reliance on tight ligand-receptor binding, however, is also the major root of drug resistance in cancer therapy. In this article, we highlight enzyme-instructed self-assembly (EISA)-the integration of enzymatic transformation and molecular self-assembly-as a multistep process for the development of cancer therapy. Using apoptosis as an example, we illustrate that the combination of enzymatic transformation and self-assembly, in fact, is an inherent feature of apoptosis. After the introduction of EISA of small molecules in the context of supramolecular hydrogelation, we describe several key studies to underscore the promises of EISA for developing cancer therapy. Particularly, we will highlight that EISA allows one to develop approaches to target "undruggable" targets or "untargetable" features of cancer cells and provides the opportunity for simultaneously interacting with multiple targets. We envision that EISA, used separately or in combination with current anticancer therapeutics, will ultimately lead to a paradigm shift for developing anticancer medicine that inhibit multiple hallmark capabilities of cancer. PMID:25933032

  5. Multi-step process for concentrating magnetic particles in waste sludges

    DOEpatents

    Watson, J.L.

    1990-07-10

    This invention involves a multi-step, multi-force process for dewatering sludges which have high concentrations of magnetic particles, such as waste sludges generated during steelmaking. This series of processing steps involves (1) mixing a chemical flocculating agent with the sludge; (2) allowing the particles to aggregate under non-turbulent conditions; (3) subjecting the mixture to a magnetic field which will pull the magnetic aggregates in a selected direction, causing them to form a compacted sludge; (4) preferably, decanting the clarified liquid from the compacted sludge; and (5) using filtration to convert the compacted sludge into a cake having a very high solids content. Steps 2 and 3 should be performed simultaneously. This reduces the treatment time and increases the extent of flocculation and the effectiveness of the process. As partially formed aggregates with active flocculating groups are pulled through the mixture by the magnetic field, they will contact other particles and form larger aggregates. This process can increase the solids concentration of steelmaking sludges in an efficient and economic manner, thereby accomplishing either of two goals: (a) it can convert hazardous wastes into economic resources for recycling as furnace feed material, or (b) it can dramatically reduce the volume of waste material which must be disposed. 7 figs.

  6. Multi-step process for concentrating magnetic particles in waste sludges

    DOEpatents

    Watson, John L.

    1990-01-01

    This invention involves a multi-step, multi-force process for dewatering sludges which have high concentrations of magnetic particles, such as waste sludges generated during steelmaking. This series of processing steps involves (1) mixing a chemical flocculating agent with the sludge; (2) allowing the particles to aggregate under non-turbulent conditions; (3) subjecting the mixture to a magnetic field which will pull the magnetic aggregates in a selected direction, causing them to form a compacted sludge; (4) preferably, decanting the clarified liquid from the compacted sludge; and (5) using filtration to convert the compacted sludge into a cake having a very high solids content. Steps 2 and 3 should be performed simultaneously. This reduces the treatment time and increases the extent of flocculation and the effectiveness of the process. As partially formed aggregates with active flocculating groups are pulled through the mixture by the magnetic field, they will contact other particles and form larger aggregates. This process can increase the solids concentration of steelmaking sludges in an efficient and economic manner, thereby accomplishing either of two goals: (a) it can convert hazardous wastes into economic resources for recycling as furnace feed material, or (b) it can dramatically reduce the volume of waste material which must be disposed.

  7. Spontaneous formation of the unlocked state of the ribosome is a multistep process.

    PubMed

    Munro, James B; Altman, Roger B; Tung, Chang-Shung; Cate, Jamie H D; Sanbonmatsu, Kevin Y; Blanchard, Scott C

    2010-01-12

    The mechanism of substrate translocation through the ribosome is central to the rapid and faithful translation of mRNA into proteins. The rate-limiting step in translocation is an unlocking process that includes the formation of an "unlocked" intermediate state, which requires the convergence of large-scale conformational events within the ribosome including tRNA hybrid states formation, closure of the ribosomal L1 stalk domain, and subunit ratcheting. Here, by imaging of the pretranslocation ribosome complex from multiple structural perspectives using two- and three-color single-molecule fluorescence resonance energy transfer, we observe that tRNA hybrid states formation and L1 stalk closure, events central to the unlocking mechanism, are not tightly coupled. These findings reveal that the unlocked state is achieved through a stochastic-multistep process, where the extent of conformational coupling depends on the nature of tRNA substrates. These data suggest that cellular mechanisms affecting the coupling of conformational processes on the ribosome may regulate the process of translation elongation. PMID:20018653

  8. Shutdown Dose Rate Analysis Using the Multi-Step CADIS Method

    SciTech Connect

    Ibrahim, Ahmad M.; Peplow, Douglas E.; Peterson, Joshua L.; Grove, Robert E.

    2015-01-01

    The Multi-Step Consistent Adjoint Driven Importance Sampling (MS-CADIS) hybrid Monte Carlo (MC)/deterministic radiation transport method was proposed to speed up the shutdown dose rate (SDDR) neutron MC calculation using an importance function that represents the neutron importance to the final SDDR. This work applied the MS-CADIS method to the ITER SDDR benchmark problem. The MS-CADIS method was also used to calculate the SDDR uncertainty resulting from uncertainties in the MC neutron calculation and to determine the degree of undersampling in SDDR calculations because of the limited ability of the MC method to tally detailed spatial and energy distributions. The analysis that used the ITER benchmark problem compared the efficiency of the MS-CADIS method to the traditional approach of using global MC variance reduction techniques for speeding up SDDR neutron MC calculation. Compared to the standard Forward-Weighted-CADIS (FW-CADIS) method, the MS-CADIS method increased the efficiency of the SDDR neutron MC calculation by 69%. The MS-CADIS method also increased the fraction of nonzero scoring mesh tally elements in the space-energy regions of high importance to the final SDDR.

  9. Investigation of the reactions of acrylamide during in vitro multistep enzymatic digestion of thermally processed foods.

    PubMed

    Hamzalıoğlu, Aytül; Gökmen, Vural

    2015-01-01

    This study investigated the fate of acrylamide in thermally processed foods after ingestion. An in vitro multistep enzymatic digestion system simulating gastric, duodenal and colon phases was used to understand the fate of acrylamide in bakery and fried potato products. Acrylamide levels gradually decreased through gastric, duodenal and colon phases during in vitro digestion of biscuits. At the end of digestion, acrylamide reduction was between 49.2% and 73.4% in biscuits. Binary model systems composed of acrylamide and amino acids were used to understand the mechanism of acrylamide reduction. High-resolution mass spectrometry analyses confirmed Michael addition of amino acids to acrylamide during digestion. In contrast to bakery products, acrylamide levels increased significantly during gastric digestion of fried potatoes. The Schiff base formed between reducing sugars and asparagine disappeared rapidly, whereas the acrylamide level increased during the gastric phase. This suggests that intermediates like the Schiff base that accumulate in potatoes during frying are potential precursors of acrylamide under gastric conditions. PMID:25468219

  10. Shutdown Dose Rate Analysis Using the Multi-Step CADIS Method

    DOE PAGESBeta

    Ibrahim, Ahmad M.; Peplow, Douglas E.; Peterson, Joshua L.; Grove, Robert E.

    2015-01-01

    The Multi-Step Consistent Adjoint Driven Importance Sampling (MS-CADIS) hybrid Monte Carlo (MC)/deterministic radiation transport method was proposed to speed up the shutdown dose rate (SDDR) neutron MC calculation using an importance function that represents the neutron importance to the final SDDR. This work applied the MS-CADIS method to the ITER SDDR benchmark problem. The MS-CADIS method was also used to calculate the SDDR uncertainty resulting from uncertainties in the MC neutron calculation and to determine the degree of undersampling in SDDR calculations because of the limited ability of the MC method to tally detailed spatial and energy distributions. The analysismore » that used the ITER benchmark problem compared the efficiency of the MS-CADIS method to the traditional approach of using global MC variance reduction techniques for speeding up SDDR neutron MC calculation. Compared to the standard Forward-Weighted-CADIS (FW-CADIS) method, the MS-CADIS method increased the efficiency of the SDDR neutron MC calculation by 69%. The MS-CADIS method also increased the fraction of nonzero scoring mesh tally elements in the space-energy regions of high importance to the final SDDR.« less

  11. Method to Improve Indium Bump Bonding via Indium Oxide Removal Using a Multi-Step Plasma Process

    NASA Technical Reports Server (NTRS)

    Greer, H. Frank (Inventor); Jones, Todd J. (Inventor); Vasquez, Richard P. (Inventor); Hoenk, Michael E. (Inventor); Dickie, Matthew R. (Inventor); Nikzad, Shouleh (Inventor)

    2012-01-01

    A process for removing indium oxide from indium bumps in a flip-chip structure to reduce contact resistance, by a multi-step plasma treatment. A first plasma treatment of the indium bumps with an argon, methane and hydrogen plasma reduces indium oxide, and a second plasma treatment with an argon and hydrogen plasma removes residual organics. The multi-step plasma process for removing indium oxide from the indium bumps is more effective in reducing the oxide, and yet does not require the use of halogens, does not change the bump morphology, does not attack the bond pad material or under-bump metallization layers, and creates no new mechanisms for open circuits.

  12. Surface Modified Particles By Multi-Step Michael-Type Addition And Process For The Preparation Thereof

    SciTech Connect

    Cook, Ronald Lee; Elliott, Brian John; Luebben, Silvia DeVito; Myers, Andrew William; Smith, Bryan Matthew

    2005-05-03

    A new class of surface modified particles and a multi-step Michael-type addition surface modification process for the preparation of the same is provided. The multi-step Michael-type addition surface modification process involves two or more reactions to compatibilize particles with various host systems and/or to provide the particles with particular chemical reactivities. The initial step comprises the attachment of a small organic compound to the surface of the inorganic particle. The subsequent steps attach additional compounds to the previously attached organic compounds through reactive organic linking groups. Specifically, these reactive groups are activated carbon—carbon pi bonds and carbon and non-carbon nucleophiles that react via Michael or Michael-type additions.

  13. Automation in Immunohematology

    PubMed Central

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-01-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process. PMID:22988378

  14. Advances in inspection automation

    NASA Astrophysics Data System (ADS)

    Weber, Walter H.; Mair, H. Douglas; Jansen, Dion; Lombardi, Luciano

    2013-01-01

    This new session at QNDE reflects the growing interest in inspection automation. Our paper describes a newly developed platform that makes the complex NDE automation possible without the need for software programmers. Inspection tasks that are tedious, error-prone or impossible for humans to perform can now be automated using a form of drag and drop visual scripting. Our work attempts to rectify the problem that NDE is not keeping pace with the rest of factory automation. Outside of NDE, robots routinely and autonomously machine parts, assemble components, weld structures and report progress to corporate databases. By contrast, components arriving in the NDT department typically require manual part handling, calibrations and analysis. The automation examples in this paper cover the development of robotic thickness gauging and the use of adaptive contour following on the NRU reactor inspection at Chalk River.

  15. A Family of Symmetric Linear Multistep Methods for the Numerical Solution of the Schroedinger Equation and Related Problems

    SciTech Connect

    Anastassi, Z. A.; Simos, T. E.

    2010-09-30

    We develop a new family of explicit symmetric linear multistep methods for the efficient numerical solution of the Schroedinger equation and related problems with oscillatory solution. The new methods are trigonometrically fitted and have improved intervals of periodicity as compared to the corresponding classical method with constant coefficients and other methods from the literature. We also apply the methods along with other known methods to real periodic problems, in order to measure their efficiency.

  16. Progress in Cherenkov ring imaging: Part 1. Detection and localization of photons with the multistep proportional chamber

    NASA Astrophysics Data System (ADS)

    Bouclier, R.; Charpak, G.; Cattai, A.; Million, G.; Peisert, A.; Santiard, J. C.; Sauli, F.; Coutrakon, G.; Hubbard, J. R.; Mangeot, Ph.; Mullie, J.; Tichit, J.; Glass, H.; Kirz, J.; McCarthy, R.

    1983-02-01

    The multistep proportional chamber, operated with a photosensitive gas filling, makes it possible to obtain stable multiplication factors in excess of 10 6 and can be used for the detection of single photoelectrons released in the gas. The efficiency and localization properties of the device in the detection of vacuum ultraviolet photons are discussed here, in view of its use for particle identification exploiting the Cherenkov ring-imaging method.

  17. Fabrication of low-loss, single-mode-channel waveguide with DNA-CTMA biopolymer by multistep processing technology.

    PubMed

    Zhou, Jun; Wang, Zhen Yong; Yang, Xin; Wong, C-Y; Pun, Edwin Y B

    2010-05-15

    A multistep processing and reactive ion etching technique has been developed to fabricate optical channel waveguides based on deoxyribonucleic acid-cetyltrimethylammonium biopolymer material. The channel waveguides exhibit excellent single-mode output and high confinement of light because of the sharp waveguide profile with very smooth surfaces and vertical sidewalls. The measurement results show that these channel waveguides have low propagation losses and small polarization dependent losses at 633, 1310, and 1550 nm wavelengths. PMID:20479792

  18. DE-FG02-05ER64001 Overcoming the hurdles of multi-step targeting (MST) for effective radioimmunotherapy of solid tumors

    SciTech Connect

    P.I. Steven M. Larson MD Co P.I. Nai-Kong Cheung MD, Ph.D.

    2009-09-21

    The 4 specific aims of this project are: (1) Optimization of MST to increase tumor uptake; (2) Antigen heterogeneity; (3) Characterization and reduction of renal uptake; and (4) Validation in vivo of optimized MST targeted therapy. This proposal focussed upon optimizing multistep immune targeting strategies for the treatment of cancer. Two multi-step targeting constructs were explored during this funding period: (1) anti-Tag-72 and (2) anti-GD2.

  19. Systematic review automation technologies

    PubMed Central

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  20. The automation of clinical trial serious adverse event reporting workflow

    PubMed Central

    London, Jack W; Smalley, Karl J; Conner, Kyle; Smith, J Bruce

    2011-01-01

    Background The reporting of serious adverse events (SAEs) is a requirement when conducting a clinical trial involving human subjects, necessary for the protection of the participants. The reporting process is a multi-step procedure, involving a number of individuals from initiation to final review, and must be completed in a timely fashion. Purpose The purpose of this project was to automate the adverse event reporting process, replacing paper-based processes with computer-based processes, so that personnel effort and time required for serious adverse event reporting was reduced, and the monitoring of reporting performance and adverse event characteristics was facilitated. Methods Use case analysis was employed to understand the reporting workflow and generate software requirements. The automation of the workflow was then implemented, employing computer databases, web-based forms, electronic signatures, and email communication. Results In the initial year (2007) of full deployment, 588 SAE reports were processed by the automated system, eSAEy™. The median time from initiation to Principal Investigator electronic signature was less than 2 days (mean 7 ± 0.7 days). This was a significant reduction from the prior paper-based system, which had a median time for signature of 24 days (mean of 45 ± 5.7 days). With eSAEy™, reports on adverse event characteristics (type, grade, etc.) were easily obtained and had consistent values based on standard terminologies. Limitation The automated system described was designed specifically for the work flow at Thomas Jefferson University. While the methodology for system design, and the system requirements derived from common clinical trials adverse reporting procedures are applicable in general, specific work flow details may not relevant at other institutions. Conclusion The system facilitated analysis of individual investigator reporting performance, as well as the aggregation and analysis of the nature of reported adverse

  1. A multistep approach in the cytologic evaluation of liver biopsy samples of dogs with hepatic diseases.

    PubMed

    Stockhaus, C; Van Den Ingh, T; Rothuizen, J; Teske, E

    2004-09-01

    Cytologic criteria were evaluated for their diagnostic value in liver disease in dogs. Therefore, histopathologic and cytologic examination was performed on liver biopsy samples of 73 dogs with liver diseases and 28 healthy dogs. Logistic regression analysis was used to select the measured parameters to be included in a multistep approach. With the logistic regression method, different characteristic cytologic parameters could be defined for each histopathologic diagnosis. In malignant lymphoma of the liver, the presence of large numbers of lymphoblasts with a minimum of 5% of all cells was found. Clusters of epithelial cells with several cytologic characteristics of malignancy intermixed with normal hepatocytes were indicative of metastatic carcinoma or cholangiocellular carcinoma. Liver cells in hepatocellular carcinoma were characterized by a high nucleus/cytoplasm ratio, large cell diameters, increased numbers of nucleoli per nuclei, small numbers of cytoplasmic vacuoles, and frequently, small numbers of lymphocytes. Extrahepatic cholestasis was characterized by excessive extracellular bile pigment in the form of biliary casts, an increased number of nucleoli within hepatocytes, decreased hepatic cell size, and low numbers of lymphocytes. In destructive cholangiolitis, increased numbers of neutrophils and a small mean nuclear size within hepatocytes were seen. Acute and nonspecific reactive hepatitis are diagnosed based on the presence of moderate reactive nuclear patterns, including more pronounced chromatin, prominent nucleoli, increased numbers of inflammatory cells, excluding lymphocytes, and the absence of increased numbers of bile duct cell clusters. Increased number of mast cells also was indicative of nonspecific reactive hepatitis. Important cytologic criteria for the diagnosis of liver cirrhosis, in addition to chronic hepatitis, are intracellular bile accumulation and increased numbers of bile duct cell clusters. In summary, the stepwise approach

  2. Detection of Heterogeneous Small Inclusions by a Multi-Step MUSIC Method

    NASA Astrophysics Data System (ADS)

    Solimene, Raffaele; Dell'Aversano, Angela; Leone, Giovanni

    2014-05-01

    In this contribution the problem of detecting and localizing scatterers with small (in terms of wavelength) cross sections by collecting their scattered field is addressed. The problem is dealt with for a two-dimensional and scalar configuration where the background is given as a two-layered cylindrical medium. More in detail, while scattered field data are taken in the outermost layer, inclusions are embedded within the inner layer. Moreover, the case of heterogeneous inclusions (i.e., having different scattering coefficients) is addressed. As a pertinent applicative context we identify the problem of diagnose concrete pillars in order to detect and locate rebars, ducts and other small in-homogeneities that can populate the interior of the pillar. The nature of inclusions influences the scattering coefficients. For example, the field scattered by rebars is stronger than the one due to ducts. Accordingly, it is expected that the more weakly scattering inclusions can be difficult to be detected as their scattered fields tend to be overwhelmed by those of strong scatterers. In order to circumvent this problem, in this contribution a multi-step MUltiple SIgnal Classification (MUSIC) detection algorithm is adopted [1]. In particular, the first stage aims at detecting rebars. Once rebars have been detected, their positions are exploited to update the Green's function and to subtract the scattered field due to their presence. The procedure is repeated until all the inclusions are detected. The analysis is conducted by numerical experiments for a multi-view/multi-static single-frequency configuration and the synthetic data are generated by a FDTD forward solver. Acknowledgement This work benefited from networking activities carried out within the EU funded COST Action TU1208 "Civil Engineering Applications of Ground Penetrating Radar." [1] R. Solimene, A. Dell'Aversano and G. Leone, "MUSIC algorithms for rebar detection," J. of Geophysics and Engineering, vol. 10, pp. 1

  3. Planning for Office Automation.

    ERIC Educational Resources Information Center

    Mick, Colin K.

    1983-01-01

    Outlines a practical approach to planning for office automation termed the "Focused Process Approach" (the "what" phase, "how" phase, "doing" phase) which is a synthesis of the problem-solving and participatory planning approaches. Thirteen references are provided. (EJS)

  4. Space station automation II

    SciTech Connect

    Chiou, W.C.

    1986-01-01

    This book contains the proceedings of a conference on space station automation. Topics include the following: distributed artificial intelligence for space station energy management systems and computer architecture for tolerobots in earth orbit.

  5. Automated data analysis.

    NASA Astrophysics Data System (ADS)

    Teuber, D.

    Automated data analysis assists the astronomer in the decision making processes applied for extracting astronomical information from data. Automated data analysis is the step between image processing and model interpretation. Tools developed in AI are applied (classification, expert system). Programming languages and computers are chosen to fulfil the increasing requirements. Expert systems have begun in astronomy. Data banks permit the astronomical community to share the large body of resulting information.

  6. Automated Status Notification System

    NASA Technical Reports Server (NTRS)

    2005-01-01

    NASA Lewis Research Center's Automated Status Notification System (ASNS) was born out of need. To prevent "hacker attacks," Lewis' telephone system needed to monitor communications activities 24 hr a day, 7 days a week. With decreasing staff resources, this continuous monitoring had to be automated. By utilizing existing communications hardware, a UNIX workstation, and NAWK (a pattern scanning and processing language), we implemented a continuous monitoring system.

  7. Automated Pilot Advisory System

    NASA Technical Reports Server (NTRS)

    Parks, J. L., Jr.; Haidt, J. G.

    1981-01-01

    An Automated Pilot Advisory System (APAS) was developed and operationally tested to demonstrate the concept that low cost automated systems can provide air traffic and aviation weather advisory information at high density uncontrolled airports. The system was designed to enhance the see and be seen rule of flight, and pilots who used the system preferred it over the self announcement system presently used at uncontrolled airports.

  8. Automated Lattice Perturbation Theory

    SciTech Connect

    Monahan, Christopher

    2014-11-01

    I review recent developments in automated lattice perturbation theory. Starting with an overview of lattice perturbation theory, I focus on the three automation packages currently "on the market": HiPPy/HPsrc, Pastor and PhySyCAl. I highlight some recent applications of these methods, particularly in B physics. In the final section I briefly discuss the related, but distinct, approach of numerical stochastic perturbation theory.

  9. Shielded cells transfer automation

    SciTech Connect

    Fisher, J J

    1984-01-01

    Nuclear waste from shielded cells is removed, packaged, and transferred manually in many nuclear facilities. Radiation exposure is absorbed by operators during these operations and limited only through procedural controls. Technological advances in automation using robotics have allowed a production waste removal operation to be automated to reduce radiation exposure. The robotic system bags waste containers out of glove box and transfers them to a shielded container. Operators control the system outside the system work area via television cameras. 9 figures.

  10. Automated imagery orthorectification pilot

    NASA Astrophysics Data System (ADS)

    Slonecker, E. Terrence; Johnson, Brad; McMahon, Joe

    2009-10-01

    Automated orthorectification of raw image products is now possible based on the comprehensive metadata collected by Global Positioning Systems and Inertial Measurement Unit technology aboard aircraft and satellite digital imaging systems, and based on emerging pattern-matching and automated image-to-image and control point selection capabilities in many advanced image processing systems. Automated orthorectification of standard aerial photography is also possible if a camera calibration report and sufficient metadata is available. Orthorectification of historical imagery, for which only limited metadata was available, was also attempted and found to require some user input, creating a semi-automated process that still has significant potential to reduce processing time and expense for the conversion of archival historical imagery into geospatially enabled, digital formats, facilitating preservation and utilization of a vast archive of historical imagery. Over 90 percent of the frames of historical aerial photos used in this experiment were successfully orthorectified to the accuracy of the USGS 100K base map series utilized for the geospatial reference of the archive. The accuracy standard for the 100K series maps is approximately 167 feet (51 meters). The main problems associated with orthorectification failure were cloud cover, shadow and historical landscape change which confused automated image-to-image matching processes. Further research is recommended to optimize automated orthorectification methods and enable broad operational use, especially as related to historical imagery archives.

  11. Automated Groundwater Screening

    SciTech Connect

    Taylor, Glenn A.; Collard, Leonard, B.

    2005-10-31

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application.

  12. Genetic control of bacterial biofilms.

    PubMed

    Wolska, Krystyna I; Grudniak, Anna M; Rudnicka, Zofia; Markowska, Katarzyna

    2016-05-01

    Nearly all bacterial species, including pathogens, have the ability to form biofilms. Biofilms are defined as structured ecosystems in which microbes are attached to surfaces and embedded in a matrix composed of polysaccharides, eDNA, and proteins, and their development is a multistep process. Bacterial biofilms constitute a large medical problem due to their extremely high resistance to various types of therapeutics, including conventional antibiotics. Several environmental and genetic signals control every step of biofilm development and dispersal. From among the latter, quorum sensing, cyclic diguanosine-5'-monophosphate, and small RNAs are considered as the main regulators. The present review describes the control role of these three regulators in the life cycles of biofilms built by Pseudomonas aeruginosa, Staphylococcus aureus, Salmonella enterica serovar Typhimurium, and Vibrio cholerae. The interconnections between their activities are shown. Compounds and strategies which target the activity of these regulators, mainly quorum sensing inhibitors, and their potential role in therapy are also assessed. PMID:26294280

  13. Disposable and removable nucleic acid extraction and purification cartridges for automated flow-through systems

    DOEpatents

    Regan, John Frederick

    2014-09-09

    Removable cartridges are used on automated flow-through systems for the purpose of extracting and purifying genetic material from complex matrices. Different types of cartridges are paired with specific automated protocols to concentrate, extract, and purifying pathogenic or human genetic material. Their flow-through nature allows large quantities sample to be processed. Matrices may be filtered using size exclusion and/or affinity filters to concentrate the pathogen of interest. Lysed material is ultimately passed through a filter to remove the insoluble material before the soluble genetic material is delivered past a silica-like membrane that binds the genetic material, where it is washed, dried, and eluted. Cartridges are inserted into the housing areas of flow-through automated instruments, which are equipped with sensors to ensure proper placement and usage of the cartridges. Properly inserted cartridges create fluid- and air-tight seals with the flow lines of an automated instrument.

  14. Analysis of amyotrophic lateral sclerosis as a multistep process: a population-based modelling study

    PubMed Central

    Al-Chalabi, Ammar; Calvo, Andrea; Chio, Adriano; Colville, Shuna; Ellis, Cathy M; Hardiman, Orla; Heverin, Mark; Howard, Robin S; Huisman, Mark H B; Keren, Noa; Leigh, P Nigel; Mazzini, Letizia; Mora, Gabriele; Orrell, Richard W; Rooney, James; Scott, Kirsten M; Scotton, William J; Seelen, Meinie; Shaw, Christopher E; Sidle, Katie S; Swingler, Robert; Tsuda, Miho; Veldink, Jan H; Visser, Anne E; van den Berg, Leonard H; Pearce, Neil

    2014-01-01

    Summary Background Amyotrophic lateral sclerosis shares characteristics with some cancers, such as onset being more common in later life, progression usually being rapid, the disease affecting a particular cell type, and showing complex inheritance. We used a model originally applied to cancer epidemiology to investigate the hypothesis that amyotrophic lateral sclerosis is a multistep process. Methods We generated incidence data by age and sex from amyotrophic lateral sclerosis population registers in Ireland (registration dates 1995–2012), the Netherlands (2006–12), Italy (1995–2004), Scotland (1989–98), and England (2002–09), and calculated age and sex-adjusted incidences for each register. We regressed the log of age-specific incidence against the log of age with least squares regression. We did the analyses within each register, and also did a combined analysis, adjusting for register. Findings We identified 6274 cases of amyotrophic lateral sclerosis from a catchment population of about 34 million people. We noted a linear relationship between log incidence and log age in all five registers: England r2=0·95, Ireland r2=0·99, Italy r2=0·95, the Netherlands r2=0·99, and Scotland r2=0·97; overall r2=0·99. All five registers gave similar estimates of the linear slope ranging from 4·5 to 5·1, with overlapping confidence intervals. The combination of all five registers gave an overall slope of 4·8 (95% CI 4·5–5·0), with similar estimates for men (4·6, 4·3–4·9) and women (5·0, 4·5–5·5). Interpretation A linear relationship between the log incidence and log age of onset of amyotrophic lateral sclerosis is consistent with a multistage model of disease. The slope estimate suggests that amyotrophic lateral sclerosis is a six-step process. Identification of these steps could lead to preventive and therapeutic avenues. Funding UK Medical Research Council; UK Economic and Social Research Council; Ireland Health Research Board; The

  15. Achieving effective terminal exciton delivery in quantum dot antenna-sensitized multistep DNA photonic wires.

    PubMed

    Spillmann, Christopher M; Ancona, Mario G; Buckhout-White, Susan; Algar, W Russ; Stewart, Michael H; Susumu, Kimihiro; Huston, Alan L; Goldman, Ellen R; Medintz, Igor L

    2013-08-27

    exciton transfer efficiencies approaching 100% are seen when the dye spacings are 0.5 × R0. However, as additional dyes are included in each wire, strong nonidealities appear that are suspected to arise predominantly from the poor photophysical performance of the last two acceptor dyes (Cy5 and Cy5.5). The results are discussed in the context of improving exciton transfer efficiency along photonic wires and the contributions these architectures can make to understanding multistep FRET processes. PMID:23844838

  16. Transarterial chemoembolization using iodized oil for unresectable hepatocellular carcinoma: perspective from multistep hepatocarcinogenesis

    PubMed Central

    Yoshimitsu, Kengo

    2014-01-01

    Transarterial chemoembolization (TACE) using iodized oil (Lipiodol®) (Lp-TACE) as a carrier of chemotherapeutic agents has been routinely performed to control hepatocellular carcinomas (HCC) in Japan, and its use has yielded fairly beneficial therapeutic results. Lipiodol is thought to pass through the tumor sinusoids of HCC and reach the outflow drainage areas, namely, the portal venous side of the tumor. By doing this, Lipiodol blocks not only the tumor’s arterial inflow but also its portal venous outflow, providing sufficient ischemic effects. It is known that the inflow blood system, tumor sinusoids, and outflow blood system change drastically during the process of multistep hepatocarcinogenesis; thus, it is reasonable to postulate that the distribution of Lipiodol and the subsequent therapeutic effect of Lp-TACE may also change during that process. Arterial inflow to HCC is highest for moderately differentiated HCC (mHCC) and is relatively low in well or poorly differentiated HCC (wHCC and pHCC, respectively). It has been suggested that the metabolic state of wHCC and mHCC is aerobic, while that of pHCC is anaerobic. The tumor sinusoids in wHCC and mHCC are small in size and large in number, while those in pHCC are large in size and small in number. This finding results in a greater chance of tumor cell exposure to chemotherapeutic agents in the former and a lesser chance in the latter. The outflow tract, namely, the drainage system via the residual portal venous branches within the pseudocapsule, is more complete in mHCC and pHCC and less so in wHCC. Considering all of these components of HCC of different histological grades, Lp-TACE should have the greatest effect on mHCC and a relatively low effect on wHCC and pHCC. To achieve consistently high therapeutic results, it is important to consider these components, which affect the sensitivity of HCC to Lp-TACE, to maximize both the chemotherapeutic and ischemic effects of this therapy. PMID:25114603

  17. Characterization and multi-step transketolase-ω-transaminase bioconversions in an immobilized enzyme microreactor (IEMR) with packed tube.

    PubMed

    Halim, Amanatuzzakiah Abdul; Szita, Nicolas; Baganz, Frank

    2013-12-01

    The concept of de novo metabolic engineering through novel synthetic pathways offers new directions for multi-step enzymatic synthesis of complex molecules. This has been complemented by recent progress in performing enzymatic reactions using immobilized enzyme microreactors (IEMR). This work is concerned with the construction of de novo designed enzyme pathways in a microreactor synthesizing chiral molecules. An interesting compound, commonly used as the building block in several pharmaceutical syntheses, is a single diastereoisomer of 2-amino-1,3,4-butanetriol (ABT). This chiral amino alcohol can be synthesized from simple achiral substrates using two enzymes, transketolase (TK) and transaminase (TAm). Here we describe the development of an IEMR using His6-tagged TK and TAm immobilized onto Ni-NTA agarose beads and packed into tubes to enable multi-step enzyme reactions. The kinetic parameters of both enzymes were first determined using single IEMRs evaluated by a kinetic model developed for packed bed reactors. The Km(app) for both enzymes appeared to be flow rate dependent, while the turnover number kcat was reduced 3 fold compared to solution-phase TK and TAm reactions. For the multi-step enzyme reaction, single IEMRs were cascaded in series, whereby the first enzyme, TK, catalyzed a model reaction of lithium-hydroxypyruvate (HPA) and glycolaldehyde (GA) to L-erythrulose (ERY), and the second unit of the IEMR with immobilized TAm converted ERY into ABT using (S)-α-methylbenzylamine (MBA) as amine donor. With initial 60mM (HPA and GA each) and 6mM (MBA) substrate concentration mixture, the coupled reaction reached approximately 83% conversion in 20 min at the lowest flow rate. The ability to synthesize a chiral pharmaceutical intermediate, ABT in relatively short time proves this IEMR system as a powerful tool for construction and evaluation of de novo pathways as well as for determination of enzyme kinetics. PMID:24055435

  18. Multi-Step Ka/Ka Dichroic Plate with Rounded Corners for NASA's 34m Beam Waveguide Antenna

    NASA Technical Reports Server (NTRS)

    Veruttipong, Watt; Khayatian, Behrouz; Hoppe, Daniel; Long, Ezra

    2013-01-01

    A multi-step Ka/Ka dichroic plate Frequency Selective Surface (FSS structure) is designed, manufactured and tested for use in NASA's Deep Space Network (DSN) 34m Beam Waveguide (BWG) antennas. The proposed design allows ease of manufacturing and ability to handle the increased transmit power (reflected off the FSS) of the DSN BWG antennas from 20kW to 100 kW. The dichroic is designed using HFSS and results agree well with measured data considering the manufacturing tolerances that could be achieved on the dichroic.

  19. [Genetics and genetic counseling].

    PubMed

    Izzi, Claudia; Liut, Francesca; Dallera, Nadia; Mazza, Cinzia; Magistroni, Riccardo; Savoldi, Gianfranco; Scolari, Francesco

    2016-01-01

    Autosomal Dominant Polycystic Kidney Disease (ADPKD) is the most frequent genetic disease, characterized by progressive development of bilateral renal cysts. Two causative genes have been identified: PKD1 and PKD2. ADPKD phenotype is highly variable. Typically, ADPKD is an adult onset disease. However, occasionally, ADPKD manifests as very early onset disease. The phenotypic variability of ADPKD can be explained at three genetic levels: genic, allelic and gene modifier effects. Recent advances in molecular screening for PKD gene mutations and the introduction of the new next generation sequencing (NGS)- based genotyping approach have generated considerable improvement regarding the knowledge of genetic basis of ADPKD. The purpose of this article is to provide a comprehensive review of the genetics of ADPKD, focusing on new insights in genotype-phenotype correlation and exploring novel clinical approach to genetic testing. Evaluation of these new genetic information requires a multidisciplinary approach involving a nephrologist and a clinical geneticist. PMID:27067213

  20. Automated Camera Calibration

    NASA Technical Reports Server (NTRS)

    Chen, Siqi; Cheng, Yang; Willson, Reg

    2006-01-01

    Automated Camera Calibration (ACAL) is a computer program that automates the generation of calibration data for camera models used in machine vision systems. Machine vision camera models describe the mapping between points in three-dimensional (3D) space in front of the camera and the corresponding points in two-dimensional (2D) space in the camera s image. Calibrating a camera model requires a set of calibration data containing known 3D-to-2D point correspondences for the given camera system. Generating calibration data typically involves taking images of a calibration target where the 3D locations of the target s fiducial marks are known, and then measuring the 2D locations of the fiducial marks in the images. ACAL automates the analysis of calibration target images and greatly speeds the overall calibration process.

  1. Automated telescope scheduling

    NASA Technical Reports Server (NTRS)

    Johnston, Mark D.

    1988-01-01

    With the ever increasing level of automation of astronomical telescopes the benefits and feasibility of automated planning and scheduling are becoming more apparent. Improved efficiency and increased overall telescope utilization are the most obvious goals. Automated scheduling at some level has been done for several satellite observatories, but the requirements on these systems were much less stringent than on modern ground or satellite observatories. The scheduling problem is particularly acute for Hubble Space Telescope: virtually all observations must be planned in excruciating detail weeks to months in advance. Space Telescope Science Institute has recently made significant progress on the scheduling problem by exploiting state-of-the-art artificial intelligence software technology. What is especially interesting is that this effort has already yielded software that is well suited to scheduling groundbased telescopes, including the problem of optimizing the coordinated scheduling of more than one telescope.

  2. Automated telescope scheduling

    NASA Astrophysics Data System (ADS)

    Johnston, Mark D.

    1988-08-01

    With the ever increasing level of automation of astronomical telescopes the benefits and feasibility of automated planning and scheduling are becoming more apparent. Improved efficiency and increased overall telescope utilization are the most obvious goals. Automated scheduling at some level has been done for several satellite observatories, but the requirements on these systems were much less stringent than on modern ground or satellite observatories. The scheduling problem is particularly acute for Hubble Space Telescope: virtually all observations must be planned in excruciating detail weeks to months in advance. Space Telescope Science Institute has recently made significant progress on the scheduling problem by exploiting state-of-the-art artificial intelligence software technology. What is especially interesting is that this effort has already yielded software that is well suited to scheduling groundbased telescopes, including the problem of optimizing the coordinated scheduling of more than one telescope.

  3. Power subsystem automation study

    NASA Technical Reports Server (NTRS)

    Imamura, M. S.; Moser, R. L.; Veatch, M.

    1983-01-01

    Generic power-system elements and their potential faults are identified. Automation functions and their resulting benefits are defined and automation functions between power subsystem, central spacecraft computer, and ground flight-support personnel are partitioned. All automation activities were categorized as data handling, monitoring, routine control, fault handling, planning and operations, or anomaly handling. Incorporation of all these classes of tasks, except for anomaly handling, in power subsystem hardware and software was concluded to be mandatory to meet the design and operational requirements of the space station. The key drivers are long mission lifetime, modular growth, high-performance flexibility, a need to accommodate different electrical user-load equipment, onorbit assembly/maintenance/servicing, and potentially large number of power subsystem components. A significant effort in algorithm development and validation is essential in meeting the 1987 technology readiness date for the space station.

  4. Automated Factor Slice Sampling.

    PubMed

    Tibbits, Matthew M; Groendyke, Chris; Haran, Murali; Liechty, John C

    2014-01-01

    Markov chain Monte Carlo (MCMC) algorithms offer a very general approach for sampling from arbitrary distributions. However, designing and tuning MCMC algorithms for each new distribution, can be challenging and time consuming. It is particularly difficult to create an efficient sampler when there is strong dependence among the variables in a multivariate distribution. We describe a two-pronged approach for constructing efficient, automated MCMC algorithms: (1) we propose the "factor slice sampler", a generalization of the univariate slice sampler where we treat the selection of a coordinate basis (factors) as an additional tuning parameter, and (2) we develop an approach for automatically selecting tuning parameters in order to construct an efficient factor slice sampler. In addition to automating the factor slice sampler, our tuning approach also applies to the standard univariate slice samplers. We demonstrate the efficiency and general applicability of our automated MCMC algorithm with a number of illustrative examples. PMID:24955002

  5. Automated Factor Slice Sampling

    PubMed Central

    Tibbits, Matthew M.; Groendyke, Chris; Haran, Murali; Liechty, John C.

    2013-01-01

    Markov chain Monte Carlo (MCMC) algorithms offer a very general approach for sampling from arbitrary distributions. However, designing and tuning MCMC algorithms for each new distribution, can be challenging and time consuming. It is particularly difficult to create an efficient sampler when there is strong dependence among the variables in a multivariate distribution. We describe a two-pronged approach for constructing efficient, automated MCMC algorithms: (1) we propose the “factor slice sampler”, a generalization of the univariate slice sampler where we treat the selection of a coordinate basis (factors) as an additional tuning parameter, and (2) we develop an approach for automatically selecting tuning parameters in order to construct an efficient factor slice sampler. In addition to automating the factor slice sampler, our tuning approach also applies to the standard univariate slice samplers. We demonstrate the efficiency and general applicability of our automated MCMC algorithm with a number of illustrative examples. PMID:24955002

  6. Automated segmentation and reconstruction of patient-specific cardiac anatomy and pathology from in vivo MRI*

    NASA Astrophysics Data System (ADS)

    Ringenberg, Jordan; Deo, Makarand; Devabhaktuni, Vijay; Filgueiras-Rama, David; Pizarro, Gonzalo; Ibañez, Borja; Berenfeld, Omer; Boyers, Pamela; Gold, Jeffrey

    2012-12-01

    This paper presents an automated method to segment left ventricle (LV) tissues from functional and delayed-enhancement (DE) cardiac magnetic resonance imaging (MRI) scans using a sequential multi-step approach. First, a region of interest (ROI) is computed to create a subvolume around the LV using morphological operations and image arithmetic. From the subvolume, the myocardial contours are automatically delineated using difference of Gaussians (DoG) filters and GSV snakes. These contours are used as a mask to identify pathological tissues, such as fibrosis or scar, within the DE-MRI. The presented automated technique is able to accurately delineate the myocardium and identify the pathological tissue in patient sets. The results were validated by two expert cardiologists, and in one set the automated results are quantitatively and qualitatively compared with expert manual delineation. Furthermore, the method is patient-specific, performed on an entire patient MRI series. Thus, in addition to providing a quick analysis of individual MRI scans, the fully automated segmentation method is used for effectively tagging regions in order to reconstruct computerized patient-specific 3D cardiac models. These models can then be used in electrophysiological studies and surgical strategy planning.

  7. Automated knowledge generation

    NASA Technical Reports Server (NTRS)

    Myler, Harley R.; Gonzalez, Avelino J.

    1988-01-01

    The general objectives of the NASA/UCF Automated Knowledge Generation Project were the development of an intelligent software system that could access CAD design data bases, interpret them, and generate a diagnostic knowledge base in the form of a system model. The initial area of concentration is in the diagnosis of the process control system using the Knowledge-based Autonomous Test Engineer (KATE) diagnostic system. A secondary objective was the study of general problems of automated knowledge generation. A prototype was developed, based on object-oriented language (Flavors).

  8. Automation of analytical isotachophoresis

    NASA Technical Reports Server (NTRS)

    Thormann, Wolfgang

    1985-01-01

    The basic features of automation of analytical isotachophoresis (ITP) are reviewed. Experimental setups consisting of narrow bore tubes which are self-stabilized against thermal convection are considered. Sample detection in free solution is discussed, listing the detector systems presently used or expected to be of potential use in the near future. The combination of a universal detector measuring the evolution of ITP zone structures with detector systems specific to desired components is proposed as a concept of an automated chemical analyzer based on ITP. Possible miniaturization of such an instrument by means of microlithographic techniques is discussed.

  9. Automating the CMS DAQ

    SciTech Connect

    Bauer, G.; et al.

    2014-01-01

    We present the automation mechanisms that have been added to the Data Acquisition and Run Control systems of the Compact Muon Solenoid (CMS) experiment during Run 1 of the LHC, ranging from the automation of routine tasks to automatic error recovery and context-sensitive guidance to the operator. These mechanisms helped CMS to maintain a data taking efficiency above 90% and to even improve it to 95% towards the end of Run 1, despite an increase in the occurrence of single-event upsets in sub-detector electronics at high LHC luminosity.

  10. Automated gas chromatography

    DOEpatents

    Mowry, Curtis D.; Blair, Dianna S.; Rodacy, Philip J.; Reber, Stephen D.

    1999-01-01

    An apparatus and process for the continuous, near real-time monitoring of low-level concentrations of organic compounds in a liquid, and, more particularly, a water stream. A small liquid volume of flow from a liquid process stream containing organic compounds is diverted by an automated process to a heated vaporization capillary where the liquid volume is vaporized to a gas that flows to an automated gas chromatograph separation column to chromatographically separate the organic compounds. Organic compounds are detected and the information transmitted to a control system for use in process control. Concentrations of organic compounds less than one part per million are detected in less than one minute.

  11. Automated software development workstation

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Engineering software development was automated using an expert system (rule-based) approach. The use of this technology offers benefits not available from current software development and maintenance methodologies. A workstation was built with a library or program data base with methods for browsing the designs stored; a system for graphical specification of designs including a capability for hierarchical refinement and definition in a graphical design system; and an automated code generation capability in FORTRAN. The workstation was then used in a demonstration with examples from an attitude control subsystem design for the space station. Documentation and recommendations are presented.

  12. Grouped and Multistep Nanoheteroepitaxy: Toward High-Quality GaN on Quasi-Periodic Nano-Mask.

    PubMed

    Feng, Xiaohui; Yu, Tongjun; Wei, Yang; Ji, Cheng; Cheng, Yutian; Zong, Hua; Wang, Kun; Yang, Zhijian; Kang, Xiangning; Zhang, Guoyi; Fan, Shoushan

    2016-07-20

    A novel nanoheteroepitaxy method, namely, the grouped and multistep nanoheteroepitaxy (GM-NHE), is proposed to attain a high-quality gallium nitride (GaN) epilayer by metal-organic vapor phase epitaxy. This method combines the effects of sub-100 nm nucleation and multistep lateral growth by using a low-cost but unique carbon nanotube mask, which consists of nanoscale growth windows with a quasi-periodic 2D fill factor. It is found that GM-NHE can facilely reduce threading dislocation density (TDD) and modulate residual stress on foreign substrate without any regrowth. As a result, high-quality GaN epilayer is produced with homogeneously low TDD of 4.51 × 10(7) cm(-2) and 2D-modulated stress, and the performance of the subsequent 410 nm near-ultraviolet light-emitting diode is greatly boosted. In this way, with the facile fabrication of nanomask and the one-off epitaxy procedure, GaN epilayer is prominently improved with the assistance of nanotechnology, which demonstrates great application potential for high-efficiency TDD-sensitive optoelectronic and electronic devices. PMID:27351723

  13. Abundance and composition of indigenous bacterial communities in a multi-step biofiltration-based drinking water treatment plant.

    PubMed

    Lautenschlager, Karin; Hwang, Chiachi; Ling, Fangqiong; Liu, Wen-Tso; Boon, Nico; Köster, Oliver; Egli, Thomas; Hammes, Frederik

    2014-10-01

    Indigenous bacterial communities are essential for biofiltration processes in drinking water treatment systems. In this study, we examined the microbial community composition and abundance of three different biofilter types (rapid sand, granular activated carbon, and slow sand filters) and their respective effluents in a full-scale, multi-step treatment plant (Zürich, CH). Detailed analysis of organic carbon degradation underpinned biodegradation as the primary function of the biofilter biomass. The biomass was present in concentrations ranging between 2-5 × 10(15) cells/m(3) in all filters but was phylogenetically, enzymatically and metabolically diverse. Based on 16S rRNA gene-based 454 pyrosequencing analysis for microbial community composition, similar microbial taxa (predominantly Proteobacteria, Planctomycetes, Acidobacteria, Bacteriodetes, Nitrospira and Chloroflexi) were present in all biofilters and in their respective effluents, but the ratio of microbial taxa was different in each filter type. This change was also reflected in the cluster analysis, which revealed a change of 50-60% in microbial community composition between the different filter types. This study documents the direct influence of the filter biomass on the microbial community composition of the final drinking water, particularly when the water is distributed without post-disinfection. The results provide new insights on the complexity of indigenous bacteria colonizing drinking water systems, especially in different biofilters of a multi-step treatment plant. PMID:24937356

  14. Elastic Frustration Causing Two-Step and Multistep Transitions in Spin-Crossover Solids: Emergence of Complex Antiferroelastic Structures.

    PubMed

    Paez-Espejo, Miguel; Sy, Mouhamadou; Boukheddaden, Kamel

    2016-03-01

    Two-step and multistep spin transitions are frequently observed in switchable cooperative molecular solids. They present the advantage to open the way for three- or several-bit electronics. Despite extensive experimental studies, their theoretical description was to date only phenomenological, based on Ising models including competing ferro- and antiferro-magnetic interactions, even though it is recognized that the elastic interactions are at the heart of the spin transition phenomenon, due to the volume change between the low- and high-temperature phases. To remedy this shortcoming, we designed the first consistent elastic model, taking into account both volume change upon spin transition and elastic frustration. This ingredient was revealed to be powerful, since it was able to obtain all observed experimental configurations in a consistent way. Thus, according to the strength of the elastic frustration, the system may undergo first-order transition with hysteresis, gradual, hysteretic two-step or multistep transitions, and incomplete transitions. Furthermore, the analysis of the spatial organization of the HS and LS species in the plateau regions revealed the emergence of complex antiferro-elastic patterns going from simple antiferro-magnetic-like order to long-range spatial modulations of the high-spin fraction. These results enabled us to identify the elastic frustration as the fundamental mechanism at the origin of the very recent experimental observations showing the existence of organized spatial modulations of the high-spin fraction inside the plateau of two-step spin transitions. PMID:26860531

  15. Human Factors In Aircraft Automation

    NASA Technical Reports Server (NTRS)

    Billings, Charles

    1995-01-01

    Report presents survey of state of art in human factors in automation of aircraft operation. Presents examination of aircraft automation and effects on flight crews in relation to human error and aircraft accidents.

  16. A multi-step system for screening and localization of hard exudates in retinal images

    NASA Astrophysics Data System (ADS)

    Bopardikar, Ajit S.; Bhola, Vishal; Raghavendra, B. S.; Narayanan, Rangavittal

    2012-03-01

    The number of people being affected by Diabetes mellitus worldwide is increasing at an alarming rate. Monitoring of the diabetic condition and its effects on the human body are therefore of great importance. Of particular interest is diabetic retinopathy (DR) which is a result of prolonged, unchecked diabetes and affects the visual system. DR is a leading cause of blindness throughout the world. At any point of time 25 - 44% of people with diabetes are afflicted by DR. Automation of the screening and monitoring process for DR is therefore essential for efficient utilization of healthcare resources and optimizing treatment of the affected individuals. Such automation would use retinal images and detect the presence of specific artifacts such as hard exudates, hemorrhages and soft exudates (that may appear in the image) to gauge the severity of DR. In this paper, we focus on the detection of hard exudates. We propose a two step system that consists of a screening step that classifies retinal images as normal or abnormal based on the presence of hard exudates and a detection stage that localizes these artifacts in an abnormal retinal image. The proposed screening step automatically detects the presence of hard exudates with a high sensitivity and positive predictive value (PPV ). The detection/localization step uses a k-means based clustering approach to localize hard exudates in the retinal image. Suitable feature vectors are chosen based on their ability to isolate hard exudates while minimizing false detections. The algorithm was tested on a benchmark dataset (DIARETDB1) and was seen to provide a superior performance compared to existing methods. The two-step process described in this paper can be embedded in a tele-ophthalmology system to aid with speedy detection and diagnosis of the severity of DR.

  17. ATC automation concepts

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    1990-01-01

    Information on the design of human-centered tools for terminal area air traffic control (ATC) is given in viewgraph form. Information is given on payoffs and products, guidelines, ATC as a team process, automation tools for ATF, and the traffic management advisor.

  18. Building Automation Systems.

    ERIC Educational Resources Information Center

    Honeywell, Inc., Minneapolis, Minn.

    A number of different automation systems for use in monitoring and controlling building equipment are described in this brochure. The system functions include--(1) collection of information, (2) processing and display of data at a central panel, and (3) taking corrective action by sounding alarms, making adjustments, or automatically starting and…

  19. Automated Student Model Improvement

    ERIC Educational Resources Information Center

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  20. Automated Management Of Documents

    NASA Technical Reports Server (NTRS)

    Boy, Guy

    1995-01-01

    Report presents main technical issues involved in computer-integrated documentation. Problems associated with automation of management and maintenance of documents analyzed from perspectives of artificial intelligence and human factors. Technologies that may prove useful in computer-integrated documentation reviewed: these include conventional approaches to indexing and retrieval of information, use of hypertext, and knowledge-based artificial-intelligence systems.

  1. Automating Small Libraries.

    ERIC Educational Resources Information Center

    Swan, James

    1996-01-01

    Presents a four-phase plan for small libraries strategizing for automation: inventory and weeding, data conversion, implementation, and enhancements. Other topics include selecting a system, MARC records, compatibility, ease of use, industry standards, searching capabilities, support services, system security, screen displays, circulation modules,…

  2. Automated solvent concentrator

    NASA Technical Reports Server (NTRS)

    Griffith, J. S.; Stuart, J. L.

    1976-01-01

    Designed for automated drug identification system (AUDRI), device increases concentration by 100. Sample is first filtered, removing particulate contaminants and reducing water content of sample. Sample is extracted from filtered residue by specific solvent. Concentrator provides input material to analysis subsystem.

  3. Automated Essay Scoring

    ERIC Educational Resources Information Center

    Dikli, Semire

    2006-01-01

    The impacts of computers on writing have been widely studied for three decades. Even basic computers functions, i.e. word processing, have been of great assistance to writers in modifying their essays. The research on Automated Essay Scoring (AES) has revealed that computers have the capacity to function as a more effective cognitive tool (Attali,…

  4. Automation in haemostasis.

    PubMed

    Huber, A R; Méndez, A; Brunner-Agten, S

    2013-01-01

    Automatia, an ancient Greece goddess of luck who makes things happen by themselves and on her own will without human engagement, is present in our daily life in the medical laboratory. Automation has been introduced and perfected by clinical chemistry and since then expanded into other fields such as haematology, immunology, molecular biology and also coagulation testing. The initial small and relatively simple standalone instruments have been replaced by more complex systems that allow for multitasking. Integration of automated coagulation testing into total laboratory automation has become possible in the most recent years. Automation has many strengths and opportunities if weaknesses and threats are respected. On the positive side, standardization, reduction of errors, reduction of cost and increase of throughput are clearly beneficial. Dependence on manufacturers, high initiation cost and somewhat expensive maintenance are less favourable factors. The modern lab and especially the todays lab technicians and academic personnel in the laboratory do not add value for the doctor and his patients by spending lots of time behind the machines. In the future the lab needs to contribute at the bedside suggesting laboratory testing and providing support and interpretation of the obtained results. The human factor will continue to play an important role in testing in haemostasis yet under different circumstances. PMID:23460141

  5. Automated conflict resolution issues

    NASA Technical Reports Server (NTRS)

    Wike, Jeffrey S.

    1991-01-01

    A discussion is presented of how conflicts for Space Network resources should be resolved in the ATDRSS era. The following topics are presented: a description of how resource conflicts are currently resolved; a description of issues associated with automated conflict resolution; present conflict resolution strategies; and topics for further discussion.

  6. Mining Your Automated System.

    ERIC Educational Resources Information Center

    Larsen, Patricia M., Ed.; And Others

    1996-01-01

    Four articles address issues of collecting, compiling, reporting, and interpreting statistics generated by automated library systems for administrative decision making. Topics include using a management information system to forecast growth and assess areas for downsizing; statistics for collection development and analysis; and online system…

  7. Automated galaxy recognition

    NASA Astrophysics Data System (ADS)

    Rappaport, Barry; Anderson, Kurt

    Previous approaches to automated image processing have used both deterministic and nondeterministic techniques. These have not used any form of conceptual learning nor have they employed artificial intelligence techniques. Addition of such techniques to the task of image processing may significantly enhance the efficiencies and accuracies of the recognition and classification processes. In our application, the objects to be recognized and classified are galaxies.

  8. Automated Administrative Data Bases

    NASA Technical Reports Server (NTRS)

    Marrie, M. D.; Jarrett, J. R.; Reising, S. A.; Hodge, J. E.

    1984-01-01

    Improved productivity and more effective response to information requirements for internal management, NASA Centers, and Headquarters resulted from using automated techniques. Modules developed to provide information on manpower, RTOPS, full time equivalency, and physical space reduced duplication, increased communication, and saved time. There is potential for greater savings by sharing and integrating with those who have the same requirements.

  9. Automating Food Service.

    ERIC Educational Resources Information Center

    Kavulla, Timothy A.

    1986-01-01

    The Wichita, Kansas, Public Schools' Food Service Department Project Reduction in Paperwork (RIP) is designed to automate certain paperwork functions, thus reducing cost and flow of paper. This article addresses how RIP manages free/reduced meal applications and meets the objectives of reducing paper and increasing accuracy, timeliness, and…

  10. Program automated documentation methods

    NASA Technical Reports Server (NTRS)

    Lanzano, B. C.

    1970-01-01

    The mission analysis and trajectory simulation program is summarized; it provides an understanding of the size and complexity of one simulation for which documentation is mandatory. Programs for automating documentation of subroutines, flow charts, and internal cross reference information are also included.

  11. Automated Geospatial Watershed Assessment

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment (AGWA) tool is a Geographic Information Systems (GIS) interface jointly developed by the U.S. Environmental Protection Agency, the U.S. Department of Agriculture (USDA) Agricultural Research Service, and the University of Arizona to a...

  12. Microcontroller for automation application

    NASA Technical Reports Server (NTRS)

    Cooper, H. W.

    1975-01-01

    The description of a microcontroller currently being developed for automation application was given. It is basically an 8-bit microcomputer with a 40K byte random access memory/read only memory, and can control a maximum of 12 devices through standard 15-line interface ports.

  13. Automated CCTV Tester

    Energy Science and Technology Software Center (ESTSC)

    2000-09-13

    The purpose of an automated CCTV tester is to automatically and continuously monitor multiple perimeter security cameras for changes in a camera's measured resolution and alignment (camera looking at the proper area). It shall track and record the image quality and position of each camera and produce an alarm when a camera is out of specification.

  14. Validating Automated Speaking Tests

    ERIC Educational Resources Information Center

    Bernstein, Jared; Van Moere, Alistair; Cheng, Jian

    2010-01-01

    This paper presents evidence that supports the valid use of scores from fully automatic tests of spoken language ability to indicate a person's effectiveness in spoken communication. The paper reviews the constructs, scoring, and the concurrent validity evidence of "facility-in-L2" tests, a family of automated spoken language tests in Spanish,…

  15. Automated EEG acquisition

    NASA Technical Reports Server (NTRS)

    Frost, J. D., Jr.; Hillman, C. E., Jr.

    1977-01-01

    Automated self-contained portable device can be used by technicians with minimal training. Data acquired from patient at remote site are transmitted to centralized interpretation center using conventional telephone equipment. There, diagnostic information is analyzed, and results are relayed back to remote site.

  16. Automation in Libraries.

    ERIC Educational Resources Information Center

    Canadian Library Association, Ottawa (Ontario).

    The fourth Canadian Association of College and University Libraries (CACUL) Conference on Library Automation was held in Hamilton, June 20-21, 1970, as a pre-conference workshop of the Canadian Library Association (CLA). The purpose of the conference was to present papers on current projects and to discuss the continuing need for this type of…

  17. Staff Reactions to Automation.

    ERIC Educational Resources Information Center

    Winstead, Elizabeth B.

    1994-01-01

    Describes two surveys of three libraries on a university campus, one conducted in 1987 and one in 1993, that investigated how library staff reacted to the library automation process. The hypotheses that were tested are discussed, and results are compared to a similar survey conducted in 1985. (LRW)

  18. Medical genetics

    SciTech Connect

    Nora, J.J.; Fraser, F.C.

    1989-01-01

    This book presents a discussion of medical genetics for the practitioner treating or counseling patients with genetic disease. It includes a discussion of the relationship of heredity and diseases, the chromosomal basis for heredity, gene frequencies, and genetics of development and maldevelopment. The authors also focus on teratology, somatic cell genetics, genetics and cancer, genetics of behavior.

  19. A universal method for automated gene mapping

    PubMed Central

    Zipperlen, Peder; Nairz, Knud; Rimann, Ivo; Basler, Konrad; Hafen, Ernst; Hengartner, Michael; Hajnal, Alex

    2005-01-01

    Small insertions or deletions (InDels) constitute a ubiquituous class of sequence polymorphisms found in eukaryotic genomes. Here, we present an automated high-throughput genotyping method that relies on the detection of fragment-length polymorphisms (FLPs) caused by InDels. The protocol utilizes standard sequencers and genotyping software. We have established genome-wide FLP maps for both Caenorhabditis elegans and Drosophila melanogaster that facilitate genetic mapping with a minimum of manual input and at comparatively low cost. PMID:15693948

  20. Automating spectral measurements

    NASA Astrophysics Data System (ADS)

    Goldstein, Fred T.

    2008-09-01

    This paper discusses the architecture of software utilized in spectroscopic measurements. As optical coatings become more sophisticated, there is mounting need to automate data acquisition (DAQ) from spectrophotometers. Such need is exacerbated when 100% inspection is required, ancillary devices are utilized, cost reduction is crucial, or security is vital. While instrument manufacturers normally provide point-and-click DAQ software, an application programming interface (API) may be missing. In such cases automation is impossible or expensive. An API is typically provided in libraries (*.dll, *.ocx) which may be embedded in user-developed applications. Users can thereby implement DAQ automation in several Windows languages. Another possibility, developed by FTG as an alternative to instrument manufacturers' software, is the ActiveX application (*.exe). ActiveX, a component of many Windows applications, provides means for programming and interoperability. This architecture permits a point-and-click program to act as automation client and server. Excel, for example, can control and be controlled by DAQ applications. Most importantly, ActiveX permits ancillary devices such as barcode readers and XY-stages to be easily and economically integrated into scanning procedures. Since an ActiveX application has its own user-interface, it can be independently tested. The ActiveX application then runs (visibly or invisibly) under DAQ software control. Automation capabilities are accessed via a built-in spectro-BASIC language with industry-standard (VBA-compatible) syntax. Supplementing ActiveX, spectro-BASIC also includes auxiliary serial port commands for interfacing programmable logic controllers (PLC). A typical application is automatic filter handling.

  1. Comparison of GenomEra C. difficile and Xpert C. difficile as Confirmatory Tests in a Multistep Algorithm for Diagnosis of Clostridium difficile Infection

    PubMed Central

    Reigadas, Elena; Marín, Mercedes; Fernández-Chico, Antonia; Catalán, Pilar; Bouza, Emilio

    2014-01-01

    We compared two multistep diagnostic algorithms based on C. Diff Quik Chek Complete and, as confirmatory tests, GenomEra C. difficile and Xpert C. difficile. The sensitivity, specificity, positive predictive value, and negative predictive value were 87.2%, 99.7%, 97.1%, and 98.3%, respectively, for the GenomEra-based algorithm and 89.7%, 99.4%, 95.5%, and 98.6%, respectively, for the Xpert-based algorithm. GenomEra represents an alternative to Xpert as a confirmatory test of a multistep algorithm for Clostridium difficile infection (CDI) diagnosis. PMID:25392360

  2. Algorithms for automated DNA assembly

    PubMed Central

    Densmore, Douglas; Hsiau, Timothy H.-C.; Kittleson, Joshua T.; DeLoache, Will; Batten, Christopher; Anderson, J. Christopher

    2010-01-01

    Generating a defined set of genetic constructs within a large combinatorial space provides a powerful method for engineering novel biological functions. However, the process of assembling more than a few specific DNA sequences can be costly, time consuming and error prone. Even if a correct theoretical construction scheme is developed manually, it is likely to be suboptimal by any number of cost metrics. Modular, robust and formal approaches are needed for exploring these vast design spaces. By automating the design of DNA fabrication schemes using computational algorithms, we can eliminate human error while reducing redundant operations, thus minimizing the time and cost required for conducting biological engineering experiments. Here, we provide algorithms that optimize the simultaneous assembly of a collection of related DNA sequences. We compare our algorithms to an exhaustive search on a small synthetic dataset and our results show that our algorithms can quickly find an optimal solution. Comparison with random search approaches on two real-world datasets show that our algorithms can also quickly find lower-cost solutions for large datasets. PMID:20335162

  3. Automation model of sewerage rehabilitation planning.

    PubMed

    Yang, M D; Su, T C

    2006-01-01

    The major steps of sewerage rehabilitation include inspection of sewerage, assessment of structural conditions, computation of structural condition grades, and determination of rehabilitation methods and materials. Conventionally, sewerage rehabilitation planning relies on experts with professional background that is tedious and time-consuming. This paper proposes an automation model of planning optimal sewerage rehabilitation strategies for the sewer system by integrating image process, clustering technology, optimization, and visualization display. Firstly, image processing techniques, such as wavelet transformation and co-occurrence features extraction, were employed to extract various characteristics of structural failures from CCTV inspection images. Secondly, a classification neural network was established to automatically interpret the structural conditions by comparing the extracted features with the typical failures in a databank. Then, to achieve optimal rehabilitation efficiency, a genetic algorithm was used to determine appropriate rehabilitation methods and substitution materials for the pipe sections with a risk of mal-function and even collapse. Finally, the result from the automation model can be visualized in a geographic information system in which essential information of the sewer system and sewerage rehabilitation plans are graphically displayed. For demonstration, the automation model of optimal sewerage rehabilitation planning was applied to a sewer system in east Taichung, Chinese Taiwan. PMID:17302324

  4. Genetics Home Reference: tyrosinemia

    MedlinePlus

    ... in the multistep process that breaks down the amino acid tyrosine, a building block of most proteins. If ... Resources MedlinePlus (4 links) Encyclopedia: Aminoaciduria Health Topic: Amino Acid Metabolism Disorders Health Topic: Liver Diseases Health Topic: ...

  5. Automated Pollution Control

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Patterned after the Cassini Resource Exchange (CRE), Sholtz and Associates established the Automated Credit Exchange (ACE), an Internet-based concept that automates the auctioning of "pollution credits" in Southern California. An early challenge of the Jet Propulsion Laboratory's Cassini mission was allocating the spacecraft's resources. To support the decision-making process, the CRE was developed. The system removes the need for the science instrument manager to know the individual instruments' requirements for the spacecraft resources. Instead, by utilizing principles of exchange, the CRE induces the instrument teams to reveal their requirements. In doing so, they arrive at an efficient allocation of spacecraft resources by trading among themselves. A Southern California RECLAIM air pollution credit trading market has been set up using same bartering methods utilized in the Cassini mission in order to help companies keep pollution and costs down.

  6. Automated macromolecular crystallization screening

    DOEpatents

    Segelke, Brent W.; Rupp, Bernhard; Krupka, Heike I.

    2005-03-01

    An automated macromolecular crystallization screening system wherein a multiplicity of reagent mixes are produced. A multiplicity of analysis plates is produced utilizing the reagent mixes combined with a sample. The analysis plates are incubated to promote growth of crystals. Images of the crystals are made. The images are analyzed with regard to suitability of the crystals for analysis by x-ray crystallography. A design of reagent mixes is produced based upon the expected suitability of the crystals for analysis by x-ray crystallography. A second multiplicity of mixes of the reagent components is produced utilizing the design and a second multiplicity of reagent mixes is used for a second round of automated macromolecular crystallization screening. In one embodiment the multiplicity of reagent mixes are produced by a random selection of reagent components.

  7. Automated Assembly Center (AAC)

    NASA Technical Reports Server (NTRS)

    Stauffer, Robert J.

    1993-01-01

    The objectives of this project are as follows: to integrate advanced assembly and assembly support technology under a comprehensive architecture; to implement automated assembly technologies in the production of high-visibility DOD weapon systems; and to document the improved cost, quality, and lead time. This will enhance the production of DOD weapon systems by utilizing the latest commercially available technologies combined into a flexible system that will be able to readily incorporate new technologies as they emerge. Automated assembly encompasses the following areas: product data, process planning, information management policies and framework, three schema architecture, open systems communications, intelligent robots, flexible multi-ability end effectors, knowledge-based/expert systems, intelligent workstations, intelligent sensor systems, and PDES/PDDI data standards.

  8. Automated breeder fuel fabrication

    SciTech Connect

    Goldmann, L.H.; Frederickson, J.R.

    1983-09-01

    The objective of the Secure Automated Fabrication (SAF) Project is to develop remotely operated equipment for the processing and manufacturing of breeder reactor fuel pins. The SAF line will be installed in the Fuels and Materials Examination Facility (FMEF). The FMEF is presently under construction at the Department of Energy's (DOE) Hanford site near Richland, Washington, and is operated by the Westinghouse Hanford Company (WHC). The fabrication and support systems of the SAF line are designed for computer-controlled operation from a centralized control room. Remote and automated fuel fabriction operations will result in: reduced radiation exposure to workers; enhanced safeguards; improved product quality; near real-time accountability, and increased productivity. The present schedule calls for installation of SAF line equipment in the FMEF beginning in 1984, with qualifying runs starting in 1986 and production commencing in 1987. 5 figures.

  9. Automated assembly in space

    NASA Technical Reports Server (NTRS)

    Srivastava, Sandanand; Dwivedi, Suren N.; Soon, Toh Teck; Bandi, Reddy; Banerjee, Soumen; Hughes, Cecilia

    1989-01-01

    The installation of robots and their use of assembly in space will create an exciting and promising future for the U.S. Space Program. The concept of assembly in space is very complicated and error prone and it is not possible unless the various parts and modules are suitably designed for automation. Certain guidelines are developed for part designing and for an easy precision assembly. Major design problems associated with automated assembly are considered and solutions to resolve these problems are evaluated in the guidelines format. Methods for gripping and methods for part feeding are developed with regard to the absence of gravity in space. The guidelines for part orientation, adjustments, compliances and various assembly construction are discussed. Design modifications of various fasteners and fastening methods are also investigated.

  10. Terminal automation system maintenance

    SciTech Connect

    Coffelt, D.; Hewitt, J.

    1997-01-01

    Nothing has improved petroleum product loading in recent years more than terminal automation systems. The presence of terminal automation systems (TAS) at loading racks has increased operational efficiency and safety and enhanced their accounting and management capabilities. However, like all finite systems, they occasionally malfunction or fail. Proper servicing and maintenance can minimize this. And in the unlikely event a TAS breakdown does occur, prompt and effective troubleshooting can reduce its impact on terminal productivity. To accommodate around-the-clock loading at racks, increasingly unattended by terminal personnel, TAS maintenance, servicing and troubleshooting has become increasingly demanding. It has also become increasingly important. After 15 years of trial and error at petroleum and petrochemical storage and transfer terminals, a number of successful troubleshooting programs have been developed. These include 24-hour {open_quotes}help hotlines,{close_quotes} internal (terminal company) and external (supplier) support staff, and {open_quotes}layered{close_quotes} support. These programs are described.

  11. Automated gas chromatography

    DOEpatents

    Mowry, C.D.; Blair, D.S.; Rodacy, P.J.; Reber, S.D.

    1999-07-13

    An apparatus and process for the continuous, near real-time monitoring of low-level concentrations of organic compounds in a liquid, and, more particularly, a water stream. A small liquid volume of flow from a liquid process stream containing organic compounds is diverted by an automated process to a heated vaporization capillary where the liquid volume is vaporized to a gas that flows to an automated gas chromatograph separation column to chromatographically separate the organic compounds. Organic compounds are detected and the information transmitted to a control system for use in process control. Concentrations of organic compounds less than one part per million are detected in less than one minute. 7 figs.

  12. Automated campaign system

    NASA Astrophysics Data System (ADS)

    Vondran, Gary; Chao, Hui; Lin, Xiaofan; Beyer, Dirk; Joshi, Parag; Atkins, Brian; Obrador, Pere

    2006-02-01

    To run a targeted campaign involves coordination and management across numerous organizations and complex process flows. Everything from market analytics on customer databases, acquiring content and images, composing the materials, meeting the sponsoring enterprise brand standards, driving through production and fulfillment, and evaluating results; all processes are currently performed by experienced highly trained staff. Presented is a developed solution that not only brings together technologies that automate each process, but also automates the entire flow so that a novice user could easily run a successful campaign from their desktop. This paper presents the technologies, structure, and process flows used to bring this system together. Highlighted will be how the complexity of running a targeted campaign is hidden from the user through technologies, all while providing the benefits of a professionally managed campaign.

  13. Automated Microbial Metabolism Laboratory

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Development of the automated microbial metabolism laboratory (AMML) concept is reported. The focus of effort of AMML was on the advanced labeled release experiment. Labeled substrates, inhibitors, and temperatures were investigated to establish a comparative biochemical profile. Profiles at three time intervals on soil and pure cultures of bacteria isolated from soil were prepared to establish a complete library. The development of a strategy for the return of a soil sample from Mars is also reported.

  14. Automated Cooperative Trajectories

    NASA Technical Reports Server (NTRS)

    Hanson, Curt; Pahle, Joseph; Brown, Nelson

    2015-01-01

    This presentation is an overview of the Automated Cooperative Trajectories project. An introduction to the phenomena of wake vortices is given, along with a summary of past research into the possibility of extracting energy from the wake by flying close parallel trajectories. Challenges and barriers to adoption of civilian automatic wake surfing technology are identified. A hardware-in-the-loop simulation is described that will support future research. Finally, a roadmap for future research and technology transition is proposed.

  15. Automated RTOP Management System

    NASA Technical Reports Server (NTRS)

    Hayes, P.

    1984-01-01

    The structure of NASA's Office of Aeronautics and Space Technology electronic information system network from 1983 to 1985 is illustrated. The RTOP automated system takes advantage of existing hardware, software, and expertise, and provides: (1) computerized cover sheet and resources forms; (2) electronic signature and transmission; (3) a data-based information system; (4) graphics; (5) intercenter communications; (6) management information; and (7) text editing. The system is coordinated with Headquarters efforts in codes R,E, and T.

  16. Components for automated microscopy

    NASA Astrophysics Data System (ADS)

    Determann, H.; Hartmann, H.; Schade, K. H.; Stankewitz, H. W.

    1980-12-01

    A number of devices, aiming at automated analysis of microscopic objects as regards their morphometrical parameters or their photometrical values, were developed. These comprise: (1) a device for automatic focusing tuned on maximum contrast; (2) a feedback system for automatic optimization of microscope illumination; and (3) microscope lenses with adjustable pupil distances for usage in the two previous devices. An extensive test program on histological and zytological applications proves the wide application possibilities of the autofocusing device.

  17. Power subsystem automation study

    NASA Technical Reports Server (NTRS)

    Tietz, J. C.; Sewy, D.; Pickering, C.; Sauers, R.

    1984-01-01

    The purpose of the phase 2 of the power subsystem automation study was to demonstrate the feasibility of using computer software to manage an aspect of the electrical power subsystem on a space station. The state of the art in expert systems software was investigated in this study. This effort resulted in the demonstration of prototype expert system software for managing one aspect of a simulated space station power subsystem.

  18. Cavendish Balance Automation

    NASA Technical Reports Server (NTRS)

    Thompson, Bryan

    2000-01-01

    This is the final report for a project carried out to modify a manual commercial Cavendish Balance for automated use in cryostat. The scope of this project was to modify an off-the-shelf manually operated Cavendish Balance to allow for automated operation for periods of hours or days in cryostat. The purpose of this modification was to allow the balance to be used in the study of effects of superconducting materials on the local gravitational field strength to determine if the strength of gravitational fields can be reduced. A Cavendish Balance was chosen because it is a fairly simple piece of equipment for measuring gravity, one the least accurately known and least understood physical constants. The principle activities that occurred under this purchase order were: (1) All the components necessary to hold and automate the Cavendish Balance in a cryostat were designed. Engineering drawings were made of custom parts to be fabricated, other off-the-shelf parts were procured; (2) Software was written in LabView to control the automation process via a stepper motor controller and stepper motor, and to collect data from the balance during testing; (3)Software was written to take the data collected from the Cavendish Balance and reduce it to give a value for the gravitational constant; (4) The components of the system were assembled and fitted to a cryostat. Also the LabView hardware including the control computer, stepper motor driver, data collection boards, and necessary cabling were assembled; and (5) The system was operated for a number of periods, data collected, and reduced to give an average value for the gravitational constant.

  19. Automated Testing System

    Energy Science and Technology Software Center (ESTSC)

    2006-05-09

    ATS is a Python-language program for automating test suites for software programs that do not interact with thier users, such as scripted scientific simulations. ATS features a decentralized approach especially suited to larger projects. In its multinode mode it can utilize many nodes of a cluster in order to do many test in parallel. It has features for submitting longer-running tests to a batch system and would have to be customized for use elsewhere.

  20. Discovery of novel, non-acidic mPGES-1 inhibitors by virtual screening with a multistep protocol

    PubMed Central

    Noha, Stefan M.; Fischer, Katrin; Koeberle, Andreas; Garscha, Ulrike; Werz, Oliver; Schuster, Daniela

    2015-01-01

    Microsomal prostaglandin E2 synthase-1 (mPGES-1) inhibitors are considered as potential therapeutic agents for the treatment of inflammatory pain and certain types of cancer. So far, several series of acidic as well as non-acidic inhibitors of mPGES-1 have been discovered. Acidic inhibitors, however, may have issues, such as loss of potency in human whole blood and in vivo, stressing the importance of the design and identification of novel, non-acidic chemical scaffolds of mPGES-1 inhibitors. Using a multistep virtual screening protocol, the Vitas-M compound library (∼1.3 million entries) was filtered and 16 predicted compounds were experimentally evaluated in a biological assay in vitro. This approach yielded two molecules active in the low micromolar range (IC50 values: 4.5 and 3.8 μM, respectively). PMID:26088337

  1. Statistical investigation of a blank holder force distribution system for a multi-step deep drawing process

    NASA Astrophysics Data System (ADS)

    Tommerup, So/ren; Endelt, Benny; Nielsen, Karl Brian

    2013-12-01

    This paper investigates process control possibilities obtained from a new tool concept for adaptive blank holder force (BHF) distribution. The investigation concerns the concept's application to a multi-step deep drawing process exemplified by the NUMISHEET2014 benchmark 2: Springback of draw-redraw pan. An actuator system, where several cavities are embedded into the blank holder plate is used. By independently controlling the pressure of hydraulic fluid in these cavities, a controlled deflection of the blank holder plate surface can be achieved whereby the distribution of the BHF can be controlled. Using design of experiments, a full 3-level factorial experiments is conducted with respect to the cavity pressures, and the effects and interactions are evaluated.

  2. Strong textured SmCo5 nanoflakes with ultrahigh coercivity prepared by multistep (three steps) surfactant-assisted ball milling

    NASA Astrophysics Data System (ADS)

    Zuo, Wen-Liang; Zhao, Xin; Xiong, Jie-Fu; Zhang, Ming; Zhao, Tong-Yun; Hu, Feng-Xia; Sun, Ji-Rong; Shen, Bao-Gen

    2015-08-01

    The high coercivity of 26.2 kOe for SmCo5 nanoflakes are obtained by multistep (three steps) surfactant-assisted ball milling. The magnetic properties, phase structure and morphology are studied by VSM, XRD and SEM, respectively. The results demonstrate that the three step ball-milling can keep more complete crystallinity (relatively less defects) during the process of milling compared with one step high energy ball-milling, which enhances the texture degree and coercivity. In addition, the mechanism of coercivity are also studied by the temperature dependence of demagnetization curves for aligned SmCo5 nanoflakes/resin composite, the result indicates that the magnetization reversal could be controlled by co-existed mechanisms of pinning and nucleation.

  3. A Multistep Synthesis of 4-Nitro-1-ethynylbenzene Involving Palladium Catalysis, Conformational Analysis, Acetal Hydrolysis, and Oxidative Decarbonylation

    NASA Astrophysics Data System (ADS)

    Goodwin, Thomas E.; Hurst, Eva M.; Ross, Ashley S.

    1999-01-01

    Palladium-catalyzed reactions, particularly carbon-carbon bond formations, are rapidly becoming a mainstay of organic synthesis in industrial and academic laboratories. Although these important procedures are covered in advanced organic chemistry texts, they have rarely permeated into introductory organic texts or laboratory manuals. One of the more useful processes involves the coupling of a terminal alkyne to an aromatic bromide or iodide. We describe a convenient coupling procedure for the preparation of the tetrahydropyranyl ether of a propargyl alcohol derivative. This product can be easily hydrolyzed to the propargyl alcohol, then oxidatively decarbonylated to produce 4-nitro-1-ethynylbenzene. Several important topics may be illustrated and discussed in conjunction with the multistep microscale reaction series which has been developed. These include the following: palladium-catalyzed carbon-carbon bond formation, conformational analysis by NMR spectroscopy and molecular modeling, hydrolysis of an acetal, and oxidative decarbonylation via nucleophilic acyl substitution.

  4. Strong textured SmCo5 nanoflakes with ultrahigh coercivity prepared by multistep (three steps) surfactant-assisted ball milling

    PubMed Central

    Zuo, Wen-Liang; Zhao, Xin; Xiong, Jie-Fu; Zhang, Ming; Zhao, Tong-Yun; Hu, Feng-Xia; Sun, Ji-Rong; Shen, Bao-Gen

    2015-01-01

    The high coercivity of 26.2 kOe for SmCo5 nanoflakes are obtained by multistep (three steps) surfactant-assisted ball milling. The magnetic properties, phase structure and morphology are studied by VSM, XRD and SEM, respectively. The results demonstrate that the three step ball-milling can keep more complete crystallinity (relatively less defects) during the process of milling compared with one step high energy ball-milling, which enhances the texture degree and coercivity. In addition, the mechanism of coercivity are also studied by the temperature dependence of demagnetization curves for aligned SmCo5 nanoflakes/resin composite, the result indicates that the magnetization reversal could be controlled by co-existed mechanisms of pinning and nucleation. PMID:26272186

  5. Ultra-fast consensus of discrete-time multi-agent systems with multi-step predictive output feedback

    NASA Astrophysics Data System (ADS)

    Zhang, Wenle; Liu, Jianchang

    2016-04-01

    This article addresses the ultra-fast consensus problem of high-order discrete-time multi-agent systems based on a unified consensus framework. A novel multi-step predictive output mechanism is proposed under a directed communication topology containing a spanning tree. By predicting the outputs of a network several steps ahead and adding this information into the consensus protocol, it is shown that the asymptotic convergence factor is improved by a power of q + 1 compared to the routine consensus. The difficult problem of selecting the optimal control gain is solved well by introducing a variable called convergence step. In addition, the ultra-fast formation achievement is studied on the basis of this new consensus protocol. Finally, the ultra-fast consensus with respect to a reference model and robust consensus is discussed. Some simulations are performed to illustrate the effectiveness of the theoretical results.

  6. Improvement of Mechanical Properties in the Functionally Graded Aluminum Matrix Nanocomposites Fabricated via a Novel Multistep Friction Stir Processing

    NASA Astrophysics Data System (ADS)

    Salehi, Mojtaba; Farnoush, Hamidreza; Heydarian, Arash; Aghazadeh Mohandesi, Jamshid

    2015-02-01

    In the present study, the functionally graded bulk Al-SiC nanocomposites were successfully fabricated by applying a novel multistep friction stir processing. Microstructural observations by scanning electron microscope indicated a proper distribution of SiC nanoparticles in the Al 6061 matrix. Microhardness profiles descended to 50 from 160 Hv due to the formation of compositionally gradient of SiC nanoparticles along the thickness. The tensile behavior of graded samples revealed a simultaneous enhancement of ultimate tensile strength (44 pct), strain at maximum stress (244 pct), and work of fracture (492 pct) with respect to the homogeneous sample. Furthermore, the graded samples sustained up to 4 pct strain after initiation of primary cracking, while the catastrophic fracture occurred instantly after cracking in the homogenous sample. A dimple-like ductile fracture surface was observed for the graded layers in which an increase in the SiC particle content will result in smaller dimple size.

  7. Improvement of Mechanical Properties in the Functionally Graded Aluminum Matrix Nanocomposites Fabricated via a Novel Multistep Friction Stir Processing

    NASA Astrophysics Data System (ADS)

    Salehi, Mojtaba; Farnoush, Hamidreza; Heydarian, Arash; Aghazadeh Mohandesi, Jamshid

    2014-09-01

    In the present study, the functionally graded bulk Al-SiC nanocomposites were successfully fabricated by applying a novel multistep friction stir processing. Microstructural observations by scanning electron microscope indicated a proper distribution of SiC nanoparticles in the Al 6061 matrix. Microhardness profiles descended to 50 from 160 Hv due to the formation of compositionally gradient of SiC nanoparticles along the thickness. The tensile behavior of graded samples revealed a simultaneous enhancement of ultimate tensile strength (44 pct), strain at maximum stress (244 pct), and work of fracture (492 pct) with respect to the homogeneous sample. Furthermore, the graded samples sustained up to 4 pct strain after initiation of primary cracking, while the catastrophic fracture occurred instantly after cracking in the homogenous sample. A dimple-like ductile fracture surface was observed for the graded layers in which an increase in the SiC particle content will result in smaller dimple size.

  8. Automation in biological crystallization

    PubMed Central

    Shaw Stewart, Patrick; Mueller-Dieckmann, Jochen

    2014-01-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given. PMID:24915074

  9. Autonomy, Automation, and Systems

    NASA Astrophysics Data System (ADS)

    Turner, Philip R.

    1987-02-01

    Aerospace industry interest in autonomy and automation, given fresh impetus by the national goal of establishing a Space Station, is becoming a major item of research and technology development. The promise of new technology arising from research in Artificial Intelligence (AI) has focused much attention on its potential in autonomy and automation. These technologies can improve performance in autonomous control functions that involve planning, scheduling, and fault diagnosis of complex systems. There are, however, many aspects of system and subsystem design in an autonomous system that impact AI applications, but do not directly involve AI technology. Development of a system control architecture, establishment of an operating system within the design, providing command and sensory data collection features appropriate to automated operation, and the use of design analysis tools to support system engineering are specific examples of major design issues. Aspects such as these must also receive attention and technology development support if we are to implement complex autonomous systems within the realistic limitations of mass, power, cost, and available flight-qualified technology that are all-important to a flight project.

  10. Medical genetics

    SciTech Connect

    Jorde, L.B.; Carey, J.C.; White, R.L.

    1995-10-01

    This book on the subject of medical genetics is a textbook aimed at a very broad audience: principally, medical students, nursing students, graduate, and undergraduate students. The book is actually a primer of general genetics as applied to humans and provides a well-balanced introduction to the scientific and clinical basis of human genetics. The twelve chapters include: Introduction, Basic Cell Biology, Genetic Variation, Autosomal Dominant and Recessive Inheritance, Sex-linked and Mitochondrial Inheritance, Clinical Cytogenetics, Gene Mapping, Immunogenetics, Cancer Genetics, Multifactorial Inheritance and Common Disease, Genetic Screening, Genetic Diagnosis and Gene Therapy, and Clinical Genetics and Genetic Counseling.

  11. Reaction and catalyst engineering to exploit kinetically controlled whole-cell multistep biocatalysis for terminal FAME oxyfunctionalization.

    PubMed

    Schrewe, Manfred; Julsing, Mattijs K; Lange, Kerstin; Czarnotta, Eik; Schmid, Andreas; Bühler, Bruno

    2014-09-01

    The oxyfunctionalization of unactivated C−H bonds can selectively and efficiently be catalyzed by oxygenase-containing whole-cell biocatalysts. Recombinant Escherichia coli W3110 containing the alkane monooxygenase AlkBGT and the outer membrane protein AlkL from Pseudomonas putida GPo1 have been shown to efficiently catalyze the terminal oxyfunctionalization of renewable fatty acid methyl esters yielding bifunctional products of interest for polymer synthesis. In this study, AlkBGTL-containing E. coli W3110 is shown to catalyze the multistep conversion of dodecanoic acid methyl ester (DAME) via terminal alcohol and aldehyde to the acid, exhibiting Michaelis-Menten-type kinetics for each reaction step. In two-liquid phase biotransformations, the product formation pattern was found to be controlled by DAME availability. Supplying DAME as bulk organic phase led to accumulation of the terminal alcohol as the predominant product. Limiting DAME availability via application of bis(2-ethylhexyl)phthalate (BEHP) as organic carrier solvent enabled almost exclusive acid accumulation. Furthermore, utilization of BEHP enhanced catalyst stability by reducing toxic effects of substrate and products. A further shift towards the overoxidized products was achieved by co-expression of the gene encoding the alcohol dehydrogenase AlkJ, which was shown to catalyze efficient and irreversible alcohol to aldehyde oxidation in vivo. With DAME as organic phase, the aldehyde accumulated as main product using resting cells containing AlkBGT, AlkL, as well as AlkJ. This study highlights the versatility of whole-cell biocatalysis for synthesis of industrially relevant bifunctional building blocks and demonstrates how integrated reaction and catalyst engineering can be implemented to control product formation patterns in biocatalytic multistep reactions. PMID:24852702

  12. Global proteomic profiling in multistep hepatocarcinogenesis and identification of PARP1 as a novel molecular marker in hepatocellular carcinoma.

    PubMed

    Xu, Xiao; Liu, Zhikun; Wang, Jianguo; Xie, Haiyang; Li, Jie; Cao, Jili; Zhou, Lin; Zheng, Shusen

    2016-03-22

    The more accurate biomarkers have long been desired for hepatocellular carcinoma (HCC). Here, we characterized global large-scale proteomics of multistep hepatocarcinogenesis in an attempt to identify novel biomarkers for HCC. Quantitative data of 37874 sequences and 3017 proteins during hepatocarcinogenesis were obtained in cohort 1 of 75 samples (5 pooled groups: normal livers, hepatitis livers, cirrhotic livers, peritumoral livers, and HCC tissues) by iTRAQ 2D LC-MS/MS. The diagnostic performance of the top six most upregulated proteins in HCC group and HSP70 as reference were subsequently validated in cohort 2 of 114 samples (hepatocarcinogenesis from normal livers to HCC) using immunohistochemistry. Of seven candidate protein markers, PARP1, GS and NDRG1 showed the optimal diagnostic performance for HCC. PARP1, as a novel marker, showed comparable diagnostic performance to that of classic markers GS and NDRG1 in HCC (AUCs = 0.872, 0.856 and 0.792, respectively). A significant higher AUC of 0.945 was achieved when three markers combined. For diagnosis of HCC, the sensitivity and specificity were 88.2% and 81.0% when at least two of the markers were positive. Similar diagnostic values of PARP1, GS and NDRG1 were confirmed by immunohistochemistry in cohort 3 of 180 HCC patients. Further analysis indicated that PARP1 and NDRG1 were associated with some clinicopathological features, and the independent prognostic factors for HCC patients. Overall, global large-scale proteomics on spectrum of multistep hepatocarcinogenesis are obtained. PARP1 is a novel promising diagnostic/prognostic marker for HCC, and the three-marker panel (PARP1, GS and NDRG1) with excellent diagnostic performance for HCC was established. PMID:26883192

  13. Global proteomic profiling in multistep hepatocarcinogenesis and identification of PARP1 as a novel molecular marker in hepatocellular carcinoma

    PubMed Central

    Wang, Jianguo; Xie, Haiyang; Li, Jie; Cao, Jili; Zhou, Lin; Zheng, Shusen

    2016-01-01

    The more accurate biomarkers have long been desired for hepatocellular carcinoma (HCC). Here, we characterized global large-scale proteomics of multistep hepatocarcinogenesis in an attempt to identify novel biomarkers for HCC. Quantitative data of 37874 sequences and 3017 proteins during hepatocarcinogenesis were obtained in cohort 1 of 75 samples (5 pooled groups: normal livers, hepatitis livers, cirrhotic livers, peritumoral livers, and HCC tissues) by iTRAQ 2D LC-MS/MS. The diagnostic performance of the top six most upregulated proteins in HCC group and HSP70 as reference were subsequently validated in cohort 2 of 114 samples (hepatocarcinogenesis from normal livers to HCC) using immunohistochemistry. Of seven candidate protein markers, PARP1, GS and NDRG1 showed the optimal diagnostic performance for HCC. PARP1, as a novel marker, showed comparable diagnostic performance to that of classic markers GS and NDRG1 in HCC (AUCs = 0.872, 0.856 and 0.792, respectively). A significant higher AUC of 0.945 was achieved when three markers combined. For diagnosis of HCC, the sensitivity and specificity were 88.2% and 81.0% when at least two of the markers were positive. Similar diagnostic values of PARP1, GS and NDRG1 were confirmed by immunohistochemistry in cohort 3 of 180 HCC patients. Further analysis indicated that PARP1 and NDRG1 were associated with some clinicopathological features, and the independent prognostic factors for HCC patients. Overall, global large-scale proteomics on spectrum of multistep hepatocarcinogenesis are obtained. PARP1 is a novel promising diagnostic/prognostic marker for HCC, and the three-marker panel (PARP1, GS and NDRG1) with excellent diagnostic performance for HCC was established. PMID:26883192

  14. Genetic algorithms

    NASA Technical Reports Server (NTRS)

    Wang, Lui; Bayer, Steven E.

    1991-01-01

    Genetic algorithms are mathematical, highly parallel, adaptive search procedures (i.e., problem solving methods) based loosely on the processes of natural genetics and Darwinian survival of the fittest. Basic genetic algorithms concepts are introduced, genetic algorithm applications are introduced, and results are presented from a project to develop a software tool that will enable the widespread use of genetic algorithm technology.

  15. Continuous Video Modeling to Assist with Completion of Multi-Step Home Living Tasks by Young Adults with Moderate Intellectual Disability

    ERIC Educational Resources Information Center

    Mechling, Linda C.; Ayres, Kevin M.; Bryant, Kathryn J.; Foster, Ashley L.

    2014-01-01

    The current study evaluated a relatively new video-based procedure, continuous video modeling (CVM), to teach multi-step cleaning tasks to high school students with moderate intellectual disability. CVM in contrast to video modeling and video prompting allows repetition of the video model (looping) as many times as needed while the user completes…

  16. AUTOMATED INADVERTENT INTRUDER APPLICATION

    SciTech Connect

    Koffman, L; Patricia Lee, P; Jim Cook, J; Elmer Wilhite, E

    2007-05-29

    The Environmental Analysis and Performance Modeling group of Savannah River National Laboratory (SRNL) conducts performance assessments of the Savannah River Site (SRS) low-level waste facilities to meet the requirements of DOE Order 435.1. These performance assessments, which result in limits on the amounts of radiological substances that can be placed in the waste disposal facilities, consider numerous potential exposure pathways that could occur in the future. One set of exposure scenarios, known as inadvertent intruder analysis, considers the impact on hypothetical individuals who are assumed to inadvertently intrude onto the waste disposal site. Inadvertent intruder analysis considers three distinct scenarios for exposure referred to as the agriculture scenario, the resident scenario, and the post-drilling scenario. Each of these scenarios has specific exposure pathways that contribute to the overall dose for the scenario. For the inadvertent intruder analysis, the calculation of dose for the exposure pathways is a relatively straightforward algebraic calculation that utilizes dose conversion factors. Prior to 2004, these calculations were performed using an Excel spreadsheet. However, design checks of the spreadsheet calculations revealed that errors could be introduced inadvertently when copying spreadsheet formulas cell by cell and finding these errors was tedious and time consuming. This weakness led to the specification of functional requirements to create a software application that would automate the calculations for inadvertent intruder analysis using a controlled source of input parameters. This software application, named the Automated Inadvertent Intruder Application, has undergone rigorous testing of the internal calculations and meets software QA requirements. The Automated Inadvertent Intruder Application was intended to replace the previous spreadsheet analyses with an automated application that was verified to produce the same calculations and

  17. Automated genotyping of dinucleotide repeat markers

    SciTech Connect

    Perlin, M.W.; Hoffman, E.P. |

    1994-09-01

    The dinucleotide repeats (i.e., microsatellites) such as CA-repeats are a highly polymorphic, highly abundant class of PCR-amplifiable markers that have greatly streamlined genetic mapping experimentation. It is expected that over 30,000 such markers (including tri- and tetranucleotide repeats) will be characterized for routine use in the next few years. Since only size determination, and not sequencing, is required to determine alleles, in principle, dinucleotide repeat genotyping is easily performed on electrophoretic gels, and can be automated using DNA sequencers. Unfortunately, PCR stuttering with these markers generates not one band for each allele, but a pattern of bands. Since closely spaced alleles must be disambiguated by human scoring, this poses a key obstacle to full automation. We have developed methods that overcome this obstacle. Our model is that the observed data is generated by arithmetic superposition (i.e., convolution) of multiple allele patterns. By quantitatively measuring the size of each component band, and exploiting the unique stutter pattern associated with each marker, closely spaced alleles can be deconvolved; this unambiguously reconstructs the {open_quotes}true{close_quotes} allele bands, with stutter artifact removed. We used this approach in a system for automated diagnosis of (X-linked) Duchenne muscular dystrophy; four multiplexed CA-repeats within the dystrophin gene were assayed on a DNA sequencer. Our method accurately detected small variations in gel migration that shifted the allele size estimate. In 167 nonmutated alleles, 89% (149/167) showed no size variation, 9% (15/167) showed 1 bp variation, and 2% (3/167) showed 2 bp variation. We are currently developing a library of dinucleotide repeat patterns; together with our deconvolution methods, this library will enable fully automated genotyping of dinucleotide repeats from sizing data.

  18. PaR-PaR Laboratory Automation Platform

    SciTech Connect

    Linshiz, G; Stawski, N; Poust, S; Bi, CH; Keasling, JD; Hilson, NJ

    2013-05-01

    Labor-intensive multistep biological tasks, such as the construction and cloning of DNA molecules, are prime candidates for laboratory automation. Flexible and biology-friendly operation of robotic equipment is key to its successful integration in biological laboratories, and the efforts required to operate a robot must be much smaller than the alternative manual lab work. To achieve these goals, a simple high-level biology-friendly robot programming language is needed. We have developed and experimentally validated such a language: Programming a Robot (PaR-PaR). The syntax and compiler for the language are based on computer science principles and a deep understanding of biological workflows. PaR-PaR allows researchers to use liquid-handling robots effectively, enabling experiments that would not have been considered previously. After minimal training, a biologist can independently write complicated protocols for a robot within an hour. Adoption of PaR-PaR as a standard cross-platform language would enable hand-written or software-generated robotic protocols to be shared across laboratories.

  19. Automated DNA diagnostics using an ELISA-based oligonucleotide ligation assay.

    PubMed Central

    Nickerson, D A; Kaiser, R; Lappin, S; Stewart, J; Hood, L; Landegren, U

    1990-01-01

    DNA diagnostics, the detection of specific DNA sequences, will play an increasingly important role in medicine as the molecular basis of human disease is defined. Here, we demonstrate an automated, nonisotopic strategy for DNA diagnostics using amplification of target DNA segments by the polymerase chain reaction (PCR) and the discrimination of allelic sequence variants by a colorimetric oligonucleotide ligation assay (OLA). We have applied the automated PCR/OLA procedure to diagnosis of common genetic diseases, such as sickle cell anemia and cystic fibrosis (delta F508 mutation), and to genetic linkage mapping of gene segments in the human T-cell receptor beta-chain locus. The automated PCR/OLA strategy provides a rapid system for diagnosis of genetic, malignant, and infectious diseases as well as a powerful approach to genetic linkage mapping of chromosomes and forensic DNA typing. Images PMID:2247466

  20. Automated Proactive Fault Isolation: A Key to Automated Commissioning

    SciTech Connect

    Katipamula, Srinivas; Brambley, Michael R.

    2007-07-31

    In this paper, we present a generic model for automated continuous commissioing and then delve in detail into one of the processes, proactive testing for fault isolation, which is key to automating commissioning. The automated commissioining process uses passive observation-based fault detction and diagnostic techniques, followed by automated proactive testing for fault isolation, automated fault evaluation, and automated reconfiguration of controls together to continuously keep equipment controlled and running as intended. Only when hard failures occur or a physical replacement is required does the process require human intervention, and then sufficient information is provided by the automated commissioning system to target manual maintenance where it is needed. We then focus on fault isolation by presenting detailed logic that can be used to automatically isolate faults in valves, a common component in HVAC systems, as an example of how automated proactive fault isolation can be accomplished. We conclude the paper with a discussion of how this approach to isolating faults can be applied to other common HVAC components and their automated commmissioning and a summary of key conclusions of the paper.

  1. New Genetics

    MedlinePlus

    ... human genome, behavioral genetics, pharmacogenetics, drug resistance, biofilms, computer modeling. » more Chapter 5: 21st-Century Genetics Covers systems biology, GFP, genetic testing, privacy concerns, DNA forensics, ...

  2. Genetic Counseling

    MedlinePlus

    ... Articles Genetic Counseling Information For... Media Policy Makers Genetic Counseling Language: English Español (Spanish) Recommend on Facebook ... informed decisions about testing and treatment. Reasons for Genetic Counseling There are many reasons that people go ...

  3. A versatile valving toolkit for automating fluidic operations in paper microfluidic devices

    PubMed Central

    Toley, Bhushan J.; Wang, Jessica A.; Gupta, Mayuri; Buser, Joshua R.; Lafleur, Lisa K.; Lutz, Barry R.; Fu, Elain; Yager, Paul

    2015-01-01

    Failure to utilize valving and automation techniques has restricted the complexity of fluidic operations that can be performed in paper microfluidic devices. We developed a toolkit of paper microfluidic valves and methods for automatic valve actuation using movable paper strips and fluid-triggered expanding elements. To the best of our knowledge, this is the first functional demonstration of this valving strategy in paper microfluidics. After introduction of fluids on devices, valves can actuate automatically a) after a certain period of time, or b) after the passage of a certain volume of fluid. Timing of valve actuation can be tuned with greater than 8.5% accuracy by changing lengths of timing wicks, and we present timed on-valves, off-valves, and diversion (channel-switching) valves. The actuators require ~30 μl fluid to actuate and the time required to switch from one state to another ranges from ~5 s for short to ~50s for longer wicks. For volume-metered actuation, the size of a metering pad can be adjusted to tune actuation volume, and we present two methods – both methods can achieve greater than 9% accuracy. Finally, we demonstrate the use of these valves in a device that conducts a multi-step assay for the detection of the malaria protein PfHRP2. Although slightly more complex than devices that do not have moving parts, this valving and automation toolkit considerably expands the capabilities of paper microfluidic devices. Components of this toolkit can be used to conduct arbitrarily complex, multi-step fluidic operations on paper-based devices, as demonstrated in the malaria assay device. PMID:25606810

  4. Automation in organizations: Eternal conflict

    NASA Technical Reports Server (NTRS)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  5. 78 FR 66039 - Modification of National Customs Automation Program Test Concerning Automated Commercial...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-04

    ... Concerning Automated Commercial Environment (ACE) Cargo Release (Formerly Known as Simplified Entry) AGENCY... Automated Commercial Environment (ACE). Originally, the test was known as the Simplified Entry Test because...'s (CBP's) National Customs Automation Program (NCAP) test concerning Automated...

  6. 77 FR 48527 - National Customs Automation Program (NCAP) Test Concerning Automated Commercial Environment (ACE...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-14

    ... Automated Commercial Environment (ACE) Simplified Entry: Modification of Participant Selection Criteria and... (NCAP) test concerning the simplified entry functionality in the Automated Commercial Environment (ACE...) National Customs Automation Program (NCAP) test concerning Automated Commercial Environment...

  7. AUTOMATING SHALLOW SEISMIC IMAGING

    SciTech Connect

    Steeples, Don W.

    2003-09-14

    The current project is a continuation of an effort to develop ultrashallow seismic imaging as a cost-effective method potentially applicable to DOE facilities. The objective of the present research is to develop and demonstrate the use of a cost-effective, automated method of conducting shallow seismic surveys, an approach that represents a significant departure from conventional seismic-survey field procedures. Initial testing of a mechanical geophone-planting device suggests that large numbers of geophones can be placed both quickly and automatically. The development of such a device could make the application of SSR considerably more efficient and less expensive. The imaging results obtained using automated seismic methods will be compared with results obtained using classical seismic techniques. Although this research falls primarily into the field of seismology, for comparison and quality-control purposes, some GPR data will be collected as well. In the final year of th e research, demonstration surveys at one or more DOE facilities will be performed. An automated geophone-planting device of the type under development would not necessarily be limited to the use of shallow seismic reflection methods; it also would be capable of collecting data for seismic-refraction and possibly for surface-wave studies. Another element of our research plan involves monitoring the cone of depression of a pumping well that is being used as a proxy site for fluid-flow at a contaminated site. Our next data set will be collected at a well site where drawdown equilibrium has been reached. Noninvasive, in-situ methods such as placing geophones automatically and using near-surface seismic methods to identify and characterize the hydrologic flow regimes at contaminated sites support the prospect of developing effective, cost-conscious cleanup strategies for DOE and others.

  8. Automating CPM-GOMS

    NASA Technical Reports Server (NTRS)

    John, Bonnie; Vera, Alonso; Matessa, Michael; Freed, Michael; Remington, Roger

    2002-01-01

    CPM-GOMS is a modeling method that combines the task decomposition of a GOMS analysis with a model of human resource usage at the level of cognitive, perceptual, and motor operations. CPM-GOMS models have made accurate predictions about skilled user behavior in routine tasks, but developing such models is tedious and error-prone. We describe a process for automatically generating CPM-GOMS models from a hierarchical task decomposition expressed in a cognitive modeling tool called Apex. Resource scheduling in Apex automates the difficult task of interleaving the cognitive, perceptual, and motor resources underlying common task operators (e.g. mouse move-and-click). Apex's UI automatically generates PERT charts, which allow modelers to visualize a model's complex parallel behavior. Because interleaving and visualization is now automated, it is feasible to construct arbitrarily long sequences of behavior. To demonstrate the process, we present a model of automated teller interactions in Apex and discuss implications for user modeling. available to model human users, the Goals, Operators, Methods, and Selection (GOMS) method [6, 21] has been the most widely used, providing accurate, often zero-parameter, predictions of the routine performance of skilled users in a wide range of procedural tasks [6, 13, 15, 27, 28]. GOMS is meant to model routine behavior. The user is assumed to have methods that apply sequences of operators and to achieve a goal. Selection rules are applied when there is more than one method to achieve a goal. Many routine tasks lend themselves well to such decomposition. Decomposition produces a representation of the task as a set of nested goal states that include an initial state and a final state. The iterative decomposition into goals and nested subgoals can terminate in primitives of any desired granularity, the choice of level of detail dependent on the predictions required. Although GOMS has proven useful in HCI, tools to support the

  9. Virtual director: automating a webcast

    NASA Astrophysics Data System (ADS)

    Machnicki, Erik; Rowe, Lawrence A.

    2001-12-01

    This paper presents a system designed to automate the production of webcasts, the Virtual Director. It automates simple tasks such as control of recording equipment, stream broadcasting, and camera control. It also automates content decisions, such as which camera view to broadcast. Directors can specify the content decisions using an automation specification language. The Virtual Director also uses a question monitor service to automatically identify questions and move the cameras to show the audience member asking the question. We discuss the implementation of the Virtual Director and present the results of its use in the production of a university seminar series.

  10. World-wide distribution automation systems

    SciTech Connect

    Devaney, T.M.

    1994-12-31

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems.

  11. Automated Propellant Blending

    NASA Technical Reports Server (NTRS)

    Hohmann, Carl W. (Inventor); Harrington, Douglas W. (Inventor); Dutton, Maureen L. (Inventor); Tipton, Billy Charles, Jr. (Inventor); Bacak, James W. (Inventor); Salazar, Frank (Inventor)

    2000-01-01

    An automated propellant blending apparatus and method that uses closely metered addition of countersolvent to a binder solution with propellant particles dispersed therein to precisely control binder precipitation and particle aggregation is discussed. A profile of binder precipitation versus countersolvent-solvent ratio is established empirically and used in a computer algorithm to establish countersolvent addition parameters near the cloud point for controlling the transition of properties of the binder during agglomeration and finishing of the propellant composition particles. The system is remotely operated by computer for safety, reliability and improved product properties, and also increases product output.

  12. Automated Propellant Blending

    NASA Technical Reports Server (NTRS)

    Hohmann, Carl W. (Inventor); Harrington, Douglas W. (Inventor); Dutton, Maureen L. (Inventor); Tipton, Billy Charles, Jr. (Inventor); Bacak, James W. (Inventor); Salazar, Frank (Inventor)

    1999-01-01

    An automated propellant blending apparatus and method uses closely metered addition of countersolvent to a binder solution with propellant particles dispersed therein to precisely control binder precipitation and particle aggregation. A profile of binder precipitation versus countersolvent-solvent ratio is established empirically and used in a computer algorithm to establish countersolvent addition parameters near the cloud point for controlling the transition of properties of the binder during agglomeration and finishing of the propellant composition particles. The system is remotely operated by computer for safety, reliability and improved product properties, and also increases product output.

  13. Methods for Multisweep Automation

    SciTech Connect

    SHEPHERD,JASON F.; MITCHELL,SCOTT A.; KNUPP,PATRICK; WHITE,DAVID R.

    2000-09-14

    Sweeping has become the workhorse algorithm for creating conforming hexahedral meshes of complex models. This paper describes progress on the automatic, robust generation of MultiSwept meshes in CUBIT. MultiSweeping extends the class of volumes that may be swept to include those with multiple source and multiple target surfaces. While not yet perfect, CUBIT's MultiSweeping has recently become more reliable, and been extended to assemblies of volumes. Sweep Forging automates the process of making a volume (multi) sweepable: Sweep Verification takes the given source and target surfaces, and automatically classifies curve and vertex types so that sweep layers are well formed and progress from sources to targets.

  14. Automated fiber pigtailing machine

    DOEpatents

    Strand, Oliver T.; Lowry, Mark E.

    1999-01-01

    The Automated Fiber Pigtailing Machine (AFPM) aligns and attaches optical fibers to optoelectonic (OE) devices such as laser diodes, photodiodes, and waveguide devices without operator intervention. The so-called pigtailing process is completed with sub-micron accuracies in less than 3 minutes. The AFPM operates unattended for one hour, is modular in design and is compatible with a mass production manufacturing environment. This machine can be used to build components which are used in military aircraft navigation systems, computer systems, communications systems and in the construction of diagnostics and experimental systems.

  15. Automated Hazard Analysis

    Energy Science and Technology Software Center (ESTSC)

    2003-06-26

    The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control andmore » job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the work planning process.« less

  16. Automated fiber pigtailing machine

    DOEpatents

    Strand, O.T.; Lowry, M.E.

    1999-01-05

    The Automated Fiber Pigtailing Machine (AFPM) aligns and attaches optical fibers to optoelectronic (OE) devices such as laser diodes, photodiodes, and waveguide devices without operator intervention. The so-called pigtailing process is completed with sub-micron accuracies in less than 3 minutes. The AFPM operates unattended for one hour, is modular in design and is compatible with a mass production manufacturing environment. This machine can be used to build components which are used in military aircraft navigation systems, computer systems, communications systems and in the construction of diagnostics and experimental systems. 26 figs.

  17. [Automated anesthesia record system].

    PubMed

    Zhu, Tao; Liu, Jin

    2005-12-01

    Based on Client/Server architecture, a software of automated anesthesia record system running under Windows operation system and networks has been developed and programmed with Microsoft Visual C++ 6.0, Visual Basic 6.0 and SQL Server. The system can deal with patient's information throughout the anesthesia. It can collect and integrate the data from several kinds of medical equipment such as monitor, infusion pump and anesthesia machine automatically and real-time. After that, the system presents the anesthesia sheets automatically. The record system makes the anesthesia record more accurate and integral and can raise the anesthesiologist's working efficiency. PMID:16422117

  18. Automated System Marketplace 1995: The Changing Face of Automation.

    ERIC Educational Resources Information Center

    Barry, Jeff; And Others

    1995-01-01

    Discusses trends in the automated system marketplace with specific attention to online vendors and their customers: academic, public, school, and special libraries. Presents vendor profiles; tables and charts on computer systems and sales; and sidebars that include a vendor source list and the differing views on procuring an automated library…

  19. Automated office blood pressure.

    PubMed

    Myers, Martin G; Godwin, Marshall

    2012-05-01

    Manual blood pressure (BP) is gradually disappearing from clinical practice with the mercury sphygmomanometer now considered to be an environmental hazard. Manual BP is also subject to measurement error on the part of the physician/nurse and patient-related anxiety which can result in poor quality BP measurements and office-induced (white coat) hypertension. Automated office (AO) BP with devices such as the BpTRU (BpTRU Medical Devices, Coquitlam, BC) has already replaced conventional manual BP in many primary care practices in Canada and has also attracted interest in other countries where research studies using AOBP have been undertaken. The basic principles of AOBP include multiple readings taken with a fully automated recorder with the patient resting alone in a quiet room. When these principles are followed, office-induced hypertension is eliminated and AOBP exhibits a much stronger correlation with the awake ambulatory BP as compared with routine manual BP measurements. Unlike routine manual BP, AOBP correlates as well with left ventricular mass as does the awake ambulatory BP. AOBP also simplifies the definition of hypertension in that the cut point for a normal AOBP (< 135/85 mm Hg) is the same as for the awake ambulatory BP and home BP. This article summarizes the currently available evidence supporting the use of AOBP in routine clinical practice and proposes an algorithm in which AOBP replaces manual BP for the diagnosis and management of hypertension. PMID:22265230

  20. Maneuver Automation Software

    NASA Technical Reports Server (NTRS)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; Illsley, Jeannette

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  1. Holton automates its longwall

    SciTech Connect

    Brezovec, D.

    1987-07-01

    Westmoreland Coal Co.'s underground mines in Virginia are putting automated longwalls to work, and have in the process boosted productivity from 16 to 20 clean tons per man-day in the last five years. The longwall face that was installed at Westmoreland's Holton mine on Aug.28, 1985, theoretically could operate with only three workers at the face, the shearer operator, a mechanic and the headgate operator. Advancing the shields and the face conveyor, a job that now occupies four workers on most longwall faces, would be accomplished entirely by remote control. The automated roof support advance system relies on a microprocessor located next to the stageloader. The microprocessor is programmed to coordinate the movement of the shields and face conveyor as the shearer passes. The article describes that a sensor-activated disc located at the end of the shearer's haulage motor shaft counts the rotations of the shearer and relays information on how far the shearer has moved and in what direction to the microprocessor through the trailing cable. The computer defines the location of the shearer and issues commands through a data transmission line that connects the microprocessor to control units located on the shields. The shields and face conveyor move in a sequence programmed into the microprocessor.

  2. Agile automated vision

    NASA Astrophysics Data System (ADS)

    Fandrich, Juergen; Schmitt, Lorenz A.

    1994-11-01

    The microelectronic industry is a protagonist in driving automated vision to new paradigms. Today semiconductor manufacturers use vision systems quite frequently in their fabs in the front-end process. In fact, the process depends on reliable image processing systems. In the back-end process, where ICs are assembled and packaged, today vision systems are only partly used. But in the next years automated vision will become compulsory for the back-end process as well. Vision will be fully integrated into every IC package production machine to increase yields and reduce costs. Modem high-speed material processing requires dedicated and efficient concepts in image processing. But the integration of various equipment in a production plant leads to unifying handling of data flow and interfaces. Only agile vision systems can act with these contradictions: fast, reliable, adaptable, scalable and comprehensive. A powerful hardware platform is a unneglectable requirement for the use of advanced and reliable, but unfortunately computing intensive image processing algorithms. The massively parallel SIMD hardware product LANTERN/VME supplies a powerful platform for existing and new functionality. LANTERN/VME is used with a new optical sensor for IC package lead inspection. This is done in 3D, including horizontal and coplanarity inspection. The appropriate software is designed for lead inspection, alignment and control tasks in IC package production and handling equipment, like Trim&Form, Tape&Reel and Pick&Place machines.

  3. Space station advanced automation

    NASA Technical Reports Server (NTRS)

    Woods, Donald

    1990-01-01

    In the development of a safe, productive and maintainable space station, Automation and Robotics (A and R) has been identified as an enabling technology which will allow efficient operation at a reasonable cost. The Space Station Freedom's (SSF) systems are very complex, and interdependent. The usage of Advanced Automation (AA) will help restructure, and integrate system status so that station and ground personnel can operate more efficiently. To use AA technology for the augmentation of system management functions requires a development model which consists of well defined phases of: evaluation, development, integration, and maintenance. The evaluation phase will consider system management functions against traditional solutions, implementation techniques and requirements; the end result of this phase should be a well developed concept along with a feasibility analysis. In the development phase the AA system will be developed in accordance with a traditional Life Cycle Model (LCM) modified for Knowledge Based System (KBS) applications. A way by which both knowledge bases and reasoning techniques can be reused to control costs is explained. During the integration phase the KBS software must be integrated with conventional software, and verified and validated. The Verification and Validation (V and V) techniques applicable to these KBS are based on the ideas of consistency, minimal competency, and graph theory. The maintenance phase will be aided by having well designed and documented KBS software.

  4. An Automated Field Phenotyping Pipeline for Application in Grapevine Research

    PubMed Central

    Kicherer, Anna; Herzog, Katja; Pflanz, Michael; Wieland, Markus; Rüger, Philipp; Kecke, Steffen; Kuhlmann, Heiner; Töpfer, Reinhard

    2015-01-01

    Due to its perennial nature and size, the acquisition of phenotypic data in grapevine research is almost exclusively restricted to the field and done by visual estimation. This kind of evaluation procedure is limited by time, cost and the subjectivity of records. As a consequence, objectivity, automation and more precision of phenotypic data evaluation are needed to increase the number of samples, manage grapevine repositories, enable genetic research of new phenotypic traits and, therefore, increase the efficiency in plant research. In the present study, an automated field phenotyping pipeline was setup and applied in a plot of genetic resources. The application of the PHENObot allows image acquisition from at least 250 individual grapevines per hour directly in the field without user interaction. Data management is handled by a database (IMAGEdata). The automatic image analysis tool BIVcolor (Berries in Vineyards-color) permitted the collection of precise phenotypic data of two important fruit traits, berry size and color, within a large set of plants. The application of the PHENObot represents an automated tool for high-throughput sampling of image data in the field. The automated analysis of these images facilitates the generation of objective and precise phenotypic data on a larger scale. PMID:25730485

  5. An automated field phenotyping pipeline for application in grapevine research.

    PubMed

    Kicherer, Anna; Herzog, Katja; Pflanz, Michael; Wieland, Markus; Rüger, Philipp; Kecke, Steffen; Kuhlmann, Heiner; Töpfer, Reinhard

    2015-01-01

    Due to its perennial nature and size, the acquisition of phenotypic data in grapevine research is almost exclusively restricted to the field and done by visual estimation. This kind of evaluation procedure is limited by time, cost and the subjectivity of records. As a consequence, objectivity, automation and more precision of phenotypic data evaluation are needed to increase the number of samples, manage grapevine repositories, enable genetic research of new phenotypic traits and, therefore, increase the efficiency in plant research. In the present study, an automated field phenotyping pipeline was setup and applied in a plot of genetic resources. The application of the PHENObot allows image acquisition from at least 250 individual grapevines per hour directly in the field without user interaction. Data management is handled by a database (IMAGEdata). The automatic image analysis tool BIVcolor (Berries in Vineyards-color) permitted the collection of precise phenotypic data of two important fruit traits, berry size and color, within a large set of plants. The application of the PHENObot represents an automated tool for high-throughput sampling of image data in the field. The automated analysis of these images facilitates the generation of objective and precise phenotypic data on a larger scale. PMID:25730485

  6. Separation of collagen-bound and porous bone water transverse relaxation in mice: proposal of a multi-step approach.

    PubMed

    Marcon, Magda; Keller, Daniel; Wurnig, Moritz C; Eberhardt, Christian; Weiger, Markus; Eberli, Daniel; Boss, Andreas

    2016-07-01

    The separation and quantification of collagen-bound water (CBW) and pore water (PW) components of the cortical bone signal are important because of their different contribution to bone mechanical properties. Ultrashort TE (UTE) imaging can be used to exploit the transverse relaxation from CBW and PW, allowing their quantification. We tested, for the first time, the feasibility of UTE measurements in mice for the separation and quantification of the transverse relaxation of CBW and PW in vivo using three different approaches for T2 * determination. UTE sequences were acquired at 4.7 T in six mice with 10 different TEs (50-5000 μs). The transverse relaxation time T2 * of CBW (T2 *cbw ) and PW (T2 *pw ) and the CBW fraction (bwf) were computed using a mono-exponential (i), a standard bi-exponential (ii) and a new multi-step bi-exponential (iii) approach. Regions of interest were drawn at multiple levels of the femur and vertebral body cortical bone for each mouse. The sum of the normalized squared residuals (Res) and the homogeneity of variance were tested to compare the different methods. In the femur, approach (i) yielded mean T2 * ± standard deviation (SD) of 657 ± 234 μs. With approach (ii), T2 *cbw , T2 *pw and bwf were 464 ± 153 μs, 15 777 ± 10 864 μs and 57.6 ± 9.9%, respectively. For approach (iii), T2 *cbw , T2 *pw and bwf were 387 ± 108 μs, 7534 ± 2765 μs and 42.5 ± 6.2%, respectively. Similar values were obtained from vertebral bodies. Res with approach (ii) was lower than with the two other approaches (p < 0.007), but T2 *pw and bwf variance was lower with approach (iii) than with approach (ii) (p < 0.048). We demonstrated that the separation and quantification of cortical bone water components with UTE sequences is feasible in vivo in mouse models. The direct bi-exponential approach exhibited the best approximation to the measured signal curve with the lowest residuals; however, the newly

  7. Computer automated design and computer automated manufacture.

    PubMed

    Brncick, M

    2000-08-01

    The introduction of computer aided design and computer aided manufacturing into the field of prosthetics and orthotics did not arrive without concern. Many prosthetists feared that the computer would provide other allied health practitioners who had little or no experience in prosthetics the ability to fit and manage amputees. Technicians in the field felt their jobs may be jeopardized by automated fabrication techniques. This has not turned out to be the case. Prosthetists who use CAD-CAM techniques are finding they have more time for patient care and clinical assessment. CAD-CAM is another tool for them to provide better care for the patients/clients they serve. One of the factors that deterred the acceptance of CAD-CAM techniques in its early stages was that of cost. It took a significant investment in software and hardware for the prosthetists to begin to use the new systems. This new technique was not reimbursed by insurance coverage. Practitioners did not have enough information about this new technique to make a sound decision on their investment of time and money. Ironically, it is the need to hold health care costs down that may prove to be the catalyst for the increased use of CAD-CAM in the field. Providing orthoses and prostheses to patients who require them is a very labor intensive process. Practitioners are looking for better, faster, and more economical ways in which to provide their services under the pressure of managed care. CAD-CAM may be the answer. The author foresees shape sensing departments in hospitals where patients would be sent to be digitized, similar to someone going for radiograph or ultrasound. Afterwards, an orthosis or prosthesis could be provided from a central fabrication facility at a remote site, most likely on the same day. Not long ago, highly skilled practitioners with extensive technical ability would custom make almost every orthosis. One now practices in an atmosphere where off-the-shelf orthoses are the standard. This

  8. Automation's Effect on Library Personnel.

    ERIC Educational Resources Information Center

    Dakshinamurti, Ganga

    1985-01-01

    Reports on survey studying the human-machine interface in Canadian university, public, and special libraries. Highlights include position category and educational background of 118 participants, participants' feelings toward automation, physical effects of automation, diffusion in decision making, interpersonal communication, future trends,…

  9. Suddenly Last Decade! Automation Arrives.

    ERIC Educational Resources Information Center

    Epstein, Susan Baerg

    1983-01-01

    Discusses concerns of librarians entering field of library automation emphasizing issues surrounding automated circulation control systems and online catalogs. Factors which have contributed to dramatic growth in these areas are enumerated: MARC II format, reduced computer costs, commercial vendors, scarce resources, and turnkey systems. (EJS)

  10. Automated Test-Form Generation

    ERIC Educational Resources Information Center

    van der Linden, Wim J.; Diao, Qi

    2011-01-01

    In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…

  11. Automated Circulation. SPEC Kit 43.

    ERIC Educational Resources Information Center

    Association of Research Libraries, Washington, DC. Office of Management Studies.

    Of the 64 libraries responding to a 1978 Association of Research Libraries (ARL) survey, 37 indicated that they used automated circulation systems; half of these were commercial systems, and most were batch-process or combination batch process and online. Nearly all libraries without automated systems cited lack of funding as the reason for not…

  12. Progress Toward Automated Cost Estimation

    NASA Technical Reports Server (NTRS)

    Brown, Joseph A.

    1992-01-01

    Report discusses efforts to develop standard system of automated cost estimation (ACE) and computer-aided design (CAD). Advantage of system is time saved and accuracy enhanced by automating extraction of quantities from design drawings, consultation of price lists, and application of cost and markup formulas.

  13. Robotics/Automated Systems Technicians.

    ERIC Educational Resources Information Center

    Doty, Charles R.

    Major resources exist that can be used to develop or upgrade programs in community colleges and technical institutes that educate robotics/automated systems technicians. The first category of resources is Economic, Social, and Education Issues. The Office of Technology Assessment (OTA) report, "Automation and the Workplace," presents analyses of…

  14. Opening up Library Automation Software

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2009-01-01

    Throughout the history of library automation, the author has seen a steady advancement toward more open systems. In the early days of library automation, when proprietary systems dominated, the need for standards was paramount since other means of inter-operability and data exchange weren't possible. Today's focus on Application Programming…

  15. The Library Administrator's Automation Handbook.

    ERIC Educational Resources Information Center

    Boss, Richard W.

    One of the most significant decisions in a library administrator's career is the decision to automate one or more of a library's operations. This book describes the present state of local library automation; the planning, selection, and implementation process; and the library administrator's role in the process. The bulk of the text provides a…

  16. Comparative genomics reveals multistep pathogenesis of E2A-PBX1 acute lymphoblastic leukemia

    PubMed Central

    Duque-Afonso, Jesús; Feng, Jue; Scherer, Florian; Lin, Chiou-Hong; Wong, Stephen H.K.; Wang, Zhong; Iwasaki, Masayuki; Cleary, Michael L.

    2015-01-01

    Acute lymphoblastic leukemia (ALL) is the most common childhood cancer; however, its genetic diversity limits investigation into the molecular pathogenesis of disease and development of therapeutic strategies. Here, we engineered mice that conditionally express the E2A-PBX1 fusion oncogene, which results from chromosomal translocation t(1;19) and is present in 5% to 7% of pediatric ALL cases. The incidence of leukemia in these mice varied from 5% to 50%, dependent on the Cre-driving promoter (Cd19, Mb1, or Mx1) used to induce E2A-PBX1 expression. Two distinct but highly similar subtypes of B cell precursor ALLs that differed by their pre–B cell receptor (pre-BCR) status were induced and displayed maturation arrest at the pro-B/large pre–B II stages of differentiation, similar to human E2A-PBX1 ALL. Somatic activation of E2A-PBX1 in B cell progenitors enhanced self-renewal and led to acquisition of multiple secondary genomic aberrations, including prominent spontaneous loss of Pax5. In preleukemic mice, conditional Pax5 deletion cooperated with E2A-PBX1 to expand progenitor B cell subpopulations, increasing penetrance and shortening leukemia latency. Recurrent secondary activating mutations were detected in key signaling pathways, most notably JAK/STAT, that leukemia cells require for proliferation. These data support conditional E2A-PBX1 mice as a model of human ALL and suggest targeting pre-BCR signaling and JAK kinases as potential therapeutic strategies. PMID:26301816

  17. An open source multistep model to predict mutagenicity from statistical analysis and relevant structural alerts

    PubMed Central

    2010-01-01

    Background Mutagenicity is the capability of a substance to cause genetic mutations. This property is of high public concern because it has a close relationship with carcinogenicity and potentially with reproductive toxicity. Experimentally, mutagenicity can be assessed by the Ames test on Salmonella with an estimated experimental reproducibility of 85%; this intrinsic limitation of the in vitro test, along with the need for faster and cheaper alternatives, opens the road to other types of assessment methods, such as in silico structure-activity prediction models. A widely used method checks for the presence of known structural alerts for mutagenicity. However the presence of such alerts alone is not a definitive method to prove the mutagenicity of a compound towards Salmonella, since other parts of the molecule can influence and potentially change the classification. Hence statistically based methods will be proposed, with the final objective to obtain a cascade of modeling steps with custom-made properties, such as the reduction of false negatives. Results A cascade model has been developed and validated on a large public set of molecular structures and their associated Salmonella mutagenicity outcome. The first step consists in the derivation of a statistical model and mutagenicity prediction, followed by further checks for specific structural alerts in the "safe" subset of the prediction outcome space. In terms of accuracy (i.e., overall correct predictions of both negative and positives), the obtained model approached the 85% reproducibility of the experimental mutagenicity Ames test. Conclusions The model and the documentation for regulatory purposes are freely available on the CAESAR website. The input is simply a file of molecular structures and the output is the classification result. PMID:20678181

  18. Space station automation and autonomy

    SciTech Connect

    Carlisle, R.F.

    1984-08-01

    Mission definition and technology assessment studies support the necessity of incorporating increasing degrees of automation in a space station. As presently envisioned, a space station will evolve over 10-20 years. As the complexity of the space station grows, decision-making must be transferred from the crew to an on-board computer system in order to increase the productivity of the man/machine system. Thus, growth considerations require that provision be made for increasing degrees of automation as the space station evolves. Awareness by the planners and technologists of automated system interactions, of the functional role of automation and autonomy, and of design concepts that permit growth will significantly affect technology and system choices. The power system is an excellent case study for examining its possible evolution from manual to automated and continued evolution towards autonomous control. The purpose of this paper is to give an overview of the requirements for this evolution from the systems perspective.

  19. Automated design of aerospace structures

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Mccomb, H. G.

    1974-01-01

    The current state-of-the-art in structural analysis of aerospace vehicles is characterized, automated design technology is discussed, and an indication is given of the future direction of research in analysis and automated design. Representative computer programs for analysis typical of those in routine use in vehicle design activities are described, and results are shown for some selected analysis problems. Recent and planned advances in analysis capability are indicated. Techniques used to automate the more routine aspects of structural design are discussed, and some recently developed automated design computer programs are described. Finally, discussion is presented of early accomplishments in interdisciplinary automated design systems, and some indication of the future thrust of research in this field is given.

  20. Automated Desalting Apparatus

    NASA Technical Reports Server (NTRS)

    Spencer, Maegan K.; Liu, De-Ling; Kanik, Isik; Beegle, Luther

    2010-01-01

    Because salt and metals can mask the signature of a variety of organic molecules (like amino acids) in any given sample, an automated system to purify complex field samples has been created for the analytical techniques of electrospray ionization/ mass spectroscopy (ESI/MS), capillary electrophoresis (CE), and biological assays where unique identification requires at least some processing of complex samples. This development allows for automated sample preparation in the laboratory and analysis of complex samples in the field with multiple types of analytical instruments. Rather than using tedious, exacting protocols for desalting samples by hand, this innovation, called the Automated Sample Processing System (ASPS), takes analytes that have been extracted through high-temperature solvent extraction and introduces them into the desalting column. After 20 minutes, the eluent is produced. This clear liquid can then be directly analyzed by the techniques listed above. The current apparatus including the computer and power supplies is sturdy, has an approximate mass of 10 kg, and a volume of about 20 20 20 cm, and is undergoing further miniaturization. This system currently targets amino acids. For these molecules, a slurry of 1 g cation exchange resin in deionized water is packed into a column of the apparatus. Initial generation of the resin is done by flowing sequentially 2.3 bed volumes of 2N NaOH and 2N HCl (1 mL each) to rinse the resin, followed by .5 mL of deionized water. This makes the pH of the resin near neutral, and eliminates cross sample contamination. Afterward, 2.3 mL of extracted sample is then loaded into the column onto the top of the resin bed. Because the column is packed tightly, the sample can be applied without disturbing the resin bed. This is a vital step needed to ensure that the analytes adhere to the resin. After the sample is drained, oxalic acid (1 mL, pH 1.6-1.8, adjusted with NH4OH) is pumped into the column. Oxalic acid works as a

  1. Baculovirus expression system and method for high throughput expression of genetic material

    DOEpatents

    Clark, Robin; Davies, Anthony

    2001-01-01

    The present invention provides novel recombinant baculovirus expression systems for expressing foreign genetic material in a host cell. Such expression systems are readily adapted to an automated method for expression foreign genetic material in a high throughput manner. In other aspects, the present invention features a novel automated method for determining the function of foreign genetic material by transfecting the same into a host by way of the recombinant baculovirus expression systems according to the present invention.

  2. Genetic control of Drosophila nerve cord development

    NASA Technical Reports Server (NTRS)

    Skeath, James B.; Thor, Stefan

    2003-01-01

    The Drosophila ventral nerve cord has been a central model system for studying the molecular genetic mechanisms that control CNS development. Studies show that the generation of neural diversity is a multistep process initiated by the patterning and segmentation of the neuroectoderm. These events act together with the process of lateral inhibition to generate precursor cells (neuroblasts) with specific identities, distinguished by the expression of unique combinations of regulatory genes. The expression of these genes in a given neuroblast restricts the fate of its progeny, by activating specific combinations of downstream genes. These genes in turn specify the identity of any given postmitotic cell, which is evident by its cellular morphology and choice of neurotransmitter.

  3. Automated Defect Classification (ADC)

    Energy Science and Technology Software Center (ESTSC)

    1998-01-01

    The ADC Software System is designed to provide semiconductor defect feature analysis and defect classification capabilities. Defect classification is an important software method used by semiconductor wafer manufacturers to automate the analysis of defect data collected by a wide range of microscopy techniques in semiconductor wafer manufacturing today. These microscopies (e.g., optical bright and dark field, scanning electron microscopy, atomic force microscopy, etc.) generate images of anomalies that are induced or otherwise appear on wafermore » surfaces as a result of errant manufacturing processes or simple atmospheric contamination (e.g., airborne particles). This software provides methods for analyzing these images, extracting statistical features from the anomalous regions, and applying supervised classifiers to label the anomalies into user-defined categories.« less

  4. Automated satellite image navigation

    NASA Astrophysics Data System (ADS)

    Bassett, Robert M.

    1992-12-01

    The automated satellite image navigation method (Auto-Avian) developed and tested by Spaulding (1990) at the Naval Postgraduate School is investigated. The Auto-Avian method replaced the manual procedure of selecting Ground Control Points (GCP's) with an autocorrelation process that utilizes the World Vector Shoreline (WVS) provided by the Defense Mapping Agency (DMA) as a string of GCP's to rectify satellite images. The automatic cross-correlation of binary reference (WVS) and search (image) windows eliminated the subjective error associated with the manual selection of GCP's and produced accuracies comparable to the manual method. The scope of Spaulding's (1990) research was expanded. The worldwide application of the Auto-Avian method was demonstrated in three world regions (eastern North Pacific Ocean, eastern North Atlantic Ocean, and Persian Gulf). Using five case studies, the performance of the Auto-Avian method on 'less than optimum' images (i.e., islands, coastlines affected by lateral distortion and/or cloud cover) was investigated.

  5. Automated synthetic scene generation

    NASA Astrophysics Data System (ADS)

    Givens, Ryan N.

    Physics-based simulations generate synthetic imagery to help organizations anticipate system performance of proposed remote sensing systems. However, manually constructing synthetic scenes which are sophisticated enough to capture the complexity of real-world sites can take days to months depending on the size of the site and desired fidelity of the scene. This research, sponsored by the Air Force Research Laboratory's Sensors Directorate, successfully developed an automated approach to fuse high-resolution RGB imagery, lidar data, and hyperspectral imagery and then extract the necessary scene components. The method greatly reduces the time and money required to generate realistic synthetic scenes and developed new approaches to improve material identification using information from all three of the input datasets.

  6. Distributed Experiment Automation System

    NASA Astrophysics Data System (ADS)

    Lebedev, Gennadi

    2003-03-01

    Module based distributed system for controlling and automation scientific experiments were developed. System divides in five main layers: 1. Data processing and presentation modules, 2. Controllers - support primary command evaluation, data analysis and synchronization between Device Drivers. 3. Data Server. Provide real time data storage and management. 4. Device Drivers, support communication, preliminary signals acquisitions and control of peripheral devices. 5. Utility - batch processing, login, errors of execution handling, experimental data persistent storage and management, modules and devices monitoring, alarm state, remote components messaging and notification processing. System used networking (DCOM protocol) for communication between distributed modules. Configuration, modules parameters, data and commands links defined in scripting file (XML format). This modular structure allows great flexibility and extensibility as modules can be added and configured as required without any extensive programming.

  7. Robust automated knowledge capture.

    SciTech Connect

    Stevens-Adams, Susan Marie; Abbott, Robert G.; Forsythe, James Chris; Trumbo, Michael Christopher Stefan; Haass, Michael Joseph; Hendrickson, Stacey M. Langfitt

    2011-10-01

    This report summarizes research conducted through the Sandia National Laboratories Robust Automated Knowledge Capture Laboratory Directed Research and Development project. The objective of this project was to advance scientific understanding of the influence of individual cognitive attributes on decision making. The project has developed a quantitative model known as RumRunner that has proven effective in predicting the propensity of an individual to shift strategies on the basis of task and experience related parameters. Three separate studies are described which have validated the basic RumRunner model. This work provides a basis for better understanding human decision making in high consequent national security applications, and in particular, the individual characteristics that underlie adaptive thinking.

  8. Expedition automated flow fluorometer

    NASA Astrophysics Data System (ADS)

    Krikun, V. A.; Salyuk, P. A.

    2015-11-01

    This paper describes an apparatus and operation of automated flow-through dual-channel fluorometer for studying the fluorescence of dissolved organic matter, and the fluorescence of phytoplankton cells with open and closed reaction centers in sea areas with oligotrophic and eutrophic water type. The step-by step excitation by two semiconductor lasers or two light-emitting diodes is realized in the current device. The excitation wavelengths are 405nm and 532nm in the default configuration. Excitation radiation of each light source can be changed with different durations, intensities and repetition rate. Registration of the fluorescence signal carried out by two photo-multipliers with different optical filters of 580-600 nm and 680-700 nm band pass diapasons. The configuration of excitation sources and spectral diapasons of registered radiation can be changed due to decided tasks.

  9. Berkeley automated supernova search

    SciTech Connect

    Kare, J.T.; Pennypacker, C.R.; Muller, R.A.; Mast, T.S.; Crawford, F.S.; Burns, M.S.

    1981-01-01

    The Berkeley automated supernova search employs a computer controlled 36-inch telescope and charge coupled device (CCD) detector to image 2500 galaxies per night. A dedicated minicomputer compares each galaxy image with stored reference data to identify supernovae in real time. The threshold for detection is m/sub v/ = 18.8. We plan to monitor roughly 500 galaxies in Virgo and closer every night, and an additional 6000 galaxies out to 70 Mpc on a three night cycle. This should yield very early detection of several supernovae per year for detailed study, and reliable premaximum detection of roughly 100 supernovae per year for statistical studies. The search should be operational in mid-1982.

  10. [From automation to robotics].

    PubMed

    1985-01-01

    The introduction of automation into the laboratory of biology seems to be unavoidable. But at which cost, if it is necessary to purchase a new machine for every new application? Fortunately the same image processing techniques, belonging to a theoretic framework called Mathematical Morphology, may be used in visual inspection tasks, both in car industry and in the biology lab. Since the market for industrial robotics applications is much higher than the market of biomedical applications, the price of image processing devices drops, and becomes sometimes less than the price of a complete microscope equipment. The power of the image processing methods of Mathematical Morphology will be illustrated by various examples, as automatic silver grain counting in autoradiography, determination of HLA genotype, electrophoretic gels analysis, automatic screening of cervical smears... Thus several heterogeneous applications may share the same image processing device, provided there is a separate and devoted work station for each of them. PMID:4091303

  11. Health care automation companies.

    PubMed

    1995-12-01

    Health care automation companies: card transaction processing/EFT/EDI-capable banks; claims auditing/analysis; claims processors/clearinghouses; coding products/services; computer hardware; computer networking/LAN/WAN; consultants; data processing/outsourcing; digital dictation/transcription; document imaging/optical disk storage; executive information systems; health information networks; hospital/health care information systems; interface engines; laboratory information systems; managed care information systems; patient identification/credit cards; pharmacy information systems; POS terminals; radiology information systems; software--claims related/computer-based patient records/home health care/materials management/supply ordering/physician practice management/translation/utilization review/outcomes; telecommunications products/services; telemedicine/teleradiology; value-added networks. PMID:10153839

  12. Automating Frame Analysis

    SciTech Connect

    Sanfilippo, Antonio P.; Franklin, Lyndsey; Tratz, Stephen C.; Danielson, Gary R.; Mileson, Nicholas D.; Riensche, Roderick M.; McGrath, Liam

    2008-04-01

    Frame Analysis has come to play an increasingly stronger role in the study of social movements in Sociology and Political Science. While significant steps have been made in providing a theory of frames and framing, a systematic characterization of the frame concept is still largely lacking and there are no rec-ognized criteria and methods that can be used to identify and marshal frame evi-dence reliably and in a time and cost effective manner. Consequently, current Frame Analysis work is still too reliant on manual annotation and subjective inter-pretation. The goal of this paper is to present an approach to the representation, acquisition and analysis of frame evidence which leverages Content Analysis, In-formation Extraction and Semantic Search methods to provide a systematic treat-ment of a Frame Analysis and automate frame annotation.

  13. Automated stereotactic positioning system.

    PubMed

    Goerss, S J; Kelly, P J; Kall, B A

    1987-01-01

    An automated stereotactic machine has been interfaced to a surgical computer to complete a totally interactive surgical system capable of locating tumor volumes. Stepper motors, activated by the host computer, drive a three-dimensional slide to position the patient's head with respect to a fixed arc, locating the surgical target. Linear encoders on each axis create a closed-loop positioning system and a digital display for visual inspection of the slide's position. The 160-mm arc directs all instrumentation to its isocenter, regardless of the two angular settings, providing maximum freedom in selecting a safe trajectory to the target. Phantom test points compatible with computerized tomographic and magnetic resonance imaging were repeatedly scanned to determine the overall system accuracy, which approached 0.6 mm, depending on the spatial resolution of the image. This stereotactic device may be used to perform stereotactic laser craniotomies, biopsies, 192Ir implants for interstitial radiation, third ventriculostomies and functional procedures. PMID:3329830

  14. Automated Analysis Workstation

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Information from NASA Tech Briefs of work done at Langley Research Center and the Jet Propulsion Laboratory assisted DiaSys Corporation in manufacturing their first product, the R/S 2000. Since then, the R/S 2000 and R/S 2003 have followed. Recently, DiaSys released their fourth workstation, the FE-2, which automates the process of making and manipulating wet-mount preparation of fecal concentrates. The time needed to read the sample is decreased, permitting technologists to rapidly spot parasites, ova and cysts, sometimes carried in the lower intestinal tract of humans and animals. Employing the FE-2 is non-invasive, can be performed on an out-patient basis, and quickly provides confirmatory results.

  15. Automated Electrostatics Environmental Chamber

    NASA Technical Reports Server (NTRS)

    Calle, Carlos; Lewis, Dean C.; Buchanan, Randy K.; Buchanan, Aubri

    2005-01-01

    The Mars Electrostatics Chamber (MEC) is an environmental chamber designed primarily to create atmospheric conditions like those at the surface of Mars to support experiments on electrostatic effects in the Martian environment. The chamber is equipped with a vacuum system, a cryogenic cooling system, an atmospheric-gas replenishing and analysis system, and a computerized control system that can be programmed by the user and that provides both automation and options for manual control. The control system can be set to maintain steady Mars-like conditions or to impose temperature and pressure variations of a Mars diurnal cycle at any given season and latitude. In addition, the MEC can be used in other areas of research because it can create steady or varying atmospheric conditions anywhere within the wide temperature, pressure, and composition ranges between the extremes of Mars-like and Earth-like conditions.

  16. Automated Standard Hazard Tool

    NASA Technical Reports Server (NTRS)

    Stebler, Shane

    2014-01-01

    The current system used to generate standard hazard reports is considered cumbersome and iterative. This study defines a structure for this system's process in a clear, algorithmic way so that standard hazard reports and basic hazard analysis may be completed using a centralized, web-based computer application. To accomplish this task, a test server is used to host a prototype of the tool during development. The prototype is configured to easily integrate into NASA's current server systems with minimal alteration. Additionally, the tool is easily updated and provides NASA with a system that may grow to accommodate future requirements and possibly, different applications. Results of this project's success are outlined in positive, subjective reviews complete by payload providers and NASA Safety and Mission Assurance personnel. Ideally, this prototype will increase interest in the concept of standard hazard automation and lead to the full-scale production of a user-ready application.

  17. Automated attendance accounting system

    NASA Technical Reports Server (NTRS)

    Chapman, C. P. (Inventor)

    1973-01-01

    An automated accounting system useful for applying data to a computer from any or all of a multiplicity of data terminals is disclosed. The system essentially includes a preselected number of data terminals which are each adapted to convert data words of decimal form to another form, i.e., binary, usable with the computer. Each data terminal may take the form of a keyboard unit having a number of depressable buttons or switches corresponding to selected data digits and/or function digits. A bank of data buffers, one of which is associated with each data terminal, is provided as a temporary storage. Data from the terminals is applied to the data buffers on a digit by digit basis for transfer via a multiplexer to the computer.

  18. Automating the multiprocessing environment

    NASA Technical Reports Server (NTRS)

    Arpasi, Dale J.

    1989-01-01

    An approach to automate the programming and operation of tree-structured networks of multiprocessor systems is discussed. A conceptual, knowledge-based operating environment is presented, and requirements for two major technology elements are identified as follows: (1) An intelligent information translator is proposed for implementating information transfer between dissimilar hardware and software, thereby enabling independent and modular development of future systems and promoting a language-independence of codes and information; (2) A resident system activity manager, which recognizes the systems capabilities and monitors the status of all systems within the environment, is proposed for integrating dissimilar systems into effective parallel processing resources to optimally meet user needs. Finally, key computational capabilities which must be provided before the environment can be realized are identified.

  19. Solar array automation limitations

    NASA Technical Reports Server (NTRS)

    Trumble, Terry M.

    1990-01-01

    Significant progress in the automation of the spacecraft electrical power systems has been made within the past few years. This is especially important with the development of the space station and the increasing demand on the electrical power systems for future satellites. The key element of the spacecraft power system, the solar arrays which supply the power, will have to grow to supply many tens of kilowatts of power within the next twenty years. This growth will be accompanied by the problems associated with large distributed power systems. The growth of the arrays, the on-array management problems and potential solutions to array degradation or failure are discussed. Multilowatt arrays for unmanned spacecraft with comments on the implications of array degradation for manned spacecraft are discussed.

  20. Automating the analytical laboratory via the Chemical Analysis Automation paradigm

    SciTech Connect

    Hollen, R.; Rzeszutko, C.

    1997-10-01

    To address the need for standardization within the analytical chemistry laboratories of the nation, the Chemical Analysis Automation (CAA) program within the US Department of Energy, Office of Science and Technology`s Robotic Technology Development Program is developing laboratory sample analysis systems that will automate the environmental chemical laboratories. The current laboratory automation paradigm consists of islands-of-automation that do not integrate into a system architecture. Thus, today the chemist must perform most aspects of environmental analysis manually using instrumentation that generally cannot communicate with other devices in the laboratory. CAA is working towards a standardized and modular approach to laboratory automation based upon the Standard Analysis Method (SAM) architecture. Each SAM system automates a complete chemical method. The building block of a SAM is known as the Standard Laboratory Module (SLM). The SLM, either hardware or software, automates a subprotocol of an analysis method and can operate as a standalone or as a unit within a SAM. The CAA concept allows the chemist to easily assemble an automated analysis system, from sample extraction through data interpretation, using standardized SLMs without the worry of hardware or software incompatibility or the necessity of generating complicated control programs. A Task Sequence Controller (TSC) software program schedules and monitors the individual tasks to be performed by each SLM configured within a SAM. The chemist interfaces with the operation of the TSC through the Human Computer Interface (HCI), a logical, icon-driven graphical user interface. The CAA paradigm has successfully been applied in automating EPA SW-846 Methods 3541/3620/8081 for the analysis of PCBs in a soil matrix utilizing commercially available equipment in tandem with SLMs constructed by CAA.

  1. Automated imatinib immunoassay

    PubMed Central

    Beumer, Jan H.; Kozo, Daniel; Harney, Rebecca L.; Baldasano, Caitlin N.; Jarrah, Justin; Christner, Susan M.; Parise, Robert; Baburina, Irina; Courtney, Jodi B.; Salamone, Salvatore J.

    2014-01-01

    Background Imatinib pharmacokinetic variability and the relationship of trough concentrations with clinical outcomes have been extensively reported. Though physical methods to quantitate imatinib exist, they are not widely available for routine use. An automated homogenous immunoassay for imatinib has been developed, facilitating routine imatinib testing. Methods Imatinib-selective monoclonal antibodies, without substantial cross-reactivity to the N-desmethyl metabolite or N-desmethyl conjugates, were produced. The antibodies were conjugated to 200 nm particles to develop immunoassay reagents on the Beckman Coulter AU480™ analyzer. These reagents were analytically validated using Clinical Laboratory Standards Institute protocols. Method comparison to LC-MS/MS was conducted using 77 plasma samples collected from subjects receiving imatinib. Results The assay requires 4 µL of sample without pre-treatment. The non-linear calibration curve ranges from 0 to 3,000 ng/mL. With automated sample dilution, concentrations of up to 9,000 ng/mL can be quantitated. The AU480 produces the first result in 10 minutes, and up to 400 tests per hour. Repeatability ranged from 2.0 to 6.0% coefficient of variation (CV), and within-laboratory reproducibility ranged from 2.9 to 7.4% CV. Standard curve stability was two weeks and on-board reagent stability was 6 weeks. For clinical samples with imatinib concentrations from 438 – 2,691 ng/mL, method comparison with LC-MS/MS gave a slope of 0.995 with a y-intercept of 24.3 and a correlation coefficient of 0.978. Conclusion The immunoassay is suitable for quantitating imatinib in human plasma, demonstrating good correlation with a physical method. Testing for optimal imatinib exposure can now be performed on routine clinical analyzers. PMID:25551407

  2. Automated endoscope reprocessors.

    PubMed

    Desilets, David; Kaul, Vivek; Tierney, William M; Banerjee, Subhas; Diehl, David L; Farraye, Francis A; Kethu, Sripathi R; Kwon, Richard S; Mamula, Petar; Pedrosa, Marcos C; Rodriguez, Sarah A; Wong Kee Song, Louis-Michel

    2010-10-01

    The ASGE Technology Committee provides reviews of existing, new, or emerging endoscopic technologies that have an impact on the practice of GI endoscopy. Evidence-based methodology is used, with a MEDLINE literature search to identify pertinent clinical studies on the topic and a MAUDE (U.S. Food and Drug Administration Center for Devices and Radiological Health) database search to identify the reported complications of a given technology. Both are supplemented by accessing the "related articles" feature of PubMed and by scrutinizing pertinent references cited by the identified studies. Controlled clinical trials are emphasized, but in many cases data from randomized, controlled trials are lacking. In such cases, large case series, preliminary clinical studies, and expert opinions are used. Technical data are gathered from traditional and Web-based publications, proprietary publications, and informal communications with pertinent vendors. Technology Status Evaluation Reports are drafted by 1 or 2 members of the ASGE Technology Committee, reviewed and edited by the committee as a whole, and approved by the Governing Board of the ASGE. When financial guidance is indicated, the most recent coding data and list prices at the time of publication are provided. For this review, the MEDLINE database was searched through February 2010 for articles related to automated endoscope reprocessors, using the words endoscope reprocessing, endoscope cleaning, automated endoscope reprocessors, and high-level disinfection. Technology Status Evaluation Reports are scientific reviews provided solely for educational and informational purposes. Technology Status Evaluation Reports are not rules and should not be construed as establishing a legal standard of care or as encouraging, advocating, requiring, or discouraging any particular treatment or payment for such treatment. PMID:20883843

  3. Integrated approach using multistep enzyme digestion, TiO2 enrichment, and database search for in-depth phosphoproteomic profiling.

    PubMed

    Han, Dohyun; Jin, Jonghwa; Yu, Jiyoung; Kim, Kyunggon; Kim, Youngsoo

    2015-01-01

    Protein phosphorylation is a major PTM that regulates important cell signaling mechanisms. In-depth phosphoproteomic analysis provides a method of examining this complex interplay, yielding a mechanistic understanding of the cellular processes and pathogenesis of various diseases. However, the analysis of protein phosphorylation is challenging, due to the low concentration of phosphoproteins in highly complex mixtures and the high variability of phosphorylation sites. Thus, typical phosphoproteome studies that are based on MS require large amounts of starting material and extensive fractionation steps to reduce the sample complexity. To this end, we present a simple strategy (integrated multistep enzyme digestion, enrichment, database search-iMEED) to improve coverage of the phosphoproteome from lower sample amounts which is faster than other commonly used approaches. It is inexpensive and adaptable to low sample amounts and saves time and effort with regard to sample preparation and mass spectrometric analysis, allowing samples to be prepared without prefractionation or specific instruments, such as HPLC. All MS data have been deposited in the ProteomeXchange with identifier PXD001033 (http://proteomecentral.proteomexchange.org/dataset/PXD001033). PMID:25159016

  4. Multiwavelength Observations of a Slow Raise, Multi-Step X1.6 Flare and the Associated Eruption

    NASA Astrophysics Data System (ADS)

    Yurchyshyn, V.

    2015-12-01

    Using multi-wavelength observations we studied a slow rise, multi-step X1.6 flare that began on November 7, 2014 as a localized eruption of core fields inside a δ-sunspot and later engulfed the entire active region. This flare event was associated with formation of two systems of post eruption arcades (PEAs) and several J-shaped flare ribbons showing extremely fine details, irreversible changes in the photospheric magnetic fields, and it was accompanied by a fast and wide coronal mass ejection. Data from the Solar Dynamics Observatory, IRIS spacecraft along with the ground based data from the New Solar Telescope (NST) present evidence that i) the flare and the eruption were directly triggered by a flux emergence that occurred inside a δ--sunspot at the boundary between two umbrae; ii) this event represented an example of an in-situ formation of an unstable flux rope observed only in hot AIA channels (131 and 94Å) and LASCO C2 coronagraph images; iii) the global PEA system spanned the entire AR and was due to global scale reconnection occurring at heights of about one solar radii, indicating on the global spatial and temporal scale of the eruption.

  5. Multiwavelength Observations of a Slow-rise, Multistep X1.6 Flare and the Associated Eruption

    NASA Astrophysics Data System (ADS)

    Yurchyshyn, V.; Kumar, P.; Cho, K.-S.; Lim, E.-K.; Abramenko, V. I.

    2015-10-01

    Using multiwavelength observations, we studied a slow-rise, multistep X1.6 flare that began on 2014 November 7 as a localized eruption of core fields inside a δ-sunspot and later engulfed the entire active region (AR). This flare event was associated with formation of two systems of post-eruption arcades (PEAs) and several J-shaped flare ribbons showing extremely fine details, irreversible changes in the photospheric magnetic fields, and it was accompanied by a fast and wide coronal mass ejection. Data from the Solar Dynamics Observatory and IRIS spacecraft, along with the ground-based data from the New Solar Telescope, present evidence that (i) the flare and the eruption were directly triggered by a flux emergence that occurred inside a δ-sunspot at the boundary between two umbrae; (ii) this event represented an example of the formation of an unstable flux rope observed only in hot AIA channels (131 and 94 Å) and LASCO C2 coronagraph images; (iii) the global PEA spanned the entire AR and was due to global-scale reconnection occurring at heights of about one solar radius, indicating the global spatial and temporal scale of the eruption.

  6. Neutrophilic Cathepsin C Is Maturated by a Multistep Proteolytic Process and Secreted by Activated Cells during Inflammatory Lung Diseases.

    PubMed

    Hamon, Yveline; Legowska, Monika; Hervé, Virginie; Dallet-Choisy, Sandrine; Marchand-Adam, Sylvain; Vanderlynden, Lise; Demonte, Michèle; Williams, Rich; Scott, Christopher J; Si-Tahar, Mustapha; Heuzé-Vourc'h, Nathalie; Lalmanach, Gilles; Jenne, Dieter E; Lesner, Adam; Gauthier, Francis; Korkmaz, Brice

    2016-04-15

    The cysteine protease cathepsin C (CatC) activates granule-associated proinflammatory serine proteases in hematopoietic precursor cells. Its early inhibition in the bone marrow is regarded as a new therapeutic strategy for treating proteolysis-driven chronic inflammatory diseases, but its complete inhibition is elusive in vivo Controlling the activity of CatC may be achieved by directly inhibiting its activity with a specific inhibitor or/and by preventing its maturation. We have investigated immunochemically and kinetically the occurrence of CatC and its proform in human hematopoietic precursor cells and in differentiated mature immune cells in lung secretions. The maturation of proCatC obeys a multistep mechanism that can be entirely managed by CatS in neutrophilic precursor cells. CatS inhibition by a cell-permeable inhibitor abrogated the release of the heavy and light chains from proCatC and blocked ∼80% of CatC activity. Under these conditions the activity of neutrophil serine proteases, however, was not abolished in precursor cell cultures. In patients with neutrophilic lung inflammation, mature CatC is found in large amounts in sputa. It is secreted by activated neutrophils as confirmed through lipopolysaccharide administration in a nonhuman primate model. CatS inhibitors currently in clinical trials are expected to decrease the activity of neutrophilic CatC without affecting those of elastase-like serine proteases. PMID:26884336

  7. Trace determination of gadolinium in biomedical samples by diode laser-based multi-step resonance ionization mass spectrometry.

    PubMed

    Blaum, K; Geppert, C; Schreiber, W G; Hengstler, J G; Müller, P; Nörtershäuser, W; Wendt, K; Bushaw, B A

    2002-04-01

    The application of high-resolution multi-step resonance ionization mass spectrometry (RIMS) to the trace determination of the rare earth element gadolinium is described. Utilizing three-step resonant excitation into an autoionizing level, both isobaric and isotopic selectivity of >10(7) were attained. An overall detection efficiency of approximately 10(-7) and an isotope specific detection limit of 1.5 x 10(9) atoms have been demonstrated. When targeting the major isotope (158)Gd, this corresponds to a total Gd detection limit of 1.6 pg. Additionally, linear response has been demonstrated over a dynamic range of six orders of magnitude. The method has been used to determine the Gd content in various normal and tumor tissue samples, taken from a laboratory mouse shortly after injection of gadolinium diethylenetriaminepentaacetic acid dimeglumine (Gd-DTPA), which is used as a contrast agent for magnetic resonance imaging (MRI). The RIMS results show Gd concentrations that vary by more than two orders of magnitude (0.07-11.5 microg mL(-1)) depending on the tissue type. This variability is similar to that observed in MRI scans that depict Gd-DTPA content in the mouse prior to dissection, and illustrates the potential for quantitative trace analysis in microsamples of biomedical materials. PMID:12012186

  8. A pilot randomized, placebo-controlled, double-blind trial of a multistep herbal program for assisting smokers to quit.

    PubMed

    James, Gary D; Britton, Geraldine R; Sobczak, Janet; Rhodes-Keefe, Joyce; Sprague, Lori; Gueldner, Sarah H

    2012-12-01

    This pilot randomized-controlled trial was designed to evaluate the effectiveness of an over-the-counter multistep herbal smoking cessation regimen, SmokeRx, that employs four different herbal formulations taken at different times during the program. Twenty-two subjects were randomized to a placebo group and 20 to the SmokeRx program. The results show that the odds of reduced or validated cessation of smoking were not significantly different between the groups at any juncture over the 6 months of the trial but that there was a trend for higher odds in the SmokeRx group. Subjects were also more likely to drop out of the placebo group (p = .06), suggesting a possible positive effect of the SmokeRx regimen. Overall, early dropouts (at 2 week follow-up) appeared less motivated to quit smoking, as they were more likely to be younger, had smoked more than 5 years, had greater difficulty refraining from smoking in places where it is forbidden, had fewer previous quit attempts, did not intend to quit smoking in the next month, and exercised fewer hours per week. These results suggest that a larger trial of SmokeRx may be warranted and that more studies that assess the efficacy of herbal formulas are needed to provide valid data for non-nicotine smoking cessation options. PMID:24622491

  9. Automatic Generation of Wide Dynamic Range Image without Pseudo-Edge Using Integration of Multi-Steps Exposure Images

    NASA Astrophysics Data System (ADS)

    Migiyama, Go; Sugimura, Atsuhiko; Osa, Atsushi; Miike, Hidetoshi

    Recently, digital cameras are offering technical advantages rapidly. However, the shot image is different from the sight image generated when that scenery is seen with the naked eye. There are blown-out highlights and crushed blacks in the image that photographed the scenery of wide dynamic range. The problems are hardly generated in the sight image. These are contributory cause of difference between the shot image and the sight image. Blown-out highlights and crushed blacks are caused by the difference of dynamic range between the image sensor installed in a digital camera such as CCD and CMOS and the human visual system. Dynamic range of the shot image is narrower than dynamic range of the sight image. In order to solve the problem, we propose an automatic method to decide an effective exposure range in superposition of edges. We integrate multi-step exposure images using the method. In addition, we try to erase pseudo-edges using the process to blend exposure values. Afterwards, we get a pseudo wide dynamic range image automatically.

  10. Segmenting the Femoral Head and Acetabulum in the Hip Joint Automatically Using a Multi-Step Scheme

    NASA Astrophysics Data System (ADS)

    Wang, Ji; Cheng, Yuanzhi; Fu, Yili; Zhou, Shengjun; Tamura, Shinichi

    We describe a multi-step approach for automatic segmentation of the femoral head and the acetabulum in the hip joint from three dimensional (3D) CT images. Our segmentation method consists of the following steps: 1) construction of the valley-emphasized image by subtracting valleys from the original images; 2) initial segmentation of the bone regions by using conventional techniques including the initial threshold and binary morphological operations from the valley-emphasized image; 3) further segmentation of the bone regions by using the iterative adaptive classification with the initial segmentation result; 4) detection of the rough bone boundaries based on the segmented bone regions; 5) 3D reconstruction of the bone surface using the rough bone boundaries obtained in step 4) by a network of triangles; 6) correction of all vertices of the 3D bone surface based on the normal direction of vertices; 7) adjustment of the bone surface based on the corrected vertices. We evaluated our approach on 35 CT patient data sets. Our experimental results show that our segmentation algorithm is more accurate and robust against noise than other conventional approaches for automatic segmentation of the femoral head and the acetabulum. Average root-mean-square (RMS) distance from manual reference segmentations created by experienced users was approximately 0.68mm (in-plane resolution of the CT data).

  11. A multi-step reaction model for ignition of fully-dense Al-CuO nanocomposite powders

    NASA Astrophysics Data System (ADS)

    Stamatis, D.; Ermoline, A.; Dreizin, E. L.

    2012-12-01

    A multi-step reaction model is developed to describe heterogeneous processes occurring upon heating of an Al-CuO nanocomposite material prepared by arrested reactive milling. The reaction model couples a previously derived Cabrera-Mott oxidation mechanism describing initial, low temperature processes and an aluminium oxidation model including formation of different alumina polymorphs at increased film thicknesses and higher temperatures. The reaction model is tuned using traces measured by differential scanning calorimetry. Ignition is studied for thin powder layers and individual particles using respectively the heated filament (heating rates of 103-104 K s-1) and laser ignition (heating rate ∼106 K s-1) experiments. The developed heterogeneous reaction model predicts a sharp temperature increase, which can be associated with ignition when the laser power approaches the experimental ignition threshold. In experiments, particles ignited by the laser beam are observed to explode, indicating a substantial gas release accompanying ignition. For the heated filament experiments, the model predicts exothermic reactions at the temperatures, at which ignition is observed experimentally; however, strong thermal contact between the metal filament and powder prevents the model from predicting the thermal runaway. It is suggested that oxygen gas release from decomposing CuO, as observed from particles exploding upon ignition in the laser beam, disrupts the thermal contact of the powder and filament; this phenomenon must be included in the filament ignition model to enable prediction of the temperature runaway.

  12. Terahertz spectroscopic imaging using noncollinear electro-optic sampling and a multistep mirror without shifting the object

    NASA Astrophysics Data System (ADS)

    Itani, Norihiko; Maruyama, Kazunori; Hasegawa, Shin-ya; Wakana, Shinichi

    2012-10-01

    We previously developed a high-speed terahertz spectroscopic imaging method based on electro-optic sampling with a noncollinear geometry of the THz beam and probe laser beam and using a multistep mirror in the path of the probe beam. We set the incident probe laser into MAST at a 45° angle, to prevent interference between adjacent beams. However, this produced beam vignetting, so imaging had to be performed twice, between sample movements, and this increased the imaging time accordingly. Thus, we improved the probe-laser imaging system after reflecting from the MAST to correct for the effects of diffraction. This prevents interference from adjacent beams and allows the angle of incidence on the MAST to be set to 0°, enabling the entire sample surface to be imaged in one measurement. As a result, we are able to perform measurements in 40 seconds, half the time of the previous method, and obtain a 28x28-pixel spectral image with spatial resolution of 1.07 mm. To verify the imaging performance, we also measured test samples, showing that the shape and thickness of items inside an opaque plastic case can be distinguished using amplitude and phase images, and metallic foreign objects can be detected. We also evaluated the method and were able to show the validity of the spectral imaging results by distinguishing the transmission or blocking of arbitrary frequency components.

  13. Multi-step approach for comparing the local air pollution contributions of conventional and innovative MSW thermo-chemical treatments.

    PubMed

    Ragazzi, M; Rada, E C

    2012-10-01

    In the sector of municipal solid waste management the debate on the performances of conventional and novel thermo-chemical technologies is still relevant. When a plant must be constructed, decision makers often select a technology prior to analyzing the local environmental impact of the available options, as this type of study is generally developed when the design of the plant has been carried out. Additionally, in the literature there is a lack of comparative analyses of the contributions to local air pollution from different technologies. The present study offers a multi-step approach, based on pollutant emission factors and atmospheric dilution coefficients, for a local comparative analysis. With this approach it is possible to check if some assumptions related to the advantages of the novel thermochemical technologies, in terms of local direct impact on air quality, can be applied to municipal solid waste treatment. The selected processes concern combustion, gasification and pyrolysis, alone or in combination. The pollutants considered are both carcinogenic and non-carcinogenic. A case study is presented concerning the location of a plant in an alpine region and its contribution to the local air pollution. Results show that differences among technologies are less than expected. Performances of each technology are discussed in details. PMID:22795304

  14. Multi-step reaction mechanism for F atom interactions with organosilicate glass and SiO x films

    NASA Astrophysics Data System (ADS)

    Mankelevich, Yuri A.; Voronina, Ekaterina N.; Rakhimova, Tatyana V.; Palov, Alexander P.; Lopaev, Dmitry V.; Zyryanov, Sergey M.; Baklanov, Mikhail R.

    2016-09-01

    An ab initio approach with the density functional theory (DFT) method was used to study F atom interactions with organosilicate glass (OSG)-based low-k dielectric films. Because of the complexity and significant modifications of the OSG surface structure during the interaction with radicals and etching, a variety of reactions between the surface groups and thermal F atoms can happen. For OSG film etching and damage, we propose a multi-step mechanism based on DFT static and dynamic simulations, which is consistent with the previously reported experimental observations. The important part of the proposed mechanism is the formation of pentavalent Si atoms on the OSG surface due to a quasi-chemisorption of the incident F atoms. The revealed mechanism of F atom incorporation into the OSG matrix explains the experimentally observed phenomena of fast fluorination without significant modification of the chemical structure. We demonstrate that the pentavalent Si states induce the weakening of adjacent Si–O bonds and their breaking under F atom flux. The calculated results allow us to propose a set of elementary chemical reactions of successive removal of CH3 and CH2 groups and fluorinated SiO x matrix etching.

  15. Altered expression of CKs 14/20 is an early event in a rat model of multistep bladder carcinogenesis.

    PubMed

    Gil da Costa, Rui M; Oliveira, Paula A; Vasconcelos-Nóbrega, Carmen; Arantes-Rodrigues, Regina; Pinto-Leite, Rosário; Colaço, Aura A; de la Cruz, Luis F; Lopes, Carlos

    2015-10-01

    Cytokeratins (CKs) 14 and 20 are promising markers for diagnosing urothelial lesions and for studying their prognosis and histogenesis. This work aimed to study the immunohistochemical staining patterns of CK14/20 during multistep carcinogenesis leading to papillary bladder cancer in a rat model. Thirty female Fischer 344 rats were divided into three groups: group 1 (control); group 2, which received N-butyl-N-(4-hydroxybutyl)nitrosamine (BBN) for 20 weeks plus 1 week without treatment; and group 3, which received BBN for 20 weeks plus 8 weeks without treatment. Bladder lesions were classified histologically. CK14 and CK20 immunostaining was assessed according to its distribution and intensity. In control animals, 0-25% of basal cells and umbrella cells stained positive for CK14 and CK20 respectively. On groups 2 and 3, nodular hyperplastic lesions showed normal CK20 and moderately increased CK14 staining (26-50% of cells). Dysplasia, squamous metaplasia, papilloma, papillary tumours of low malignant potential and low- and high-grade papillary carcinomas showed increased CK14 and CK20 immunostaining in all epithelial layers. Altered CK14 and CK20 expression is an early event in urothelial carcinogenesis and is present in a wide spectrum of urothelial superficial neoplastic and preneoplastic lesions. PMID:26515584

  16. Anisotropic multi-step etching for large-area fabrication of surface microstructures on stainless steel to control thermal radiation

    NASA Astrophysics Data System (ADS)

    Shimizu, M.; Yamada, T.; Sasaki, K.; Takada, A.; Nomura, H.; Iguchi, F.; Yugami, H.

    2015-04-01

    Controlling the thermal radiation spectra of materials is one of the promising ways to advance energy system efficiency. It is well known that the thermal radiation spectrum can be controlled through the introduction of periodic surface microstructures. Herein, a method for the large-area fabrication of periodic microstructures based on multi-step wet etching is described. The method consists of three main steps, i.e., resist mask fabrication via photolithography, electrochemical wet etching, and side wall protection. Using this method, high-aspect micro-holes (0.82 aspect ratio) arrayed with hexagonal symmetry were fabricated on a stainless steel substrate. The conventional wet etching process method typically provides an aspect ratio of 0.3. The optical absorption peak attributed to the fabricated micro-hole array appeared at 0.8 μm, and the peak absorbance exceeded 0.8 for the micro-holes with a 0.82 aspect ratio. While argon plasma etching in a vacuum chamber was used in the present study for the formation of the protective layer, atmospheric plasma etching should be possible and will expand the applicability of this new method for the large-area fabrication of high-aspect materials.

  17. Evaluation of a statewide program in genetic diseases.

    PubMed

    Mitchell, J A; Petroski, G

    1998-07-01

    We used the Genetics Office Automation System (GOAS), a database management system designed to facilitate collection and analysis of medical genetic data, to evaluate the Missouri Genetics Disease Program (MGDP). From 1985 through 1995, patient data were collected at four tertiary care genetic centers. The number of genetic visits per 100,000 people more than doubled from 1985 through 1995. The results of subpopulation analyses indicate that the MGDP has facilitated improvements in: (1) services for newborns and infants, (2) rural outreach programs, and (3) evaluation of the incidence and impact of genetic disorders. PMID:9677054

  18. Multi-step ion beam etching of sub-30 nm magnetic tunnel junctions for reducing leakage and MgO barrier damage

    SciTech Connect

    Chun, Sung-woo; Kim, Daehong; Kwon, Jihun; Kim, Bongho; Choi, Seonjun; Lee, Seung-Beck

    2012-04-01

    We have demonstrated the fabrication of sub 30 nm magnetic tunnel junctions (MTJs) with perpendicular magnetic anisotropy. The multi-step ion beam etching (IBE) process performed for 18 min between 45 deg. and 30 deg. , at 500 V combined ion supply voltage, resulted in a 55 nm tall MTJ with 28 nm diameter. We used a negative tone electron beam resist as the hard mask, which maintained its lateral dimension during the IBE, allowing almost vertical pillar side profiles. The measurement results showed a tunnel magneto-resistance ratio of 13% at 1 k{Omega} junction resistance. With further optimization in IBE energy and multi-step etching process, it will be possible to fabricate perpendicularly oriented MTJs for future sub 30 nm non-volatile magnetic memory applications.

  19. Influence of multi-step heat treatments in creep age forming of 7075 aluminum alloy: Optimization for springback, strength and exfoliation corrosion

    SciTech Connect

    Arabi Jeshvaghani, R.; Zohdi, H.; Shahverdi, H.R.; Bozorg, M.; Hadavi, S.M.M.

    2012-11-15

    Multi-step heat treatments comprise of high temperature forming (150 Degree-Sign C/24 h plus 190 Degree-Sign C for several minutes) and subsequent low temperature forming (120 Degree-Sign C for 24 h) is developed in creep age forming of 7075 aluminum alloy to decrease springback and exfoliation corrosion susceptibility without reduction in tensile properties. The results show that the multi-step heat treatment gives the low springback and the best combination of exfoliation corrosion resistance and tensile strength. The lower springback is attributed to the dislocation recovery and more stress relaxation at higher temperature. Transmission electron microscopy observations show that corrosion resistance is improved due to the enlargement in the size and the inter-particle distance of the grain boundaries precipitates. Furthermore, the achievement of the high strength is related to the uniform distribution of ultrafine {eta} Prime precipitates within grains. - Highlights: Black-Right-Pointing-Pointer Creep age forming developed for manufacturing of aircraft wing panels by aluminum alloy. Black-Right-Pointing-Pointer A good combination of properties with minimal springback is required in this component. Black-Right-Pointing-Pointer This requirement can be improved through the appropriate heat treatments. Black-Right-Pointing-Pointer Multi-step cycles developed in creep age forming of AA7075 for improving of springback and properties. Black-Right-Pointing-Pointer Results indicate simultaneous enhancing the properties and shape accuracy (lower springback).

  20. Genetic Aberrations in Childhood Acute Lymphoblastic Leukaemia: Application of High-Density Single Nucleotide Polymorphism Array

    PubMed Central

    Sulong, Sarina

    2010-01-01

    Screening of the entire human genome using high-density single nucleotide polymorphism array (SNPA) has become a powerful technique used in cancer genetics and population genetics studies. The GeneChip® Mapping Array, introduced by Affymetrix, is one SNPA platform utilised for genotyping studies. This GeneChip system allows researchers to gain a comprehensive view of cancer biology on a single platform for the quantification of chromosomal amplifications, deletions, and loss of heterozygosity or for allelic imbalance studies. Importantly, this array analysis has the potential to reveal novel genetic findings involved in the multistep development of cancer. Given the importance of genetic factors in leukaemogenesis and the usefulness of screening the whole genome, SNPA analysis has been utilised in many studies to characterise genetic aberrations in childhood acute lymphoblastic leukaemia. PMID:22135543

  1. Automated harvesting and 2-step purification of unclarified mammalian cell-culture broths containing antibodies.

    PubMed

    Holenstein, Fabian; Eriksson, Christer; Erlandsson, Ioana; Norrman, Nils; Simon, Jill; Danielsson, Åke; Milicov, Adriana; Schindler, Patrick; Schlaeppi, Jean-Marc

    2015-10-30

    Therapeutic monoclonal antibodies represent one of the fastest growing segments in the pharmaceutical market. The growth of the segment has necessitated development of new efficient and cost saving platforms for the preparation and analysis of early candidates for faster and better antibody selection and characterization. We report on a new integrated platform for automated harvesting of whole unclarified cell-culture broths, followed by in-line tandem affinity-capture, pH neutralization and size-exclusion chromatography of recombinant antibodies expressed transiently in mammalian human embryonic kidney 293T-cells at the 1-L scale. The system consists of two bench-top chromatography instruments connected to a central unit with eight disposable filtration devices used for loading and filtering the cell cultures. The staggered parallel multi-step configuration of the system allows unattended processing of eight samples in less than 24h. The system was validated with a random panel of 45 whole-cell culture broths containing recombinant antibodies in the early profiling phase. The results showed that the overall performances of the preparative automated system were higher compared to the conventional downstream process including manual harvesting and purification. The mean recovery of purified material from the culture-broth was 66.7%, representing a 20% increase compared to that of the manual process. Moreover, the automated process reduced by 3-fold the amount of residual aggregates in the purified antibody fractions, indicating that the automated system allows the cost-efficient and timely preparation of antibodies in the 20-200mg range, and covers the requirements for early in vitro and in vivo profiling and formulation of these drug candidates. PMID:26431859

  2. Programmable Automated Welding System (PAWS)

    NASA Technical Reports Server (NTRS)

    Kline, Martin D.

    1994-01-01

    An ambitious project to develop an advanced, automated welding system is being funded as part of the Navy Joining Center with Babcock & Wilcox as the prime integrator. This program, the Programmable Automated Welding System (PAWS), involves the integration of both planning and real-time control activities. Planning functions include the development of a graphical decision support system within a standard, portable environment. Real-time control functions include the development of a modular, intelligent, real-time control system and the integration of a number of welding process sensors. This paper presents each of these components of the PAWS and discusses how they can be utilized to automate the welding operation.

  3. Automated techniques for spacecraft monitoring

    NASA Technical Reports Server (NTRS)

    Segnar, H. R.

    1972-01-01

    The feasibility of implementing automated spacecraft monitoring depends on four factors: sufficient computer resources, suitable monitoring function definitions, adequate spacecraft data, and effective and economical test systems. The advantages of automated monitoring lie in the decision-making speed of the computer and the continuous monitoring coverage provided by an automated monitoring program. Use of these advantages introduces a new concept of spacecraft monitoring in which system specialists, ground based or onboard, freed from routine and tedious monitoring, could devote their expertise to unprogrammed or contingency situations.

  4. Automated Fluid Interface System (AFIS)

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Automated remote fluid servicing will be necessary for future space missions, as future satellites will be designed for on-orbit consumable replenishment. In order to develop an on-orbit remote servicing capability, a standard interface between a tanker and the receiving satellite is needed. The objective of the Automated Fluid Interface System (AFIS) program is to design, fabricate, and functionally demonstrate compliance with all design requirements for an automated fluid interface system. A description and documentation of the Fairchild AFIS design is provided.

  5. Genetic Mapping

    MedlinePlus

    ... Genetic Education Resources for Teachers Genomic Careers National DNA Day Online Education Kit Online Genetics Education Resources ... prevalent. Using various laboratory techniques, the scientists isolate DNA from these samples and examine it for unique ...

  6. Genetic counseling

    MedlinePlus

    ... this page: //medlineplus.gov/ency/patientinstructions/000510.htm Genetic counseling To use the sharing features on this ... cystic fibrosis or Down syndrome. Who May Want Genetic Counseling? It is up to you whether or ...

  7. Genetic counseling

    MedlinePlus

    Genetics is the study of heredity, the process of a parent passing certain genes on to their ... certain diseases are also often determined by genes. Genetic counseling is the process where parents can learn ...

  8. Genetic Disorders

    MedlinePlus

    ... This can cause a medical condition called a genetic disorder. You can inherit a gene mutation from ... during your lifetime. There are three types of genetic disorders: Single-gene disorders, where a mutation affects ...

  9. Automated Engineering Design (AED); An approach to automated documentation

    NASA Technical Reports Server (NTRS)

    Mcclure, C. W.

    1970-01-01

    The automated engineering design (AED) is reviewed, consisting of a high level systems programming language, a series of modular precoded subroutines, and a set of powerful software machine tools that effectively automate the production and design of new languages. AED is used primarily for development of problem and user-oriented languages. Software production phases are diagramed, and factors which inhibit effective documentation are evaluated.

  10. Genetic modification and genetic determinism

    PubMed Central

    Resnik, David B; Vorhaus, Daniel B

    2006-01-01

    In this article we examine four objections to the genetic modification of human beings: the freedom argument, the giftedness argument, the authenticity argument, and the uniqueness argument. We then demonstrate that each of these arguments against genetic modification assumes a strong version of genetic determinism. Since these strong deterministic assumptions are false, the arguments against genetic modification, which assume and depend upon these assumptions, are therefore unsound. Serious discussion of the morality of genetic modification, and the development of sound science policy, should be driven by arguments that address the actual consequences of genetic modification for individuals and society, not by ones propped up by false or misleading biological assumptions. PMID:16800884

  11. Imaging Genetics

    ERIC Educational Resources Information Center

    Munoz, Karen E.; Hyde, Luke W.; Hariri, Ahmad R.

    2009-01-01

    Imaging genetics is an experimental strategy that integrates molecular genetics and neuroimaging technology to examine biological mechanisms that mediate differences in behavior and the risks for psychiatric disorder. The basic principles in imaging genetics and the development of the field are discussed.

  12. Fuzzy Control/Space Station automation

    NASA Technical Reports Server (NTRS)

    Gersh, Mark

    1990-01-01

    Viewgraphs on fuzzy control/space station automation are presented. Topics covered include: Space Station Freedom (SSF); SSF evolution; factors pointing to automation & robotics (A&R); astronaut office inputs concerning A&R; flight system automation and ground operations applications; transition definition program; and advanced automation software tools.

  13. 46 CFR 15.715 - Automated vessels.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Automated vessels. 15.715 Section 15.715 Shipping COAST... Limitations and Qualifying Factors § 15.715 Automated vessels. (a) Coast Guard acceptance of automated systems... automated system in establishing initial manning levels; however, until the system is proven reliable,...

  14. 46 CFR 15.715 - Automated vessels.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 1 2012-10-01 2012-10-01 false Automated vessels. 15.715 Section 15.715 Shipping COAST... Limitations and Qualifying Factors § 15.715 Automated vessels. (a) Coast Guard acceptance of automated systems... automated system in establishing initial manning levels; however, until the system is proven reliable,...

  15. 46 CFR 15.715 - Automated vessels.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 1 2011-10-01 2011-10-01 false Automated vessels. 15.715 Section 15.715 Shipping COAST... Limitations and Qualifying Factors § 15.715 Automated vessels. (a) Coast Guard acceptance of automated systems... automated system in establishing initial manning levels; however, until the system is proven reliable,...

  16. 46 CFR 15.715 - Automated vessels.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 1 2014-10-01 2014-10-01 false Automated vessels. 15.715 Section 15.715 Shipping COAST... Limitations and Qualifying Factors § 15.715 Automated vessels. (a) Coast Guard acceptance of automated systems... automated system in establishing initial manning levels; however, until the system is proven reliable,...

  17. 46 CFR 15.715 - Automated vessels.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 1 2013-10-01 2013-10-01 false Automated vessels. 15.715 Section 15.715 Shipping COAST... Limitations and Qualifying Factors § 15.715 Automated vessels. (a) Coast Guard acceptance of automated systems... automated system in establishing initial manning levels; however, until the system is proven reliable,...

  18. An Automation Survival Guide for Media Centers.

    ERIC Educational Resources Information Center

    Whaley, Roger E.

    1989-01-01

    Reviews factors that should affect the decision to automate a school media center and offers suggestions for the automation process. Topics discussed include getting the library collection ready for automation, deciding what automated functions are needed, evaluating software vendors, selecting software, and budgeting. (CLB)

  19. Office Automation Boosts University's Productivity.

    ERIC Educational Resources Information Center

    School Business Affairs, 1986

    1986-01-01

    The University of Pittsburgh has a 2-year agreement designating the Xerox Corporation as the primary supplier of word processing and related office automation equipment in order to increase productivity and more efficient use of campus resources. (MLF)

  20. Office Automation at Memphis State.

    ERIC Educational Resources Information Center

    Smith, R. Eugene; And Others

    1986-01-01

    The development of a university-wide office automation plan, beginning with a short-range pilot project and a five-year plan for the entire organization with the potential for modular implementation, is described. (MSE)

  1. Real Automation in the Field

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Mayero, Micaela; Bushnell, Dennis M. (Technical Monitor)

    2001-01-01

    We provide a package of strategies for automation of non-linear arithmetic in PVS. In particular, we describe a simplication procedure for the field of real numbers and a strategy for cancellation of common terms.

  2. Automation of antimicrobial activity screening.

    PubMed

    Forry, Samuel P; Madonna, Megan C; López-Pérez, Daneli; Lin, Nancy J; Pasco, Madeleine D

    2016-03-01

    Manual and automated methods were compared for routine screening of compounds for antimicrobial activity. Automation generally accelerated assays and required less user intervention while producing comparable results. Automated protocols were validated for planktonic, biofilm, and agar cultures of the oral microbe Streptococcus mutans that is commonly associated with tooth decay. Toxicity assays for the known antimicrobial compound cetylpyridinium chloride (CPC) were validated against planktonic, biofilm forming, and 24 h biofilm culture conditions, and several commonly reported toxicity/antimicrobial activity measures were evaluated: the 50 % inhibitory concentration (IC50), the minimum inhibitory concentration (MIC), and the minimum bactericidal concentration (MBC). Using automated methods, three halide salts of cetylpyridinium (CPC, CPB, CPI) were rapidly screened with no detectable effect of the counter ion on antimicrobial activity. PMID:26970766

  3. Maintenance of Automated Library Systems.

    ERIC Educational Resources Information Center

    Epstein, Susan Baerg

    1983-01-01

    Discussion of the maintenance of both the software and hardware in an automated library system highlights maintenance by the vendor, contracts and costs, the maintenance log, downtime, and planning for trouble. (EJS)

  4. Automating the Purple Crow Lidar

    NASA Astrophysics Data System (ADS)

    Hicks, Shannon; Sica, R. J.; Argall, P. S.

    2016-06-01

    The Purple Crow LiDAR (PCL) was built to measure short and long term coupling between the lower, middle, and upper atmosphere. The initial component of my MSc. project is to automate two key elements of the PCL: the rotating liquid mercury mirror and the Zaber alignment mirror. In addition to the automation of the Zaber alignment mirror, it is also necessary to describe the mirror's movement and positioning errors. Its properties will then be added into the alignment software. Once the alignment software has been completed, we will compare the new alignment method with the previous manual procedure. This is the first among several projects that will culminate in a fully-automated lidar. Eventually, we will be able to work remotely, thereby increasing the amount of data we collect. This paper will describe the motivation for automation, the methods we propose, preliminary results for the Zaber alignment error analysis, and future work.

  5. Cockpit avionics integration and automation

    NASA Technical Reports Server (NTRS)

    Pischke, Keith M.

    1990-01-01

    Information on cockpit avionics integration and automation is given in viewgraph form, with a number of photographs. The benefits of cockpit integration are listed. The MD-11 flight guidance/flight deck system is illustrated.

  6. Towards automated traceability maintenance.

    PubMed

    Mäder, Patrick; Gotel, Orlena

    2012-10-01

    Traceability relations support stakeholders in understanding the dependencies between artifacts created during the development of a software system and thus enable many development-related tasks. To ensure that the anticipated benefits of these tasks can be realized, it is necessary to have an up-to-date set of traceability relations between the established artifacts. This goal requires the creation of traceability relations during the initial development process. Furthermore, the goal also requires the maintenance of traceability relations over time as the software system evolves in order to prevent their decay. In this paper, an approach is discussed that supports the (semi-) automated update of traceability relations between requirements, analysis and design models of software systems expressed in the UML. This is made possible by analyzing change events that have been captured while working within a third-party UML modeling tool. Within the captured flow of events, development activities comprised of several events are recognized. These are matched with predefined rules that direct the update of impacted traceability relations. The overall approach is supported by a prototype tool and empirical results on the effectiveness of tool-supported traceability maintenance are provided. PMID:23471308

  7. Towards automated traceability maintenance

    PubMed Central

    Mäder, Patrick; Gotel, Orlena

    2012-01-01

    Traceability relations support stakeholders in understanding the dependencies between artifacts created during the development of a software system and thus enable many development-related tasks. To ensure that the anticipated benefits of these tasks can be realized, it is necessary to have an up-to-date set of traceability relations between the established artifacts. This goal requires the creation of traceability relations during the initial development process. Furthermore, the goal also requires the maintenance of traceability relations over time as the software system evolves in order to prevent their decay. In this paper, an approach is discussed that supports the (semi-) automated update of traceability relations between requirements, analysis and design models of software systems expressed in the UML. This is made possible by analyzing change events that have been captured while working within a third-party UML modeling tool. Within the captured flow of events, development activities comprised of several events are recognized. These are matched with predefined rules that direct the update of impacted traceability relations. The overall approach is supported by a prototype tool and empirical results on the effectiveness of tool-supported traceability maintenance are provided. PMID:23471308

  8. An automation simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.; Mutammara, Atheel

    1988-01-01

    The work being done in porting ROBOSIM (a graphical simulation system developed jointly by NASA-MSFC and Vanderbilt University) to the HP350SRX graphics workstation is described. New additional ROBOSIM features, like collision detection and new kinematics simulation methods are also discussed. Based on the experiences of the work on ROBOSIM, a new graphics structural modeling environment is suggested which is intended to be a part of a new knowledge-based multiple aspect modeling testbed. The knowledge-based modeling methodologies and tools already available are described. Three case studies in the area of Space Station automation are also reported. First a geometrical structural model of the station is presented. This model was developed using the ROBOSIM package. Next the possible application areas of an integrated modeling environment in the testing of different Space Station operations are discussed. One of these possible application areas is the modeling of the Environmental Control and Life Support System (ECLSS), which is one of the most complex subsystems of the station. Using the multiple aspect modeling methodology, a fault propagation model of this system is being built and is described.

  9. Automated Supernova Discovery (Abstract)

    NASA Astrophysics Data System (ADS)

    Post, R. S.

    2015-12-01

    (Abstract only) We are developing a system of robotic telescopes for automatic recognition of Supernovas as well as other transient events in collaboration with the Puckett Supernova Search Team. At the SAS2014 meeting, the discovery program, SNARE, was first described. Since then, it has been continuously improved to handle searches under a wide variety of atmospheric conditions. Currently, two telescopes are used to build a reference library while searching for PSN with a partial library. Since data is taken every night without clouds, we must deal with varying atmospheric and high background illumination from the moon. Software is configured to identify a PSN, reshoot for verification with options to change the run plan to acquire photometric or spectrographic data. The telescopes are 24-inch CDK24, with Alta U230 cameras, one in CA and one in NM. Images and run plans are sent between sites so the CA telescope can search while photometry is done in NM. Our goal is to find bright PSNs with magnitude 17.5 or less which is the limit of our planned spectroscopy. We present results from our first automated PSN discoveries and plans for PSN data acquisition.

  10. Multifunction automated crawling system

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph (Inventor); Joffe, Benjamin (Inventor); Backes, Paul Gregory (Inventor)

    1999-01-01

    The present invention is an automated crawling robot system including a platform, a first leg assembly, a second leg assembly, first and second rails attached to the platform, and an onboard electronic computer controller. The first leg assembly has an intermittent coupling device and the second leg assembly has an intermittent coupling device for intermittently coupling the respective first and second leg assemblies to a particular object. The first and second leg assemblies are slidably coupled to the rail assembly and are slidably driven by motors to thereby allow linear movement. In addition, the first leg assembly is rotary driven by a rotary motor to thereby provide rotary motion relative to the platform. To effectuate motion, the intermittent coupling devices of the first and second leg assemblies alternately couple the respective first and second leg assemblies to an object. This motion is done while simultaneously moving one of the leg assemblies linearly in the desired direction and preparing the next step. This arrangement allows the crawler of the present invention to traverse an object in a range of motion covering 360 degrees.

  11. Automated document analysis system

    NASA Astrophysics Data System (ADS)

    Black, Jeffrey D.; Dietzel, Robert; Hartnett, David

    2002-08-01

    A software application has been developed to aid law enforcement and government intelligence gathering organizations in the translation and analysis of foreign language documents with potential intelligence content. The Automated Document Analysis System (ADAS) provides the capability to search (data or text mine) documents in English and the most commonly encountered foreign languages, including Arabic. Hardcopy documents are scanned by a high-speed scanner and are optical character recognized (OCR). Documents obtained in an electronic format bypass the OCR and are copied directly to a working directory. For translation and analysis, the script and the language of the documents are first determined. If the document is not in English, the document is machine translated to English. The documents are searched for keywords and key features in either the native language or translated English. The user can quickly review the document to determine if it has any intelligence content and whether detailed, verbatim human translation is required. The documents and document content are cataloged for potential future analysis. The system allows non-linguists to evaluate foreign language documents and allows for the quick analysis of a large quantity of documents. All document processing can be performed manually or automatically on a single document or a batch of documents.

  12. Automated Microbial Metabolism Laboratory

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The Automated Microbial Metabolism Laboratory (AMML) 1971-1972 program involved the investigation of three separate life detection schemes. The first was a continued further development of the labeled release experiment. The possibility of chamber reuse without inbetween sterilization, to provide comparative biochemical information was tested. Findings show that individual substrates or concentrations of antimetabolites may be sequentially added to a single test chamber. The second detection system which was investigated for possible inclusion in the AMML package of assays, was nitrogen fixation as detected by acetylene reduction. Thirdly, a series of preliminary steps were taken to investigate the feasibility of detecting biopolymers in soil. A strategy for the safe return to Earth of a Mars sample prior to manned landings on Mars is outlined. The program assumes that the probability of indigenous life on Mars is unity and then broadly presents the procedures for acquisition and analysis of the Mars sample in a manner to satisfy the scientific community and the public that adequate safeguards are being taken.

  13. Automated Gas Distribution System

    NASA Astrophysics Data System (ADS)

    Starke, Allen; Clark, Henry

    2012-10-01

    The cyclotron of Texas A&M University is one of the few and prized cyclotrons in the country. Behind the scenes of the cyclotron is a confusing, and dangerous setup of the ion sources that supplies the cyclotron with particles for acceleration. To use this machine there is a time consuming, and even wasteful step by step process of switching gases, purging, and other important features that must be done manually to keep the system functioning properly, while also trying to maintain the safety of the working environment. Developing a new gas distribution system to the ion source prevents many of the problems generated by the older manually setup process. This developed system can be controlled manually in an easier fashion than before, but like most of the technology and machines in the cyclotron now, is mainly operated based on software programming developed through graphical coding environment Labview. The automated gas distribution system provides multi-ports for a selection of different gases to decrease the amount of gas wasted through switching gases, and a port for the vacuum to decrease the amount of time spent purging the manifold. The Labview software makes the operation of the cyclotron and ion sources easier, and safer for anyone to use.

  14. Automated call tracking systems

    SciTech Connect

    Hardesty, C.

    1993-03-01

    User Services groups are on the front line with user support. We are the first to hear about problems. The speed, accuracy, and intelligence with which we respond determines the user`s perception of our effectiveness and our commitment to quality and service. To keep pace with the complex changes at our sites, we must have tools to help build a knowledge base of solutions, a history base of our users, and a record of every problem encountered. Recently, I completed a survey of twenty sites similar to the National Energy Research Supercomputer Center (NERSC). This informal survey reveals that 27% of the sites use a paper system to log calls, 60% employ homegrown automated call tracking systems, and 13% use a vendor-supplied system. Fifty-four percent of those using homegrown systems are exploring the merits of switching to a vendor-supplied system. The purpose of this paper is to provide guidelines for evaluating a call tracking system. In addition, insights are provided to assist User Services groups in selecting a system that fits their needs.

  15. Automated anomaly detection processor

    NASA Astrophysics Data System (ADS)

    Kraiman, James B.; Arouh, Scott L.; Webb, Michael L.

    2002-07-01

    Robust exploitation of tracking and surveillance data will provide an early warning and cueing capability for military and civilian Law Enforcement Agency operations. This will improve dynamic tasking of limited resources and hence operational efficiency. The challenge is to rapidly identify threat activity within a huge background of noncombatant traffic. We discuss development of an Automated Anomaly Detection Processor (AADP) that exploits multi-INT, multi-sensor tracking and surveillance data to rapidly identify and characterize events and/or objects of military interest, without requiring operators to specify threat behaviors or templates. The AADP has successfully detected an anomaly in traffic patterns in Los Angeles, analyzed ship track data collected during a Fleet Battle Experiment to detect simulated mine laying behavior amongst maritime noncombatants, and is currently under development for surface vessel tracking within the Coast Guard's Vessel Traffic Service to support port security, ship inspection, and harbor traffic control missions, and to monitor medical surveillance databases for early alert of a bioterrorist attack. The AADP can also be integrated into combat simulations to enhance model fidelity of multi-sensor fusion effects in military operations.

  16. Evolution paths for advanced automation

    NASA Technical Reports Server (NTRS)

    Healey, Kathleen J.

    1990-01-01

    As Space Station Freedom (SSF) evolves, increased automation and autonomy will be required to meet Space Station Freedom Program (SSFP) objectives. As a precursor to the use of advanced automation within the SSFP, especially if it is to be used on SSF (e.g., to automate the operation of the flight systems), the underlying technologies will need to be elevated to a high level of readiness to ensure safe and effective operations. Ground facilities supporting the development of these flight systems -- from research and development laboratories through formal hardware and software development environments -- will be responsible for achieving these levels of technology readiness. These facilities will need to evolve support the general evolution of the SSFP. This evolution will include support for increasing the use of advanced automation. The SSF Advanced Development Program has funded a study to define evolution paths for advanced automaton within the SSFP's ground-based facilities which will enable, promote, and accelerate the appropriate use of advanced automation on-board SSF. The current capability of the test beds and facilities, such as the Software Support Environment, with regard to advanced automation, has been assessed and their desired evolutionary capabilities have been defined. Plans and guidelines for achieving this necessary capability have been constructed. The approach taken has combined indepth interviews of test beds personnel at all SSF Work Package centers with awareness of relevant state-of-the-art technology and technology insertion methodologies. Key recommendations from the study include advocating a NASA-wide task force for advanced automation, and the creation of software prototype transition environments to facilitate the incorporation of advanced automation in the SSFP.

  17. Automated Power-Distribution System

    NASA Technical Reports Server (NTRS)

    Thomason, Cindy; Anderson, Paul M.; Martin, James A.

    1990-01-01

    Automated power-distribution system monitors and controls electrical power to modules in network. Handles both 208-V, 20-kHz single-phase alternating current and 120- to 150-V direct current. Power distributed to load modules from power-distribution control units (PDCU's) via subsystem distributors. Ring busses carry power to PDCU's from power source. Needs minimal attention. Detects faults and also protects against them. Potential applications include autonomous land vehicles and automated industrial process systems.

  18. Technology modernization assessment flexible automation

    SciTech Connect

    Bennett, D.W.; Boyd, D.R.; Hansen, N.H.; Hansen, M.A.; Yount, J.A.

    1990-12-01

    The objectives of this report are: to present technology assessment guidelines to be considered in conjunction with defense regulations before an automation project is developed to give examples showing how assessment guidelines may be applied to a current project to present several potential areas where automation might be applied successfully in the depot system. Depots perform primarily repair and remanufacturing operations, with limited small batch manufacturing runs. While certain activities (such as Management Information Systems and warehousing) are directly applicable to either environment, the majority of applications will require combining existing and emerging technologies in different ways, with the special needs of depot remanufacturing environment. Industry generally enjoys the ability to make revisions to its product lines seasonally, followed by batch runs of thousands or more. Depot batch runs are in the tens, at best the hundreds, of parts with a potential for large variation in product mix; reconfiguration may be required on a week-to-week basis. This need for a higher degree of flexibility suggests a higher level of operator interaction, and, in turn, control systems that go beyond the state of the art for less flexible automation and industry in general. This report investigates the benefits and barriers to automation and concludes that, while significant benefits do exist for automation, depots must be prepared to carefully investigate the technical feasibility of each opportunity and the life-cycle costs associated with implementation. Implementation is suggested in two ways: (1) develop an implementation plan for automation technologies based on results of small demonstration automation projects; (2) use phased implementation for both these and later stage automation projects to allow major technical and administrative risk issues to be addressed. 10 refs., 2 figs., 2 tabs. (JF)

  19. Automate small refinery blending operations

    SciTech Connect

    Bauman, D.E.; Mellott, M.T.

    1981-11-01

    More than ever, small refiners must become more efficient to remain competitive. Increased automation can contribute significantly to improving operations and reducing costs. Presented is a method of automating the blending operation which incorporates unique features and equipment. The system has proven successful in reducing costs and manpower, and improved product quality. The discussion is presented under headings - basic blending system; digital blender; knock engines and octave computer; programmable logic controller; flowmeters; lead additive systems.

  20. An Automated Microwave-Assisted Synthesis Purification System for Rapid Generation of Compound Libraries.

    PubMed

    Tu, Noah P; Searle, Philip A; Sarris, Kathy

    2016-06-01

    A novel methodology for the synthesis and purification of drug-like compound libraries has been developed through the use of a microwave reactor with an integrated high-performance liquid chromatography-mass spectrometry (HPLC-MS) system. The strategy uses a fully automated synthesizer with a microwave as energy source and robotic components for weighing and dispensing of solid reagents, handling liquid reagents, capper/crimper of microwave reaction tube assemblies, and transportation. Crude reaction products were filtered through solid-phase extraction cartridges and injected directly onto a reverse-phase chromatography column via an injection valve. For multistep synthesis, crude products were passed through scavenger resins and reintroduced for subsequent reactions. All synthetic and purification steps were conducted under full automation with no handling or isolation of intermediates, to afford the desired purified products. This approach opens the way to highly efficient generation of drug-like compounds as part of a lead discovery strategy or within a lead optimization program. PMID:26085482

  1. A Multistep Maturity Model for the Implementation of Electronic and Computable Diagnostic Clinical Prediction Rules (eCPRs)

    PubMed Central

    Corrigan, Derek; McDonnell, Ronan; Zarabzadeh, Atieh; Fahey, Tom

    2015-01-01

    Introduction: The use of Clinical Prediction Rules (CPRs) has been advocated as one way of implementing actionable evidence-based rules in clinical practice. The current highly manual nature of deriving CPRs makes them difficult to use and maintain. Addressing the known limitations of CPRs requires implementing more flexible and dynamic models of CPR development. We describe the application of Information and Communication Technology (ICT) to provide a platform for the derivation and dissemination of CPRs derived through analysis and continual learning from electronic patient data. Model Components: We propose a multistep maturity model for constructing electronic and computable CPRs (eCPRs). The model has six levels – from the lowest level of CPR maturity (literaturebased CPRs) to a fully electronic and computable service-oriented model of CPRs that are sensitive to specific demographic patient populations. We describe examples of implementations of the core model components – focusing on CPR representation, interoperability, electronic dissemination, CPR learning, and user interface requirements. Conclusion: The traditional focus on derivation and narrow validation of CPRs has severely limited their wider acceptance. The evolution and maturity model described here outlines a progression toward eCPRs consistent with the vision of a learning health system (LHS) – using central repositories of CPR knowledge, accessible open standards, and generalizable models to avoid repetition of previous work. This is useful for developing more ambitious strategies to address limitations of the traditional CPR development life cycle. The model described here is a starting point for promoting discussion about what a more dynamic CPR development process should look like. PMID:26290890

  2. Imaging and spectroscopic comparison of multi-step methods to form DNA arrays based on the biotin-streptavidin system.

    PubMed

    Gajos, Katarzyna; Petrou, Panagiota; Budkowski, Andrzej; Awsiuk, Kamil; Bernasik, Andrzej; Misiakos, Konstantinos; Rysz, Jakub; Raptis, Ioannis; Kakabakos, Sotirios

    2015-02-21

    Three multi-step multi-molecular approaches using the biotin-streptavidin system to contact-print DNA arrays on SiO2 surfaces modified with (3-glycidoxypropyl)trimethoxysilane are examined after each deposition/reaction step by atomic force microscopy, X-ray photoelectron spectroscopy and time of flight secondary ion mass spectrometry. Surface modification involves the spotting of preformed conjugates of biotinylated oligonucleotides with streptavidin onto surfaces coated with biotinylated bovine serum albumin b-BSA (approach I) or the spotting of biotinylated oligonucleotides onto a streptavidin coating, the latter prepared through a reaction with immobilized b-BSA (approach II) or direct adsorption (approach III). AFM micrographs, quantified by autocorrelation and height histogram parameters (e.g. roughness), reveal uniform coverage after each modification step with distinct nanostructures after the reaction of biotinylated BSA with streptavidin or of a streptavidin conjugate with biotinylated oligonucleotides. XPS relates the immobilization of biomolecules with covalent binding to the epoxy-silanized surface. Protein coverage, estimated from photoelectron attenuation, shows that regarding streptavidin the highest and the lowest immobilization efficiency is achieved by following approaches I and III, respectively, as confirmed by TOF-SIMS microanalysis. The size of the DNA spot reflects the contact radius of the printed droplet and increases with protein coverage (and roughness) prior to the spotting, as epoxy-silanized surfaces are hardly hydrophilic. Representative TOF-SIMS images show sub-millimeter spots: uniform for approach I, doughnut-like (with a small non-zero minimum) for approach II, both with coffee-rings or peak-shaped for approach III. Spot features, originating from pinned contact lines and DNA surface binding and revealed by complementary molecular distributions (all material, DNA, streptavidin, BSA, epoxy, SiO2), indicate two modes of droplet

  3. Multi-Step Fibrinogen Binding to the Integrin αIIbβ3 Detected Using Force Spectroscopy

    PubMed Central

    Litvinov, Rustem I.; Bennett, Joel S.; Weisel, John W.; Shuman, Henry

    2005-01-01

    The regulated ability of integrin αIIbβ3 to bind fibrinogen plays a crucial role in platelet aggregation and hemostasis. We have developed a model system based on laser tweezers, enabling us to measure specific rupture forces needed to separate single receptor-ligand complexes. First of all, we performed a thorough and statistically representative analysis of nonspecific protein-protein binding versus specific αIIbβ3-fibrinogen interactions in combination with experimental evidence for single-molecule measurements. The rupture force distribution of purified αIIbβ3 and fibrinogen, covalently attached to underlying surfaces, ranged from ∼20 to 150 pN. This distribution could be fit with a sum of an exponential curve for weak to moderate (20–60 pN) forces, and a Gaussian curve for strong (>60 pN) rupture forces that peaked at 80–90 pN. The interactions corresponding to these rupture force regimes differed in their susceptibility to αIIbβ3 antagonists or Mn2+, an αIIbβ3 activator. Varying the surface density of fibrinogen changed the total binding probability linearly >3.5-fold but did not affect the shape of the rupture force distribution, indicating that the measurements represent single-molecule binding. The yield strength of αIIbβ3-fibrinogen interactions was independent of the loading rate (160–16,000 pN/s), whereas their binding probability markedly correlated with the duration of contact. The aggregate of data provides evidence for complex multi-step binding/unbinding pathways of αIIbβ3 and fibrinogen revealed at the single-molecule level. PMID:16040750

  4. Rapid determination and chemical change tracking of benzoyl peroxide in wheat flour by multi-step IR macro-fingerprinting.

    PubMed

    Guo, Xiao-Xi; Hu, Wei; Liu, Yuan; Sun, Su-Qin; Gu, Dong-Chen; He, Helen; Xu, Chang-Hua; Wang, Xi-Chang

    2016-02-01

    BPO is often added to wheat flour as flour improver, but its excessive use and edibility are receiving increasing concern. A multi-step IR macro-fingerprinting was employed to identify BPO in wheat flour and unveil its changes during storage. BPO contained in wheat flour (<3.0 mg/kg) was difficult to be identified by infrared spectra with correlation coefficients between wheat flour and wheat flour samples contained BPO all close to 0.98. By applying second derivative spectroscopy, obvious differences among wheat flour and wheat flour contained BPO before and after storage in the range of 1500-1400 cm(-1) were disclosed. The peak of 1450 cm(-1) which belonged to BPO was blue shifted to 1453 cm(-1) (1455) which belonged to benzoic acid after one week of storage, indicating that BPO changed into benzoic acid after storage. Moreover, when using two-dimensional correlation infrared spectroscopy (2DCOS-IR) to track changes of BPO in wheat flour (0.05 mg/g) within one week, intensities of auto-peaks at 1781 cm(-1) and 669 cm(-1) which belonged to BPO and benzoic acid, respectively, were changing inversely, indicating that BPO was decomposed into benzoic acid. Moreover, another autopeak at 1767 cm(-1) which does not belong to benzoic acid was also rising simultaneously. By heating perturbation treatment of BPO in wheat flour based on 2DCOS-IR and spectral subtraction analysis, it was found that BPO in wheat flour not only decomposed into benzoic acid and benzoate, but also produced other deleterious substances, e.g., benzene. This study offers a promising method with minimum pretreatment and time-saving to identify BPO in wheat flour and its chemical products during storage in a holistic manner. PMID:26519920

  5. Multistep mass spectrometry methodology for direct characterization of polar lipids in green microalgae using paper spray ionization.

    PubMed

    Oradu, Sheran A; Cooks, R Graham

    2012-12-18

    Paper spray ionization, an ambient ionization method, has been applied for the identification of polar lipids in green microalgae with no sample preparation. A multistep experimental protocol was employed to characterize the lipid species of two microalgae strains, Kyo-Chlorella in tablet form and Nannochloropsis in paste form by mass spectrometry (MS). Tandem mass spectrometry (MS/MS) experiments using collision induced dissociation (CID) were employed for initial characterization of the detected lipid species, which were dominated by polar glycolipids and phospholipids. Product ion scan experiments were performed to determine the lipid head groups and fatty acid composition. Precursor ion scan experiments using fragment ions such as m/z 184, which is characteristic of the phosphocholine headgroup, were then used to confirm the lipid classification. Lipid elemental compositions were determined by exact mass measurements using high resolution mass spectrometry. Finally, the position of unsaturation was determined using reactive paper spray ionization experiments with ozone used as a reagent to cleave double bonds. Ozone was produced in situ using dielectric barrier discharge from a low temperature plasma, and it reacted in ambient air with the spray of ions produced by paper spray ionization. Using the precursor ion scan experiment, the resulting ozone cleavage product ions were used to determine the position of unsaturation for some of these species. By applying this experimental protocol, the molecular formulas and key aspects of the structures of glycerophosphocholines (PCs) such as 9Z-16:1/9Z,12Z-16:2 PC and 6Z,9Z-18:2/6Z,9Z,12Z-18:3PC and monogalactosyldiacylglycerols (MGDGs) such as 18:3/16:3MGDG were identified in the positive ion mode, while glycerophosphoglycerols (PGs) such as 18:3/16:0 PG and sulfoquinovosyldiacylglycerols (SQDGs) such as 18:3/16:0 SQDG were identified in the negative ion mode. PMID:23181824

  6. Rapid determination and chemical change tracking of benzoyl peroxide in wheat flour by multi-step IR macro-fingerprinting

    NASA Astrophysics Data System (ADS)

    Guo, Xiao-Xi; Hu, Wei; Liu, Yuan; Sun, Su-Qin; Gu, Dong-Chen; He, Helen; Xu, Chang-Hua; Wang, Xi-Chang

    2016-02-01

    BPO is often added to wheat flour as flour improver, but its excessive use and edibility are receiving increasing concern. A multi-step IR macro-fingerprinting was employed to identify BPO in wheat flour and unveil its changes during storage. BPO contained in wheat flour (< 3.0 mg/kg) was difficult to be identified by infrared spectra with correlation coefficients between wheat flour and wheat flour samples contained BPO all close to 0.98. By applying second derivative spectroscopy, obvious differences among wheat flour and wheat flour contained BPO before and after storage in the range of 1500-1400 cm- 1 were disclosed. The peak of 1450 cm- 1 which belonged to BPO was blue shifted to 1453 cm- 1 (1455) which belonged to benzoic acid after one week of storage, indicating that BPO changed into benzoic acid after storage. Moreover, when using two-dimensional correlation infrared spectroscopy (2DCOS-IR) to track changes of BPO in wheat flour (0.05 mg/g) within one week, intensities of auto-peaks at 1781 cm- 1 and 669 cm- 1 which belonged to BPO and benzoic acid, respectively, were changing inversely, indicating that BPO was decomposed into benzoic acid. Moreover, another autopeak at 1767 cm- 1 which does not belong to benzoic acid was also rising simultaneously. By heating perturbation treatment of BPO in wheat flour based on 2DCOS-IR and spectral subtraction analysis, it was found that BPO in wheat flour not only decomposed into benzoic acid and benzoate, but also produced other deleterious substances, e.g., benzene. This study offers a promising method with minimum pretreatment and time-saving to identify BPO in wheat flour and its chemical products during storage in a holistic manner.

  7. Automated DNA sequencing.

    PubMed

    Wallis, Yvonne; Morrell, Natalie

    2011-01-01

    Fluorescent cycle sequencing of PCR products is a multistage process and several methodologies are available to perform each stage. This chapter will describe the more commonly utilised dye-terminator cycle sequencing approach using BigDye® terminator chemistry (Applied Biosystems) ready for analysis on a 3730 DNA genetic analyzer. Even though DNA sequencing is one of the most common and robust techniques performed in molecular laboratories it may not always produce desirable results. The causes of the most common problems will also be discussed in this chapter. PMID:20938839

  8. Automated selection of synthetic biology parts for genetic regulatory networks.

    PubMed

    Yaman, Fusun; Bhatia, Swapnil; Adler, Aaron; Densmore, Douglas; Beal, Jacob

    2012-08-17

    Raising the level of abstraction for synthetic biology design requires solving several challenging problems, including mapping abstract designs to DNA sequences. In this paper we present the first formalism and algorithms to address this problem. The key steps of this transformation are feature matching, signal matching, and part matching. Feature matching ensures that the mapping satisfies the regulatory relationships in the abstract design. Signal matching ensures that the expression levels of functional units are compatible. Finally, part matching finds a DNA part sequence that can implement the design. Our software tool MatchMaker implements these three steps. PMID:23651287

  9. Automated ship image acquisition

    NASA Astrophysics Data System (ADS)

    Hammond, T. R.

    2008-04-01

    The experimental Automated Ship Image Acquisition System (ASIA) collects high-resolution ship photographs at a shore-based laboratory, with minimal human intervention. The system uses Automatic Identification System (AIS) data to direct a high-resolution SLR digital camera to ship targets and to identify the ships in the resulting photographs. The photo database is then searchable using the rich data fields from AIS, which include the name, type, call sign and various vessel identification numbers. The high-resolution images from ASIA are intended to provide information that can corroborate AIS reports (e.g., extract identification from the name on the hull) or provide information that has been omitted from the AIS reports (e.g., missing or incorrect hull dimensions, cargo, etc). Once assembled into a searchable image database, the images can be used for a wide variety of marine safety and security applications. This paper documents the author's experience with the practicality of composing photographs based on AIS reports alone, describing a number of ways in which this can go wrong, from errors in the AIS reports, to fixed and mobile obstructions and multiple ships in the shot. The frequency with which various errors occurred in automatically-composed photographs collected in Halifax harbour in winter time were determined by manual examination of the images. 45% of the images examined were considered of a quality sufficient to read identification markings, numbers and text off the entire ship. One of the main technical challenges for ASIA lies in automatically differentiating good and bad photographs, so that few bad ones would be shown to human users. Initial attempts at automatic photo rating showed 75% agreement with manual assessments.

  10. High-throughput sample processing and sample management; the functional evolution of classical cytogenetic assay towards automation.

    PubMed

    Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S

    2015-11-01

    High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput. PMID:26520383

  11. Genetic barcodes

    DOEpatents

    Weier, Heinz -Ulrich G

    2015-08-04

    Herein are described multicolor FISH probe sets termed "genetic barcodes" targeting several cancer or disease-related loci to assess gene rearrangements and copy number changes in tumor cells. Two, three or more different fluorophores are used to detect the genetic barcode sections thus permitting unique labeling and multilocus analysis in individual cell nuclei. Gene specific barcodes can be generated and combined to provide both numerical and structural genetic information for these and other pertinent disease associated genes.

  12. Automation: Decision Aid or Decision Maker?

    NASA Technical Reports Server (NTRS)

    Skitka, Linda J.

    1998-01-01

    This study clarified that automation bias is something unique to automated decision making contexts, and is not the result of a general tendency toward complacency. By comparing performance on exactly the same events on the same tasks with and without an automated decision aid, we were able to determine that at least the omission error part of automation bias is due to the unique context created by having an automated decision aid, and is not a phenomena that would occur even if people were not in an automated context. However, this study also revealed that having an automated decision aid did lead to modestly improved performance across all non-error events. Participants in the non- automated condition responded with 83.68% accuracy, whereas participants in the automated condition responded with 88.67% accuracy, across all events. Automated decision aids clearly led to better overall performance when they were accurate. People performed almost exactly at the level of reliability as the automation (which across events was 88% reliable). However, also clear, is that the presence of less than 100% accurate automated decision aids creates a context in which new kinds of errors in decision making can occur. Participants in the non-automated condition responded with 97% accuracy on the six "error" events, whereas participants in the automated condition had only a 65% accuracy rate when confronted with those same six events. In short, the presence of an AMA can lead to vigilance decrements that can lead to errors in decision making.

  13. Automated protein NMR resonance assignments.

    PubMed

    Wan, Xiang; Xu, Dong; Slupsky, Carolyn M; Lin, Guohui

    2003-01-01

    NMR resonance peak assignment is one of the key steps in solving an NMR protein structure. The assignment process links resonance peaks to individual residues of the target protein sequence, providing the prerequisite for establishing intra- and inter-residue spatial relationships between atoms. The assignment process is tedious and time-consuming, which could take many weeks. Though there exist a number of computer programs to assist the assignment process, many NMR labs are still doing the assignments manually to ensure quality. This paper presents (1) a new scoring system for mapping spin systems to residues, (2) an automated adjacency information extraction procedure from NMR spectra, and (3) a very fast assignment algorithm based on our previous proposed greedy filtering method and a maximum matching algorithm to automate the assignment process. The computational tests on 70 instances of (pseudo) experimental NMR data of 14 proteins demonstrate that the new score scheme has much better discerning power with the aid of adjacency information between spin systems simulated across various NMR spectra. Typically, with automated extraction of adjacency information, our method achieves nearly complete assignments for most of the proteins. The experiment shows very promising perspective that the fast automated assignment algorithm together with the new score scheme and automated adjacency extraction may be ready for practical use. PMID:16452794

  14. Space power subsystem automation technology

    NASA Technical Reports Server (NTRS)

    Graves, J. R. (Compiler)

    1982-01-01

    The technology issues involved in power subsystem automation and the reasonable objectives to be sought in such a program were discussed. The complexities, uncertainties, and alternatives of power subsystem automation, along with the advantages from both an economic and a technological perspective were considered. Whereas most spacecraft power subsystems now use certain automated functions, the idea of complete autonomy for long periods of time is almost inconceivable. Thus, it seems prudent that the technology program for power subsystem automation be based upon a growth scenario which should provide a structured framework of deliberate steps to enable the evolution of space power subsystems from the current practice of limited autonomy to a greater use of automation with each step being justified on a cost/benefit basis. Each accomplishment should move toward the objectives of decreased requirement for ground control, increased system reliability through onboard management, and ultimately lower energy cost through longer life systems that require fewer resources to operate and maintain. This approach seems well-suited to the evolution of more sophisticated algorithms and eventually perhaps even the use of some sort of artificial intelligence. Multi-hundred kilowatt systems of the future will probably require an advanced level of autonomy if they are to be affordable and manageable.

  15. Detection of carryover in automated milk sampling equipment.

    PubMed

    Løvendahl, P; Bjerring, M A

    2006-09-01

    Equipment for sampling milk in automated milking systems may cause carryover problems if residues from one sample remain and are mixed with the subsequent sample. The degree of carryover can be estimated statistically by linear regression models. This study applied various regression analyses to several real and simulated data sets. The statistical power for detecting carryover milk improved considerably when information about cow identity was included and a mixed model was applied. Carryover may affect variation between animals, including genetic variation, and thereby have an impact on management decisions and diagnostic tools based on the milk content of somatic cells. An extended procedure is needed for approval of sampling equipment for automated milking with acceptable latitudes of carryover, and this could include the regression approach taken in this study. PMID:16899700

  16. Design automation for integrated circuits

    NASA Astrophysics Data System (ADS)

    Newell, S. B.; de Geus, A. J.; Rohrer, R. A.

    1983-04-01

    Consideration is given to the development status of the use of computers in automated integrated circuit design methods, which promise the minimization of both design time and design error incidence. Integrated circuit design encompasses two major tasks: error specification, in which the goal is a logic diagram that accurately represents the desired electronic function, and physical specification, in which the goal is an exact description of the physical locations of all circuit elements and their interconnections on the chip. Design automation not only saves money by reducing design and fabrication time, but also helps the community of systems and logic designers to work more innovatively. Attention is given to established design automation methodologies, programmable logic arrays, and design shortcuts.

  17. Automated power management and control

    NASA Technical Reports Server (NTRS)

    Dolce, James L.

    1991-01-01

    A comprehensive automation design is being developed for Space Station Freedom's electric power system. A joint effort between NASA's Office of Aeronautics and Exploration Technology and NASA's Office of Space Station Freedom, it strives to increase station productivity by applying expert systems and conventional algorithms to automate power system operation. The initial station operation will use ground-based dispatches to perform the necessary command and control tasks. These tasks constitute planning and decision-making activities that strive to eliminate unplanned outages. We perceive an opportunity to help these dispatchers make fast and consistent on-line decisions by automating three key tasks: failure detection and diagnosis, resource scheduling, and security analysis. Expert systems will be used for the diagnostics and for the security analysis; conventional algorithms will be used for the resource scheduling.

  18. Automated mapping of hammond's landforms

    USGS Publications Warehouse

    Gallant, A.L.; Brown, D.D.; Hoffer, R.M.

    2005-01-01

    We automated a method for mapping Hammond's landforms over large landscapes using digital elevation data. We compared our results against Hammond's published landform maps, derived using manual interpretation procedures. We found general agreement in landform patterns mapped by the manual and the automated approaches, and very close agreement in characterization of local topographic relief. The two approaches produced different interpretations of intermediate landforms, which relied upon quantification of the proportion of landscape having gently sloping terrain. This type of computation is more efficiently and consistently applied by computer than human. Today's ready access to digital data and computerized geospatial technology provides a good foundation for mapping terrain features, but the mapping criteria guiding manual techniques in the past may not be appropriate for automated approaches. We suggest that future efforts center on the advantages offered by digital advancements in refining an approach to better characterize complex landforms. ?? 2005 IEEE.

  19. Genetic Engineering

    ERIC Educational Resources Information Center

    Phillips, John

    1973-01-01

    Presents a review of genetic engineering, in which the genotypes of plants and animals (including human genotypes) may be manipulated for the benefit of the human species. Discusses associated problems and solutions and provides an extensive bibliography of literature relating to genetic engineering. (JR)

  20. ASteCA: Automated Stellar Cluster Analysis

    NASA Astrophysics Data System (ADS)

    Perren, G. I.; Vázquez, R. A.; Piatti, A. E.

    2015-04-01

    We present the Automated Stellar Cluster Analysis package (ASteCA), a suit of tools designed to fully automate the standard tests applied on stellar clusters to determine their basic parameters. The set of functions included in the code make use of positional and photometric data to obtain precise and objective values for a given cluster's center coordinates, radius, luminosity function and integrated color magnitude, as well as characterizing through a statistical estimator its probability of being a true physical cluster rather than a random overdensity of field stars. ASteCA incorporates a Bayesian field star decontamination algorithm capable of assigning membership probabilities using photometric data alone. An isochrone fitting process based on the generation of synthetic clusters from theoretical isochrones and selection of the best fit through a genetic algorithm is also present, which allows ASteCA to provide accurate estimates for a cluster's metallicity, age, extinction and distance values along with its uncertainties. To validate the code we applied it on a large set of over 400 synthetic MASSCLEAN clusters with varying degrees of field star contamination as well as a smaller set of 20 observed Milky Way open clusters (Berkeley 7, Bochum 11, Czernik 26, Czernik 30, Haffner 11, Haffner 19, NGC 133, NGC 2236, NGC 2264, NGC 2324, NGC 2421, NGC 2627, NGC 6231, NGC 6383, NGC 6705, Ruprecht 1, Tombaugh 1, Trumpler 1, Trumpler 5 and Trumpler 14) studied in the literature. The results show that ASteCA is able to recover cluster parameters with an acceptable precision even for those clusters affected by substantial field star contamination. ASteCA is written in Python and is made available as an open source code which can be downloaded ready to be used from its official site.

  1. BOA: Framework for automated builds

    SciTech Connect

    N. Ratnikova et al.

    2003-09-30

    Managing large-scale software products is a complex software engineering task. The automation of the software development, release and distribution process is most beneficial in the large collaborations, where the big number of developers, multiple platforms and distributed environment are typical factors. This paper describes Build and Output Analyzer framework and its components that have been developed in CMS to facilitate software maintenance and improve software quality. The system allows to generate, control and analyze various types of automated software builds and tests, such as regular rebuilds of the development code, software integration for releases and installation of the existing versions.

  2. The Automated Planetary Space Station

    NASA Technical Reports Server (NTRS)

    Ivie, C. V.; Friedman, L. D.

    1977-01-01

    Results are presented for a study on mission definition and design to determine broad technology directions and needs for advanced planetary spacecraft and future planetary missions. The discussion covers mission selection, system design, and technology assessment and review for a multicomponent spacecraft exploration facility provided with nuclear power propulsion. As an example, the Automated Planetary Space Station at Jupiter is examined as a generic concept which has the capability of conducting in-depth investigations of different aspects of the entire Jovian system. Mission planning is discussed relative to low-thrust trajectory control, automatic target identification and landing, roving vehicle operation, and automated sample analysis.

  3. Advanced automation for space missions

    NASA Technical Reports Server (NTRS)

    Freitas, R. A., Jr.; Healy, T. J.; Long, J. E.

    1982-01-01

    A NASA/ASEE Summer Study conducted at the University of Santa Clara in 1980 examined the feasibility of using advanced artificial intelligence and automation technologies in future NASA space missions. Four candidate applications missions were considered: (1) An intelligent earth-sensing information system, (2) an autonomous space exploration system, (3) an automated space manufacturing facility, and (4) a self-replicating, growing lunar factory. The study assessed the various artificial intelligence and machine technologies which must be developed if such sophisticated missions are to become feasible by century's end.

  4. Automating Shallow Seismic Imaging

    SciTech Connect

    Steeples, Don W.

    2004-12-09

    This seven-year, shallow-seismic reflection research project had the aim of improving geophysical imaging of possible contaminant flow paths. Thousands of chemically contaminated sites exist in the United States, including at least 3,700 at Department of Energy (DOE) facilities. Imaging technologies such as shallow seismic reflection (SSR) and ground-penetrating radar (GPR) sometimes are capable of identifying geologic conditions that might indicate preferential contaminant-flow paths. Historically, SSR has been used very little at depths shallower than 30 m, and even more rarely at depths of 10 m or less. Conversely, GPR is rarely useful at depths greater than 10 m, especially in areas where clay or other electrically conductive materials are present near the surface. Efforts to image the cone of depression around a pumping well using seismic methods were only partially successful (for complete references of all research results, see the full Final Technical Report, DOE/ER/14826-F), but peripheral results included development of SSR methods for depths shallower than one meter, a depth range that had not been achieved before. Imaging at such shallow depths, however, requires geophone intervals of the order of 10 cm or less, which makes such surveys very expensive in terms of human time and effort. We also showed that SSR and GPR could be used in a complementary fashion to image the same volume of earth at very shallow depths. The primary research focus of the second three-year period of funding was to develop and demonstrate an automated method of conducting two-dimensional (2D) shallow-seismic surveys with the goal of saving time, effort, and money. Tests involving the second generation of the hydraulic geophone-planting device dubbed the ''Autojuggie'' showed that large numbers of geophones can be placed quickly and automatically and can acquire high-quality data, although not under rough topographic conditions. In some easy-access environments, this device could

  5. Automation U.S.A.: Overcoming Barriers to Automation.

    ERIC Educational Resources Information Center

    Brody, Herb

    1985-01-01

    Although labor unions and inadequate technology play minor roles, the principal barrier to factory automation is "fear of change." Related problems include long-term benefits, nontechnical executives, and uncertainty of factory cost accounting. Industry support for university programs is helping to educate engineers to design, implement, and…

  6. Automated Tools for Subject Matter Expert Evaluation of Automated Scoring

    ERIC Educational Resources Information Center

    Williamson, David M.; Bejar, Isaac I.; Sax, Anne

    2004-01-01

    As automated scoring of complex constructed-response examinations reaches operational status, the process of evaluating the quality of resultant scores, particularly in contrast to scores of expert human graders, becomes as complex as the data itself. Using a vignette from the Architectural Registration Examination (ARE), this article explores the…

  7. Truth in Automating: Case Studies in Library Automation.

    ERIC Educational Resources Information Center

    Drabenstott, Jon; And Others

    1989-01-01

    Contributors from five libraries--Bentley College, Boston University, the College of Charleston, the University of Wisconsin at Eau Claire, and the Resource Sharing Alliance of West Central Illinois--describe their automation projects, including staff impact; costs and funding; time and schedules; users; computer support; vendors; and consultants.…

  8. Genetic counseling.

    PubMed

    Fraser, F C

    1974-09-01

    A workshop was sponsored by the National Genetics Foundation to evaluate and make recommendations about the status of genetic counseling, its goals, nature, achievements, and needs. The process of genetic workup and counseling is divided into 5 stages: validation of the diagnosis; obtaining family history; estimation of the risk of recurrence; helping the family make a decision and take appropriate action; and extending counseling to other members of the family. Counseling can be directed at individuals or at special groups with the potential of carrying such diseases as sickle cell amenia or Tay-Sachs. No consensus exists on an optimal counseling approach. Genetic counseling is regarded as a team effort, requiring, in addition to the counselor, laboratory facilities and a variety of specialists. The source of payment for genetic counseling services is regarded as a problem of increasing concern. Generally, the fee paid rarely covers the cost of the many procedures and it is suggested that the cost, like that of other public health services, should be subsidized by the state. Considerable argument exists over whether a genetic counselor must have a M.D. degree or whether a Ph. D. in medical genetics is suitable enough. The quality of much genetic counseling, which is often done in the office of doctors unskilled in the field, would be increased if better training in genetics were offered to medical students and if physicians were informed of the existence of counseling centers. Further, there is a growing feeling that some sort of accreditation of genetic counselors is desirable. PMID:4609197

  9. Automated RNA Extraction and Purification for Multiplexed Pathogen Detection

    SciTech Connect

    Bruzek, Amy K.; Bruckner-Lea, Cindy J.

    2005-01-01

    Pathogen detection has become an extremely important part of our nation?s defense in this post 9/11 world where the threat of bioterrorist attacks are a grim reality. When a biological attack takes place, response time is critical. The faster the biothreat is assessed, the faster countermeasures can be put in place to protect the health of the general public. Today some of the most widely used methods for detecting pathogens are either time consuming or not reliable [1]. Therefore, a method that can detect multiple pathogens that is inherently reliable, rapid, automated and field portable is needed. To that end, we are developing automated fluidics systems for the recovery, cleanup, and direct labeling of community RNA from suspect environmental samples. The advantage of using RNA for detection is that there are multiple copies of mRNA in a cell, whereas there are normally only one or two copies of DNA [2]. Because there are multiple copies of mRNA in a cell for highly expressed genes, no amplification of the genetic material may be necessary, and thus rapid and direct detection of only a few cells may be possible [3]. This report outlines the development of both manual and automated methods for the extraction and purification of mRNA. The methods were evaluated using cell lysates from Escherichia coli 25922 (nonpathogenic), Salmonella typhimurium (pathogenic), and Shigella spp (pathogenic). Automated RNA purification was achieved using a custom sequential injection fluidics system consisting of a syringe pump, a multi-port valve and a magnetic capture cell. mRNA was captured using silica coated superparamagnetic beads that were trapped in the tubing by a rare earth magnet. RNA was detected by gel electrophoresis and/or by hybridization of the RNA to microarrays. The versatility of the fluidics systems and the ability to automate these systems allows for quick and easy processing of samples and eliminates the need for an experienced operator.

  10. Understanding human management of automation errors

    PubMed Central

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  11. What Is an Automated External Defibrillator?

    MedlinePlus

    ANSWERS by heart Treatments + Tests What Is an Automated External Defibrillator? An automated external defibrillator (AED) is a lightweight, portable device ... AED? Non-medical personnel such as police, fire service personnel, flight attendants, security guards and other lay ...

  12. Progress in Fully Automated Abdominal CT Interpretation

    PubMed Central

    Summers, Ronald M.

    2016-01-01

    OBJECTIVE Automated analysis of abdominal CT has advanced markedly over just the last few years. Fully automated assessment of organs, lymph nodes, adipose tissue, muscle, bowel, spine, and tumors are some examples where tremendous progress has been made. Computer-aided detection of lesions has also improved dramatically. CONCLUSION This article reviews the progress and provides insights into what is in store in the near future for automated analysis for abdominal CT, ultimately leading to fully automated interpretation. PMID:27101207

  13. Ask the experts: automation: part I.

    PubMed

    Allinson, John L; Blick, Kenneth E; Cohen, Lucinda; Higton, David; Li, Ming

    2013-08-01

    Bioanalysis invited a selection of leading researchers to express their views on automation in the bioanalytical laboratory. The topics discussed include the challenges that the modern bioanalyst faces when integrating automation into existing drug-development processes, the impact of automation and how they envision the modern bioanalytical laboratory changing in the near future. Their enlightening responses provide a valuable insight into the impact of automation and the future of the constantly evolving bioanalytical laboratory. PMID:23937129

  14. Safety in the Automated Office.

    ERIC Educational Resources Information Center

    Graves, Pat R.; Greathouse, Lillian R.

    1990-01-01

    Office automation has introduced new hazards to the workplace: electrical hazards related to computer wiring, musculoskeletal problems resulting from use of computer terminals and design of work stations, and environmental concerns related to ventilation, noise levels, and office machine chemicals. (SK)

  15. Cognitive Approaches to Automated Instruction.

    ERIC Educational Resources Information Center

    Regian, J. Wesley, Ed.; Shute, Valerie J., Ed.

    This book contains a snapshot of state-of-the-art research on the design of automated instructional systems. Selected cognitive psychologists were asked to describe their approach to instruction and cognitive diagnosis, the theoretical basis of the approach, its utility and applicability, and the knowledge engineering or task analysis methods…

  16. Automated Filtering of Internet Postings.

    ERIC Educational Resources Information Center

    Rosenfeld, Louis B.; Holland, Maurita P.

    1994-01-01

    Discussion of the use of dynamic data resources, such as Internet LISTSERVs or Usenet newsgroups, focuses on an experiment using an automated filtering system with Usenet newsgroups. Highlights include user satisfaction, based on retrieval size, data sources, and user interface and the need for some human mediation. (Contains two references.) (LRW)

  17. Automating the conflict resolution process

    NASA Technical Reports Server (NTRS)

    Wike, Jeffrey S.

    1991-01-01

    The purpose is to initiate a discussion of how the conflict resolution process at the Network Control Center can be made more efficient. Described here are how resource conflicts are currently resolved as well as the impacts of automating conflict resolution in the ATDRSS era. A variety of conflict resolution strategies are presented.

  18. Library Automation: Guidelines to Costing.

    ERIC Educational Resources Information Center

    Ford, Geoffrey

    As with all new programs, the costs associated with library automation must be carefully considered before implementation. This document suggests guidelines to be followed and areas to be considered in the costing of library procedures. An existing system model has been suggested as a standard (Appendix A) and a classification of library tasks…

  19. Office Automation in Student Affairs.

    ERIC Educational Resources Information Center

    Johnson, Sharon L.; Hamrick, Florence A.

    1987-01-01

    Offers recommendations to assist in introducing or expanding computer assistance in student affairs. Describes need for automation and considers areas of choosing hardware and software, funding and competitive bidding, installation and training, and system management. Cites greater efficiency in handling tasks and data and increased levels of…

  20. Automation; The New Industrial Revolution.

    ERIC Educational Resources Information Center

    Arnstein, George E.

    Automation is a word that describes the workings of computers and the innovations of automatic transfer machines in the factory. As the hallmark of the new industrial revolution, computers displace workers and create a need for new skills and retraining programs. With improved communication between industry and the educational community to…

  1. Automating a High School Restroom.

    ERIC Educational Resources Information Center

    Ritner-Heir, Robbin

    1999-01-01

    Discusses how one high school transformed its restrooms into cleaner and more vandal-resistant environments by automating them. Solutions discussed include installing perforated stainless steel panel ceilings, using epoxy-based paint for walls, selecting china commode fixtures instead of stainless steel, installing electronic faucets and sensors,…

  2. Automated ac galvanomagnetic measurement system

    NASA Technical Reports Server (NTRS)

    Szofran, F. R.; Espy, P. N.

    1985-01-01

    An automated, ac galvanomagnetic measurement system is described. Hall or van der Pauw measurements in the temperature range 10-300 K can be made at a preselected magnetic field without operator attendance. Procedures to validate sample installation and correct operation of other system functions, such as magnetic field and thermometry, are included. Advantages of ac measurements are discussed.

  3. Automated analysis of oxidative metabolites

    NASA Technical Reports Server (NTRS)

    Furner, R. L. (Inventor)

    1974-01-01

    An automated system for the study of drug metabolism is described. The system monitors the oxidative metabolites of aromatic amines and of compounds which produce formaldehyde on oxidative dealkylation. It includes color developing compositions suitable for detecting hyroxylated aromatic amines and formaldehyde.

  4. Automation on the Laboratory Bench.

    ERIC Educational Resources Information Center

    Legrand, M.; Foucard, A.

    1978-01-01

    A kit is described for use in automation of routine chemical research procedures. The kit uses sensors to evaluate the state of the system, actuators which modify the adjustable parameters, and an organ of decision which uses the information from the sensors. (BB)

  5. Automation of Space Inventory Management

    NASA Technical Reports Server (NTRS)

    Fink, Patrick W.; Ngo, Phong; Wagner, Raymond; Barton, Richard; Gifford, Kevin

    2009-01-01

    This viewgraph presentation describes the utilization of automated space-based inventory management through handheld RFID readers and BioNet Middleware. The contents include: 1) Space-Based INventory Management; 2) Real-Time RFID Location and Tracking; 3) Surface Acoustic Wave (SAW) RFID; and 4) BioNet Middleware.

  6. Teacherbot: Interventions in Automated Teaching

    ERIC Educational Resources Information Center

    Bayne, Sian

    2015-01-01

    Promises of "teacher-light" tuition and of enhanced "efficiency" via the automation of teaching have been with us since the early days of digital education, sometimes embraced by academics and institutions, and sometimes resisted as a set of moves which are damaging to teacher professionalism and to the humanistic values of…

  7. Library Automation and Library Education.

    ERIC Educational Resources Information Center

    Drabenstott, Jon, Ed.

    1987-01-01

    Several consultants address the issue of competencies required of professional librarians for the effective management of the automation process. Highlights include formal and professional ongoing education and the need for technical training and problem solving skills to enable librarians to evaluate and develop library systems effectively.…

  8. Design automation for integrated optics

    NASA Astrophysics Data System (ADS)

    Condrat, Christopher

    Recent breakthroughs in silicon photonics technology are enabling the integration of optical devices into silicon-based semiconductor processes. Photonics technology enables high-speed, high-bandwidth, and high-fidelity communications on the chip-scale---an important development in an increasingly communications-oriented semiconductor world. Significant developments in silicon photonic manufacturing and integration are also enabling investigations into applications beyond that of traditional telecom: sensing, filtering, signal processing, quantum technology---and even optical computing. In effect, we are now seeing a convergence of communications and computation, where the traditional roles of optics and microelectronics are becoming blurred. As the applications for opto-electronic integrated circuits (OEICs) are developed, and manufacturing capabilities expand, design support is necessary to fully exploit the potential of this optics technology. Such design support for moving beyond custom-design to automated synthesis and optimization is not well developed. Scalability requires abstractions, which in turn enables and requires the use of optimization algorithms and design methodology flows. Design automation represents an opportunity to take OEIC design to a larger scale, facilitating design-space exploration, and laying the foundation for current and future optical applications---thus fully realizing the potential of this technology. This dissertation proposes design automation for integrated optic system design. Using a building-block model for optical devices, we provide an EDA-inspired design flow and methodologies for optical design automation. Underlying these flows and methodologies are new supporting techniques in behavioral and physical synthesis, as well as device-resynthesis techniques for thermal-aware system integration. We also provide modeling for optical devices and determine optimization and constraint parameters that guide the automation

  9. Automated System Marketplace 1987: Maturity and Competition.

    ERIC Educational Resources Information Center

    Walton, Robert A.; Bridge, Frank R.

    1988-01-01

    This annual review of the library automation marketplace presents profiles of 15 major library automation firms and looks at emerging trends. Seventeen charts and tables provide data on market shares, number and size of installations, hardware availability, operating systems, and interfaces. A directory of 49 automation sources is included. (MES)

  10. Flight-deck automation: Promises and problems

    NASA Technical Reports Server (NTRS)

    Wiener, E. L.; Curry, R. E.

    1980-01-01

    The state of the art in human factors in flight-deck automation is presented. A number of critical problem areas are identified and broad design guidelines are offered. Automation-related aircraft accidents and incidents are discussed as examples of human factors problems in automated flight.

  11. Office Automation, Personnel and the New Technology.

    ERIC Educational Resources Information Center

    Magnus, Margaret

    1980-01-01

    At the first annual Office Automation Conference, the consensus was that personnel involvement in the development of office automation is vital if the new technology is to be successfully deployed. This report explores the problems inherent in office automation and provides a broad overview of the subject. (CT)

  12. Does Automated Feedback Improve Writing Quality?

    ERIC Educational Resources Information Center

    Wilson, Joshua; Olinghouse, Natalie G.; Andrada, Gilbert N.

    2014-01-01

    The current study examines data from students in grades 4-8 who participated in a statewide computer-based benchmark writing assessment that featured automated essay scoring and automated feedback. We examined whether the use of automated feedback was associated with gains in writing quality across revisions to an essay, and with transfer effects…

  13. Archives and Automation: Issues and Trends.

    ERIC Educational Resources Information Center

    Weiner, Rob

    This paper focuses on archives and automation, and reviews recent literature on various topics concerning archives and automation. Topics include: resistance to technology and the need to educate about automation; the change in archival theory due to the information age; problems with technology use; the history of organizing archival records…

  14. You're a What? Automation Technician

    ERIC Educational Resources Information Center

    Mullins, John

    2010-01-01

    Many people think of automation as laborsaving technology, but it sure keeps Jim Duffell busy. Defined simply, automation is a technique for making a device run or a process occur with minimal direct human intervention. But the functions and technologies involved in automated manufacturing are complex. Nearly all functions, from orders coming in…

  15. Library Automation in the Netherlands and Pica.

    ERIC Educational Resources Information Center

    Bossers, Anton; Van Muyen, Martin

    1984-01-01

    Describes the Pica Library Automation Network (originally the Project for Integrated Catalogue Automation), which is based on a centralized bibliographic database. Highlights include the Pica conception of library automation, online shared cataloging system, circulation control system, acquisition system, and online Dutch union catalog with…

  16. Automation of Resistance Bridge Calibrator

    NASA Astrophysics Data System (ADS)

    Podgornik, Tadej; Bojkovski, Jovan; Batagelj, Valentin; Drnovšek, Janko

    2008-02-01

    The article addresses the automation of the resistance bridge calibrator (RBC). The automation of the RBC is performed in order to facilitate the operation of the RBC, improve the reliability, and enable several additional possibilities compared to the tedious manual operation, thereby making the RBC a more practical device for routine use. The RBC is used to calibrate AC and DC resistance bridges, which are mainly used in a primary thermometry laboratory. It consists of a resistor network made up from four main resistors from which 35 different resistance values can be realized using toggle switches. Literature shows that the resistors’ non-zero temperature coefficient can influence the measurements, causing difficulties when calibrating resistance bridges with low uncertainty. Placing the RBC in a thermally stable environment can reduce this, but it does not solve the problem of the time-consuming manual selection of the resistance values. To solve this, an automated means to manipulate the switches, while the device is placed within a thermally stable environment, was created. Computer operation completely substitutes for any manual operation during which an operator would normally have to be present. The computer also acquires measurements from the bridge. In this way, repeated and reproducible calibration measurements inside a temperature-stable environment can be carried out with no active involvement of personnel. The automation process itself was divided into several stages. They included the construction of a servo-manipulator to move the switches, the design of a dedicated electronic controller that also provides a serial interface (RS-232) to the computer, and the development of custom computer software to configure the servo-manipulator and control the calibration process. Measurements show that automation does not affect the long-term stability and mechanical repeatability of the RBC. The repeatability and reproducibility of bridge calibration ratios

  17. Automating symbolic analysis with CLIPS

    NASA Technical Reports Server (NTRS)

    Morris, Keith E.

    1990-01-01

    Symbolic Analysis is a methodology first applied as an aid in selecting and generating test cases for 'white box' type testing of computer software programs. The feasibility of automating this analysis process has recently been demonstrated through the development of a CLIPS-based prototype tool. Symbolic analysis is based on separating the logic flow diagram of a computer program into its basic elements, and then systematically examining those elements and their relationships to provide a detailed static analysis of the process that those diagrams represent. The basic logic flow diagram elements are flow structure (connections), predicates (decisions), and computations (actions). The symbolic analysis approach supplies a disciplined step-by-step process to identify all executable program paths and produce a truth table that defines the input and output domains for each path identified. The resulting truth table is the tool that allows software test cases to be generated in a comprehensive manner to achieve total program path, input domain, and output domain coverage. Since the manual application of symbolic analysis is extremely labor intensive and is itself error prone, automation of the process is highly desirable. Earlier attempts at automation, utilizing conventional software approaches, had only limited success. This paper briefly describes the automation problems, the symbolic analysis expert's problem solving heuristics, and the implementation of those heuristics as a CLIPS based prototype, and the manual augmentation required. A simple application example is also provided for illustration purposes. The paper concludes with a discussion of implementation experiences, automation limitations, usage experiences, and future development suggestions.

  18. Genetic Discrimination

    MedlinePlus

    ... Medicine Working Group New Horizons and Research Patient Management Policy and Ethics Issues Quick Links for Patient Care ... genetic discrimination. April 25, 2007, Statement of Administration Policy, Office of Management and Budget Official Statement from the Office of ...

  19. RNA genetics

    SciTech Connect

    Domingo, E. ); Holland, J.J. . Dept. of Biology); Ahlquist, P. . Dept. of Plant Pathology)

    1988-01-01

    This book contains the proceedings on RNA genetics: Retroviruses, Viroids, and RNA recombination, Volume 2. Topics covered include: Replication of retrovirus genomes, Hepatitis B virus replication, and Evolution of RNA viruses.

  20. Arthropod Genetics.

    ERIC Educational Resources Information Center

    Zumwalde, Sharon

    2000-01-01

    Introduces an activity on arthropod genetics that involves phenotype and genotype identification of the creature and the construction process. Includes a list of required materials and directions to build a model arthropod. (YDS)

  1. Fully automated analysis of multi-resolution four-channel micro-array genotyping data

    NASA Astrophysics Data System (ADS)

    Abbaspour, Mohsen; Abugharbieh, Rafeef; Podder, Mohua; Tebbutt, Scott J.

    2006-03-01

    We present a fully-automated and robust microarray image analysis system for handling multi-resolution images (down to 3-micron with sizes up to 80 MBs per channel). The system is developed to provide rapid and accurate data extraction for our recently developed microarray analysis and quality control tool (SNP Chart). Currently available commercial microarray image analysis applications are inefficient, due to the considerable user interaction typically required. Four-channel DNA microarray technology is a robust and accurate tool for determining genotypes of multiple genetic markers in individuals. It plays an important role in the state of the art trend where traditional medical treatments are to be replaced by personalized genetic medicine, i.e. individualized therapy based on the patient's genetic heritage. However, fast, robust, and precise image processing tools are required for the prospective practical use of microarray-based genetic testing for predicting disease susceptibilities and drug effects in clinical practice, which require a turn-around timeline compatible with clinical decision-making. In this paper we have developed a fully-automated image analysis platform for the rapid investigation of hundreds of genetic variations across multiple genes. Validation tests indicate very high accuracy levels for genotyping results. Our method achieves a significant reduction in analysis time, from several hours to just a few minutes, and is completely automated requiring no manual interaction or guidance.

  2. Genetic Screening

    PubMed Central

    Burke, Wylie; Tarini, Beth; Press, Nancy A.; Evans, James P.

    2011-01-01

    Current approaches to genetic screening include newborn screening to identify infants who would benefit from early treatment, reproductive genetic screening to assist reproductive decision making, and family history assessment to identify individuals who would benefit from additional prevention measures. Although the traditional goal of screening is to identify early disease or risk in order to implement preventive therapy, genetic screening has always included an atypical element—information relevant to reproductive decisions. New technologies offer increasingly comprehensive identification of genetic conditions and susceptibilities. Tests based on these technologies are generating a different approach to screening that seeks to inform individuals about all of their genetic traits and susceptibilities for purposes that incorporate rapid diagnosis, family planning, and expediting of research, as well as the traditional screening goal of improving prevention. Use of these tests in population screening will increase the challenges already encountered in genetic screening programs, including false-positive and ambiguous test results, overdiagnosis, and incidental findings. Whether this approach is desirable requires further empiric research, but it also requires careful deliberation on the part of all concerned, including genomic researchers, clinicians, public health officials, health care payers, and especially those who will be the recipients of this novel screening approach. PMID:21709145

  3. Specimen coordinate automated measuring machine/fiducial automated measuring machine

    SciTech Connect

    Hedglen, Robert E.; Jacket, Howard S.; Schwartz, Allan I.

    1991-01-01

    The Specimen coordinate Automated Measuring Machine (SCAMM) and the Fiducial Automated Measuring Machine (FAMM) is a computer controlled metrology system capable of measuring length, width, and thickness, and of locating fiducial marks. SCAMM and FAMM have many similarities in their designs, and they can be converted from one to the other without taking them out of the hot cell. Both have means for: supporting a plurality of samples and a standard; controlling the movement of the samples in the +/- X and Y directions; determining the coordinates of the sample; compensating for temperature effects; and verifying the accuracy of the measurements and repeating as necessary. SCAMM and FAMM are designed to be used in hot cells.

  4. Allocating Railway Platforms Using A Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Clarke, M.; Hinde, C. J.; Withall, M. S.; Jackson, T. W.; Phillips, I. W.; Brown, S.; Watson, R.

    This paper describes an approach to automating railway station platform allocation. The system uses a Genetic Algorithm (GA) to find how a station’s resources should be allocated. Real data is used which needs to be transformed to be suitable for the automated system. Successful or ‘fit’ allocations provide a solution that meets the needs of the station schedule including platform re-occupation and various other constraints. The system associates the train data to derive the station requirements. The Genetic Algorithm is used to derive platform allocations. Finally, the system may be extended to take into account how further parameters that are external to the station have an effect on how an allocation should be applied. The system successfully allocates around 1000 trains to platforms in around 30 seconds requiring a genome of around 1000 genes to achieve this.

  5. ALFA: an automated line fitting algorithm

    NASA Astrophysics Data System (ADS)

    Wesson, R.

    2016-03-01

    I present the automated line fitting algorithm, ALFA, a new code which can fit emission line spectra of arbitrary wavelength coverage and resolution, fully automatically. In contrast to traditional emission line fitting methods which require the identification of spectral features suspected to be emission lines, ALFA instead uses a list of lines which are expected to be present to construct a synthetic spectrum. The parameters used to construct the synthetic spectrum are optimized by means of a genetic algorithm. Uncertainties are estimated using the noise structure of the residuals. An emission line spectrum containing several hundred lines can be fitted in a few seconds using a single processor of a typical contemporary desktop or laptop PC. I show that the results are in excellent agreement with those measured manually for a number of spectra. Where discrepancies exist, the manually measured fluxes are found to be less accurate than those returned by ALFA. Together with the code NEAT, ALFA provides a powerful way to rapidly extract physical information from observations, an increasingly vital function in the era of highly multiplexed spectroscopy. The two codes can deliver a reliable and comprehensive analysis of very large data sets in a few hours with little or no user interaction.

  6. Automated analysis of failure event data

    SciTech Connect

    HENNESSY,COREY; FREERKS,FRED; CAMPBELL,JAMES E.; THOMPSON,BRUCE M.

    2000-03-27

    This paper focuses on fully automated analysis of failure event data in the concept and early development stage of a semiconductor-manufacturing tool. In addition to presenting a wide range of statistical and machine-specific performance information, algorithms have been developed to examine reliability growth and to identify major contributors to unreliability. These capabilities are being implemented in a new software package called Reliadigm. When coupled with additional input regarding repair times and parts availability, the analysis software also provides spare parts inventory optimization based on genetic optimization methods. The type of question to be answered is: If this tool were placed with a customer for beta testing, what would be the optimal spares kit to meet equipment reliability goals for the lowest cost? The new algorithms are implemented in Windows{reg_sign} software and are easy to apply. This paper presents a preliminary analysis of failure event data from three IDEA machines currently in development. The paper also includes an optimal spare parts kit analysis.

  7. Genetic screening

    PubMed Central

    Andermann, Anne; Blancquaert, Ingeborg

    2010-01-01

    Abstract OBJECTIVE To provide a primer for primary care professionals who are increasingly called upon to discuss the growing number of genetic screening services available and to help patients make informed decisions about whether to participate in genetic screening, how to interpret results, and which interventions are most appropriate. QUALITY OF EVIDENCE As part of a larger research program, a wide literature relating to genetic screening was reviewed. PubMed and Internet searches were conducted using broad search terms. Effort was also made to identify the gray literature. MAIN MESSAGE Genetic screening is a type of public health program that is systematically offered to a specified population of asymptomatic individuals with the aim of providing those identified as high risk with prevention, early treatment, or reproductive options. Ensuring an added benefit from screening, as compared with standard clinical care, and preventing unintended harms, such as undue anxiety or stigmatization, depends on the design and implementation of screening programs, including the recruitment methods, education and counseling provided, timing of screening, predictive value of tests, interventions available, and presence of oversight mechanisms and safeguards. There is therefore growing apprehension that economic interests might lead to a market-driven approach to introducing and expanding screening before program effectiveness, acceptability, and feasibility have been demonstrated. As with any medical intervention, there is a moral imperative for genetic screening to do more good than harm, not only from the perspective of individuals and families, but also for the target population and society as a whole. CONCLUSION Primary care professionals have an important role to play in helping their patients navigate the rapidly changing terrain of genetic screening services by informing them about the benefits and risks of new genetic and genomic technologies and empowering them to

  8. From Crater to Graph: Manual and Automated Crater Counting Techniques

    NASA Astrophysics Data System (ADS)

    Plesko, C. S.; Werner, S. C.; Brumby, S. P.; Foing, B. H.; Asphaug, E.; Neukum, G.; Team, H.; Team, I.

    2005-12-01

    Impact craters are some of the most abundant, and most interesting features on Mars. They hold a wealth of information about Martian geology, providing clues to the relative age, local composition and erosional history of the surface. A great deal of effort has been expended to count and understand the nature of planetary crater populations (Hartman and Neukum, 2001). Highly trained experts have developed personal methods for conducting manual crater surveys. In addition, several efforts are underway to automate this process in order to keep up with the rapid increase in planetary surface image data. These efforts make use of a variety of methods, including the direct application of traditional image processing algorithms such as the Hough transform, and recent developments in genetic programming, an artificial intelligence-based technique, in which manual crater surveys are used as examples to `grow' or `evolve' crater counting algorithms. (Plesko, C. S. et al., LPSC 2005, Kim, J. R. et al., LPSC 2001, Michael, G. G. P&SS 2003, Earl, J. et al, LPSC 2005) In this study we examine automated crater counting techniques, and compare them with traditional manual techniques on MOC imagery, and demonstrate capabilities for the analysis of multi-spectral and HRSC Digital Terrain Model data as well. Techniques are compared and discussed to define and develop a robust automated crater detection strategy.

  9. Specific Genetic Disorders

    MedlinePlus

    ... of Genetic Terms Definitions for genetic terms Specific Genetic Disorders Many human diseases have a genetic component. ... Condition in an Adult The Undiagnosed Diseases Program Genetic Disorders Achondroplasia Alpha-1 Antitrypsin Deficiency Antiphospholipid Syndrome ...

  10. Automated Demand Response and Commissioning

    SciTech Connect

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Bourassa, Norman

    2005-04-01

    This paper describes the results from the second season of research to develop and evaluate the performance of new Automated Demand Response (Auto-DR) hardware and software technology in large facilities. Demand Response (DR) is a set of activities to reduce or shift electricity use to improve the electric grid reliability and manage electricity costs. Fully-Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. We refer to this as Auto-DR. The evaluation of the control and communications must be properly configured and pass through a set of test stages: Readiness, Approval, Price Client/Price Server Communication, Internet Gateway/Internet Relay Communication, Control of Equipment, and DR Shed Effectiveness. New commissioning tests are needed for such systems to improve connecting demand responsive building systems to the electric grid demand response systems.

  11. Automated nutrient analyses in seawater

    SciTech Connect

    Whitledge, T.E.; Malloy, S.C.; Patton, C.J.; Wirick, C.D.

    1981-02-01

    This manual was assembled for use as a guide for analyzing the nutrient content of seawater samples collected in the marine coastal zone of the Northeast United States and the Bering Sea. Some modifications (changes in dilution or sample pump tube sizes) may be necessary to achieve optimum measurements in very pronounced oligotrophic, eutrophic or brackish areas. Information is presented under the following section headings: theory and mechanics of automated analysis; continuous flow system description; operation of autoanalyzer system; cookbook of current nutrient methods; automated analyzer and data analysis software; computer interfacing and hardware modifications; and trouble shooting. The three appendixes are entitled: references and additional reading; manifold components and chemicals; and software listings. (JGB)

  12. Automating occupational protection records systems

    SciTech Connect

    Lyon, M.; Martin, J.B.

    1991-10-01

    Occupational protection records have traditionally been generated by field and laboratory personnel, assembled into files in the safety office, and eventually stored in a warehouse or other facility. Until recently, these records have been primarily paper copies, often handwritten. Sometimes, the paper is microfilmed for storage. However, electronic records are beginning to replace these traditional methods. The purpose of this paper is to provide guidance for making the transition to automated record keeping and retrieval using modern computer equipment. This paper describes the types of records most readily converted to electronic record keeping and a methodology for implementing an automated record system. The process of conversion is based on a requirements analysis to assess program needs and a high level of user involvement during the development. The importance of indexing the hard copy records for easy retrieval is also discussed. The concept of linkage between related records and its importance relative to reporting, research, and litigation will be addressed. 2 figs.

  13. Automated labeling in document images

    NASA Astrophysics Data System (ADS)

    Kim, Jongwoo; Le, Daniel X.; Thoma, George R.

    2000-12-01

    The National Library of Medicine (NLM) is developing an automated system to produce bibliographic records for its MEDLINER database. This system, named Medical Article Record System (MARS), employs document image analysis and understanding techniques and optical character recognition (OCR). This paper describes a key module in MARS called the Automated Labeling (AL) module, which labels all zones of interest (title, author, affiliation, and abstract) automatically. The AL algorithm is based on 120 rules that are derived from an analysis of journal page layouts and features extracted from OCR output. Experiments carried out on more than 11,000 articles in over 1,000 biomedical journals show the accuracy of this rule-based algorithm to exceed 96%.

  14. Small Business Innovations (Automated Information)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Bruce G. Jackson & Associates Document Director is an automated tool that combines word processing and database management technologies to offer the flexibility and convenience of text processing with the linking capability of database management. Originally developed for NASA, it provides a means to collect and manage information associated with requirements development. The software system was used by NASA in the design of the Assured Crew Return Vehicle, as well as by other government and commercial organizations including the Southwest Research Institute.

  15. Making the transition to automation

    SciTech Connect

    Christenson, D.J. )

    1992-10-01

    By 1995, the Bureau of Reclamation's hydropower plant near Hungry Horse, Montana, will be remotely operated from Grand Coulee dam (about 300 miles away) in Washington State. Automation at Hungry Horse will eliminate the need for four full-time power plant operators. Between now and then, a transition plan that offers employees choices for retraining, transferring, or taking early retirement will smooth the transition in reducing from five operators to one. The transition plan also includes the use of temporary employees to offset risks of reducing staff too soon. When completed in 1953, the Hungry Horse structure was the world's fourth largest and fourth highest concrete dam. The arch-gravity structure has a crest length of 2,115 feet; it is 3,565 feet above sea level. The four turbine-generator units in the powerhouse total 284 MW, and supply approximately 1 billion kilowatt-hours of electricity annually to the federal power grid managed by the Bonneville Power Administration. In 1988, Reclamation began to automate operations at many of its hydro plants, and to establish centralized control points. The control center concept will increase efficiency. It also will coordinate water movements and power supply throughout the West. In the Pacific Northwest, the Grand Coulee and Black Canyon plants are automated control centers. Several Reclamation-owned facilities in the Columbia River Basin, including Hungry Horse, will be connected to these centers via microwave and telephone lines. When automation is complete, constant monitoring by computer will replace hourly manual readings and equipment checks. Computers also are expected to increase water use efficiency by 1 to 2 percent by ensuring operation for maximum turbine efficiency. Unit efficiency curves for various heads will be programmed into the system.

  16. Home automation in the workplace.

    PubMed

    McCormack, J E; Tello, S F

    1994-01-01

    Environmental control units and home automation devices contribute to the independence and potential of individuals with disabilities, both at work and at home. Devices currently exist that can assist people with physical, cognitive, and sensory disabilities to control lighting, appliances, temperature, security, and telephone communications. This article highlights several possible applications for these technologies and discusses emerging technologies that will increase the benefits these devices offer people with disabilities. PMID:24440955

  17. Convection automated logic oven control

    SciTech Connect

    Boyer, M.A.; Eke, K.I.

    1998-03-01

    For the past few years, there has been a greater push to bring more automation to the cooling process. There have been attempts at automated cooking using a wide range of sensors and procedures, but with limited success. The authors have the answer to the automated cooking process; this patented technology is called Convection AutoLogic (CAL). The beauty of the technology is that it requires no extra hardware for the existing oven system. They use the existing temperature probe, whether it is an RTD, thermocouple, or thermistor. This means that the manufacturer does not have to be burdened with extra costs associated with automated cooking in comparison to standard ovens. The only change to the oven is the program in the central processing unit (CPU) on the board. As for its operation, when the user places the food into the oven, he or she is required to select a category (e.g., beef, poultry, or casseroles) and then simply press the start button. The CAL program then begins its cooking program. It first looks at the ambient oven temperature to see if it is a cold, warm, or hot start. CAL stores this data and then begins to look at the food`s thermal footprint. After CAL has properly detected this thermal footprint, it can calculate the time and temperature at which the food needs to be cooked. CAL then sets up these factors for the cooking stage of the program and, when the food has finished cooking, the oven is turned off automatically. The total time for this entire process is the same as the standard cooking time the user would normally set. The CAL program can also compensate for varying line voltages and detect when the oven door is opened. With all of these varying factors being monitored, CAL can produce a perfectly cooked item with minimal user input.

  18. Algorithms Could Automate Cancer Diagnosis

    NASA Technical Reports Server (NTRS)

    Baky, A. A.; Winkler, D. G.

    1982-01-01

    Five new algorithms are a complete statistical procedure for quantifying cell abnormalities from digitized images. Procedure could be basis for automated detection and diagnosis of cancer. Objective of procedure is to assign each cell an atypia status index (ASI), which quantifies level of abnormality. It is possible that ASI values will be accurate and economical enough to allow diagnoses to be made quickly and accurately by computer processing of laboratory specimens extracted from patients.

  19. Automated Platform Management System Scheduling

    NASA Technical Reports Server (NTRS)

    Hull, Larry G.

    1990-01-01

    The Platform Management System was established to coordinate the operation of platform systems and instruments. The management functions are split between ground and space components. Since platforms are to be out of contact with the ground more than the manned base, the on-board functions are required to be more autonomous than those of the manned base. Under this concept, automated replanning and rescheduling, including on-board real-time schedule maintenance and schedule repair, are required to effectively and efficiently meet Space Station Freedom mission goals. In a FY88 study, we developed several promising alternatives for automated platform planning and scheduling. We recommended both a specific alternative and a phased approach to automated platform resource scheduling. Our recommended alternative was based upon use of exactly the same scheduling engine in both ground and space components of the platform management system. Our phased approach recommendation was based upon evolutionary development of the platform. In the past year, we developed platform scheduler requirements and implemented a rapid prototype of a baseline platform scheduler. Presently we are rehosting this platform scheduler rapid prototype and integrating the scheduler prototype into two Goddard Space Flight Center testbeds, as the ground scheduler in the Scheduling Concepts, Architectures, and Networks Testbed and as the on-board scheduler in the Platform Management System Testbed. Using these testbeds, we will investigate rescheduling issues, evaluate operational performance and enhance the platform scheduler prototype to demonstrate our evolutionary approach to automated platform scheduling. The work described in this paper was performed prior to Space Station Freedom rephasing, transfer of platform responsibility to Code E, and other recently discussed changes. We neither speculate on these changes nor attempt to predict the impact of the final decisions. As a consequence some of our

  20. Fully automated urban traffic system

    NASA Technical Reports Server (NTRS)

    Dobrotin, B. M.; Hansen, G. R.; Peng, T. K. C.; Rennels, D. A.

    1977-01-01

    The replacement of the driver with an automatic system which could perform the functions of guiding and routing a vehicle with a human's capability of responding to changing traffic demands was discussed. The problem was divided into four technological areas; guidance, routing, computing, and communications. It was determined that the latter three areas being developed independent of any need for fully automated urban traffic. A guidance system that would meet system requirements was not being developed but was technically feasible.