Sample records for automated multistep genetic

  1. Automating multistep flow synthesis: approach and challenges in integrating chemistry, machines and logic

    PubMed Central

    Shukla, Chinmay A

    2017-01-01

    The implementation of automation in the multistep flow synthesis is essential for transforming laboratory-scale chemistry into a reliable industrial process. In this review, we briefly introduce the role of automation based on its application in synthesis viz. auto sampling and inline monitoring, optimization and process control. Subsequently, we have critically reviewed a few multistep flow synthesis and suggested a possible control strategy to be implemented so that it helps to reliably transfer the laboratory-scale synthesis strategy to a pilot scale at its optimum conditions. Due to the vast literature in multistep synthesis, we have classified the literature and have identified the case studies based on few criteria viz. type of reaction, heating methods, processes involving in-line separation units, telescopic synthesis, processes involving in-line quenching and process with the smallest time scale of operation. This classification will cover the broader range in the multistep synthesis literature. PMID:28684977

  2. Dissolvable fluidic time delays for programming multi-step assays in instrument-free paper diagnostics.

    PubMed

    Lutz, Barry; Liang, Tinny; Fu, Elain; Ramachandran, Sujatha; Kauffman, Peter; Yager, Paul

    2013-07-21

    Lateral flow tests (LFTs) are an ingenious format for rapid and easy-to-use diagnostics, but they are fundamentally limited to assay chemistries that can be reduced to a single chemical step. In contrast, most laboratory diagnostic assays rely on multiple timed steps carried out by a human or a machine. Here, we use dissolvable sugar applied to paper to create programmable flow delays and present a paper network topology that uses these time delays to program automated multi-step fluidic protocols. Solutions of sucrose at different concentrations (10-70% of saturation) were added to paper strips and dried to create fluidic time delays spanning minutes to nearly an hour. A simple folding card format employing sugar delays was shown to automate a four-step fluidic process initiated by a single user activation step (folding the card); this device was used to perform a signal-amplified sandwich immunoassay for a diagnostic biomarker for malaria. The cards are capable of automating multi-step assay protocols normally used in laboratories, but in a rapid, low-cost, and easy-to-use format.

  3. Dissolvable fluidic time delays for programming multi-step assays in instrument-free paper diagnostics

    PubMed Central

    Lutz, Barry; Liang, Tinny; Fu, Elain; Ramachandran, Sujatha; Kauffman, Peter; Yager, Paul

    2013-01-01

    Lateral flow tests (LFTs) are an ingenious format for rapid and easy-to-use diagnostics, but they are fundamentally limited to assay chemistries that can be reduced to a single chemical step. In contrast, most laboratory diagnostic assays rely on multiple timed steps carried out by a human or a machine. Here, we use dissolvable sugar applied to paper to create programmable flow delays and present a paper network topology that uses these time delays to program automated multi-step fluidic protocols. Solutions of sucrose at different concentrations (10-70% of saturation) were added to paper strips and dried to create fluidic time delays spanning minutes to nearly an hour. A simple folding card format employing sugar delays was shown to automate a four-step fluidic process initiated by a single user activation step (folding the card); this device was used to perform a signal-amplified sandwich immunoassay for a diagnostic biomarker for malaria. The cards are capable of automating multi-step assay protocols normally used in laboratories, but in a rapid, low-cost, and easy-to-use format. PMID:23685876

  4. Applying flow chemistry: methods, materials, and multistep synthesis.

    PubMed

    McQuade, D Tyler; Seeberger, Peter H

    2013-07-05

    The synthesis of complex molecules requires control over both chemical reactivity and reaction conditions. While reactivity drives the majority of chemical discovery, advances in reaction condition control have accelerated method development/discovery. Recent tools include automated synthesizers and flow reactors. In this Synopsis, we describe how flow reactors have enabled chemical advances in our groups in the areas of single-stage reactions, materials synthesis, and multistep reactions. In each section, we detail the lessons learned and propose future directions.

  5. Synthetic Genetic Arrays: Automation of Yeast Genetics.

    PubMed

    Kuzmin, Elena; Costanzo, Michael; Andrews, Brenda; Boone, Charles

    2016-04-01

    Genome-sequencing efforts have led to great strides in the annotation of protein-coding genes and other genomic elements. The current challenge is to understand the functional role of each gene and how genes work together to modulate cellular processes. Genetic interactions define phenotypic relationships between genes and reveal the functional organization of a cell. Synthetic genetic array (SGA) methodology automates yeast genetics and enables large-scale and systematic mapping of genetic interaction networks in the budding yeast,Saccharomyces cerevisiae SGA facilitates construction of an output array of double mutants from an input array of single mutants through a series of replica pinning steps. Subsequent analysis of genetic interactions from SGA-derived mutants relies on accurate quantification of colony size, which serves as a proxy for fitness. Since its development, SGA has given rise to a variety of other experimental approaches for functional profiling of the yeast genome and has been applied in a multitude of other contexts, such as genome-wide screens for synthetic dosage lethality and integration with high-content screening for systematic assessment of morphology defects. SGA-like strategies can also be implemented similarly in a number of other cell types and organisms, includingSchizosaccharomyces pombe,Escherichia coli, Caenorhabditis elegans, and human cancer cell lines. The genetic networks emerging from these studies not only generate functional wiring diagrams but may also play a key role in our understanding of the complex relationship between genotype and phenotype. © 2016 Cold Spring Harbor Laboratory Press.

  6. Biomek 3000: the workhorse in an automated accredited forensic genetic laboratory.

    PubMed

    Stangegaard, Michael; Meijer, Per-Johan; Børsting, Claus; Hansen, Anders J; Morling, Niels

    2012-10-01

    We have implemented and validated automated protocols for a wide range of processes such as sample preparation, PCR setup, and capillary electrophoresis setup using small, simple, and inexpensive automated liquid handlers. The flexibility and ease of programming enable the Biomek 3000 to be used in many parts of the laboratory process in a modern forensic genetics laboratory with low to medium sample throughput. In conclusion, we demonstrated that sample processing for accredited forensic genetic DNA typing can be implemented on small automated liquid handlers, leading to the reduction of manual work as well as increased quality and throughput.

  7. Fully chip-embedded automation of a multi-step lab-on-a-chip process using a modularized timer circuit.

    PubMed

    Kang, Junsu; Lee, Donghyeon; Heo, Young Jin; Chung, Wan Kyun

    2017-11-07

    For highly-integrated microfluidic systems, an actuation system is necessary to control the flow; however, the bulk of actuation devices including pumps or valves has impeded the broad application of integrated microfluidic systems. Here, we suggest a microfluidic process control method based on built-in microfluidic circuits. The circuit is composed of a fluidic timer circuit and a pneumatic logic circuit. The fluidic timer circuit is a serial connection of modularized timer units, which sequentially pass high pressure to the pneumatic logic circuit. The pneumatic logic circuit is a NOR gate array designed to control the liquid-controlling process. By using the timer circuit as a built-in signal generator, multi-step processes could be done totally inside the microchip without any external controller. The timer circuit uses only two valves per unit, and the number of process steps can be extended without limitation by adding timer units. As a demonstration, an automation chip has been designed for a six-step droplet treatment, which entails 1) loading, 2) separation, 3) reagent injection, 4) incubation, 5) clearing and 6) unloading. Each process was successfully performed for a pre-defined step-time without any external control device.

  8. Complex Genetics of Behavior: BXDs in the Automated Home-Cage.

    PubMed

    Loos, Maarten; Verhage, Matthijs; Spijker, Sabine; Smit, August B

    2017-01-01

    This chapter describes a use case for the genetic dissection and automated analysis of complex behavioral traits using the genetically diverse panel of BXD mouse recombinant inbred strains. Strains of the BXD resource differ widely in terms of gene and protein expression in the brain, as well as in their behavioral repertoire. A large mouse resource opens the possibility for gene finding studies underlying distinct behavioral phenotypes, however, such a resource poses a challenge in behavioral phenotyping. To address the specifics of large-scale screening we describe how to investigate: (1) how to assess mouse behavior systematically in addressing a large genetic cohort, (2) how to dissect automation-derived longitudinal mouse behavior into quantitative parameters, and (3) how to map these quantitative traits to the genome, deriving loci underlying aspects of behavior.

  9. Genetic circuit design automation.

    PubMed

    Nielsen, Alec A K; Der, Bryan S; Shin, Jonghyeon; Vaidyanathan, Prashant; Paralanov, Vanya; Strychalski, Elizabeth A; Ross, David; Densmore, Douglas; Voigt, Christopher A

    2016-04-01

    Computation can be performed in living cells by DNA-encoded circuits that process sensory information and control biological functions. Their construction is time-intensive, requiring manual part assembly and balancing of regulator expression. We describe a design environment, Cello, in which a user writes Verilog code that is automatically transformed into a DNA sequence. Algorithms build a circuit diagram, assign and connect gates, and simulate performance. Reliable circuit design requires the insulation of gates from genetic context, so that they function identically when used in different circuits. We used Cello to design 60 circuits forEscherichia coli(880,000 base pairs of DNA), for which each DNA sequence was built as predicted by the software with no additional tuning. Of these, 45 circuits performed correctly in every output state (up to 10 regulators and 55 parts), and across all circuits 92% of the output states functioned as predicted. Design automation simplifies the incorporation of genetic circuits into biotechnology projects that require decision-making, control, sensing, or spatial organization. Copyright © 2016, American Association for the Advancement of Science.

  10. 48 CFR 15.202 - Advisory multi-step process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Advisory multi-step... Information 15.202 Advisory multi-step process. (a) The agency may publish a presolicitation notice (see 5.204... participate in the acquisition. This process should not be used for multi-step acquisitions where it would...

  11. Multi-step high-throughput conjugation platform for the development of antibody-drug conjugates.

    PubMed

    Andris, Sebastian; Wendeler, Michaela; Wang, Xiangyang; Hubbuch, Jürgen

    2018-07-20

    Antibody-drug conjugates (ADCs) form a rapidly growing class of biopharmaceuticals which attracts a lot of attention throughout the industry due to its high potential for cancer therapy. They combine the specificity of a monoclonal antibody (mAb) and the cell-killing capacity of highly cytotoxic small molecule drugs. Site-specific conjugation approaches involve a multi-step process for covalent linkage of antibody and drug via a linker. Despite the range of parameters that have to be investigated, high-throughput methods are scarcely used so far in ADC development. In this work an automated high-throughput platform for a site-specific multi-step conjugation process on a liquid-handling station is presented by use of a model conjugation system. A high-throughput solid-phase buffer exchange was successfully incorporated for reagent removal by utilization of a batch cation exchange step. To ensure accurate screening of conjugation parameters, an intermediate UV/Vis-based concentration determination was established including feedback to the process. For conjugate characterization, a high-throughput compatible reversed-phase chromatography method with a runtime of 7 min and no sample preparation was developed. Two case studies illustrate the efficient use for mapping the operating space of a conjugation process. Due to the degree of automation and parallelization, the platform is capable of significantly reducing process development efforts and material demands and shorten development timelines for antibody-drug conjugates. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Multistep fluorescence gated proportional counters

    NASA Technical Reports Server (NTRS)

    Ramsey, Brian D.; Weisskopf, Martin C.

    1990-01-01

    A proportional counter is introduced in which the levels of energy and spatial resolutions and background rejection permit the application of the device to X-ray astronomy. A multistep approach is employed in which photons cause a signal that triggers the system and measures the energy of the incident photon. The multistep approach permits good energy resolution from parallel geometry and from the imaging stage due to coupling of the imaging and amplification stages. The design also employs fluorescence gating to reduce background, a method that is compatible with the multistep technique. Use of the proportional counter is reported for NASA's supernova campaign, and the pair background is below 0.0001 counts/sq cm sec keV at the xenon k-edge. Potential improvements and applications are listed including the CASES, POF, and EXOSS mission programs.

  13. Automation of diagnostic genetic testing: mutation detection by cyclic minisequencing.

    PubMed

    Alagrund, Katariina; Orpana, Arto K

    2014-01-01

    The rising role of nucleic acid testing in clinical decision making is creating a need for efficient and automated diagnostic nucleic acid test platforms. Clinical use of nucleic acid testing sets demands for shorter turnaround times (TATs), lower production costs and robust, reliable methods that can easily adopt new test panels and is able to run rare tests in random access principle. Here we present a novel home-brew laboratory automation platform for diagnostic mutation testing. This platform is based on the cyclic minisequecing (cMS) and two color near-infrared (NIR) detection. Pipetting is automated using Tecan Freedom EVO pipetting robots and all assays are performed in 384-well micro plate format. The automation platform includes a data processing system, controlling all procedures, and automated patient result reporting to the hospital information system. We have found automated cMS a reliable, inexpensive and robust method for nucleic acid testing for a wide variety of diagnostic tests. The platform is currently in clinical use for over 80 mutations or polymorphisms. Additionally to tests performed from blood samples, the system performs also epigenetic test for the methylation of the MGMT gene promoter, and companion diagnostic tests for analysis of KRAS and BRAF gene mutations from formalin fixed and paraffin embedded tumor samples. Automation of genetic test reporting is found reliable and efficient decreasing the work load of academic personnel.

  14. Automated design of genetic toggle switches with predetermined bistability.

    PubMed

    Chen, Shuobing; Zhang, Haoqian; Shi, Handuo; Ji, Weiyue; Feng, Jingchen; Gong, Yan; Yang, Zhenglin; Ouyang, Qi

    2012-07-20

    Synthetic biology aims to rationally construct biological devices with required functionalities. Methods that automate the design of genetic devices without post-hoc adjustment are therefore highly desired. Here we provide a method to predictably design genetic toggle switches with predetermined bistability. To accomplish this task, a biophysical model that links ribosome binding site (RBS) DNA sequence to toggle switch bistability was first developed by integrating a stochastic model with RBS design method. Then, to parametrize the model, a library of genetic toggle switch mutants was experimentally built, followed by establishing the equivalence between RBS DNA sequences and switch bistability. To test this equivalence, RBS nucleotide sequences for different specified bistabilities were in silico designed and experimentally verified. Results show that the deciphered equivalence is highly predictive for the toggle switch design with predetermined bistability. This method can be generalized to quantitative design of other probabilistic genetic devices in synthetic biology.

  15. Space station automation study: Automation requirements derived from space manufacturing concepts. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The electroepitaxial process and the Very Large Scale Integration (VLSI) circuits (chips) facilities were chosen because each requires a very high degree of automation, and therefore involved extensive use of teleoperators, robotics, process mechanization, and artificial intelligence. Both cover a raw materials process and a sophisticated multi-step process and are therfore highly representative of the kinds of difficult operation, maintenance, and repair challenges which can be expected for any type of space manufacturing facility. Generic areas were identified which will require significant further study. The initial design will be based on terrestrial state-of-the-art hard automation. One hundred candidate missions were evaluated on the basis of automation portential and availability of meaning ful knowldege. The design requirements and unconstrained design concepts developed for the two missions are presented.

  16. Finite Adaptation and Multistep Moves in the Metropolis-Hastings Algorithm for Variable Selection in Genome-Wide Association Analysis

    PubMed Central

    Peltola, Tomi; Marttinen, Pekka; Vehtari, Aki

    2012-01-01

    High-dimensional datasets with large amounts of redundant information are nowadays available for hypothesis-free exploration of scientific questions. A particular case is genome-wide association analysis, where variations in the genome are searched for effects on disease or other traits. Bayesian variable selection has been demonstrated as a possible analysis approach, which can account for the multifactorial nature of the genetic effects in a linear regression model. Yet, the computation presents a challenge and application to large-scale data is not routine. Here, we study aspects of the computation using the Metropolis-Hastings algorithm for the variable selection: finite adaptation of the proposal distributions, multistep moves for changing the inclusion state of multiple variables in a single proposal and multistep move size adaptation. We also experiment with a delayed rejection step for the multistep moves. Results on simulated and real data show increase in the sampling efficiency. We also demonstrate that with application specific proposals, the approach can overcome a specific mixing problem in real data with 3822 individuals and 1,051,811 single nucleotide polymorphisms and uncover a variant pair with synergistic effect on the studied trait. Moreover, we illustrate multimodality in the real dataset related to a restrictive prior distribution on the genetic effect sizes and advocate a more flexible alternative. PMID:23166669

  17. Application of automation and information systems to forensic genetic specimen processing.

    PubMed

    Leclair, Benoît; Scholl, Tom

    2005-03-01

    During the last 10 years, the introduction of PCR-based DNA typing technologies in forensic applications has been highly successful. This technology has become pervasive throughout forensic laboratories and it continues to grow in prevalence. For many criminal cases, it provides the most probative evidence. Criminal genotype data banking and victim identification initiatives that follow mass-fatality incidents have benefited the most from the introduction of automation for sample processing and data analysis. Attributes of offender specimens including large numbers, high quality and identical collection and processing are ideal for the application of laboratory automation. The magnitude of kinship analysis required by mass-fatality incidents necessitates the application of computing solutions to automate the task. More recently, the development activities of many forensic laboratories are focused on leveraging experience from these two applications to casework sample processing. The trend toward increased prevalence of forensic genetic analysis will continue to drive additional innovations in high-throughput laboratory automation and information systems.

  18. Identification of genes associated with dissociation of cognitive performance and neuropathological burden: Multistep analysis of genetic, epigenetic, and transcriptional data

    PubMed Central

    Yu, Lei; Chibnik, Lori B.; Dawe, Robert J.; Yang, Jingyun; Klein, Hans-Ulrich; Honer, William G.; Sperling, Reisa A.; Bennett, David A.; De Jager, Philip L.

    2017-01-01

    Introduction The molecular underpinnings of the dissociation of cognitive performance and neuropathological burden are poorly understood, and there are currently no known genetic or epigenetic determinants of the dissociation. Methods and findings “Residual cognition” was quantified by regressing out the effects of cerebral pathologies and demographic characteristics on global cognitive performance proximate to death. To identify genes influencing residual cognition, we leveraged neuropathological, genetic, epigenetic, and transcriptional data available for deceased participants of the Religious Orders Study (n = 492) and the Rush Memory and Aging Project (n = 487). Given that our sample size was underpowered to detect genome-wide significance, we applied a multistep approach to identify genes influencing residual cognition, based on our prior observation that independent genetic and epigenetic risk factors can converge on the same locus. In the first step (n = 979), we performed a genome-wide association study with a predefined suggestive p < 10−5, and nine independent loci met this threshold in eight distinct chromosomal regions. Three of the six genes within 100 kb of the lead SNP are expressed in the dorsolateral prefrontal cortex (DLPFC): UNC5C, ENC1, and TMEM106B. In the second step, in the subset of participants with DLPFC DNA methylation data (n = 648), we found that residual cognition was related to differential DNA methylation of UNC5C and ENC1 (false discovery rate < 0.05). In the third step, in the subset of participants with DLPFC RNA sequencing data (n = 469), brain transcription levels of UNC5C and ENC1 were evaluated for their association with residual cognition: RNA levels of both UNC5C (estimated effect = −0.40, 95% CI −0.69 to −0.10, p = 0.0089) and ENC1 (estimated effect = 0.0064, 95% CI 0.0033 to 0.0096, p = 5.7 × 10−5) were associated with residual cognition. In secondary analyses, we explored the mechanism of these associations

  19. Event-triggered logical flow control for comprehensive process integration of multi-step assays on centrifugal microfluidic platforms.

    PubMed

    Kinahan, David J; Kearney, Sinéad M; Dimov, Nikolay; Glynn, Macdara T; Ducrée, Jens

    2014-07-07

    The centrifugal "lab-on-a-disc" concept has proven to have great potential for process integration of bioanalytical assays, in particular where ease-of-use, ruggedness, portability, fast turn-around time and cost efficiency are of paramount importance. Yet, as all liquids residing on the disc are exposed to the same centrifugal field, an inherent challenge of these systems remains the automation of multi-step, multi-liquid sample processing and subsequent detection. In order to orchestrate the underlying bioanalytical protocols, an ample palette of rotationally and externally actuated valving schemes has been developed. While excelling with the level of flow control, externally actuated valves require interaction with peripheral instrumentation, thus compromising the conceptual simplicity of the centrifugal platform. In turn, for rotationally controlled schemes, such as common capillary burst valves, typical manufacturing tolerances tend to limit the number of consecutive laboratory unit operations (LUOs) that can be automated on a single disc. In this paper, a major advancement on recently established dissolvable film (DF) valving is presented; for the very first time, a liquid handling sequence can be controlled in response to completion of preceding liquid transfer event, i.e. completely independent of external stimulus or changes in speed of disc rotation. The basic, event-triggered valve configuration is further adapted to leverage conditional, large-scale process integration. First, we demonstrate a fluidic network on a disc encompassing 10 discrete valving steps including logical relationships such as an AND-conditional as well as serial and parallel flow control. Then we present a disc which is capable of implementing common laboratory unit operations such as metering and selective routing of flows. Finally, as a pilot study, these functions are integrated on a single disc to automate a common, multi-step lab protocol for the extraction of total RNA from

  20. Automated Test Assembly for Cognitive Diagnosis Models Using a Genetic Algorithm

    ERIC Educational Resources Information Center

    Finkelman, Matthew; Kim, Wonsuk; Roussos, Louis A.

    2009-01-01

    Much recent psychometric literature has focused on cognitive diagnosis models (CDMs), a promising class of instruments used to measure the strengths and weaknesses of examinees. This article introduces a genetic algorithm to perform automated test assembly alongside CDMs. The algorithm is flexible in that it can be applied whether the goal is to…

  1. Genetic Influences on Cognitive Function Using the Cambridge Neuropsychological Test Automated Battery

    ERIC Educational Resources Information Center

    Singer, Jamie J.; MacGregor, Alex J.; Cherkas, Lynn F.; Spector, Tim D.

    2006-01-01

    The genetic relationship between intelligence and components of cognition remains controversial. Conflicting results may be a function of the limited number of methods used in experimental evaluation. The current study is the first to use CANTAB (The Cambridge Neuropsychological Test Automated Battery). This is a battery of validated computerised…

  2. Contributions of dopamine-related genes and environmental factors to highly sensitive personality: a multi-step neuronal system-level approach.

    PubMed

    Chen, Chunhui; Chen, Chuansheng; Moyzis, Robert; Stern, Hal; He, Qinghua; Li, He; Li, Jin; Zhu, Bi; Dong, Qi

    2011-01-01

    Traditional behavioral genetic studies (e.g., twin, adoption studies) have shown that human personality has moderate to high heritability, but recent molecular behavioral genetic studies have failed to identify quantitative trait loci (QTL) with consistent effects. The current study adopted a multi-step approach (ANOVA followed by multiple regression and permutation) to assess the cumulative effects of multiple QTLs. Using a system-level (dopamine system) genetic approach, we investigated a personality trait deeply rooted in the nervous system (the Highly Sensitive Personality, HSP). 480 healthy Chinese college students were given the HSP scale and genotyped for 98 representative polymorphisms in all major dopamine neurotransmitter genes. In addition, two environment factors (stressful life events and parental warmth) that have been implicated for their contributions to personality development were included to investigate their relative contributions as compared to genetic factors. In Step 1, using ANOVA, we identified 10 polymorphisms that made statistically significant contributions to HSP. In Step 2, these polymorphism's main effects and interactions were assessed using multiple regression. This model accounted for 15% of the variance of HSP (p<0.001). Recent stressful life events accounted for an additional 2% of the variance. Finally, permutation analyses ascertained the probability of obtaining these findings by chance to be very low, p ranging from 0.001 to 0.006. Dividing these loci by the subsystems of dopamine synthesis, degradation/transport, receptor and modulation, we found that the modulation and receptor subsystems made the most significant contribution to HSP. The results of this study demonstrate the utility of a multi-step neuronal system-level approach in assessing genetic contributions to individual differences in human behavior. It can potentially bridge the gap between the high heritability estimates based on traditional behavioral genetics

  3. Contributions of Dopamine-Related Genes and Environmental Factors to Highly Sensitive Personality: A Multi-Step Neuronal System-Level Approach

    PubMed Central

    Chen, Chunhui; Chen, Chuansheng; Moyzis, Robert; Stern, Hal; He, Qinghua; Li, He; Li, Jin; Zhu, Bi; Dong, Qi

    2011-01-01

    Traditional behavioral genetic studies (e.g., twin, adoption studies) have shown that human personality has moderate to high heritability, but recent molecular behavioral genetic studies have failed to identify quantitative trait loci (QTL) with consistent effects. The current study adopted a multi-step approach (ANOVA followed by multiple regression and permutation) to assess the cumulative effects of multiple QTLs. Using a system-level (dopamine system) genetic approach, we investigated a personality trait deeply rooted in the nervous system (the Highly Sensitive Personality, HSP). 480 healthy Chinese college students were given the HSP scale and genotyped for 98 representative polymorphisms in all major dopamine neurotransmitter genes. In addition, two environment factors (stressful life events and parental warmth) that have been implicated for their contributions to personality development were included to investigate their relative contributions as compared to genetic factors. In Step 1, using ANOVA, we identified 10 polymorphisms that made statistically significant contributions to HSP. In Step 2, these polymorphism's main effects and interactions were assessed using multiple regression. This model accounted for 15% of the variance of HSP (p<0.001). Recent stressful life events accounted for an additional 2% of the variance. Finally, permutation analyses ascertained the probability of obtaining these findings by chance to be very low, p ranging from 0.001 to 0.006. Dividing these loci by the subsystems of dopamine synthesis, degradation/transport, receptor and modulation, we found that the modulation and receptor subsystems made the most significant contribution to HSP. The results of this study demonstrate the utility of a multi-step neuronal system-level approach in assessing genetic contributions to individual differences in human behavior. It can potentially bridge the gap between the high heritability estimates based on traditional behavioral genetics

  4. The MSFC large-area imaging multistep proportional counter

    NASA Technical Reports Server (NTRS)

    Ramsey, B. D.; Weisskopf, M. C.; Joy, M. K.

    1989-01-01

    A large-area multistep imaging proportional counter that is being currently developed at the Marshall Space Flight Center is described. The device, known as a multistep fluorescence gated detector, consists of a multiwire proportional counter (MWPC) with a preamplification region. The MWCP features superior spatial resolution with a very high degree of background rejection. It is ideally suited for use in X-ray astronomy in 20-100 keV energy range. The paper includes the MWPC schematic and a list of instrument specifications.

  5. Combining semi-automated image analysis techniques with machine learning algorithms to accelerate large-scale genetic studies.

    PubMed

    Atkinson, Jonathan A; Lobet, Guillaume; Noll, Manuel; Meyer, Patrick E; Griffiths, Marcus; Wells, Darren M

    2017-10-01

    Genetic analyses of plant root systems require large datasets of extracted architectural traits. To quantify such traits from images of root systems, researchers often have to choose between automated tools (that are prone to error and extract only a limited number of architectural traits) or semi-automated ones (that are highly time consuming). We trained a Random Forest algorithm to infer architectural traits from automatically extracted image descriptors. The training was performed on a subset of the dataset, then applied to its entirety. This strategy allowed us to (i) decrease the image analysis time by 73% and (ii) extract meaningful architectural traits based on image descriptors. We also show that these traits are sufficient to identify the quantitative trait loci that had previously been discovered using a semi-automated method. We have shown that combining semi-automated image analysis with machine learning algorithms has the power to increase the throughput of large-scale root studies. We expect that such an approach will enable the quantification of more complex root systems for genetic studies. We also believe that our approach could be extended to other areas of plant phenotyping. © The Authors 2017. Published by Oxford University Press.

  6. Combining semi-automated image analysis techniques with machine learning algorithms to accelerate large-scale genetic studies

    PubMed Central

    Atkinson, Jonathan A.; Lobet, Guillaume; Noll, Manuel; Meyer, Patrick E.; Griffiths, Marcus

    2017-01-01

    Abstract Genetic analyses of plant root systems require large datasets of extracted architectural traits. To quantify such traits from images of root systems, researchers often have to choose between automated tools (that are prone to error and extract only a limited number of architectural traits) or semi-automated ones (that are highly time consuming). We trained a Random Forest algorithm to infer architectural traits from automatically extracted image descriptors. The training was performed on a subset of the dataset, then applied to its entirety. This strategy allowed us to (i) decrease the image analysis time by 73% and (ii) extract meaningful architectural traits based on image descriptors. We also show that these traits are sufficient to identify the quantitative trait loci that had previously been discovered using a semi-automated method. We have shown that combining semi-automated image analysis with machine learning algorithms has the power to increase the throughput of large-scale root studies. We expect that such an approach will enable the quantification of more complex root systems for genetic studies. We also believe that our approach could be extended to other areas of plant phenotyping. PMID:29020748

  7. An automated microfluidic DNA microarray platform for genetic variant detection in inherited arrhythmic diseases.

    PubMed

    Huang, Shu-Hong; Chang, Yu-Shin; Juang, Jyh-Ming Jimmy; Chang, Kai-Wei; Tsai, Mong-Hsun; Lu, Tzu-Pin; Lai, Liang-Chuan; Chuang, Eric Y; Huang, Nien-Tsu

    2018-03-12

    In this study, we developed an automated microfluidic DNA microarray (AMDM) platform for point mutation detection of genetic variants in inherited arrhythmic diseases. The platform allows for automated and programmable reagent sequencing under precise conditions of hybridization flow and temperature control. It is composed of a commercial microfluidic control system, a microfluidic microarray device, and a temperature control unit. The automated and rapid hybridization process can be performed in the AMDM platform using Cy3 labeled oligonucleotide exons of SCN5A genetic DNA, which produces proteins associated with sodium channels abundant in the heart (cardiac) muscle cells. We then introduce a graphene oxide (GO)-assisted DNA microarray hybridization protocol to enable point mutation detection. In this protocol, a GO solution is added after the staining step to quench dyes bound to single-stranded DNA or non-perfectly matched DNA, which can improve point mutation specificity. As proof-of-concept we extracted the wild-type and mutant of exon 12 and exon 17 of SCN5A genetic DNA from patients with long QT syndrome or Brugada syndrome by touchdown PCR and performed a successful point mutation discrimination in the AMDM platform. Overall, the AMDM platform can greatly reduce laborious and time-consuming hybridization steps and prevent potential contamination. Furthermore, by introducing the reciprocating flow into the microchannel during the hybridization process, the total assay time can be reduced to 3 hours, which is 6 times faster than the conventional DNA microarray. Given the automatic assay operation, shorter assay time, and high point mutation discrimination, we believe that the AMDM platform has potential for low-cost, rapid and sensitive genetic testing in a simple and user-friendly manner, which may benefit gene screening in medical practice.

  8. A Hybrid Genetic Programming Algorithm for Automated Design of Dispatching Rules.

    PubMed

    Nguyen, Su; Mei, Yi; Xue, Bing; Zhang, Mengjie

    2018-06-04

    Designing effective dispatching rules for production systems is a difficult and timeconsuming task if it is done manually. In the last decade, the growth of computing power, advanced machine learning, and optimisation techniques has made the automated design of dispatching rules possible and automatically discovered rules are competitive or outperform existing rules developed by researchers. Genetic programming is one of the most popular approaches to discovering dispatching rules in the literature, especially for complex production systems. However, the large heuristic search space may restrict genetic programming from finding near optimal dispatching rules. This paper develops a new hybrid genetic programming algorithm for dynamic job shop scheduling based on a new representation, a new local search heuristic, and efficient fitness evaluators. Experiments show that the new method is effective regarding the quality of evolved rules. Moreover, evolved rules are also significantly smaller and contain more relevant attributes.

  9. Personalized multistep cognitive behavioral therapy for obesity

    PubMed Central

    Dalle Grave, Riccardo; Sartirana, Massimiliano; El Ghoch, Marwan; Calugi, Simona

    2017-01-01

    Multistep cognitive behavioral therapy for obesity (CBT-OB) is a treatment that may be delivered at three levels of care (outpatient, day hospital, and residential). In a stepped-care approach, CBT-OB associates the traditional procedures of weight-loss lifestyle modification, ie, physical activity and dietary recommendations, with specific cognitive behavioral strategies that have been indicated by recent research to influence weight loss and maintenance by addressing specific cognitive processes. The treatment program as a whole is delivered in six modules. These are introduced according to the individual patient’s needs in a flexible and personalized fashion. A recent randomized controlled trial has found that 88 patients suffering from morbid obesity treated with multistep residential CBT-OB achieved a mean weight loss of 15% after 12 months, with no tendency to regain weight between months 6 and 12. The treatment has also shown promising long-term results in the management of obesity associated with binge-eating disorder. If these encouraging findings are confirmed by the two ongoing outpatient studies (one delivered individually and one in a group setting), this will provide evidence-based support for the potential of multistep CBT-OB to provide a more effective alternative to standard weight-loss lifestyle-modification programs. PMID:28615960

  10. A Bias and Variance Analysis for Multistep-Ahead Time Series Forecasting.

    PubMed

    Ben Taieb, Souhaib; Atiya, Amir F

    2016-01-01

    Multistep-ahead forecasts can either be produced recursively by iterating a one-step-ahead time series model or directly by estimating a separate model for each forecast horizon. In addition, there are other strategies; some of them combine aspects of both aforementioned concepts. In this paper, we present a comprehensive investigation into the bias and variance behavior of multistep-ahead forecasting strategies. We provide a detailed review of the different multistep-ahead strategies. Subsequently, we perform a theoretical study that derives the bias and variance for a number of forecasting strategies. Finally, we conduct a Monte Carlo experimental study that compares and evaluates the bias and variance performance of the different strategies. From the theoretical and the simulation studies, we analyze the effect of different factors, such as the forecast horizon and the time series length, on the bias and variance components, and on the different multistep-ahead strategies. Several lessons are learned, and recommendations are given concerning the advantages, disadvantages, and best conditions of use of each strategy.

  11. Automated DNA extraction from genetically modified maize using aminosilane-modified bacterial magnetic particles.

    PubMed

    Ota, Hiroyuki; Lim, Tae-Kyu; Tanaka, Tsuyoshi; Yoshino, Tomoko; Harada, Manabu; Matsunaga, Tadashi

    2006-09-18

    A novel, automated system, PNE-1080, equipped with eight automated pestle units and a spectrophotometer was developed for genomic DNA extraction from maize using aminosilane-modified bacterial magnetic particles (BMPs). The use of aminosilane-modified BMPs allowed highly accurate DNA recovery. The (A(260)-A(320)):(A(280)-A(320)) ratio of the extracted DNA was 1.9+/-0.1. The DNA quality was sufficiently pure for PCR analysis. The PNE-1080 offered rapid assay completion (30 min) with high accuracy. Furthermore, the results of real-time PCR confirmed that our proposed method permitted the accurate determination of genetically modified DNA composition and correlated well with results obtained by conventional cetyltrimethylammonium bromide (CTAB)-based methods.

  12. Development of a Novel and Rapid Fully Automated Genetic Testing System.

    PubMed

    Uehara, Masayuki

    2016-01-01

    We have developed a rapid genetic testing system integrating nucleic acid extraction, purification, amplification, and detection in a single cartridge. The system performs real-time polymerase chain reaction (PCR) after nucleic acid purification in a fully automated manner. RNase P, a housekeeping gene, was purified from human nasal epithelial cells using silica-coated magnetic beads and subjected to real-time PCR using a novel droplet-real-time-PCR machine. The process was completed within 13 min. This system will be widely applicable for research and diagnostic uses.

  13. Surrogate-Assisted Genetic Programming With Simplified Models for Automated Design of Dispatching Rules.

    PubMed

    Nguyen, Su; Zhang, Mengjie; Tan, Kay Chen

    2017-09-01

    Automated design of dispatching rules for production systems has been an interesting research topic over the last several years. Machine learning, especially genetic programming (GP), has been a powerful approach to dealing with this design problem. However, intensive computational requirements, accuracy and interpretability are still its limitations. This paper aims at developing a new surrogate assisted GP to help improving the quality of the evolved rules without significant computational costs. The experiments have verified the effectiveness and efficiency of the proposed algorithms as compared to those in the literature. Furthermore, new simplification and visualisation approaches have also been developed to improve the interpretability of the evolved rules. These approaches have shown great potentials and proved to be a critical part of the automated design system.

  14. Electroencephalogram Signal Classification for Automated Epileptic Seizure Detection Using Genetic Algorithm

    PubMed Central

    Nanthini, B. Suguna; Santhi, B.

    2017-01-01

    Background: Epilepsy causes when the repeated seizure occurs in the brain. Electroencephalogram (EEG) test provides valuable information about the brain functions and can be useful to detect brain disorder, especially for epilepsy. In this study, application for an automated seizure detection model has been introduced successfully. Materials and Methods: The EEG signals are decomposed into sub-bands by discrete wavelet transform using db2 (daubechies) wavelet. The eight statistical features, the four gray level co-occurrence matrix and Renyi entropy estimation with four different degrees of order, are extracted from the raw EEG and its sub-bands. Genetic algorithm (GA) is used to select eight relevant features from the 16 dimension features. The model has been trained and tested using support vector machine (SVM) classifier successfully for EEG signals. The performance of the SVM classifier is evaluated for two different databases. Results: The study has been experimented through two different analyses and achieved satisfactory performance for automated seizure detection using relevant features as the input to the SVM classifier. Conclusion: Relevant features using GA give better accuracy performance for seizure detection. PMID:28781480

  15. Error behavior of multistep methods applied to unstable differential systems

    NASA Technical Reports Server (NTRS)

    Brown, R. L.

    1977-01-01

    The problem of modeling a dynamic system described by a system of ordinary differential equations which has unstable components for limited periods of time is discussed. It is shown that the global error in a multistep numerical method is the solution to a difference equation initial value problem, and the approximate solution is given for several popular multistep integration formulas. Inspection of the solution leads to the formulation of four criteria for integrators appropriate to unstable problems. A sample problem is solved numerically using three popular formulas and two different stepsizes to illustrate the appropriateness of the criteria.

  16. Continuous track paths reveal additive evidence integration in multistep decision making.

    PubMed

    Buc Calderon, Cristian; Dewulf, Myrtille; Gevers, Wim; Verguts, Tom

    2017-10-03

    Multistep decision making pervades daily life, but its underlying mechanisms remain obscure. We distinguish four prominent models of multistep decision making, namely serial stage, hierarchical evidence integration, hierarchical leaky competing accumulation (HLCA), and probabilistic evidence integration (PEI). To empirically disentangle these models, we design a two-step reward-based decision paradigm and implement it in a reaching task experiment. In a first step, participants choose between two potential upcoming choices, each associated with two rewards. In a second step, participants choose between the two rewards selected in the first step. Strikingly, as predicted by the HLCA and PEI models, the first-step decision dynamics were initially biased toward the choice representing the highest sum/mean before being redirected toward the choice representing the maximal reward (i.e., initial dip). Only HLCA and PEI predicted this initial dip, suggesting that first-step decision dynamics depend on additive integration of competing second-step choices. Our data suggest that potential future outcomes are progressively unraveled during multistep decision making.

  17. A new theory for multistep discretizations of stiff ordinary differential equations: Stability with large step sizes

    NASA Technical Reports Server (NTRS)

    Majda, G.

    1985-01-01

    A large set of variable coefficient linear systems of ordinary differential equations which possess two different time scales, a slow one and a fast one is considered. A small parameter epsilon characterizes the stiffness of these systems. A system of o.d.e.s. in this set is approximated by a general class of multistep discretizations which includes both one-leg and linear multistep methods. Sufficient conditions are determined under which each solution of a multistep method is uniformly bounded, with a bound which is independent of the stiffness of the system of o.d.e.s., when the step size resolves the slow time scale, but not the fast one. This property is called stability with large step sizes. The theory presented lets one compare properties of one-leg methods and linear multistep methods when they approximate variable coefficient systems of stiff o.d.e.s. In particular, it is shown that one-leg methods have better stability properties with large step sizes than their linear multistep counter parts. The theory also allows one to relate the concept of D-stability to the usual notions of stability and stability domains and to the propagation of errors for multistep methods which use large step sizes.

  18. Deadlock-free genetic scheduling algorithm for automated manufacturing systems based on deadlock control policy.

    PubMed

    Xing, KeYi; Han, LiBin; Zhou, MengChu; Wang, Feng

    2012-06-01

    Deadlock-free control and scheduling are vital for optimizing the performance of automated manufacturing systems (AMSs) with shared resources and route flexibility. Based on the Petri net models of AMSs, this paper embeds the optimal deadlock avoidance policy into the genetic algorithm and develops a novel deadlock-free genetic scheduling algorithm for AMSs. A possible solution of the scheduling problem is coded as a chromosome representation that is a permutation with repetition of parts. By using the one-step look-ahead method in the optimal deadlock control policy, the feasibility of a chromosome is checked, and infeasible chromosomes are amended into feasible ones, which can be easily decoded into a feasible deadlock-free schedule. The chromosome representation and polynomial complexity of checking and amending procedures together support the cooperative aspect of genetic search for scheduling problems strongly.

  19. A generalized theory of chromatography and multistep liquid extraction

    NASA Astrophysics Data System (ADS)

    Chizhkov, V. P.; Boitsov, V. N.

    2017-03-01

    A generalized theory of chromatography and multistep liquid extraction is developed. The principles of highly efficient processes for fine preparative separation of binary mixture components on a fixed sorbent layer are discussed.

  20. Improved perovskite phototransistor prepared using multi-step annealing method

    NASA Astrophysics Data System (ADS)

    Cao, Mingxuan; Zhang, Yating; Yu, Yu; Yao, Jianquan

    2018-02-01

    Organic-inorganic hybrid perovskites with good intrinsic physical properties have received substantial interest for solar cell and optoelectronic applications. However, perovskite film always suffers from a low carrier mobility due to its structural imperfection including sharp grain boundaries and pinholes, restricting their device performance and application potential. Here we demonstrate a straightforward strategy based on multi-step annealing process to improve the performance of perovskite photodetector. Annealing temperature and duration greatly affects the surface morphology and optoelectrical properties of perovskites which determines the device property of phototransistor. The perovskite films treated with multi-step annealing method tend to form highly uniform, well-crystallized and high surface coverage perovskite film, which exhibit stronger ultraviolet-visible absorption and photoluminescence spectrum compare to the perovskites prepared by conventional one-step annealing process. The field-effect mobilities of perovskite photodetector treated by one-step direct annealing method shows mobility as 0.121 (0.062) cm2V-1s-1 for holes (electrons), which increases to 1.01 (0.54) cm2V-1s-1 for that treated with muti-step slow annealing method. Moreover, the perovskite phototransistors exhibit a fast photoresponse speed of 78 μs. In general, this work focuses on the influence of annealing methods on perovskite phototransistor, instead of obtains best parameters of it. These findings prove that Multi-step annealing methods is feasible to prepared high performance based photodetector.

  1. Comparing Multi-Step IMAC and Multi-Step TiO2 Methods for Phosphopeptide Enrichment

    PubMed Central

    Yue, Xiaoshan; Schunter, Alissa; Hummon, Amanda B.

    2016-01-01

    Phosphopeptide enrichment from complicated peptide mixtures is an essential step for mass spectrometry-based phosphoproteomic studies to reduce sample complexity and ionization suppression effects. Typical methods for enriching phosphopeptides include immobilized metal affinity chromatography (IMAC) or titanium dioxide (TiO2) beads, which have selective affinity and can interact with phosphopeptides. In this study, the IMAC enrichment method was compared with the TiO2 enrichment method, using a multi-step enrichment strategy from whole cell lysate, to evaluate their abilities to enrich for different types of phosphopeptides. The peptide-to-beads ratios were optimized for both IMAC and TiO2 beads. Both IMAC and TiO2 enrichments were performed for three rounds to enable the maximum extraction of phosphopeptides from the whole cell lysates. The phosphopeptides that are unique to IMAC enrichment, unique to TiO2 enrichment, and identified with both IMAC and TiO2 enrichment were analyzed for their characteristics. Both IMAC and TiO2 enriched similar amounts of phosphopeptides with comparable enrichment efficiency. However, phosphopeptides that are unique to IMAC enrichment showed a higher percentage of multi-phosphopeptides, as well as a higher percentage of longer, basic, and hydrophilic phosphopeptides. Also, the IMAC and TiO2 procedures clearly enriched phosphopeptides with different motifs. Finally, further enriching with two rounds of TiO2 from the supernatant after IMAC enrichment, or further enriching with two rounds of IMAC from the supernatant TiO2 enrichment does not fully recover the phosphopeptides that are not identified with the corresponding multi-step enrichment. PMID:26237447

  2. A Parallel Genetic Algorithm for Automated Electronic Circuit Design

    NASA Technical Reports Server (NTRS)

    Long, Jason D.; Colombano, Silvano P.; Haith, Gary L.; Stassinopoulos, Dimitris

    2000-01-01

    Parallelized versions of genetic algorithms (GAs) are popular primarily for three reasons: the GA is an inherently parallel algorithm, typical GA applications are very compute intensive, and powerful computing platforms, especially Beowulf-style computing clusters, are becoming more affordable and easier to implement. In addition, the low communication bandwidth required allows the use of inexpensive networking hardware such as standard office ethernet. In this paper we describe a parallel GA and its use in automated high-level circuit design. Genetic algorithms are a type of trial-and-error search technique that are guided by principles of Darwinian evolution. Just as the genetic material of two living organisms can intermix to produce offspring that are better adapted to their environment, GAs expose genetic material, frequently strings of 1s and Os, to the forces of artificial evolution: selection, mutation, recombination, etc. GAs start with a pool of randomly-generated candidate solutions which are then tested and scored with respect to their utility. Solutions are then bred by probabilistically selecting high quality parents and recombining their genetic representations to produce offspring solutions. Offspring are typically subjected to a small amount of random mutation. After a pool of offspring is produced, this process iterates until a satisfactory solution is found or an iteration limit is reached. Genetic algorithms have been applied to a wide variety of problems in many fields, including chemistry, biology, and many engineering disciplines. There are many styles of parallelism used in implementing parallel GAs. One such method is called the master-slave or processor farm approach. In this technique, slave nodes are used solely to compute fitness evaluations (the most time consuming part). The master processor collects fitness scores from the nodes and performs the genetic operators (selection, reproduction, variation, etc.). Because of dependency

  3. Mechanical and Metallurgical Evolution of Stainless Steel 321 in a Multi-step Forming Process

    NASA Astrophysics Data System (ADS)

    Anderson, M.; Bridier, F.; Gholipour, J.; Jahazi, M.; Wanjara, P.; Bocher, P.; Savoie, J.

    2016-04-01

    This paper examines the metallurgical evolution of AISI Stainless Steel 321 (SS 321) during multi-step forming, a process that involves cycles of deformation with intermediate heat treatment steps. The multi-step forming process was simulated by implementing interrupted uniaxial tensile testing experiments. Evolution of the mechanical properties as well as the microstructural features, such as twins and textures of the austenite and martensite phases, was studied as a function of the multi-step forming process. The characteristics of the Strain-Induced Martensite (SIM) were also documented for each deformation step and intermediate stress relief heat treatment. The results indicated that the intermediate heat treatments considerably increased the formability of SS 321. Texture analysis showed that the effect of the intermediate heat treatment on the austenite was minor and led to partial recrystallization, while deformation was observed to reinforce the crystallographic texture of austenite. For the SIM, an Olson-Cohen equation type was identified to analytically predict its formation during the multi-step forming process. The generated SIM was textured and weakened with increasing deformation.

  4. Automation and validation of DNA-banking systems.

    PubMed

    Thornton, Melissa; Gladwin, Amanda; Payne, Robin; Moore, Rachael; Cresswell, Carl; McKechnie, Douglas; Kelly, Steve; March, Ruth

    2005-10-15

    DNA banking is one of the central capabilities on which modern genetic research rests. The DNA-banking system plays an essential role in the flow of genetic data from patients and genetics researchers to the application of genetic research in the clinic. Until relatively recently, large collections of DNA samples were not common in human genetics. Now, collections of hundreds of thousands of samples are common in academic institutions and private companies. Automation of DNA banking can dramatically increase throughput, eliminate manual errors and improve the productivity of genetics research. An increased emphasis on pharmacogenetics and personalized medicine has highlighted the need for genetics laboratories to operate within the principles of a recognized quality system such as good laboratory practice (GLP). Automated systems are suitable for such laboratories but require a level of validation that might be unfamiliar to many genetics researchers. In this article, we use the AstraZeneca automated DNA archive and reformatting system (DART) as a case study of how such a system can be successfully developed and validated within the principles of GLP.

  5. Stability with large step sizes for multistep discretizations of stiff ordinary differential equations

    NASA Technical Reports Server (NTRS)

    Majda, George

    1986-01-01

    One-leg and multistep discretizations of variable-coefficient linear systems of ODEs having both slow and fast time scales are investigated analytically. The stability properties of these discretizations are obtained independent of ODE stiffness and compared. The results of numerical computations are presented in tables, and it is shown that for large step sizes the stability of one-leg methods is better than that of the corresponding linear multistep methods.

  6. Multi-step routes of capuchin monkeys in a laser pointer traveling salesman task.

    PubMed

    Howard, Allison M; Fragaszy, Dorothy M

    2014-09-01

    Prior studies have claimed that nonhuman primates plan their routes multiple steps in advance. However, a recent reexamination of multi-step route planning in nonhuman primates indicated that there is no evidence for planning more than one step ahead. We tested multi-step route planning in capuchin monkeys using a pointing device to "travel" to distal targets while stationary. This device enabled us to determine whether capuchins distinguish the spatial relationship between goals and themselves and spatial relationships between goals and the laser dot, allocentrically. In Experiment 1, two subjects were presented with identical food items in Near-Far (one item nearer to subject) and Equidistant (both items equidistant from subject) conditions with a laser dot visible between the items. Subjects moved the laser dot to the items using a joystick. In the Near-Far condition, one subject demonstrated a bias for items closest to self but the other subject chose efficiently. In the second experiment, subjects retrieved three food items in similar Near-Far and Equidistant arrangements. Both subjects preferred food items nearest the laser dot and showed no evidence of multi-step route planning. We conclude that these capuchins do not make choices on the basis of multi-step look ahead strategies. © 2014 Wiley Periodicals, Inc.

  7. Multistep-Ahead Air Passengers Traffic Prediction with Hybrid ARIMA-SVMs Models

    PubMed Central

    Ming, Wei; Xiong, Tao

    2014-01-01

    The hybrid ARIMA-SVMs prediction models have been established recently, which take advantage of the unique strength of ARIMA and SVMs models in linear and nonlinear modeling, respectively. Built upon this hybrid ARIMA-SVMs models alike, this study goes further to extend them into the case of multistep-ahead prediction for air passengers traffic with the two most commonly used multistep-ahead prediction strategies, that is, iterated strategy and direct strategy. Additionally, the effectiveness of data preprocessing approaches, such as deseasonalization and detrending, is investigated and proofed along with the two strategies. Real data sets including four selected airlines' monthly series were collected to justify the effectiveness of the proposed approach. Empirical results demonstrate that the direct strategy performs better than iterative one in long term prediction case while iterative one performs better in the case of short term prediction. Furthermore, both deseasonalization and detrending can significantly improve the prediction accuracy for both strategies, indicating the necessity of data preprocessing. As such, this study contributes as a full reference to the planners from air transportation industries on how to tackle multistep-ahead prediction tasks in the implementation of either prediction strategy. PMID:24723814

  8. Multistep Methods for Integrating the Solar System

    DTIC Science & Technology

    1988-07-01

    Technical Report 1055 [Multistep Methods for Integrating the Solar System 0 Panayotis A. Skordos’ MIT Artificial Intelligence Laboratory DTIC S D g8...RMA ELEENT. PROECT. TASK Artific ial Inteligence Laboratory ARE1A G WORK UNIT NUMBERS 545 Technology Square Cambridge, MA 02139 IL. CONTROLLING...describes research done at the Artificial Intelligence Laboratory of the Massachusetts Institute of Technology, supported by the Advanced Research Projects

  9. Auto-FPFA: An Automated Microscope for Characterizing Genetically Encoded Biosensors.

    PubMed

    Nguyen, Tuan A; Puhl, Henry L; Pham, An K; Vogel, Steven S

    2018-05-09

    Genetically encoded biosensors function by linking structural change in a protein construct, typically tagged with one or more fluorescent proteins, to changes in a biological parameter of interest (such as calcium concentration, pH, phosphorylation-state, etc.). Typically, the structural change triggered by alterations in the bio-parameter is monitored as a change in either fluorescent intensity, or lifetime. Potentially, other photo-physical properties of fluorophores, such as fluorescence anisotropy, molecular brightness, concentration, and lateral and/or rotational diffusion could also be used. Furthermore, while it is likely that multiple photo-physical attributes of a biosensor might be altered as a function of the bio-parameter, standard measurements monitor only a single photo-physical trait. This limits how biosensors are designed, as well as the accuracy and interpretation of biosensor measurements. Here we describe the design and construction of an automated multimodal-microscope. This system can autonomously analyze 96 samples in a micro-titer dish and for each sample simultaneously measure intensity (photon count), fluorescence lifetime, time-resolved anisotropy, molecular brightness, lateral diffusion time, and concentration. We characterize the accuracy and precision of this instrument, and then demonstrate its utility by characterizing three types of genetically encoded calcium sensors as well as a negative control.

  10. Genetic validation of bipolar disorder identified by automated phenotyping using electronic health records.

    PubMed

    Chen, Chia-Yen; Lee, Phil H; Castro, Victor M; Minnier, Jessica; Charney, Alexander W; Stahl, Eli A; Ruderfer, Douglas M; Murphy, Shawn N; Gainer, Vivian; Cai, Tianxi; Jones, Ian; Pato, Carlos N; Pato, Michele T; Landén, Mikael; Sklar, Pamela; Perlis, Roy H; Smoller, Jordan W

    2018-04-18

    Bipolar disorder (BD) is a heritable mood disorder characterized by episodes of mania and depression. Although genomewide association studies (GWAS) have successfully identified genetic loci contributing to BD risk, sample size has become a rate-limiting obstacle to genetic discovery. Electronic health records (EHRs) represent a vast but relatively untapped resource for high-throughput phenotyping. As part of the International Cohort Collection for Bipolar Disorder (ICCBD), we previously validated automated EHR-based phenotyping algorithms for BD against in-person diagnostic interviews (Castro et al. Am J Psychiatry 172:363-372, 2015). Here, we establish the genetic validity of these phenotypes by determining their genetic correlation with traditionally ascertained samples. Case and control algorithms were derived from structured and narrative text in the Partners Healthcare system comprising more than 4.6 million patients over 20 years. Genomewide genotype data for 3330 BD cases and 3952 controls of European ancestry were used to estimate SNP-based heritability (h 2 g ) and genetic correlation (r g ) between EHR-based phenotype definitions and traditionally ascertained BD cases in GWAS by the ICCBD and Psychiatric Genomics Consortium (PGC) using LD score regression. We evaluated BD cases identified using 4 EHR-based algorithms: an NLP-based algorithm (95-NLP) and three rule-based algorithms using codified EHR with decreasing levels of stringency-"coded-strict", "coded-broad", and "coded-broad based on a single clinical encounter" (coded-broad-SV). The analytic sample comprised 862 95-NLP, 1968 coded-strict, 2581 coded-broad, 408 coded-broad-SV BD cases, and 3 952 controls. The estimated h 2 g were 0.24 (p = 0.015), 0.09 (p = 0.064), 0.13 (p = 0.003), 0.00 (p = 0.591) for 95-NLP, coded-strict, coded-broad and coded-broad-SV BD, respectively. The h 2 g for all EHR-based cases combined except coded-broad-SV (excluded due to 0 h 2 g ) was 0.12 (p

  11. Blastocyst microinjection automation.

    PubMed

    Mattos, Leonardo S; Grant, Edward; Thresher, Randy; Kluckman, Kimberly

    2009-09-01

    Blastocyst microinjections are routinely involved in the process of creating genetically modified mice for biomedical research, but their efficiency is highly dependent on the skills of the operators. As a consequence, much time and resources are required for training microinjection personnel. This situation has been aggravated by the rapid growth of genetic research, which has increased the demand for mutant animals. Therefore, increased productivity and efficiency in this area are highly desired. Here, we pursue these goals through the automation of a previously developed teleoperated blastocyst microinjection system. This included the design of a new system setup to facilitate automation, the definition of rules for automatic microinjections, the implementation of video processing algorithms to extract feedback information from microscope images, and the creation of control algorithms for process automation. Experimentation conducted with this new system and operator assistance during the cells delivery phase demonstrated a 75% microinjection success rate. In addition, implantation of the successfully injected blastocysts resulted in a 53% birth rate and a 20% yield of chimeras. These results proved that the developed system was capable of automatic blastocyst penetration and retraction, demonstrating the success of major steps toward full process automation.

  12. A Computational Workflow for the Automated Generation of Models of Genetic Designs.

    PubMed

    Misirli, Göksel; Nguyen, Tramy; McLaughlin, James Alastair; Vaidyanathan, Prashant; Jones, Timothy S; Densmore, Douglas; Myers, Chris; Wipat, Anil

    2018-06-05

    Computational models are essential to engineer predictable biological systems and to scale up this process for complex systems. Computational modeling often requires expert knowledge and data to build models. Clearly, manual creation of models is not scalable for large designs. Despite several automated model construction approaches, computational methodologies to bridge knowledge in design repositories and the process of creating computational models have still not been established. This paper describes a workflow for automatic generation of computational models of genetic circuits from data stored in design repositories using existing standards. This workflow leverages the software tool SBOLDesigner to build structural models that are then enriched by the Virtual Parts Repository API using Systems Biology Open Language (SBOL) data fetched from the SynBioHub design repository. The iBioSim software tool is then utilized to convert this SBOL description into a computational model encoded using the Systems Biology Markup Language (SBML). Finally, this SBML model can be simulated using a variety of methods. This workflow provides synthetic biologists with easy to use tools to create predictable biological systems, hiding away the complexity of building computational models. This approach can further be incorporated into other computational workflows for design automation.

  13. Automated reagent-dispensing system for microfluidic cell biology assays.

    PubMed

    Ly, Jimmy; Masterman-Smith, Michael; Ramakrishnan, Ravichandran; Sun, Jing; Kokubun, Brent; van Dam, R Michael

    2013-12-01

    Microscale systems that enable measurements of oncological phenomena at the single-cell level have a great capacity to improve therapeutic strategies and diagnostics. Such measurements can reveal unprecedented insights into cellular heterogeneity and its implications into the progression and treatment of complicated cellular disease processes such as those found in cancer. We describe a novel fluid-delivery platform to interface with low-cost microfluidic chips containing arrays of microchambers. Using multiple pairs of needles to aspirate and dispense reagents, the platform enables automated coating of chambers, loading of cells, and treatment with growth media or other agents (e.g., drugs, fixatives, membrane permeabilizers, washes, stains, etc.). The chips can be quantitatively assayed using standard fluorescence-based immunocytochemistry, microscopy, and image analysis tools, to determine, for example, drug response based on differences in protein expression and/or activation of cellular targets on an individual-cell level. In general, automation of fluid and cell handling increases repeatability, eliminates human error, and enables increased throughput, especially for sophisticated, multistep assays such as multiparameter quantitative immunocytochemistry. We report the design of the automated platform and compare several aspects of its performance to manually-loaded microfluidic chips.

  14. Multistep integration formulas for the numerical integration of the satellite problem

    NASA Technical Reports Server (NTRS)

    Lundberg, J. B.; Tapley, B. D.

    1981-01-01

    The use of two Class 2/fixed mesh/fixed order/multistep integration packages of the PECE type for the numerical integration of the second order, nonlinear, ordinary differential equation of the satellite orbit problem. These two methods are referred to as the general and the second sum formulations. The derivation of the basic equations which characterize each formulation and the role of the basic equations in the PECE algorithm are discussed. Possible starting procedures are examined which may be used to supply the initial set of values required by the fixed mesh/multistep integrators. The results of the general and second sum integrators are compared to the results of various fixed step and variable step integrators.

  15. Surface Modified Particles By Multi-Step Addition And Process For The Preparation Thereof

    DOEpatents

    Cook, Ronald Lee; Elliott, Brian John; Luebben, Silvia DeVito; Myers, Andrew William; Smith, Bryan Matthew

    2006-01-17

    The present invention relates to a new class of surface modified particles and to a multi-step surface modification process for the preparation of the same. The multi-step surface functionalization process involves two or more reactions to produce particles that are compatible with various host systems and/or to provide the particles with particular chemical reactivities. The initial step comprises the attachment of a small organic compound to the surface of the inorganic particle. The subsequent steps attach additional compounds to the previously attached organic compounds through organic linking groups.

  16. Multistep estimators of the between-study variance: The relationship with the Paule-Mandel estimator.

    PubMed

    van Aert, Robbie C M; Jackson, Dan

    2018-04-26

    A wide variety of estimators of the between-study variance are available in random-effects meta-analysis. Many, but not all, of these estimators are based on the method of moments. The DerSimonian-Laird estimator is widely used in applications, but the Paule-Mandel estimator is an alternative that is now recommended. Recently, DerSimonian and Kacker have developed two-step moment-based estimators of the between-study variance. We extend these two-step estimators so that multiple (more than two) steps are used. We establish the surprising result that the multistep estimator tends towards the Paule-Mandel estimator as the number of steps becomes large. Hence, the iterative scheme underlying our new multistep estimator provides a hitherto unknown relationship between two-step estimators and Paule-Mandel estimator. Our analysis suggests that two-step estimators are not necessarily distinct estimators in their own right; instead, they are quantities that are closely related to the usual iterative scheme that is used to calculate the Paule-Mandel estimate. The relationship that we establish between the multistep and Paule-Mandel estimator is another justification for the use of the latter estimator. Two-step and multistep estimators are perhaps best conceptualized as approximate Paule-Mandel estimators. © 2018 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  17. A versatile valving toolkit for automating fluidic operations in paper microfluidic devices.

    PubMed

    Toley, Bhushan J; Wang, Jessica A; Gupta, Mayuri; Buser, Joshua R; Lafleur, Lisa K; Lutz, Barry R; Fu, Elain; Yager, Paul

    2015-03-21

    Failure to utilize valving and automation techniques has restricted the complexity of fluidic operations that can be performed in paper microfluidic devices. We developed a toolkit of paper microfluidic valves and methods for automatic valve actuation using movable paper strips and fluid-triggered expanding elements. To the best of our knowledge, this is the first functional demonstration of this valving strategy in paper microfluidics. After introduction of fluids on devices, valves can actuate automatically after a) a certain period of time, or b) the passage of a certain volume of fluid. Timing of valve actuation can be tuned with greater than 8.5% accuracy by changing lengths of timing wicks, and we present timed on-valves, off-valves, and diversion (channel-switching) valves. The actuators require ~30 μl fluid to actuate and the time required to switch from one state to another ranges from ~5 s for short to ~50 s for longer wicks. For volume-metered actuation, the size of a metering pad can be adjusted to tune actuation volume, and we present two methods - both methods can achieve greater than 9% accuracy. Finally, we demonstrate the use of these valves in a device that conducts a multi-step assay for the detection of the malaria protein PfHRP2. Although slightly more complex than devices that do not have moving parts, this valving and automation toolkit considerably expands the capabilities of paper microfluidic devices. Components of this toolkit can be used to conduct arbitrarily complex, multi-step fluidic operations on paper-based devices, as demonstrated in the malaria assay device.

  18. INDES User's guide multistep input design with nonlinear rotorcraft modeling

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The INDES computer program, a multistep input design program used as part of a data processing technique for rotorcraft systems identification, is described. Flight test inputs base on INDES improve the accuracy of parameter estimates. The input design algorithm, program input, and program output are presented.

  19. Significantly enhanced memory effect in metallic glass by multistep training

    NASA Astrophysics Data System (ADS)

    Li, M. X.; Luo, P.; Sun, Y. T.; Wen, P.; Bai, H. Y.; Liu, Y. H.; Wang, W. H.

    2017-11-01

    The state of metastable equilibrium glass can carry an imprint of the past and exhibit memory effect. As a hallmark of glassy dynamics, memory effect can affect glassy behavior as it evolves further upon time. Even though the physical picture of the memory effect has been well studied, it is unclear whether a glass can recall as many pieces of information as possible, and if so, how the glass will accordingly behave. We report that by fractionizing temperature interval, inserting multistep aging protocols, and optimizing the time of each temperature step, i.e., by imposing a multistep "training" on a prototypical P d40N i10C u30P20 metallic glass, the memory of the trained glass can be significantly strengthened, marked by a pronounced augment in potential energy. These findings provide a new guide for regulating the energy state of glass by enhancing the nonequilibrium behaviors of the memory effect and offer an opportunity to develop a clearer physical picture of glassy dynamics.

  20. PSO-MISMO modeling strategy for multistep-ahead time series prediction.

    PubMed

    Bao, Yukun; Xiong, Tao; Hu, Zhongyi

    2014-05-01

    Multistep-ahead time series prediction is one of the most challenging research topics in the field of time series modeling and prediction, and is continually under research. Recently, the multiple-input several multiple-outputs (MISMO) modeling strategy has been proposed as a promising alternative for multistep-ahead time series prediction, exhibiting advantages compared with the two currently dominating strategies, the iterated and the direct strategies. Built on the established MISMO strategy, this paper proposes a particle swarm optimization (PSO)-based MISMO modeling strategy, which is capable of determining the number of sub-models in a self-adaptive mode, with varying prediction horizons. Rather than deriving crisp divides with equal-size s prediction horizons from the established MISMO, the proposed PSO-MISMO strategy, implemented with neural networks, employs a heuristic to create flexible divides with varying sizes of prediction horizons and to generate corresponding sub-models, providing considerable flexibility in model construction, which has been validated with simulated and real datasets.

  1. A Multistep Synthesis for an Advanced Undergraduate Organic Chemistry Laboratory

    ERIC Educational Resources Information Center

    Chang Ji; Peters, Dennis G.

    2006-01-01

    Multistep syntheses are often important components of the undergraduate organic laboratory experience and a three-step synthesis of 5-(2-sulfhydrylethyl) salicylaldehyde was described. The experiment is useful as a special project for an advanced undergraduate organic chemistry laboratory course and offers opportunities for students to master a…

  2. Induction of Pectinase Hyper Production by Multistep Mutagenesis Using a Fungal Isolate--Aspergillus flavipes.

    PubMed

    Akbar, Sabika; Prasuna, R Gyana; Khanam, Rasheeda

    2014-04-01

    Aspergillus flavipes, a slow growing pectinase producing ascomycete, was isolated from soil identified and characterised in the previously done preliminary studies. Optimisation studies revealed that Citrus peel--groundnut oil cake [CG] production media is the best media for production of high levels of pectinase up to 39 U/ml using wild strain of A. flavipes. Strain improvement of this isolated strain for enhancement of pectinase production using multistep mutagenesis procedure is the endeavour of this project. For this, the wild strain of A. flavipes was treated with both physical (UV irradiation) and chemical [Colchicine, Ethidium bromide, H2O2] mutagens to obtain Ist generation mutants. The obtained mutants were assayed and differentiated basing on pectinase productivity. The better pectinase producing strains were further subjected to multistep mutagenesis to attain stability in mutants. The goal of this project was achieved by obtaining the best pectinase secreting mutant, UV80 of 45 U/ml compared to wild strain and sister mutants. This fact was confirmed by quantitatively analysing 3rd generation mutants obtained after multistep mutagenesis.

  3. Direct observation of multistep energy transfer in LHCII with fifth-order 3D electronic spectroscopy.

    PubMed

    Zhang, Zhengyang; Lambrev, Petar H; Wells, Kym L; Garab, Győző; Tan, Howe-Siang

    2015-07-31

    During photosynthesis, sunlight is efficiently captured by light-harvesting complexes, and the excitation energy is then funneled towards the reaction centre. These photosynthetic excitation energy transfer (EET) pathways are complex and proceed in a multistep fashion. Ultrafast two-dimensional electronic spectroscopy (2DES) is an important tool to study EET processes in photosynthetic complexes. However, the multistep EET processes can only be indirectly inferred by correlating different cross peaks from a series of 2DES spectra. Here we directly observe multistep EET processes in LHCII using ultrafast fifth-order three-dimensional electronic spectroscopy (3DES). We measure cross peaks in 3DES spectra of LHCII that directly indicate energy transfer from excitons in the chlorophyll b (Chl b) manifold to the low-energy level chlorophyll a (Chl a) via mid-level Chl a energy states. This new spectroscopic technique allows scientists to move a step towards mapping the complete complex EET processes in photosynthetic systems.

  4. Attention and Multistep Problem Solving in 24-Month-Old Children

    ERIC Educational Resources Information Center

    Carrico, Renee L.

    2013-01-01

    The current study examined the role of increased attentional load in 24 month-old children's multistep problem-solving behavior. Children solved an object-based nonspatial working-memory search task, to which a motor component of varying difficulty was added. Significant disruptions in search performance were observed with the introduction of the…

  5. Description of bioremediation of soils using the model of a multistep system of microorganisms

    NASA Astrophysics Data System (ADS)

    Lubysheva, A. I.; Potashev, K. A.; Sofinskaya, O. A.

    2018-01-01

    The paper deals with the development of a mathematical model describing the interaction of a multi-step system of microorganisms in soil polluted with oil products. Each step in this system uses products of vital activity of the previous step to feed. Six different models of the multi-step system are considered. The equipping of the models with coefficients was carried out from the condition of minimizing the residual of the calculated and experimental data using an original algorithm based on the Levenberg-Marquardt method in combination with the Monte Carlo method for the initial approximation finding.

  6. Automated segmentation and reconstruction of patient-specific cardiac anatomy and pathology from in vivo MRI*

    NASA Astrophysics Data System (ADS)

    Ringenberg, Jordan; Deo, Makarand; Devabhaktuni, Vijay; Filgueiras-Rama, David; Pizarro, Gonzalo; Ibañez, Borja; Berenfeld, Omer; Boyers, Pamela; Gold, Jeffrey

    2012-12-01

    This paper presents an automated method to segment left ventricle (LV) tissues from functional and delayed-enhancement (DE) cardiac magnetic resonance imaging (MRI) scans using a sequential multi-step approach. First, a region of interest (ROI) is computed to create a subvolume around the LV using morphological operations and image arithmetic. From the subvolume, the myocardial contours are automatically delineated using difference of Gaussians (DoG) filters and GSV snakes. These contours are used as a mask to identify pathological tissues, such as fibrosis or scar, within the DE-MRI. The presented automated technique is able to accurately delineate the myocardium and identify the pathological tissue in patient sets. The results were validated by two expert cardiologists, and in one set the automated results are quantitatively and qualitatively compared with expert manual delineation. Furthermore, the method is patient-specific, performed on an entire patient MRI series. Thus, in addition to providing a quick analysis of individual MRI scans, the fully automated segmentation method is used for effectively tagging regions in order to reconstruct computerized patient-specific 3D cardiac models. These models can then be used in electrophysiological studies and surgical strategy planning.

  7. A versatile valving toolkit for automating fluidic operations in paper microfluidic devices

    PubMed Central

    Toley, Bhushan J.; Wang, Jessica A.; Gupta, Mayuri; Buser, Joshua R.; Lafleur, Lisa K.; Lutz, Barry R.; Fu, Elain; Yager, Paul

    2015-01-01

    Failure to utilize valving and automation techniques has restricted the complexity of fluidic operations that can be performed in paper microfluidic devices. We developed a toolkit of paper microfluidic valves and methods for automatic valve actuation using movable paper strips and fluid-triggered expanding elements. To the best of our knowledge, this is the first functional demonstration of this valving strategy in paper microfluidics. After introduction of fluids on devices, valves can actuate automatically a) after a certain period of time, or b) after the passage of a certain volume of fluid. Timing of valve actuation can be tuned with greater than 8.5% accuracy by changing lengths of timing wicks, and we present timed on-valves, off-valves, and diversion (channel-switching) valves. The actuators require ~30 μl fluid to actuate and the time required to switch from one state to another ranges from ~5 s for short to ~50s for longer wicks. For volume-metered actuation, the size of a metering pad can be adjusted to tune actuation volume, and we present two methods – both methods can achieve greater than 9% accuracy. Finally, we demonstrate the use of these valves in a device that conducts a multi-step assay for the detection of the malaria protein PfHRP2. Although slightly more complex than devices that do not have moving parts, this valving and automation toolkit considerably expands the capabilities of paper microfluidic devices. Components of this toolkit can be used to conduct arbitrarily complex, multi-step fluidic operations on paper-based devices, as demonstrated in the malaria assay device. PMID:25606810

  8. Automated extraction of DNA from blood and PCR setup using a Tecan Freedom EVO liquid handler for forensic genetic STR typing of reference samples.

    PubMed

    Stangegaard, Michael; Frøslev, Tobias G; Frank-Hansen, Rune; Hansen, Anders J; Morling, Niels

    2011-04-01

    We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO liquid handler mounted with the Te-MagS magnetic separation device (Tecan, Männedorf, Switzerland). The protocols were validated for accredited forensic genetic work according to ISO 17025 using the Qiagen MagAttract DNA Mini M48 kit (Qiagen GmbH, Hilden, Germany) from fresh whole blood and blood from deceased individuals. The workflow was simplified by returning the DNA extracts to the original tubes minimizing the risk of misplacing samples. The tubes that originally contained the samples were washed with MilliQ water before the return of the DNA extracts. The PCR was setup in 96-well microtiter plates. The methods were validated for the kits: AmpFℓSTR Identifiler, SGM Plus and Yfiler (Applied Biosystems, Foster City, CA), GenePrint FFFL and PowerPlex Y (Promega, Madison, WI). The automated protocols allowed for extraction and addition of PCR master mix of 96 samples within 3.5h. In conclusion, we demonstrated that (1) DNA extraction with magnetic beads and (2) PCR setup for accredited, forensic genetic short tandem repeat typing can be implemented on a simple automated liquid handler leading to the reduction of manual work, and increased quality and throughput. Copyright © 2011 Society for Laboratory Automation and Screening. Published by Elsevier Inc. All rights reserved.

  9. Hard-X-Ray-Induced Multistep Ultrafast Dissociation

    NASA Astrophysics Data System (ADS)

    Travnikova, Oksana; Marchenko, Tatiana; Goldsztejn, Gildas; Jänkälä, Kari; Sisourat, Nicolas; Carniato, Stéphane; Guillemin, Renaud; Journel, Loïc; Céolin, Denis; Püttner, Ralph; Iwayama, Hiroshi; Shigemasa, Eiji; Piancastelli, Maria Novella; Simon, Marc

    2016-05-01

    Creation of deep core holes with very short (τ ≤1 fs ) lifetimes triggers a chain of relaxation events leading to extensive nuclear dynamics on a few-femtosecond time scale. Here we demonstrate a general multistep ultrafast dissociation on an example of HCl following Cl 1 s →σ* excitation. Intermediate states with one or multiple holes in the shallower core electron shells are generated in the course of the decay cascades. The repulsive character and large gradients of the potential energy surfaces of these intermediates enable ultrafast fragmentation after the absorption of a hard x-ray photon.

  10. Contaminant source and release history identification in groundwater: A multi-step approach

    NASA Astrophysics Data System (ADS)

    Gzyl, G.; Zanini, A.; Frączek, R.; Kura, K.

    2014-02-01

    The paper presents a new multi-step approach aiming at source identification and release history estimation. The new approach consists of three steps: performing integral pumping tests, identifying sources, and recovering the release history by means of a geostatistical approach. The present paper shows the results obtained from the application of the approach within a complex case study in Poland in which several areal sources were identified. The investigated site is situated in the vicinity of a former chemical plant in southern Poland in the city of Jaworzno in the valley of the Wąwolnica River; the plant has been in operation since the First World War producing various chemicals. From an environmental point of view the most relevant activity was the production of pesticides, especially lindane. The application of the multi-step approach enabled a significant increase in the knowledge of contamination at the site. Some suspected contamination sources have been proven to have minor effect on the overall contamination. Other suspected sources have been proven to have key significance. Some areas not taken into consideration previously have now been identified as key sources. The method also enabled estimation of the magnitude of the sources and, a list of the priority reclamation actions will be drawn as a result. The multi-step approach has proven to be effective and may be applied to other complicated contamination cases. Moreover, the paper shows the capability of the geostatistical approach to manage a complex real case study.

  11. Shadowing effects on multi-step Langmuir probe array on HL-2A tokamak

    NASA Astrophysics Data System (ADS)

    Ke, R.; Xu, M.; Nie, L.; Gao, Z.; Wu, Y.; Yuan, B.; Chen, J.; Song, X.; Yan, L.; Duan, X.

    2018-05-01

    Multi-step Langmuir probe arrays have been designed and installed on the HL-2A tokamak [1]–[2] to study the turbulent transport in the edge plasma, especially for the measurement of poloidal momentum flux, Reynolds stress Rs. However, except the probe tips on the top step, all other tips on lower steps are shadowed by graphite skeleton. It is necessary to estimate the shadowing effects on equilibrium and fluctuation measurement. In this paper, comparison of shadowed tips to unshadowed ones is presented. The results show that shadowing can strongly reduce the ion and electron effective collection area. However, its effect is negligible for the turbulence intensity and coherence measurement, confirming that the multi-step LP array is proper for the turbulent transport measurement.

  12. Multistep Synthesis of a Terphenyl Derivative Showcasing the Diels-Alder Reaction

    ERIC Educational Resources Information Center

    Davie, Elizabeth A. Colby

    2015-01-01

    An adaptable multistep synthesis project designed for the culmination of a second-year organic chemistry laboratory course is described. The target compound is a terphenyl derivative that is an intermediate in the synthesis of compounds used in organic light-emitting devices. Students react a conjugated diene with dimethylacetylene dicarboxylate…

  13. A Multistep Synthesis Incorporating a Green Bromination of an Aromatic Ring

    ERIC Educational Resources Information Center

    Cardinal, Pascal; Greer, Brandon; Luong, Horace; Tyagunova, Yevgeniya

    2012-01-01

    Electrophilic aromatic substitution is a fundamental topic taught in the undergraduate organic chemistry curriculum. A multistep synthesis that includes a safer and greener method for the bromination of an aromatic ring than traditional bromination methods is described. This experiment is multifaceted and can be used to teach students about…

  14. Multistep cascade annihilations of dark matter and the Galactic Center excess

    DOE PAGES

    Elor, Gilly; Rodd, Nicholas L.; Slatyer, Tracy R.

    2015-05-26

    If dark matter is embedded in a non-trivial dark sector, it may annihilate and decay to lighter dark-sector states which subsequently decay to the Standard Model. Such scenarios - with annihilation followed by cascading dark-sector decays - can explain the apparent excess GeV gamma-rays identified in the central Milky Way, while evading bounds from dark matter direct detection experiments. Each 'step' in the cascade will modify the observable signatures of dark matter annihilation and decay, shifting the resulting photons and other final state particles to lower energies and broadening their spectra. We explore, in a model-independent way, the effect ofmore » multi-step dark-sector cascades on the preferred regions of parameter space to explain the GeV excess. We find that the broadening effects of multi-step cascades can admit final states dominated by particles that would usually produce too sharply peaked photon spectra; in general, if the cascades are hierarchical (each particle decays to substantially lighter particles), the preferred mass range for the dark matter is in all cases 20-150 GeV. Decay chains that have nearly-degenerate steps, where the products are close to half the mass of the progenitor, can admit much higher DM masses. We map out the region of mass/cross-section parameter space where cascades (degenerate, hierarchical or a combination) can fit the signal, for a range of final states. In the current paper, we study multi-step cascades in the context of explaining the GeV excess, but many aspects of our results are general and can be extended to other applications.« less

  15. Powerful Voter Selection for Making Multistep Delegate Ballot Fair

    NASA Astrophysics Data System (ADS)

    Yamakawa, Hiroshi

    For decision by majority, each voter often exercises his right by delegating to trustable other voters. Multi-step delegates rule allows indirect delegating through more than one voter, and this helps each voter finding his delegate voters. In this paper, we propose powerful voter selection method depending on the multi-step delegate rule. This method sequentially selects voters who is most delegated indirectly. Multi-agent simulation demonstrate that we can achieve highly fair poll results from small number of vote by using proposed method. Here, fairness is prediction accuracy to sum of all voters preferences for choices. In simulation, each voter selects choices arranged on one dimensional preference axis for voting. Acquaintance relationships among voters were generated as a random network, and each voter delegates some of his acquaintances who has similar preferences. We obtained simulation results from various acquaintance networks, and then averaged these results. Firstly, if each voter has enough acquaintances in average, proposed method can help predicting sum of all voters' preferences of choices from small number of vote. Secondly, if the number of each voter's acquaintances increases corresponding to an increase in the number of voters, prediction accuracy (fairness) from small number of vote can be kept in appropriate level.

  16. Automated design of infrared digital metamaterials by genetic algorithm

    NASA Astrophysics Data System (ADS)

    Sugino, Yuya; Ishikawa, Atsushi; Hayashi, Yasuhiko; Tsuruta, Kenji

    2017-08-01

    We demonstrate automatic design of infrared (IR) metamaterials using a genetic algorithm (GA) and experimentally characterize their IR properties. To implement the automated design scheme of the metamaterial structures, we adopt a digital metamaterial consisting of 7 × 7 Au nano-pixels with an area of 200 nm × 200 nm, and their placements are coded as binary genes in the GA optimization process. The GA combined with three-dimensional (3D) finite element method (FEM) simulation is developed and applied to automatically construct a digital metamaterial to exhibit pronounced plasmonic resonances at the target IR frequencies. Based on the numerical results, the metamaterials are fabricated on a Si substrate over an area of 1 mm × 1 mm by using an EB lithography, Cr/Au (2/20 nm) depositions, and liftoff process. In the FT-IR measurement, pronounced plasmonic responses of each metamaterial are clearly observed near the targeted frequencies, although the synthesized pixel arrangements of the metamaterials are seemingly random. The corresponding numerical simulations reveal the important resonant behavior of each pixel and their hybridized systems. Our approach is fully computer-aided without artificial manipulation, thus paving the way toward the novel device design for next-generation plasmonic device applications.

  17. Region-based multi-step optic disk and cup segmentation from color fundus image

    NASA Astrophysics Data System (ADS)

    Xiao, Di; Lock, Jane; Manresa, Javier Moreno; Vignarajan, Janardhan; Tay-Kearney, Mei-Ling; Kanagasingam, Yogesan

    2013-02-01

    Retinal optic cup-disk-ratio (CDR) is a one of important indicators of glaucomatous neuropathy. In this paper, we propose a novel multi-step 4-quadrant thresholding method for optic disk segmentation and a multi-step temporal-nasal segmenting method for optic cup segmentation based on blood vessel inpainted HSL lightness images and green images. The performance of the proposed methods was evaluated on a group of color fundus images and compared with the manual outlining results from two experts. Dice scores of detected disk and cup regions between the auto and manual results were computed and compared. Vertical CDRs were also compared among the three results. The preliminary experiment has demonstrated the robustness of the method for automatic optic disk and cup segmentation and its potential value for clinical application.

  18. Multi-step splicing of sphingomyelin synthase linear and circular RNAs.

    PubMed

    Filippenkov, Ivan B; Sudarkina, Olga Yu; Limborska, Svetlana A; Dergunova, Lyudmila V

    2018-05-15

    The SGMS1 gene encodes the enzyme sphingomyelin synthase 1 (SMS1), which is involved in the regulation of lipid metabolism, apoptosis, intracellular vesicular transport and other significant processes. The SGMS1 gene is located on chromosome 10 and has a size of 320 kb. Previously, we showed that dozens of alternative transcripts of the SGMS1 gene are present in various human tissues. In addition to mRNAs that provide synthesis of the SMS1 protein, this gene participates in the synthesis of non-coding transcripts, including circular RNAs (circRNAs), which include exons of the 5'-untranslated region (5'-UTR) and are highly represented in the brain. In this study, using the high-throughput technology RNA-CaptureSeq, many new SGMS1 transcripts were identified, including both intronic unspliced RNAs (premature RNAs) and RNAs formed via alternative splicing. Recursive exons (RS-exons) that can participate in the multi-step splicing of long introns of the gene were also identified. These exons participate in the formation of circRNAs. Thus, multi-step splicing may provide a variety of linear and circular RNAs of eukaryotic genes in tissues. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Implementing a Structured Reporting Initiative Using a Collaborative Multistep Approach.

    PubMed

    Goldberg-Stein, Shlomit; Walter, William R; Amis, E Stephen; Scheinfeld, Meir H

    To describe the successful implementation of a structured reporting initiative in a large urban academic radiology department. We describe our process, compromises, and top 10 lessons learned in overhauling traditional reporting practices and comprehensively implementing structured reporting at our institution. To achieve our goals, we took deliberate steps toward consensus building, undertook multistep template refinement, and achieved close collaboration with the technical staff, department coders, and hospital information technologists. Following institutional review board exemption, we audited radiologist compliance by evaluating 100 consecutive cases of 12 common examination types. Fisher exact test was applied to determine significance of association between trainee initial report drafting and template compliance. We produced and implemented structured reporting templates for 95% of all departmental computed tomography, magnetic resonance, and ultrasound examinations. Structured templates include specialized reports adhering to the American College of Radiology's Reporting and Data Systems (ACR's RADS) recommendations (eg, Lung-RADS and Li-RADS). We attained 94% radiologist compliance within 2 years, without any financial incentives. We provide a blueprint of how to successfully achieve structured reporting using a collaborative multistep approach. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Multi-objective optimization of process parameters of multi-step shaft formed with cross wedge rolling based on orthogonal test

    NASA Astrophysics Data System (ADS)

    Han, S. T.; Shu, X. D.; Shchukin, V.; Kozhevnikova, G.

    2018-06-01

    In order to achieve reasonable process parameters in forming multi-step shaft by cross wedge rolling, the research studied the rolling-forming process multi-step shaft on the DEFORM-3D finite element software. The interactive orthogonal experiment was used to study the effect of the eight parameters, the first section shrinkage rate φ1, the first forming angle α1, the first spreading angle β1, the first spreading length L1, the second section shrinkage rate φ2, the second forming angle α2, the second spreading angle β2 and the second spreading length L2, on the quality of shaft end and the microstructure uniformity. By using the fuzzy mathematics comprehensive evaluation method and the extreme difference analysis, the influence degree of the process parameters on the quality of the multi-step shaft is obtained: β2>φ2L1>α1>β1>φ1>α2L2. The results of the study can provide guidance for obtaining multi-stepped shaft with high mechanical properties and achieving near net forming without stub bar in cross wedge rolling.

  1. Disposable and removable nucleic acid extraction and purification cartridges for automated flow-through systems

    DOEpatents

    Regan, John Frederick

    2014-09-09

    Removable cartridges are used on automated flow-through systems for the purpose of extracting and purifying genetic material from complex matrices. Different types of cartridges are paired with specific automated protocols to concentrate, extract, and purifying pathogenic or human genetic material. Their flow-through nature allows large quantities sample to be processed. Matrices may be filtered using size exclusion and/or affinity filters to concentrate the pathogen of interest. Lysed material is ultimately passed through a filter to remove the insoluble material before the soluble genetic material is delivered past a silica-like membrane that binds the genetic material, where it is washed, dried, and eluted. Cartridges are inserted into the housing areas of flow-through automated instruments, which are equipped with sensors to ensure proper placement and usage of the cartridges. Properly inserted cartridges create fluid- and air-tight seals with the flow lines of an automated instrument.

  2. Use of Chiral Oxazolidinones for a Multi-Step Synthetic Laboratory Module

    ERIC Educational Resources Information Center

    Betush, Matthew P.; Murphree, S. Shaun

    2009-01-01

    Chiral oxazolidinone chemistry is used as a framework for an advanced multi-step synthesis lab. The cost-effective and robust preparation of chiral starting materials is presented, as well as the use of chiral auxiliaries in a synthesis scheme that is appropriate for students currently in the second semester of the organic sequence. (Contains 1…

  3. Controlled growth of silica-titania hybrid functional nanoparticles through a multistep microfluidic approach.

    PubMed

    Shiba, K; Sugiyama, T; Takei, T; Yoshikawa, G

    2015-11-11

    Silica/titania-based functional nanoparticles were prepared through controlled nucleation of titania and subsequent encapsulation by silica through a multistep microfluidic approach, which was successfully applied to obtaining aminopropyl-functionalized silica/titania nanoparticles for a highly sensitive humidity sensor.

  4. Optimal generalized multistep integration formulae for real-time digital simulation

    NASA Technical Reports Server (NTRS)

    Moerder, D. D.; Halyo, N.

    1985-01-01

    The problem of discretizing a dynamical system for real-time digital simulation is considered. Treating the system and its simulation as stochastic processes leads to a statistical characterization of simulator fidelity. A plant discretization procedure based on an efficient matrix generalization of explicit linear multistep discrete integration formulae is introduced, which minimizes a weighted sum of the mean squared steady-state and transient error between the system and simulator outputs.

  5. Surface Modified Particles By Multi-Step Michael-Type Addition And Process For The Preparation Thereof

    DOEpatents

    Cook, Ronald Lee; Elliott, Brian John; Luebben, Silvia DeVito; Myers, Andrew William; Smith, Bryan Matthew

    2005-05-03

    A new class of surface modified particles and a multi-step Michael-type addition surface modification process for the preparation of the same is provided. The multi-step Michael-type addition surface modification process involves two or more reactions to compatibilize particles with various host systems and/or to provide the particles with particular chemical reactivities. The initial step comprises the attachment of a small organic compound to the surface of the inorganic particle. The subsequent steps attach additional compounds to the previously attached organic compounds through reactive organic linking groups. Specifically, these reactive groups are activated carbon—carbon pi bonds and carbon and non-carbon nucleophiles that react via Michael or Michael-type additions.

  6. Biocatalyzed Regioselective Synthesis in Undergraduate Organic Laboratories: Multistep Synthesis of 2-Arachidonoylglycerol

    ERIC Educational Resources Information Center

    Johnston, Meghan R.; Makriyannis, Alexandros; Whitten, Kyle M.; Drew, Olivia C.; Best, Fiona A.

    2016-01-01

    In order to introduce the concepts of biocatalysis and its utility in synthesis to organic chemistry students, a multistep synthesis of endogenous cannabinergic ligand 2-arachidonoylglycerol (2-AG) was tailored for use as a laboratory exercise. Over four weeks, students successfully produced 2-AG, purifying and characterizing products at each…

  7. Controlled droplet microfluidic systems for multistep chemical and biological assays.

    PubMed

    Kaminski, T S; Garstecki, P

    2017-10-16

    Droplet microfluidics is a relatively new and rapidly evolving field of science focused on studying the hydrodynamics and properties of biphasic flows at the microscale, and on the development of systems for practical applications in chemistry, biology and materials science. Microdroplets present several unique characteristics of interest to a broader research community. The main distinguishing features include (i) large numbers of isolated compartments of tiny volumes that are ideal for single cell or single molecule assays, (ii) rapid mixing and negligible thermal inertia that all provide excellent control over reaction conditions, and (iii) the presence of two immiscible liquids and the interface between them that enables new or exotic processes (the synthesis of new functional materials and structures that are otherwise difficult to obtain, studies of the functions and properties of lipid and polymer membranes and execution of reactions at liquid-liquid interfaces). The most frequent application of droplet microfluidics relies on the generation of large numbers of compartments either for ultrahigh throughput screens or for the synthesis of functional materials composed of millions of droplets or particles. Droplet microfluidics has already evolved into a complex field. In this review we focus on 'controlled droplet microfluidics' - a portfolio of techniques that provide convenient platforms for multistep complex reaction protocols and that take advantage of automated and passive methods of fluid handling on a chip. 'Controlled droplet microfluidics' can be regarded as a group of methods capable of addressing and manipulating droplets in series. The functionality and complexity of controlled droplet microfluidic systems can be positioned between digital microfluidics (DMF) addressing each droplet individually using 2D arrays of electrodes and ultrahigh throughput droplet microfluidics focused on the generation of hundreds of thousands or even millions of

  8. Method to Improve Indium Bump Bonding via Indium Oxide Removal Using a Multi-Step Plasma Process

    NASA Technical Reports Server (NTRS)

    Dickie, Matthew R. (Inventor); Nikzad, Shouleh (Inventor); Greer, H. Frank (Inventor); Jones, Todd J. (Inventor); Vasquez, Richard P. (Inventor); Hoenk, Michael E. (Inventor)

    2012-01-01

    A process for removing indium oxide from indium bumps in a flip-chip structure to reduce contact resistance, by a multi-step plasma treatment. A first plasma treatment of the indium bumps with an argon, methane and hydrogen plasma reduces indium oxide, and a second plasma treatment with an argon and hydrogen plasma removes residual organics. The multi-step plasma process for removing indium oxide from the indium bumps is more effective in reducing the oxide, and yet does not require the use of halogens, does not change the bump morphology, does not attack the bond pad material or under-bump metallization layers, and creates no new mechanisms for open circuits.

  9. Multi-Step Time Series Forecasting with an Ensemble of Varied Length Mixture Models.

    PubMed

    Ouyang, Yicun; Yin, Hujun

    2018-05-01

    Many real-world problems require modeling and forecasting of time series, such as weather temperature, electricity demand, stock prices and foreign exchange (FX) rates. Often, the tasks involve predicting over a long-term period, e.g. several weeks or months. Most existing time series models are inheritably for one-step prediction, that is, predicting one time point ahead. Multi-step or long-term prediction is difficult and challenging due to the lack of information and uncertainty or error accumulation. The main existing approaches, iterative and independent, either use one-step model recursively or treat the multi-step task as an independent model. They generally perform poorly in practical applications. In this paper, as an extension of the self-organizing mixture autoregressive (AR) model, the varied length mixture (VLM) models are proposed to model and forecast time series over multi-steps. The key idea is to preserve the dependencies between the time points within the prediction horizon. Training data are segmented to various lengths corresponding to various forecasting horizons, and the VLM models are trained in a self-organizing fashion on these segments to capture these dependencies in its component AR models of various predicting horizons. The VLM models form a probabilistic mixture of these varied length models. A combination of short and long VLM models and an ensemble of them are proposed to further enhance the prediction performance. The effectiveness of the proposed methods and their marked improvements over the existing methods are demonstrated through a number of experiments on synthetic data, real-world FX rates and weather temperatures.

  10. Rapid access to compound libraries through flow technology: fully automated synthesis of a 3-aminoindolizine library via orthogonal diversification.

    PubMed

    Lange, Paul P; James, Keith

    2012-10-08

    A novel methodology for the synthesis of druglike heterocycle libraries has been developed through the use of flow reactor technology. The strategy employs orthogonal modification of a heterocyclic core, which is generated in situ, and was used to construct both a 25-membered library of druglike 3-aminoindolizines, and selected examples of a 100-member virtual library. This general protocol allows a broad range of acylation, alkylation and sulfonamidation reactions to be performed in conjunction with a tandem Sonogashira coupling/cycloisomerization sequence. All three synthetic steps were conducted under full automation in the flow reactor, with no handling or isolation of intermediates, to afford the desired products in good yields. This fully automated, multistep flow approach opens the way to highly efficient generation of druglike heterocyclic systems as part of a lead discovery strategy or within a lead optimization program.

  11. Microfluidic large-scale integration: the evolution of design rules for biological automation.

    PubMed

    Melin, Jessica; Quake, Stephen R

    2007-01-01

    Microfluidic large-scale integration (mLSI) refers to the development of microfluidic chips with thousands of integrated micromechanical valves and control components. This technology is utilized in many areas of biology and chemistry and is a candidate to replace today's conventional automation paradigm, which consists of fluid-handling robots. We review the basic development of mLSI and then discuss design principles of mLSI to assess the capabilities and limitations of the current state of the art and to facilitate the application of mLSI to areas of biology. Many design and practical issues, including economies of scale, parallelization strategies, multiplexing, and multistep biochemical processing, are discussed. Several microfluidic components used as building blocks to create effective, complex, and highly integrated microfluidic networks are also highlighted.

  12. Model predictive control design for polytopic uncertain systems by synthesising multi-step prediction scenarios

    NASA Astrophysics Data System (ADS)

    Lu, Jianbo; Xi, Yugeng; Li, Dewei; Xu, Yuli; Gan, Zhongxue

    2018-01-01

    A common objective of model predictive control (MPC) design is the large initial feasible region, low online computational burden as well as satisfactory control performance of the resulting algorithm. It is well known that interpolation-based MPC can achieve a favourable trade-off among these different aspects. However, the existing results are usually based on fixed prediction scenarios, which inevitably limits the performance of the obtained algorithms. So by replacing the fixed prediction scenarios with the time-varying multi-step prediction scenarios, this paper provides a new insight into improvement of the existing MPC designs. The adopted control law is a combination of predetermined multi-step feedback control laws, based on which two MPC algorithms with guaranteed recursive feasibility and asymptotic stability are presented. The efficacy of the proposed algorithms is illustrated by a numerical example.

  13. Impact of user influence on information multi-step communication in a micro-blog

    NASA Astrophysics Data System (ADS)

    Wu, Yue; Hu, Yong; He, Xiao-Hai; Deng, Ken

    2014-06-01

    User influence is generally considered as one of the most critical factors that affect information cascading spreading. Based on this common assumption, this paper proposes a theoretical model to examine user influence on the information multi-step communication in a micro-blog. The multi-steps of information communication are divided into first-step and non-first-step, and user influence is classified into five dimensions. Actual data from the Sina micro-blog is collected to construct the model by means of an approach based on structural equations that uses the Partial Least Squares (PLS) technique. Our experimental results indicate that the dimensions of the number of fans and their authority significantly impact the information of first-step communication. Leader rank has a positive impact on both first-step and non-first-step communication. Moreover, global centrality and weight of friends are positively related to the information non-first-step communication, but authority is found to have much less relation to it.

  14. Automating the packing heuristic design process with genetic programming.

    PubMed

    Burke, Edmund K; Hyde, Matthew R; Kendall, Graham; Woodward, John

    2012-01-01

    The literature shows that one-, two-, and three-dimensional bin packing and knapsack packing are difficult problems in operational research. Many techniques, including exact, heuristic, and metaheuristic approaches, have been investigated to solve these problems and it is often not clear which method to use when presented with a new instance. This paper presents an approach which is motivated by the goal of building computer systems which can design heuristic methods. The overall aim is to explore the possibilities for automating the heuristic design process. We present a genetic programming system to automatically generate a good quality heuristic for each instance. It is not necessary to change the methodology depending on the problem type (one-, two-, or three-dimensional knapsack and bin packing problems), and it therefore has a level of generality unmatched by other systems in the literature. We carry out an extensive suite of experiments and compare with the best human designed heuristics in the literature. Note that our heuristic design methodology uses the same parameters for all the experiments. The contribution of this paper is to present a more general packing methodology than those currently available, and to show that, by using this methodology, it is possible for a computer system to design heuristics which are competitive with the human designed heuristics from the literature. This represents the first packing algorithm in the literature able to claim human competitive results in such a wide variety of packing domains.

  15. A Multistep Synthesis Featuring Classic Carbonyl Chemistry for the Advanced Organic Chemistry Laboratory

    ERIC Educational Resources Information Center

    Duff, David B.; Abbe, Tyler G.; Goess, Brian C.

    2012-01-01

    A multistep synthesis of 5-isopropyl-1,3-cyclohexanedione is carried out from three commodity chemicals. The sequence involves an aldol condensation, Dieckmann-type annulation, ester hydrolysis, and decarboxylation. No purification is required until after the final step, at which point gravity column chromatography provides the desired product in…

  16. Automated recognition of the pericardium contour on processed CT images using genetic algorithms.

    PubMed

    Rodrigues, É O; Rodrigues, L O; Oliveira, L S N; Conci, A; Liatsis, P

    2017-08-01

    This work proposes the use of Genetic Algorithms (GA) in tracing and recognizing the pericardium contour of the human heart using Computed Tomography (CT) images. We assume that each slice of the pericardium can be modelled by an ellipse, the parameters of which need to be optimally determined. An optimal ellipse would be one that closely follows the pericardium contour and, consequently, separates appropriately the epicardial and mediastinal fats of the human heart. Tracing and automatically identifying the pericardium contour aids in medical diagnosis. Usually, this process is done manually or not done at all due to the effort required. Besides, detecting the pericardium may improve previously proposed automated methodologies that separate the two types of fat associated to the human heart. Quantification of these fats provides important health risk marker information, as they are associated with the development of certain cardiovascular pathologies. Finally, we conclude that GA offers satisfiable solutions in a feasible amount of processing time. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. On the Development of Multi-Step Inverse FEM with Shell Model

    NASA Astrophysics Data System (ADS)

    Huang, Y.; Du, R.

    2005-08-01

    The inverse or one-step finite element approach is increasingly used in the sheet metal stamping industry to predict strain distribution and the initial blank shape in the preliminary design stage. Based on the existing theory, there are two types of method: one is based on the principle of virtual work and the other is based on the principle of extreme work. Much research has been conducted to improve the accuracy of simulation results. For example, based on the virtual work principle, Batoz et al. developed a new method using triangular DKT shell elements. In this new method, the bending and unbending effects are considered. Based on the principle of extreme work, Majlessi and et al. proposed the multi-step inverse approach with membrane elements and applied it to an axis-symmetric part. Lee and et al. presented an axis-symmetric shell element model to solve the similar problem. In this paper, a new multi-step inverse method is introduced with no limitation on the workpiece shape. It is a shell element model based on the virtual work principle. The new method is validated by means of comparing to the commercial software system (PAMSTAMP®). The comparison results indicate that the accuracy is good.

  18. Dynamical genetic programming in XCSF.

    PubMed

    Preen, Richard J; Bull, Larry

    2013-01-01

    A number of representation schemes have been presented for use within learning classifier systems, ranging from binary encodings to artificial neural networks. This paper presents results from an investigation into using a temporally dynamic symbolic representation within the XCSF learning classifier system. In particular, dynamical arithmetic networks are used to represent the traditional condition-action production system rules to solve continuous-valued reinforcement learning problems and to perform symbolic regression, finding competitive performance with traditional genetic programming on a number of composite polynomial tasks. In addition, the network outputs are later repeatedly sampled at varying temporal intervals to perform multistep-ahead predictions of a financial time series.

  19. Formation of Stone-Wales edge: Multistep reconstruction and growth mechanisms of zigzag nanographene.

    PubMed

    Dang, Jing-Shuang; Wang, Wei-Wei; Zheng, Jia-Jia; Nagase, Shigeru; Zhao, Xiang

    2017-10-05

    Although the existence of Stone-Wales (5-7) defect at graphene edge has been clarified experimentally, theoretical study on the formation mechanism is still imperfect. In particular, the regioselectivity of multistep reactions at edge (self-reconstruction and growth with foreign carbon feedstock) is essential to understand the kinetic behavior of reactive boundaries but investigations are still lacking. Herein, by using finite-sized models, multistep reconstructions and carbon dimer additions of a bared zigzag edge are introduced using density functional theory calculations. The zigzag to 5-7 transformation is proved as a site-selective process to generate alternating 5-7 pairs sequentially and the first step with largest barrier is suggested as the rate-determining step. Conversely, successive C 2 insertions on the active edge are calculated to elucidate the formation of 5-7 edge during graphene growth. A metastable intermediate with a triple sequentially fused pentagon fragment is proved as the key structure for 5-7 edge formation. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  20. Adaptation to Vocal Expressions Reveals Multistep Perception of Auditory Emotion

    PubMed Central

    Maurage, Pierre; Rouger, Julien; Latinus, Marianne; Belin, Pascal

    2014-01-01

    The human voice carries speech as well as important nonlinguistic signals that influence our social interactions. Among these cues that impact our behavior and communication with other people is the perceived emotional state of the speaker. A theoretical framework for the neural processing stages of emotional prosody has suggested that auditory emotion is perceived in multiple steps (Schirmer and Kotz, 2006) involving low-level auditory analysis and integration of the acoustic information followed by higher-level cognition. Empirical evidence for this multistep processing chain, however, is still sparse. We examined this question using functional magnetic resonance imaging and a continuous carry-over design (Aguirre, 2007) to measure brain activity while volunteers listened to non-speech-affective vocalizations morphed on a continuum between anger and fear. Analyses dissociated neuronal adaptation effects induced by similarity in perceived emotional content between consecutive stimuli from those induced by their acoustic similarity. We found that bilateral voice-sensitive auditory regions as well as right amygdala coded the physical difference between consecutive stimuli. In contrast, activity in bilateral anterior insulae, medial superior frontal cortex, precuneus, and subcortical regions such as bilateral hippocampi depended predominantly on the perceptual difference between morphs. Our results suggest that the processing of vocal affect recognition is a multistep process involving largely distinct neural networks. Amygdala and auditory areas predominantly code emotion-related acoustic information while more anterior insular and prefrontal regions respond to the abstract, cognitive representation of vocal affect. PMID:24920615

  1. Adaptation to vocal expressions reveals multistep perception of auditory emotion.

    PubMed

    Bestelmeyer, Patricia E G; Maurage, Pierre; Rouger, Julien; Latinus, Marianne; Belin, Pascal

    2014-06-11

    The human voice carries speech as well as important nonlinguistic signals that influence our social interactions. Among these cues that impact our behavior and communication with other people is the perceived emotional state of the speaker. A theoretical framework for the neural processing stages of emotional prosody has suggested that auditory emotion is perceived in multiple steps (Schirmer and Kotz, 2006) involving low-level auditory analysis and integration of the acoustic information followed by higher-level cognition. Empirical evidence for this multistep processing chain, however, is still sparse. We examined this question using functional magnetic resonance imaging and a continuous carry-over design (Aguirre, 2007) to measure brain activity while volunteers listened to non-speech-affective vocalizations morphed on a continuum between anger and fear. Analyses dissociated neuronal adaptation effects induced by similarity in perceived emotional content between consecutive stimuli from those induced by their acoustic similarity. We found that bilateral voice-sensitive auditory regions as well as right amygdala coded the physical difference between consecutive stimuli. In contrast, activity in bilateral anterior insulae, medial superior frontal cortex, precuneus, and subcortical regions such as bilateral hippocampi depended predominantly on the perceptual difference between morphs. Our results suggest that the processing of vocal affect recognition is a multistep process involving largely distinct neural networks. Amygdala and auditory areas predominantly code emotion-related acoustic information while more anterior insular and prefrontal regions respond to the abstract, cognitive representation of vocal affect. Copyright © 2014 Bestelmeyer et al.

  2. Automated multiplex genome-scale engineering in yeast

    PubMed Central

    Si, Tong; Chao, Ran; Min, Yuhao; Wu, Yuying; Ren, Wen; Zhao, Huimin

    2017-01-01

    Genome-scale engineering is indispensable in understanding and engineering microorganisms, but the current tools are mainly limited to bacterial systems. Here we report an automated platform for multiplex genome-scale engineering in Saccharomyces cerevisiae, an important eukaryotic model and widely used microbial cell factory. Standardized genetic parts encoding overexpression and knockdown mutations of >90% yeast genes are created in a single step from a full-length cDNA library. With the aid of CRISPR-Cas, these genetic parts are iteratively integrated into the repetitive genomic sequences in a modular manner using robotic automation. This system allows functional mapping and multiplex optimization on a genome scale for diverse phenotypes including cellulase expression, isobutanol production, glycerol utilization and acetic acid tolerance, and may greatly accelerate future genome-scale engineering endeavours in yeast. PMID:28469255

  3. Round-off error in long-term orbital integrations using multistep methods

    NASA Technical Reports Server (NTRS)

    Quinlan, Gerald D.

    1994-01-01

    Techniques for reducing roundoff error are compared by testing them on high-order Stormer and summetric multistep methods. The best technique for most applications is to write the equation in summed, function-evaluation form and to store the coefficients as rational numbers. A larger error reduction can be achieved by writing the equation in backward-difference form and performing some of the additions in extended precision, but this entails a larger central processing unit (cpu) cost.

  4. PaR-PaR Laboratory Automation Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linshiz, G; Stawski, N; Poust, S

    2013-05-01

    Labor-intensive multistep biological tasks, such as the construction and cloning of DNA molecules, are prime candidates for laboratory automation. Flexible and biology-friendly operation of robotic equipment is key to its successful integration in biological laboratories, and the efforts required to operate a robot must be much smaller than the alternative manual lab work. To achieve these goals, a simple high-level biology-friendly robot programming language is needed. We have developed and experimentally validated such a language: Programming a Robot (PaR-PaR). The syntax and compiler for the language are based on computer science principles and a deep understanding of biological workflows. PaR-PaRmore » allows researchers to use liquid-handling robots effectively, enabling experiments that would not have been considered previously. After minimal training, a biologist can independently write complicated protocols for a robot within an hour. Adoption of PaR-PaR as a standard cross-platform language would enable hand-written or software-generated robotic protocols to be shared across laboratories.« less

  5. PaR-PaR laboratory automation platform.

    PubMed

    Linshiz, Gregory; Stawski, Nina; Poust, Sean; Bi, Changhao; Keasling, Jay D; Hillson, Nathan J

    2013-05-17

    Labor-intensive multistep biological tasks, such as the construction and cloning of DNA molecules, are prime candidates for laboratory automation. Flexible and biology-friendly operation of robotic equipment is key to its successful integration in biological laboratories, and the efforts required to operate a robot must be much smaller than the alternative manual lab work. To achieve these goals, a simple high-level biology-friendly robot programming language is needed. We have developed and experimentally validated such a language: Programming a Robot (PaR-PaR). The syntax and compiler for the language are based on computer science principles and a deep understanding of biological workflows. PaR-PaR allows researchers to use liquid-handling robots effectively, enabling experiments that would not have been considered previously. After minimal training, a biologist can independently write complicated protocols for a robot within an hour. Adoption of PaR-PaR as a standard cross-platform language would enable hand-written or software-generated robotic protocols to be shared across laboratories.

  6. Evaluation of accuracy in implant site preparation performed in single- or multi-step drilling procedures.

    PubMed

    Marheineke, Nadine; Scherer, Uta; Rücker, Martin; von See, Constantin; Rahlf, Björn; Gellrich, Nils-Claudius; Stoetzer, Marcus

    2018-06-01

    Dental implant failure and insufficient osseointegration are proven results of mechanical and thermal damage during the surgery process. We herein performed a comparative study of a less invasive single-step drilling preparation protocol and a conventional multiple drilling sequence. Accuracy of drilling holes was precisely analyzed and the influence of different levels of expertise of the handlers and additional use of drill template guidance was evaluated. Six experimental groups, deployed in an osseous study model, were representing template-guided and freehanded drilling actions in a stepwise drilling procedure in comparison to a single-drill protocol. Each experimental condition was studied by the drilling actions of respectively three persons without surgical knowledge as well as three highly experienced oral surgeons. Drilling actions were performed and diameters were recorded with a precision measuring instrument. Less experienced operators were able to significantly increase the drilling accuracy using a guiding template, especially when multi-step preparations are performed. Improved accuracy without template guidance was observed when experienced operators were executing single-step versus multi-step technique. Single-step drilling protocols have shown to produce more accurate results than multi-step procedures. The outcome of any protocol can be further improved by use of guiding templates. Operator experience can be a contributing factor. Single-step preparations are less invasive and are promoting osseointegration. Even highly experienced surgeons are achieving higher levels of accuracy by combining this technique with template guidance. Hereby template guidance enables a reduction of hands-on time and side effects during surgery and lead to a more predictable clinical diameter.

  7. Multistep Model of Cervical Cancer: Participation of miRNAs and Coding Genes

    PubMed Central

    López, Angelica Judith Granados; López, Jesús Adrián

    2014-01-01

    Aberrant miRNA expression is well recognized as an important step in the development of cancer. Close to 70 microRNAs (miRNAs) have been implicated in cervical cancer up to now, nevertheless it is unknown if aberrant miRNA expression causes the onset of cervical cancer. One of the best ways to address this issue is through a multistep model of carcinogenesis. In the progression of cervical cancer there are three well-established steps to reach cancer that we used in the model proposed here. The first step of the model comprises the gene changes that occur in normal cells to be transformed into immortal cells (CIN 1), the second comprises immortal cell changes to tumorigenic cells (CIN 2), the third step includes cell changes to increase tumorigenic capacity (CIN 3), and the final step covers tumorigenic changes to carcinogenic cells. Altered miRNAs and their target genes are located in each one of the four steps of the multistep model of carcinogenesis. miRNA expression has shown discrepancies in different works; therefore, in this model we include miRNAs recording similar results in at least two studies. The present model is a useful insight into studying potential prognostic, diagnostic, and therapeutic miRNAs. PMID:25192291

  8. Microcomputer-Based Genetics Office Database System

    PubMed Central

    Cutts, James H.; Mitchell, Joyce A.

    1985-01-01

    A database management system (Genetics Office Automation System, GOAS) has been developed for the Medical Genetics Unit of the University of Missouri. The system, which records patients' visits to the Unit's genetic and prenatal clinics, has been implemented on an IBM PC/XT microcomputer. A description of the system, the reasons for implementation, its databases, and uses are presented.

  9. Automated tetraploid genotype calling by hierarchical clustering

    USDA-ARS?s Scientific Manuscript database

    SNP arrays are transforming breeding and genetics research for autotetraploids. To fully utilize these arrays, however, the relationship between signal intensity and allele dosage must be inferred independently for each marker. We developed an improved computational method to automate this process, ...

  10. A transition from using multi-step procedures to a fully integrated system for performing extracorporeal photopheresis: A comparison of costs and efficiencies.

    PubMed

    Azar, Nabih; Leblond, Veronique; Ouzegdouh, Maya; Button, Paul

    2017-12-01

    The Pitié Salpêtrière Hospital Hemobiotherapy Department, Paris, France, has been providing extracorporeal photopheresis (ECP) since November 2011, and started using the Therakos ® CELLEX ® fully integrated system in 2012. This report summarizes our single-center experience of transitioning from the use of multi-step ECP procedures to the fully integrated ECP system, considering the capacity and cost implications. The total number of ECP procedures performed 2011-2015 was derived from department records. The time taken to complete a single ECP treatment using a multi-step technique and the fully integrated system at our department was assessed. Resource costs (2014€) were obtained for materials and calculated for personnel time required. Time-driven activity-based costing methods were applied to provide a cost comparison. The number of ECP treatments per year increased from 225 (2012) to 727 (2015). The single multi-step procedure took 270 min compared to 120 min for the fully integrated system. The total calculated per-session cost of performing ECP using the multi-step procedure was greater than with the CELLEX ® system (€1,429.37 and €1,264.70 per treatment, respectively). For hospitals considering a transition from multi-step procedures to fully integrated methods for ECP where cost may be a barrier, time-driven activity-based costing should be utilized to gain a more comprehensive understanding the full benefit that such a transition offers. The example from our department confirmed that there were not just cost and time savings, but that the time efficiencies gained with CELLEX ® allow for more patient treatments per year. © 2017 The Authors Journal of Clinical Apheresis Published by Wiley Periodicals, Inc.

  11. Automated measurement of zebrafish larval movement

    PubMed Central

    Cario, Clinton L; Farrell, Thomas C; Milanese, Chiara; Burton, Edward A

    2011-01-01

    Abstract The zebrafish is a powerful vertebrate model that is readily amenable to genetic, pharmacological and environmental manipulations to elucidate the molecular and cellular basis of movement and behaviour. We report software enabling automated analysis of zebrafish movement from video recordings captured with cameras ranging from a basic camcorder to more specialized equipment. The software, which is provided as open-source MATLAB functions, can be freely modified and distributed, and is compatible with multiwell plates under a wide range of experimental conditions. Automated measurement of zebrafish movement using this technique will be useful for multiple applications in neuroscience, pharmacology and neuropsychiatry. PMID:21646414

  12. Automated measurement of zebrafish larval movement.

    PubMed

    Cario, Clinton L; Farrell, Thomas C; Milanese, Chiara; Burton, Edward A

    2011-08-01

    The zebrafish is a powerful vertebrate model that is readily amenable to genetic, pharmacological and environmental manipulations to elucidate the molecular and cellular basis of movement and behaviour. We report software enabling automated analysis of zebrafish movement from video recordings captured with cameras ranging from a basic camcorder to more specialized equipment. The software, which is provided as open-source MATLAB functions, can be freely modified and distributed, and is compatible with multiwell plates under a wide range of experimental conditions. Automated measurement of zebrafish movement using this technique will be useful for multiple applications in neuroscience, pharmacology and neuropsychiatry.

  13. Grouped and Multistep Nanoheteroepitaxy: Toward High-Quality GaN on Quasi-Periodic Nano-Mask.

    PubMed

    Feng, Xiaohui; Yu, Tongjun; Wei, Yang; Ji, Cheng; Cheng, Yutian; Zong, Hua; Wang, Kun; Yang, Zhijian; Kang, Xiangning; Zhang, Guoyi; Fan, Shoushan

    2016-07-20

    A novel nanoheteroepitaxy method, namely, the grouped and multistep nanoheteroepitaxy (GM-NHE), is proposed to attain a high-quality gallium nitride (GaN) epilayer by metal-organic vapor phase epitaxy. This method combines the effects of sub-100 nm nucleation and multistep lateral growth by using a low-cost but unique carbon nanotube mask, which consists of nanoscale growth windows with a quasi-periodic 2D fill factor. It is found that GM-NHE can facilely reduce threading dislocation density (TDD) and modulate residual stress on foreign substrate without any regrowth. As a result, high-quality GaN epilayer is produced with homogeneously low TDD of 4.51 × 10(7) cm(-2) and 2D-modulated stress, and the performance of the subsequent 410 nm near-ultraviolet light-emitting diode is greatly boosted. In this way, with the facile fabrication of nanomask and the one-off epitaxy procedure, GaN epilayer is prominently improved with the assistance of nanotechnology, which demonstrates great application potential for high-efficiency TDD-sensitive optoelectronic and electronic devices.

  14. Fully automated analysis of multi-resolution four-channel micro-array genotyping data

    NASA Astrophysics Data System (ADS)

    Abbaspour, Mohsen; Abugharbieh, Rafeef; Podder, Mohua; Tebbutt, Scott J.

    2006-03-01

    We present a fully-automated and robust microarray image analysis system for handling multi-resolution images (down to 3-micron with sizes up to 80 MBs per channel). The system is developed to provide rapid and accurate data extraction for our recently developed microarray analysis and quality control tool (SNP Chart). Currently available commercial microarray image analysis applications are inefficient, due to the considerable user interaction typically required. Four-channel DNA microarray technology is a robust and accurate tool for determining genotypes of multiple genetic markers in individuals. It plays an important role in the state of the art trend where traditional medical treatments are to be replaced by personalized genetic medicine, i.e. individualized therapy based on the patient's genetic heritage. However, fast, robust, and precise image processing tools are required for the prospective practical use of microarray-based genetic testing for predicting disease susceptibilities and drug effects in clinical practice, which require a turn-around timeline compatible with clinical decision-making. In this paper we have developed a fully-automated image analysis platform for the rapid investigation of hundreds of genetic variations across multiple genes. Validation tests indicate very high accuracy levels for genotyping results. Our method achieves a significant reduction in analysis time, from several hours to just a few minutes, and is completely automated requiring no manual interaction or guidance.

  15. Molecular genetics of chronic neutrophilic leukemia, chronic myelomonocytic leukemia and atypical chronic myeloid leukemia.

    PubMed

    Li, Bing; Gale, Robert Peter; Xiao, Zhijian

    2014-12-12

    According to the 2008 World Health Organization classification, chronic neutrophilic leukemia, chronic myelomonocytic leukemia and atypical chronic myeloid leukemia are rare diseases. The remarkable progress in our understanding of the molecular genetics of myeloproliferative neoplasms and myelodysplastic/myeloproliferative neoplasms has made it clear that there are some specific genetic abnormalities in these 3 rare diseases. At the same time, there is considerable overlap among these disorders at the molecular level. The various combinations of genetic abnormalities indicate a multi-step pathogenesis, which likely contributes to the marked clinical heterogeneity of these disorders. This review focuses on the current knowledge and challenges related to the molecular pathogenesis of chronic neutrophilic leukemia, chronic myelomonocytic leukemia and atypical chronic myeloid leukemia and relationships between molecular findings, clinical features and prognosis.

  16. Adaptive multi-step Full Waveform Inversion based on Waveform Mode Decomposition

    NASA Astrophysics Data System (ADS)

    Hu, Yong; Han, Liguo; Xu, Zhuo; Zhang, Fengjiao; Zeng, Jingwen

    2017-04-01

    Full Waveform Inversion (FWI) can be used to build high resolution velocity models, but there are still many challenges in seismic field data processing. The most difficult problem is about how to recover long-wavelength components of subsurface velocity models when seismic data is lacking of low frequency information and without long-offsets. To solve this problem, we propose to use Waveform Mode Decomposition (WMD) method to reconstruct low frequency information for FWI to obtain a smooth model, so that the initial model dependence of FWI can be reduced. In this paper, we use adjoint-state method to calculate the gradient for Waveform Mode Decomposition Full Waveform Inversion (WMDFWI). Through the illustrative numerical examples, we proved that the low frequency which is reconstructed by WMD method is very reliable. WMDFWI in combination with the adaptive multi-step inversion strategy can obtain more faithful and accurate final inversion results. Numerical examples show that even if the initial velocity model is far from the true model and lacking of low frequency information, we still can obtain good inversion results with WMD method. From numerical examples of anti-noise test, we see that the adaptive multi-step inversion strategy for WMDFWI has strong ability to resist Gaussian noise. WMD method is promising to be able to implement for the land seismic FWI, because it can reconstruct the low frequency information, lower the dominant frequency in the adjoint source, and has a strong ability to resist noise.

  17. High-throughput sample processing and sample management; the functional evolution of classical cytogenetic assay towards automation.

    PubMed

    Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S

    2015-11-01

    High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput. Published by Elsevier B.V.

  18. Analysis of Time Filters in Multistep Methods

    NASA Astrophysics Data System (ADS)

    Hurl, Nicholas

    Geophysical ow simulations have evolved sophisticated implicit-explicit time stepping methods (based on fast-slow wave splittings) followed by time filters to control any unstable models that result. Time filters are modular and parallel. Their effect on stability of the overall process has been tested in numerous simulations, but never analyzed. Stability is proven herein for the Crank-Nicolson Leapfrog (CNLF) method with the Robert-Asselin (RA) time filter and for the Crank-Nicolson Leapfrog method with the Robert-Asselin-Williams (RAW) time filter for systems by energy methods. We derive an equivalent multistep method for CNLF+RA and CNLF+RAW and stability regions are obtained. The time step restriction for energy stability of CNLF+RA is smaller than CNLF and CNLF+RAW time step restriction is even smaller. Numerical tests find that RA and RAW add numerical dissipation. This thesis also shows that all modes of the Crank-Nicolson Leap Frog (CNLF) method are asymptotically stable under the standard timestep condition.

  19. Design Automation in Synthetic Biology.

    PubMed

    Appleton, Evan; Madsen, Curtis; Roehner, Nicholas; Densmore, Douglas

    2017-04-03

    Design automation refers to a category of software tools for designing systems that work together in a workflow for designing, building, testing, and analyzing systems with a target behavior. In synthetic biology, these tools are called bio-design automation (BDA) tools. In this review, we discuss the BDA tools areas-specify, design, build, test, and learn-and introduce the existing software tools designed to solve problems in these areas. We then detail the functionality of some of these tools and show how they can be used together to create the desired behavior of two types of modern synthetic genetic regulatory networks. Copyright © 2017 Cold Spring Harbor Laboratory Press; all rights reserved.

  20. Baculovirus expression system and method for high throughput expression of genetic material

    DOEpatents

    Clark, Robin; Davies, Anthony

    2001-01-01

    The present invention provides novel recombinant baculovirus expression systems for expressing foreign genetic material in a host cell. Such expression systems are readily adapted to an automated method for expression foreign genetic material in a high throughput manner. In other aspects, the present invention features a novel automated method for determining the function of foreign genetic material by transfecting the same into a host by way of the recombinant baculovirus expression systems according to the present invention.

  1. Differential genetic regulation of motor activity and anxiety-related behaviors in mice using an automated home cage task.

    PubMed

    Kas, Martien J H; de Mooij-van Malsen, Annetrude J G; Olivier, Berend; Spruijt, Berry M; van Ree, Jan M

    2008-08-01

    Traditional behavioral tests, such as the open field test, measure an animal's responsiveness to a novel environment. However, it is generally difficult to assess whether the behavioral response obtained from these tests relates to the expression level of motor activity and/or to avoidance of anxiogenic areas. Here, an automated home cage environment for mice was designed to obtain independent measures of motor activity levels and of sheltered feeding preference during three consecutive days. Chronic treatment with the anxiolytic drug chlordiazepoxide (5 and 10 mg/kg/day) in C57BL/6J mice reduced sheltered feeding preference without altering motor activity levels. Furthermore, two distinct chromosome substitution strains, derived from C57BL/6J (host strain) and A/J (donor strain) inbred strains, expressed either increased sheltering preference in females (chromosome 15) or reduced motor activity levels in females and males (chromosome 1) when compared to C57BL/6J. Longitudinal behavioral monitoring revealed that these phenotypic differences maintained after adaptation to the home cage. Thus, by using new automated behavioral phenotyping approaches, behavior can be dissociated into distinct behavioral domains (e.g., anxiety-related and motor activity domains) with different underlying genetic origin and pharmacological responsiveness.

  2. Analyzing multistep homogeneous nucleation in vapor-to-solid transitions using molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Tanaka, Kyoko K.; Diemand, Jürg; Tanaka, Hidekazu; Angélil, Raymond

    2017-08-01

    In this paper, we present multistep homogeneous nucleations in vapor-to-solid transitions as revealed by molecular dynamics simulations on Lennard-Jones molecules, where liquidlike clusters are created and crystallized. During a long, direct N V E (constant volume, energy, and number of molecules) involving the integration of (1.9 -15 )× 106 molecules in up to 200 million steps (=4.3 μ s ), crystallization in many large, supercooled nanoclusters is observed once the liquid clusters grow to a certain size (˜800 molecules for the case of T ≃0.5 ɛ /k ). In the simulations, we discovered an interesting process associated with crystallization: the solid clusters lost 2-5 % of their mass during crystallization at low temperatures below their melting temperatures. Although the crystallized clusters were heated by latent heat, they were stabilized by cooling due to evaporation. The clusters crystallized quickly and completely except at surface layers. However, they did not have stable crystal structures, rather they had metastable structures such as icosahedral, decahedral, face-centered-cubic-rich (fcc-rich), and hexagonal-close-packed-rich (hcp-rich). Several kinds of cluster structures coexisted in the same size range of ˜1000 -5000 molecules. Our results imply that multistep nucleation is a common first stage of condensation from vapor to solid.

  3. An automated field phenotyping pipeline for application in grapevine research.

    PubMed

    Kicherer, Anna; Herzog, Katja; Pflanz, Michael; Wieland, Markus; Rüger, Philipp; Kecke, Steffen; Kuhlmann, Heiner; Töpfer, Reinhard

    2015-02-26

    Due to its perennial nature and size, the acquisition of phenotypic data in grapevine research is almost exclusively restricted to the field and done by visual estimation. This kind of evaluation procedure is limited by time, cost and the subjectivity of records. As a consequence, objectivity, automation and more precision of phenotypic data evaluation are needed to increase the number of samples, manage grapevine repositories, enable genetic research of new phenotypic traits and, therefore, increase the efficiency in plant research. In the present study, an automated field phenotyping pipeline was setup and applied in a plot of genetic resources. The application of the PHENObot allows image acquisition from at least 250 individual grapevines per hour directly in the field without user interaction. Data management is handled by a database (IMAGEdata). The automatic image analysis tool BIVcolor (Berries in Vineyards-color) permitted the collection of precise phenotypic data of two important fruit traits, berry size and color, within a large set of plants. The application of the PHENObot represents an automated tool for high-throughput sampling of image data in the field. The automated analysis of these images facilitates the generation of objective and precise phenotypic data on a larger scale.

  4. An Automated Field Phenotyping Pipeline for Application in Grapevine Research

    PubMed Central

    Kicherer, Anna; Herzog, Katja; Pflanz, Michael; Wieland, Markus; Rüger, Philipp; Kecke, Steffen; Kuhlmann, Heiner; Töpfer, Reinhard

    2015-01-01

    Due to its perennial nature and size, the acquisition of phenotypic data in grapevine research is almost exclusively restricted to the field and done by visual estimation. This kind of evaluation procedure is limited by time, cost and the subjectivity of records. As a consequence, objectivity, automation and more precision of phenotypic data evaluation are needed to increase the number of samples, manage grapevine repositories, enable genetic research of new phenotypic traits and, therefore, increase the efficiency in plant research. In the present study, an automated field phenotyping pipeline was setup and applied in a plot of genetic resources. The application of the PHENObot allows image acquisition from at least 250 individual grapevines per hour directly in the field without user interaction. Data management is handled by a database (IMAGEdata). The automatic image analysis tool BIVcolor (Berries in Vineyards-color) permitted the collection of precise phenotypic data of two important fruit traits, berry size and color, within a large set of plants. The application of the PHENObot represents an automated tool for high-throughput sampling of image data in the field. The automated analysis of these images facilitates the generation of objective and precise phenotypic data on a larger scale. PMID:25730485

  5. Genetic algorithms in teaching artificial intelligence (automated generation of specific algebras)

    NASA Astrophysics Data System (ADS)

    Habiballa, Hashim; Jendryscik, Radek

    2017-11-01

    The problem of teaching essential Artificial Intelligence (AI) methods is an important task for an educator in the branch of soft-computing. The key focus is often given to proper understanding of the principle of AI methods in two essential points - why we use soft-computing methods at all and how we apply these methods to generate reasonable results in sensible time. We present one interesting problem solved in the non-educational research concerning automated generation of specific algebras in the huge search space. We emphasize above mentioned points as an educational case study of an interesting problem in automated generation of specific algebras.

  6. Articulating Identities and Analyzing Belonging: A Multistep Intervention That Affirms and Informs a Diversity of Students

    ERIC Educational Resources Information Center

    Cook-Sather, Alison; Des-Ogugua, Crystal; Bahti, Melanie

    2018-01-01

    This article describes a multistep intervention developed for an undergraduate course called 'Advocating Diversity in Higher Education.' The goal of the intervention was to affirm diversity and foster a sense of inclusion among students within and beyond the course. We contextualize the intervention in student protests during 2015 and 2016…

  7. Low-loss ultracompact optical power splitter using a multistep structure.

    PubMed

    Huang, Zhe; Chan, Hau Ping; Afsar Uddin, Mohammad

    2010-04-01

    We propose a low-loss ultracompact optical power splitter for broadband passive optical network applications. The design is based on a multistep structure involving a two-material (core/cladding) system. The performance of the proposed device was evaluated through the three-dimensional finite-difference beam propagation method. By using the proposed design, an excess loss of 0.4 dB was achieved at a full branching angle of 24 degrees. The wavelength-dependent loss was found to be less than 0.3 dB, and the polarization-dependent loss was less than 0.05 dB from O to L bands. The device offers the potential of being mass-produced using low-cost polymer-based embossing techniques.

  8. Multi-Step Deep Reactive Ion Etching Fabrication Process for Silicon-Based Terahertz Components

    NASA Technical Reports Server (NTRS)

    Reck, Theodore (Inventor); Perez, Jose Vicente Siles (Inventor); Lee, Choonsup (Inventor); Cooper, Ken B. (Inventor); Jung-Kubiak, Cecile (Inventor); Mehdi, Imran (Inventor); Chattopadhyay, Goutam (Inventor); Lin, Robert H. (Inventor); Peralta, Alejandro (Inventor)

    2016-01-01

    A multi-step silicon etching process has been developed to fabricate silicon-based terahertz (THz) waveguide components. This technique provides precise dimensional control across multiple etch depths with batch processing capabilities. Nonlinear and passive components such as mixers and multipliers waveguides, hybrids, OMTs and twists have been fabricated and integrated into a small silicon package. This fabrication technique enables a wafer-stacking architecture to provide ultra-compact multi-pixel receiver front-ends in the THz range.

  9. Multistep translation and cultural adaptation of the Penn acoustic neuroma quality-of-life scale for German-speaking patients.

    PubMed

    Kristin, Julia; Glaas, Marcel Fabian; Stenin, Igor; Albrecht, Angelika; Klenzner, Thomas; Schipper, Jörg; Eysel-Gosepath, Katrin

    2017-11-01

    Monitoring the health-related quality of life (HRQOL) for patients with vestibular schwannoma (VS) has garnered increasing interest. In German-speaking countries, there is no disease-specific questionnaire available similar to the "Penn Acoustic Neuroma Quality-of-life Scale" (PANQOL). We translated the PANQOL for German-speaking patients based on a multistep protocol that included not only a forward-backward translation but also linguistic and sociocultural adaptations. The process consists of translation, synthesis, back translation, review by an expert committee, administration of the prefinal version to our patients, submission and appraisal of all written documents by our research team. The required multidisciplinary team for translation comprised head and neck surgeons, language professionals (German and English), a professional translator, and bilingual participants. A total of 123 patients with VS underwent microsurgical procedures via different approaches at our clinic between January 2007 and January 2017. Among these, 72 patients who underwent the translabyrinthine approach participated in the testing of the German-translated PANQOL. The first German version of the PANQOL questionnaire was created by a multistep translation process. The responses indicate that the questionnaire is simple to administer and applicable to our patients. The use of a multistep process to translate quality-of-life questionnaires is complex and time-consuming. However, this process was performed properly and resulted in a version of the PANQOL for assessing the quality of life of German-speaking patients with VS.

  10. Genetic control of Drosophila nerve cord development

    NASA Technical Reports Server (NTRS)

    Skeath, James B.; Thor, Stefan

    2003-01-01

    The Drosophila ventral nerve cord has been a central model system for studying the molecular genetic mechanisms that control CNS development. Studies show that the generation of neural diversity is a multistep process initiated by the patterning and segmentation of the neuroectoderm. These events act together with the process of lateral inhibition to generate precursor cells (neuroblasts) with specific identities, distinguished by the expression of unique combinations of regulatory genes. The expression of these genes in a given neuroblast restricts the fate of its progeny, by activating specific combinations of downstream genes. These genes in turn specify the identity of any given postmitotic cell, which is evident by its cellular morphology and choice of neurotransmitter.

  11. MEGA-CC: computing core of molecular evolutionary genetics analysis program for automated and iterative data analysis.

    PubMed

    Kumar, Sudhir; Stecher, Glen; Peterson, Daniel; Tamura, Koichiro

    2012-10-15

    There is a growing need in the research community to apply the molecular evolutionary genetics analysis (MEGA) software tool for batch processing a large number of datasets and to integrate it into analysis workflows. Therefore, we now make available the computing core of the MEGA software as a stand-alone executable (MEGA-CC), along with an analysis prototyper (MEGA-Proto). MEGA-CC provides users with access to all the computational analyses available through MEGA's graphical user interface version. This includes methods for multiple sequence alignment, substitution model selection, evolutionary distance estimation, phylogeny inference, substitution rate and pattern estimation, tests of natural selection and ancestral sequence inference. Additionally, we have upgraded the source code for phylogenetic analysis using the maximum likelihood methods for parallel execution on multiple processors and cores. Here, we describe MEGA-CC and outline the steps for using MEGA-CC in tandem with MEGA-Proto for iterative and automated data analysis. http://www.megasoftware.net/.

  12. Guest Programmable Multistep Spin Crossover in a Porous 2-D Hofmann-Type Material.

    PubMed

    Murphy, Michael J; Zenere, Katrina A; Ragon, Florence; Southon, Peter D; Kepert, Cameron J; Neville, Suzanne M

    2017-01-25

    The spin crossover (SCO) phenomenon defines an elegant class of switchable materials that can show cooperative transitions when long-range elastic interactions are present. Such materials can show multistepped transitions, targeted both fundamentally and for expanded data storage applications, when antagonistic interactions (i.e., competing ferro- and antiferro-elastic interactions) drive concerted lattice distortions. To this end, a new SCO framework scaffold, [Fe II (bztrz) 2 (Pd II (CN) 4 )]·n(guest) (bztrz = (E)-1-phenyl-N-(1,2,4-triazol-4-yl)methanimine, 1·n(guest)), has been prepared that supports a variety of antagonistic solid state interactions alongside a distinct dual guest pore system. In this 2-D Hofmann-type material we find that inbuilt competition between ferro- and antiferro-elastic interactions provides a SCO behavior that is intrinsically frustrated. This frustration is harnessed by guest exchange to yield a very broad array of spin transition characters in the one framework lattice (one- (1·(H 2 O,EtOH)), two- (1·3H 2 O) and three-stepped (1·∼2H 2 O) transitions and SCO-deactivation (1)). This variety of behaviors illustrates that the degree of elastic frustration can be manipulated by molecular guests, which suggests that the structural features that contribute to multistep switching may be more subtle than previously anticipated.

  13. Multi-Step Lithiation of Tin Sulfide: An Investigation Using In Situ Electron Microscopy

    DOE PAGES

    Hwang, Sooyeon; Yao, Zhenpeng; Zhang, Lei; ...

    2018-04-03

    Two-dimensional metal sulfides have been widely explored as promising electrodes for lithium ion batteries since their two-dimensional layered structure allows lithium ions to intercalate between layers. For tin disulfide, the lithiation process proceeds via a sequence of three different types of reactions: intercalation, conversion, and alloying but the full scenario of reaction dynamics remains nebulous. In this paper, we investigate the dynamical process of the multi-step reactions using in situ electron microscopy and discover an intermediate rock-salt phase with the disordering of Li and Sn cations, after the initial 2-dimensional intercalation. The disordered cations occupy all the octahedral sites andmore » block the channels for intercalation, which alter the reaction pathways during further lithiation. Our first principles calculations of the non-equilibrium lithiation of SnS2 corroborate the energetic preference of the disordered rock-salt structure over known layered polymorphs. The in situ observations and calculations suggest a two-phase reaction nature for intercalation, disordering, and following conversion reactions. In addition, in situ de-lithiation observation confirms that the alloying reaction is reversible while the conversion reaction is not, which is consistent to the ex situ analysis. This work reveals the full lithiation characteristic of SnS2 and sheds light on the understanding of complex multistep reactions in two-dimensional materials.« less

  14. Multi-Step Lithiation of Tin Sulfide: An Investigation Using In Situ Electron Microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hwang, Sooyeon; Yao, Zhenpeng; Zhang, Lei

    Two-dimensional metal sulfides have been widely explored as promising electrodes for lithium ion batteries since their two-dimensional layered structure allows lithium ions to intercalate between layers. For tin disulfide, the lithiation process proceeds via a sequence of three different types of reactions: intercalation, conversion, and alloying but the full scenario of reaction dynamics remains nebulous. In this paper, we investigate the dynamical process of the multi-step reactions using in situ electron microscopy and discover an intermediate rock-salt phase with the disordering of Li and Sn cations, after the initial 2-dimensional intercalation. The disordered cations occupy all the octahedral sites andmore » block the channels for intercalation, which alter the reaction pathways during further lithiation. Our first principles calculations of the non-equilibrium lithiation of SnS2 corroborate the energetic preference of the disordered rock-salt structure over known layered polymorphs. The in situ observations and calculations suggest a two-phase reaction nature for intercalation, disordering, and following conversion reactions. In addition, in situ de-lithiation observation confirms that the alloying reaction is reversible while the conversion reaction is not, which is consistent to the ex situ analysis. This work reveals the full lithiation characteristic of SnS2 and sheds light on the understanding of complex multistep reactions in two-dimensional materials.« less

  15. A Droplet Microfluidic Platform for Automating Genetic Engineering.

    PubMed

    Gach, Philip C; Shih, Steve C C; Sustarich, Jess; Keasling, Jay D; Hillson, Nathan J; Adams, Paul D; Singh, Anup K

    2016-05-20

    We present a water-in-oil droplet microfluidic platform for transformation, culture and expression of recombinant proteins in multiple host organisms including bacteria, yeast and fungi. The platform consists of a hybrid digital microfluidic/channel-based droplet chip with integrated temperature control to allow complete automation and integration of plasmid addition, heat-shock transformation, addition of selection medium, culture, and protein expression. The microfluidic format permitted significant reduction in consumption (100-fold) of expensive reagents such as DNA and enzymes compared to the benchtop method. The chip contains a channel to continuously replenish oil to the culture chamber to provide a fresh supply of oxygen to the cells for long-term (∼5 days) cell culture. The flow channel also replenished oil lost to evaporation and increased the number of droplets that could be processed and cultured. The platform was validated by transforming several plasmids into Escherichia coli including plasmids containing genes for fluorescent proteins GFP, BFP and RFP; plasmids with selectable markers for ampicillin or kanamycin resistance; and a Golden Gate DNA assembly reaction. We also demonstrate the applicability of this platform for transformation in widely used eukaryotic organisms such as Saccharomyces cerevisiae and Aspergillus niger. Duration and temperatures of the microfluidic heat-shock procedures were optimized to yield transformation efficiencies comparable to those obtained by benchtop methods with a throughput up to 6 droplets/min. The proposed platform offers potential for automation of molecular biology experiments significantly reducing cost, time and variability while improving throughput.

  16. [Automated parturition control in primi- and multiparous cows of a Simmental and Holstein crossbred herd].

    PubMed

    Dippon, Matthias; Petzl, Wolfram; Lange, Dorothee; Zerbe, Holm

    2017-02-09

    Perinatal calf mortality is a current problem in dairy farming with regards to ethics and economic losses. Optimizing calving management by frequent monitoring helps increasing the survival rate. The objective of this study was to evaluate the breed and parity dependent applicability of a recently introduced automated parturition control system with regards to its reliability in the field. Seven days prior to the calculated calving date the automated parturition control system was applied intravaginally in 23 primiparous and 31 multiparous cows in a Holstein-Friesian (HF) and Simmental (FV) crossbred herd. In the case of three consecutive false alarms the animal was removed from the study and was rated as false positive (FP). The statistical significant interdependence of FP alarms and the genetic proportion of HF was calculated using the Mann-Whitney-U test. The automated parturition control system could successfully be applied in all animals with a genetic HF proportion > 66%. Animals with a predominant FV proportion (> 66%) frequently showed FP alarms (31.6%). Furthermore, multiparous cows lost the intravaginal sender more frequently than primiparous cows (29.0% vs. 8.7%). In 72.2% heavily pregnant cows purulent vaginal discharge was observed. The automated parturition control system can successfully be applied in HF cows. Due to frequent losses of the intravaginal sender we cannot recommend its use in cows with a genetic FV proportion > 66%. Future developments of intravaginal automated parturition control systems should incorporate the influence of different breeds on its applicability.

  17. Model of multistep electron transfer in a single-mode polar medium

    NASA Astrophysics Data System (ADS)

    Feskov, S. V.; Yudanov, V. V.

    2017-09-01

    A mathematical model of multistep photoinduced electron transfer (PET) in a polar medium with a single relaxation time (Debye solvent) is developed. The model includes the polarization nonequilibrity formed in the vicinity of the donor-acceptor molecular system at the initial steps of photoreaction and its influence on the subsequent steps of PET. It is established that the results from numerical simulation of transient luminescence spectra of photoexcited donor-acceptor complexes (DAC) conform to calculated data obtained on the basis of the familiar experimental technique used to measure the relaxation function of solvent polarization in the vicinity of DAC in the picosecond and subpicosecond ranges.

  18. Automated production of [18 F]FTHA according to GMP.

    PubMed

    Savisto, Nina; Viljanen, Tapio; Kokkomäki, Esa; Bergman, Jörgen; Solin, Olof

    2018-02-01

    14-(R,S)-[ 18 F]fluoro-6-thia-heptadecanoic acid is a tracer for fatty acid imaging by positron emission tomography. High demand for this tracer required us to replace semiautomatic synthesis with a fully automated procedure. An automated synthesis device was constructed in-house for multistep nucleophilic 18 F-fluorination and a control system was developed. The synthesis device was combined with a sterile filtration unit and both were qualified. 14-(R,S)-[ 18 F]fluoro-6-thia-heptadecanoic acid was produced according to good manufacturing practice guidelines set by the European Union. The synthesis includes an initial nucleophilic labelling reaction, deprotection, preparative HPLC separation, purification of the final product, and formulation for injection. The duration and temperature of the reaction and hydrolysis were optimized, and the radiochemical stability of the formulated product was determined. The rotary evaporator used to evaporate the solvent after HPLC purification was replaced with solid phase extraction purification. We also replaced the human serum albumin used in the earlier procedure with a phosphate buffer-ascorbic acid mixture in the final formulation solution. From 2011 to 2016, we performed 219 synthesis procedures, 94% of which were successful. The radiochemical yield of 14-(R,S)-[ 18 F]fluoro-6-thia-heptadecanoic acid, decay-corrected to the end of bombardment, was 13% ± 6.3%. The total amount of formulated end product was 1.7 ± 0.8 GBq at end of synthesis. Copyright © 2017 John Wiley & Sons, Ltd.

  19. A Multistep Organocatalysis Experiment for the Undergraduate Organic Laboratory: An Enantioselective Aldol Reaction Catalyzed by Methyl Prolinamide

    ERIC Educational Resources Information Center

    Wade, Edmir O.; Walsh, Kenneth E.

    2011-01-01

    In recent years, there has been an explosion of research concerning the area of organocatalysis. A multistep capstone laboratory project that combines traditional reactions frequently found in organic laboratory curriculums with this new field of research is described. In this experiment, the students synthesize a prolinamide-based organocatalyst…

  20. Synthesis of Two Local Anesthetics from Toluene: An Organic Multistep Synthesis in a Project-Oriented Laboratory Course

    ERIC Educational Resources Information Center

    Demare, Patricia; Regla, Ignacio

    2012-01-01

    This article describes one of the projects in the advanced undergraduate organic chemistry laboratory course concerning the synthesis of two local anesthetic drugs, prilocaine and benzocaine, with a common three-step sequence starting from toluene. Students undertake, in a several-week independent project, the multistep synthesis of a…

  1. Self-Regulated Strategy Development Instruction for Teaching Multi-Step Equations to Middle School Students Struggling in Math

    ERIC Educational Resources Information Center

    Cuenca-Carlino, Yojanna; Freeman-Green, Shaqwana; Stephenson, Grant W.; Hauth, Clara

    2016-01-01

    Six middle school students identified as having a specific learning disability or at risk for mathematical difficulties were taught how to solve multi-step equations by using the self-regulated strategy development (SRSD) model of instruction. A multiple-probe-across-pairs design was used to evaluate instructional effects. Instruction was provided…

  2. Synthesis of 10-Ethyl Flavin: A Multistep Synthesis Organic Chemistry Laboratory Experiment for Upper-Division Undergraduate Students

    ERIC Educational Resources Information Center

    Sichula, Vincent A.

    2015-01-01

    A multistep synthesis of 10-ethyl flavin was developed as an organic chemistry laboratory experiment for upper-division undergraduate students. Students synthesize 10-ethyl flavin as a bright yellow solid via a five-step sequence. The experiment introduces students to various hands-on experimental organic synthetic techniques, such as column…

  3. Propagators for the Time-Dependent Kohn-Sham Equations: Multistep, Runge-Kutta, Exponential Runge-Kutta, and Commutator Free Magnus Methods.

    PubMed

    Gómez Pueyo, Adrián; Marques, Miguel A L; Rubio, Angel; Castro, Alberto

    2018-05-09

    We examine various integration schemes for the time-dependent Kohn-Sham equations. Contrary to the time-dependent Schrödinger's equation, this set of equations is nonlinear, due to the dependence of the Hamiltonian on the electronic density. We discuss some of their exact properties, and in particular their symplectic structure. Four different families of propagators are considered, specifically the linear multistep, Runge-Kutta, exponential Runge-Kutta, and the commutator-free Magnus schemes. These have been chosen because they have been largely ignored in the past for time-dependent electronic structure calculations. The performance is analyzed in terms of cost-versus-accuracy. The clear winner, in terms of robustness, simplicity, and efficiency is a simplified version of a fourth-order commutator-free Magnus integrator. However, in some specific cases, other propagators, such as some implicit versions of the multistep methods, may be useful.

  4. Automation or De-automation

    NASA Astrophysics Data System (ADS)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  5. Synthesis of Frontalin, the Aggregation Pheromone of the Southern Pine Beetle: A Multistep Organic Synthesis for Undergraduate Students.

    ERIC Educational Resources Information Center

    Bartlett, Paul A.; And Others

    1984-01-01

    Background information and experimental procedures are provided for the multistep synthesis of frontalin. The experiment exposes students to a range of practical laboratory problems and important synthetic reactions and provides experiences in working on a medium-size, as well as a relatively small-size scale. (JN)

  6. Automation of route identification and optimisation based on data-mining and chemical intuition.

    PubMed

    Lapkin, A A; Heer, P K; Jacob, P-M; Hutchby, M; Cunningham, W; Bull, S D; Davidson, M G

    2017-09-21

    Data-mining of Reaxys and network analysis of the combined literature and in-house reactions set were used to generate multiple possible reaction routes to convert a bio-waste feedstock, limonene, into a pharmaceutical API, paracetamol. The network analysis of data provides a rich knowledge-base for generation of the initial reaction screening and development programme. Based on the literature and the in-house data, an overall flowsheet for the conversion of limonene to paracetamol was proposed. Each individual reaction-separation step in the sequence was simulated as a combination of the continuous flow and batch steps. The linear model generation methodology allowed us to identify the reaction steps requiring further chemical optimisation. The generated model can be used for global optimisation and generation of environmental and other performance indicators, such as cost indicators. However, the identified further challenge is to automate model generation to evolve optimal multi-step chemical routes and optimal process configurations.

  7. Genetic Design Automation: engineering fantasy or scientific renewal?

    PubMed Central

    Lux, Matthew W.; Bramlett, Brian W.; Ball, David A.; Peccoud, Jean

    2013-01-01

    Synthetic biology aims to make genetic systems more amenable to engineering, which has naturally led to the development of Computer-Aided Design (CAD) tools. Experimentalists still primarily rely on project-specific ad-hoc workflows instead of domain-specific tools, suggesting that CAD tools are lagging behind the front line of the field. Here, we discuss the scientific hurdles that have limited the productivity gains anticipated from existing tools. We argue that the real value of efforts to develop CAD tools is the formalization of genetic design rules that determine the complex relationships between genotype and phenotype. PMID:22001068

  8. Purification of crude glycerol from transesterification reaction of palm oil using direct method and multistep method

    NASA Astrophysics Data System (ADS)

    Nasir, N. F.; Mirus, M. F.; Ismail, M.

    2017-09-01

    Crude glycerol which produced from transesterification reaction has limited usage if it does not undergo purification process. It also contains excess methanol, catalyst and soap. Conventionally, purification method of the crude glycerol involves high cost and complex processes. This study aimed to determine the effects of using different purification methods which are direct method (comprises of ion exchange and methanol removal steps) and multistep method (comprises of neutralization, filtration, ion exchange and methanol removal steps). Two crude glycerol samples were investigated; the self-produced sample through the transesterification process of palm oil and the sample obtained from biodiesel plant. Samples were analysed using Fourier Transform Infrared Spectroscopy, Gas Chromatography and High Performance Liquid Chromatography. The results of this study for both samples after purification have showed that the pure glycerol was successfully produced and fatty acid salts were eliminated. Also, the results indicated the absence of methanol in both samples after purification process. In short, the combination of 4 purification steps has contributed to a higher quality of glycerol. Multistep purification method gave a better result compared to the direct method as neutralization and filtration steps helped in removing most excess salt, fatty acid and catalyst.

  9. Genetic design automation: engineering fantasy or scientific renewal?

    PubMed

    Lux, Matthew W; Bramlett, Brian W; Ball, David A; Peccoud, Jean

    2012-02-01

    The aim of synthetic biology is to make genetic systems more amenable to engineering, which has naturally led to the development of computer-aided design (CAD) tools. Experimentalists still primarily rely on project-specific ad hoc workflows instead of domain-specific tools, which suggests that CAD tools are lagging behind the front line of the field. Here, we discuss the scientific hurdles that have limited the productivity gains anticipated from existing tools. We argue that the real value of efforts to develop CAD tools is the formalization of genetic design rules that determine the complex relationships between genotype and phenotype. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Genetic variation of growth dynamics in maize (Zea mays L.) revealed through automated non-invasive phenotyping.

    PubMed

    Muraya, Moses M; Chu, Jianting; Zhao, Yusheng; Junker, Astrid; Klukas, Christian; Reif, Jochen C; Altmann, Thomas

    2017-01-01

    Hitherto, most quantitative trait loci of maize growth and biomass yield have been identified for a single time point, usually the final harvest stage. Through this approach cumulative effects are detected, without considering genetic factors causing phase-specific differences in growth rates. To assess the genetics of growth dynamics, we employed automated non-invasive phenotyping to monitor the plant sizes of 252 diverse maize inbred lines at 11 different developmental time points; 50 k SNP array genotype data were used for genome-wide association mapping and genomic selection. The heritability of biomass was estimated to be over 71%, and the average prediction accuracy amounted to 0.39. Using the individual time point data, 12 main effect marker-trait associations (MTAs) and six pairs of epistatic interactions were detected that displayed different patterns of expression at various developmental time points. A subset of them also showed significant effects on relative growth rates in different intervals. The detected MTAs jointly explained up to 12% of the total phenotypic variation, decreasing with developmental progression. Using non-parametric functional mapping and multivariate mapping approaches, four additional marker loci affecting growth dynamics were detected. Our results demonstrate that plant biomass accumulation is a complex trait governed by many small effect loci, most of which act at certain restricted developmental phases. This highlights the need for investigation of stage-specific growth affecting genes to elucidate important processes operating at different developmental phases. © 2016 The Authors The Plant Journal © 2016 John Wiley & Sons Ltd.

  11. Automated production at the curie level of no-carrier-added 6-[(18)F]fluoro-L-dopa and 2-[(18)F]fluoro-L-tyrosine on a FASTlab synthesizer.

    PubMed

    Lemaire, C; Libert, L; Franci, X; Genon, J-L; Kuci, S; Giacomelli, F; Luxen, A

    2015-06-15

    An efficient, fully automated, enantioselective multi-step synthesis of no-carrier-added (nca) 6-[(18)F]fluoro-L-dopa ([(18)F]FDOPA) and 2-[(18)F]fluoro-L-tyrosine ([(18)F]FTYR) on a GE FASTlab synthesizer in conjunction with an additional high- performance liquid chromatography (HPLC) purification has been developed. A PTC (phase-transfer catalyst) strategy was used to synthesize these two important radiopharmaceuticals. According to recent chemistry improvements, automation of the whole process was implemented in a commercially available GE FASTlab module, with slight hardware modification using single use cassettes and stand-alone HPLC. [(18)F]FDOPA and [(18)F]FTYR were produced in 36.3 ± 3.0% (n = 8) and 50.5 ± 2.7% (n = 10) FASTlab radiochemical yield (decay corrected). The automated radiosynthesis on the FASTlab module requires about 52 min. Total synthesis time including HPLC purification and formulation was about 62 min. Enantiomeric excesses for these two aromatic amino acids were always >95%, and the specific activity of was >740 GBq/µmol. This automated synthesis provides high amount of [(18)F]FDOPA and [(18)F]FTYR (>37 GBq end of synthesis (EOS)). The process, fully adaptable for reliable production across multiple PET sites, could be readily implemented into a clinical good manufacturing process (GMP) environment. Copyright © 2015 John Wiley & Sons, Ltd.

  12. Controlled multistep synthesis in a three-phase droplet reactor

    PubMed Central

    Nightingale, Adrian M.; Phillips, Thomas W.; Bannock, James H.; de Mello, John C.

    2014-01-01

    Channel-fouling is a pervasive problem in continuous flow chemistry, causing poor product control and reactor failure. Droplet chemistry, in which the reaction mixture flows as discrete droplets inside an immiscible carrier liquid, prevents fouling by isolating the reaction from the channel walls. Unfortunately, the difficulty of controllably adding new reagents to an existing droplet stream has largely restricted droplet chemistry to simple reactions in which all reagents are supplied at the time of droplet formation. Here we describe an effective method for repeatedly adding controlled quantities of reagents to droplets. The reagents are injected into a multiphase fluid stream, comprising the carrier liquid, droplets of the reaction mixture and an inert gas that maintains a uniform droplet spacing and suppresses new droplet formation. The method, which is suited to many multistep reactions, is applied to a five-stage quantum dot synthesis wherein particle growth is sustained by repeatedly adding fresh feedstock. PMID:24797034

  13. Automated Image Analysis of HER2 Fluorescence In Situ Hybridization to Refine Definitions of Genetic Heterogeneity in Breast Cancer Tissue

    PubMed Central

    Radziuviene, Gedmante; Rasmusson, Allan; Augulis, Renaldas; Lesciute-Krilaviciene, Daiva; Laurinaviciene, Aida; Clim, Eduard

    2017-01-01

    Human epidermal growth factor receptor 2 gene- (HER2-) targeted therapy for breast cancer relies primarily on HER2 overexpression established by immunohistochemistry (IHC) with borderline cases being further tested for amplification by fluorescence in situ hybridization (FISH). Manual interpretation of HER2 FISH is based on a limited number of cells and rather complex definitions of equivocal, polysomic, and genetically heterogeneous (GH) cases. Image analysis (IA) can extract high-capacity data and potentially improve HER2 testing in borderline cases. We investigated statistically derived indicators of HER2 heterogeneity in HER2 FISH data obtained by automated IA of 50 IHC borderline (2+) cases of invasive ductal breast carcinoma. Overall, IA significantly underestimated the conventional HER2, CEP17 counts, and HER2/CEP17 ratio; however, it collected more amplified cells in some cases below the lower limit of GH definition by manual procedure. Indicators for amplification, polysomy, and bimodality were extracted by factor analysis and allowed clustering of the tumors into amplified, nonamplified, and equivocal/polysomy categories. The bimodality indicator provided independent cell diversity characteristics for all clusters. Tumors classified as bimodal only partially coincided with the conventional GH heterogeneity category. We conclude that automated high-capacity nonselective tumor cell assay can generate evidence-based HER2 intratumor heterogeneity indicators to refine GH definitions. PMID:28752092

  14. Automated Image Analysis of HER2 Fluorescence In Situ Hybridization to Refine Definitions of Genetic Heterogeneity in Breast Cancer Tissue.

    PubMed

    Radziuviene, Gedmante; Rasmusson, Allan; Augulis, Renaldas; Lesciute-Krilaviciene, Daiva; Laurinaviciene, Aida; Clim, Eduard; Laurinavicius, Arvydas

    2017-01-01

    Human epidermal growth factor receptor 2 gene- (HER2-) targeted therapy for breast cancer relies primarily on HER2 overexpression established by immunohistochemistry (IHC) with borderline cases being further tested for amplification by fluorescence in situ hybridization (FISH). Manual interpretation of HER2 FISH is based on a limited number of cells and rather complex definitions of equivocal, polysomic, and genetically heterogeneous (GH) cases. Image analysis (IA) can extract high-capacity data and potentially improve HER2 testing in borderline cases. We investigated statistically derived indicators of HER2 heterogeneity in HER2 FISH data obtained by automated IA of 50 IHC borderline (2+) cases of invasive ductal breast carcinoma. Overall, IA significantly underestimated the conventional HER2, CEP17 counts, and HER2/CEP17 ratio; however, it collected more amplified cells in some cases below the lower limit of GH definition by manual procedure. Indicators for amplification, polysomy, and bimodality were extracted by factor analysis and allowed clustering of the tumors into amplified, nonamplified, and equivocal/polysomy categories. The bimodality indicator provided independent cell diversity characteristics for all clusters. Tumors classified as bimodal only partially coincided with the conventional GH heterogeneity category. We conclude that automated high-capacity nonselective tumor cell assay can generate evidence-based HER2 intratumor heterogeneity indicators to refine GH definitions.

  15. Microarc oxidation coating covered Ti implants with micro-scale gouges formed by a multi-step treatment for improving osseointegration.

    PubMed

    Bai, Yixin; Zhou, Rui; Cao, Jianyun; Wei, Daqing; Du, Qing; Li, Baoqiang; Wang, Yaming; Jia, Dechang; Zhou, Yu

    2017-07-01

    The sub-microporous microarc oxidation (MAO) coating covered Ti implant with micro-scale gouges has been fabricated via a multi-step MAO process to overcome the compromised bone-implant integration. The as-prepared implant has been further mediated by post-heat treatment to compare the effects of -OH functional group and the nano-scale orange peel-like morphology on osseointegration. The bone regeneration, bone-implant contact interface, and biomechanical push-out force of the modified Ti implant have been discussed thoroughly in this work. The greatly improved push-out force for the MAO coated Ti implants with micro-scale gouges could be attributed to the excellent mechanical interlocking effect between implants and biologically meshed bone tissues. Attributed to the -OH functional group which promotes synostosis between the biologically meshed bone and the gouge surface of implant, the multi-step MAO process could be an effective strategy to improve the osseointegration of Ti implant. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Long-term memory-based control of attention in multi-step tasks requires working memory: evidence from domain-specific interference

    PubMed Central

    Foerster, Rebecca M.; Carbone, Elena; Schneider, Werner X.

    2014-01-01

    Evidence for long-term memory (LTM)-based control of attention has been found during the execution of highly practiced multi-step tasks. However, does LTM directly control for attention or are working memory (WM) processes involved? In the present study, this question was investigated with a dual-task paradigm. Participants executed either a highly practiced visuospatial sensorimotor task (speed stacking) or a verbal task (high-speed poem reciting), while maintaining visuospatial or verbal information in WM. Results revealed unidirectional and domain-specific interference. Neither speed stacking nor high-speed poem reciting was influenced by WM retention. Stacking disrupted the retention of visuospatial locations, but did not modify memory performance of verbal material (letters). Reciting reduced the retention of verbal material substantially whereas it affected the memory performance of visuospatial locations to a smaller degree. We suggest that the selection of task-relevant information from LTM for the execution of overlearned multi-step tasks recruits domain-specific WM. PMID:24847304

  17. Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction

    PubMed Central

    Gallistel, C. R.; Balci, Fuat; Freestone, David; Kheifets, Aaron; King, Adam

    2014-01-01

    We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be

  18. Automated, quantitative cognitive/behavioral screening of mice: for genetics, pharmacology, animal cognition and undergraduate instruction.

    PubMed

    Gallistel, C R; Balci, Fuat; Freestone, David; Kheifets, Aaron; King, Adam

    2014-02-26

    We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be

  19. Error analysis on squareness of multi-sensor integrated CMM for the multistep registration method

    NASA Astrophysics Data System (ADS)

    Zhao, Yan; Wang, Yiwen; Ye, Xiuling; Wang, Zhong; Fu, Luhua

    2018-01-01

    The multistep registration(MSR) method in [1] is to register two different classes of sensors deployed on z-arm of CMM(coordinate measuring machine): a video camera and a tactile probe sensor. In general, it is difficult to obtain a very precise registration result with a single common standard, instead, this method is achieved by measuring two different standards with a constant distance between them two which are fixed on a steel plate. Although many factors have been considered such as the measuring ability of sensors, the uncertainty of the machine and the number of data pairs, there is no exact analysis on the squareness between the x-axis and the y-axis on the xy plane. For this sake, error analysis on the squareness of multi-sensor integrated CMM for the multistep registration method will be made to examine the validation of the MSR method. Synthetic experiments on the squareness on the xy plane for the simplified MSR with an inclination rotation are simulated, which will lead to a regular result. Experiments have been carried out with the multi-standard device designed also in [1], meanwhile, inspections with the help of a laser interferometer on the xy plane have been carried out. The final results are conformed to the simulations, and the squareness errors of the MSR method are also similar to the results of interferometer. In other word, the MSR can also adopted/utilized to verify the squareness of a CMM.

  20. Cross-cultural adaptation of instruments assessing breastfeeding determinants: a multi-step approach

    PubMed Central

    2014-01-01

    Background Cross-cultural adaptation is a necessary process to effectively use existing instruments in other cultural and language settings. The process of cross-culturally adapting, including translation, of existing instruments is considered a critical set to establishing a meaningful instrument for use in another setting. Using a multi-step approach is considered best practice in achieving cultural and semantic equivalence of the adapted version. We aimed to ensure the content validity of our instruments in the cultural context of KwaZulu-Natal, South Africa. Methods The Iowa Infant Feeding Attitudes Scale, Breastfeeding Self-Efficacy Scale-Short Form and additional items comprise our consolidated instrument, which was cross-culturally adapted utilizing a multi-step approach during August 2012. Cross-cultural adaptation was achieved through steps to maintain content validity and attain semantic equivalence in the target version. Specifically, Lynn’s recommendation to apply an item-level content validity index score was followed. The revised instrument was translated and back-translated. To ensure semantic equivalence, Brislin’s back-translation approach was utilized followed by the committee review to address any discrepancies that emerged from translation. Results Our consolidated instrument was adapted to be culturally relevant and translated to yield more reliable and valid results for use in our larger research study to measure infant feeding determinants effectively in our target cultural context. Conclusions Undertaking rigorous steps to effectively ensure cross-cultural adaptation increases our confidence that the conclusions we make based on our self-report instrument(s) will be stronger. In this way, our aim to achieve strong cross-cultural adaptation of our consolidated instruments was achieved while also providing a clear framework for other researchers choosing to utilize existing instruments for work in other cultural, geographic and population

  1. Progressive Enrichment of Stemness Features and Tumor Stromal Alterations in Multistep Hepatocarcinogenesis.

    PubMed

    Yoo, Jeong Eun; Kim, Young-Joo; Rhee, Hyungjin; Kim, Haeryoung; Ahn, Ei Yong; Choi, Jin Sub; Roncalli, Massimo; Park, Young Nyun

    2017-01-01

    Cancer stem cells (CSCs), a subset of tumor cells, contribute to an aggressive biological behavior, which is also affected by the tumor stroma. Despite the role of CSCs and the tumor stroma in hepatocellular carcinoma (HCC), features of stemness have not yet been studied in relation to tumor stromal alterations in multistep hepatocarcinogenesis. We investigated the expression status of stemness markers and tumor stromal changes in B viral carcinogenesis, which is the main etiology of HCC in Asia. Stemness features of tumoral hepatocytes (EpCAM, K19, Oct3/4, c-KIT, c-MET, and CD133), and tumor stromal cells expressing α-smooth muscle actin (α-SMA), CD68, CD163, and IL-6 were analyzed in 36 low grade dysplastic nodules (DNs), 48 high grade DNs, 30 early HCCs (eHCCs), and 51 progressed HCCs (pHCCs) by immunohistochemistry or real-time PCR. Stemness features (EpCAM and K19 in particular) were progressively acquired during hepatocarcinogenesis in combination with enrichment of stromal cells (CAFs, TAMs, IL-6+ cells). Stemness features were seen sporadically in DNs, more consistent in eHCCs, and peaked in pHCCs. Likewise, stromal cells were discernable in DNs, showed up as consistent cell densities in eHCCs and peaked in pHCCs. The stemness features and tumor stromal alterations also peaked in less differentiated or larger HCCs. In conclusion, progression of B viral multistep hepatocarcinogenesis is characterized by an enrichment of stemness features of neoplastic hepatocytes and a parallel alteration of the tumor stroma. The modulation of neoplastic hepatocytes and stromal cells was at low levels in precancerous lesions (DNs), consistently increased in incipient cancer (eHCCs) and peaked in pHCCs. Thus, in B viral hepatocarcinogenesis, interactions between CSCs and the tumor stroma, although starting early, seem to play a major role in tumor progression.

  2. Automated 3D bioassembly of micro-tissues for biofabrication of hybrid tissue engineered constructs.

    PubMed

    Mekhileri, N V; Lim, K S; Brown, G C J; Mutreja, I; Schon, B S; Hooper, G J; Woodfield, T B F

    2018-01-12

    Bottom-up biofabrication approaches combining micro-tissue fabrication techniques with extrusion-based 3D printing of thermoplastic polymer scaffolds are emerging strategies in tissue engineering. These biofabrication strategies support native self-assembly mechanisms observed in developmental stages of tissue or organoid growth as well as promoting cell-cell interactions and cell differentiation capacity. Few technologies have been developed to automate the precise assembly of micro-tissues or tissue modules into structural scaffolds. We describe an automated 3D bioassembly platform capable of fabricating simple hybrid constructs via a two-step bottom-up bioassembly strategy, as well as complex hybrid hierarchical constructs via a multistep bottom-up bioassembly strategy. The bioassembly system consisted of a fluidic-based singularisation and injection module incorporated into a commercial 3D bioprinter. The singularisation module delivers individual micro-tissues to an injection module, for insertion into precise locations within a 3D plotted scaffold. To demonstrate applicability for cartilage tissue engineering, human chondrocytes were isolated and micro-tissues of 1 mm diameter were generated utilising a high throughput 96-well plate format. Micro-tissues were singularised with an efficiency of 96.0 ± 5.1%. There was no significant difference in size, shape or viability of micro-tissues before and after automated singularisation and injection. A layer-by-layer approach or aforementioned bottom-up bioassembly strategy was employed to fabricate a bilayered construct by alternatively 3D plotting a thermoplastic (PEGT/PBT) polymer scaffold and inserting pre-differentiated chondrogenic micro-tissues or cell-laden gelatin-based (GelMA) hydrogel micro-spheres, both formed via high-throughput fabrication techniques. No significant difference in viability between the construct assembled utilising the automated bioassembly system and manually assembled construct was

  3. [The genetic fingerprints file in France: between security and freedom].

    PubMed

    Manaouil, C; Gignon, M; Werbrouck, A; Jarde, O

    2008-01-01

    In France, the French National File Automated with Genetic fingerprints (FNAEG) is a bank automated by genetic data which is used in penal domain. It facilitates search of the authors of malpractices, or the missing people. Since 1998, it has enabled to resolve numerous criminal cases. An extension of the field of application has been observed. It is a confidential register which is subjected to numerous controls. Nevertheless, private character of the data and its functioning (criminal character of the refusal of taking, periods of answer, and problem of data's conservation) explain the important contesting of associations worried about the respect of personal freedoms.

  4. Characterization and multi-step transketolase-ω-transaminase bioconversions in an immobilized enzyme microreactor (IEMR) with packed tube.

    PubMed

    Halim, Amanatuzzakiah Abdul; Szita, Nicolas; Baganz, Frank

    2013-12-01

    The concept of de novo metabolic engineering through novel synthetic pathways offers new directions for multi-step enzymatic synthesis of complex molecules. This has been complemented by recent progress in performing enzymatic reactions using immobilized enzyme microreactors (IEMR). This work is concerned with the construction of de novo designed enzyme pathways in a microreactor synthesizing chiral molecules. An interesting compound, commonly used as the building block in several pharmaceutical syntheses, is a single diastereoisomer of 2-amino-1,3,4-butanetriol (ABT). This chiral amino alcohol can be synthesized from simple achiral substrates using two enzymes, transketolase (TK) and transaminase (TAm). Here we describe the development of an IEMR using His6-tagged TK and TAm immobilized onto Ni-NTA agarose beads and packed into tubes to enable multi-step enzyme reactions. The kinetic parameters of both enzymes were first determined using single IEMRs evaluated by a kinetic model developed for packed bed reactors. The Km(app) for both enzymes appeared to be flow rate dependent, while the turnover number kcat was reduced 3 fold compared to solution-phase TK and TAm reactions. For the multi-step enzyme reaction, single IEMRs were cascaded in series, whereby the first enzyme, TK, catalyzed a model reaction of lithium-hydroxypyruvate (HPA) and glycolaldehyde (GA) to L-erythrulose (ERY), and the second unit of the IEMR with immobilized TAm converted ERY into ABT using (S)-α-methylbenzylamine (MBA) as amine donor. With initial 60mM (HPA and GA each) and 6mM (MBA) substrate concentration mixture, the coupled reaction reached approximately 83% conversion in 20 min at the lowest flow rate. The ability to synthesize a chiral pharmaceutical intermediate, ABT in relatively short time proves this IEMR system as a powerful tool for construction and evaluation of de novo pathways as well as for determination of enzyme kinetics. Copyright © 2013 The Authors. Published by

  5. Arduino-based automation of a DNA extraction system.

    PubMed

    Kim, Kyung-Won; Lee, Mi-So; Ryu, Mun-Ho; Kim, Jong-Won

    2015-01-01

    There have been many studies to detect infectious diseases with the molecular genetic method. This study presents an automation process for a DNA extraction system based on microfluidics and magnetic bead, which is part of a portable molecular genetic test system. This DNA extraction system consists of a cartridge with chambers, syringes, four linear stepper actuators, and a rotary stepper actuator. The actuators provide a sequence of steps in the DNA extraction process, such as transporting, mixing, and washing for the gene specimen, magnetic bead, and reagent solutions. The proposed automation system consists of a PC-based host application and an Arduino-based controller. The host application compiles a G code sequence file and interfaces with the controller to execute the compiled sequence. The controller executes stepper motor axis motion, time delay, and input-output manipulation. It drives the stepper motor with an open library, which provides a smooth linear acceleration profile. The controller also provides a homing sequence to establish the motor's reference position, and hard limit checking to prevent any over-travelling. The proposed system was implemented and its functionality was investigated, especially regarding positioning accuracy and velocity profile.

  6. Automation and Optimization of Multipulse Laser Zona Drilling of Mouse Embryos During Embryo Biopsy.

    PubMed

    Wong, Christopher Yee; Mills, James K

    2017-03-01

    Laser zona drilling (LZD) is a required step in many embryonic surgical procedures, for example, assisted hatching and preimplantation genetic diagnosis. LZD involves the ablation of the zona pellucida (ZP) using a laser while minimizing potentially harmful thermal effects on critical internal cell structures. Develop a method for the automation and optimization of multipulse LZD, applied to cleavage-stage embryos. A two-stage optimization is used. The first stage uses computer vision algorithms to identify embryonic structures and determines the optimal ablation zone farthest away from critical structures such as blastomeres. The second stage combines a genetic algorithm with a previously reported thermal analysis of LZD to optimize the combination of laser pulse locations and pulse durations. The goal is to minimize the peak temperature experienced by the blastomeres while creating the desired opening in the ZP. A proof of concept of the proposed LZD automation and optimization method is demonstrated through experiments on mouse embryos with positive results, as adequately sized openings are created. Automation of LZD is feasible and is a viable step toward the automation of embryo biopsy procedures. LZD is a common but delicate procedure performed by human operators using subjective methods to gauge proper LZD procedure. Automation of LZD removes human error to increase the success rate of LZD. Although the proposed methods are developed for cleavage-stage embryos, the same methods may be applied to most types LZD procedures, embryos at different developmental stages, or nonembryonic cells.

  7. Altering user' acceptance of automation through prior automation exposure.

    PubMed

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  8. Continuous multistep synthesis of perillic acid from limonene by catalytic biofilms under segmented flow.

    PubMed

    Willrodt, Christian; Halan, Babu; Karthaus, Lisa; Rehdorf, Jessica; Julsing, Mattijs K; Buehler, Katja; Schmid, Andreas

    2017-02-01

    The efficiency of biocatalytic reactions involving industrially interesting reactants is often constrained by toxification of the applied biocatalyst. Here, we evaluated the combination of biologically and technologically inspired strategies to overcome toxicity-related issues during the multistep oxyfunctionalization of (R)-(+)-limonene to (R)-(+)-perillic acid. Pseudomonas putida GS1 catalyzing selective limonene oxidation via the p-cymene degradation pathway and recombinant Pseudomonas taiwanensis VLB120 were evaluated for continuous perillic acid production. A tubular segmented-flow biofilm reactor was used in order to relieve oxygen limitations and to enable membrane mediated substrate supply as well as efficient in situ product removal. Both P. putida GS1 and P. taiwanensis VLB120 developed a catalytic biofilm in this system. The productivity of wild-type P. putida GS1 encoding the enzymes for limonene bioconversion was highly dependent on the carbon source and reached 34 g L tube -1  day -1 when glycerol was supplied. More than 10-fold lower productivities were reached irrespective of the applied carbon source when the recombinant P. taiwanensis VLB120 harboring p-cymene monooxygenase and p-cumic alcohol dehydrogenase was used as biocatalyst. The technical applicability for preparative perillic acid synthesis in the applied system was verified by purification of perillic acid from the outlet stream using an anion exchanger resin. This concept enabled the multistep production of perillic acid and which might be transferred to other reactions involving volatile reactants and toxic end-products. Biotechnol. Bioeng. 2017;114: 281-290. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  9. Complacency and Automation Bias in the Use of Imperfect Automation.

    PubMed

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  10. Automated Design of Quantum Circuits

    NASA Technical Reports Server (NTRS)

    Williams, Colin P.; Gray, Alexander G.

    2000-01-01

    In order to design a quantum circuit that performs a desired quantum computation, it is necessary to find a decomposition of the unitary matrix that represents that computation in terms of a sequence of quantum gate operations. To date, such designs have either been found by hand or by exhaustive enumeration of all possible circuit topologies. In this paper we propose an automated approach to quantum circuit design using search heuristics based on principles abstracted from evolutionary genetics, i.e. using a genetic programming algorithm adapted specially for this problem. We demonstrate the method on the task of discovering quantum circuit designs for quantum teleportation. We show that to find a given known circuit design (one which was hand-crafted by a human), the method considers roughly an order of magnitude fewer designs than naive enumeration. In addition, the method finds novel circuit designs superior to those previously known.

  11. Multistep Lattice-Voxel method utilizing lattice function for Monte-Carlo treatment planning with pixel based voxel model.

    PubMed

    Kumada, H; Saito, K; Nakamura, T; Sakae, T; Sakurai, H; Matsumura, A; Ono, K

    2011-12-01

    Treatment planning for boron neutron capture therapy generally utilizes Monte-Carlo methods for calculation of the dose distribution. The new treatment planning system JCDS-FX employs the multi-purpose Monte-Carlo code PHITS to calculate the dose distribution. JCDS-FX allows to build a precise voxel model consisting of pixel based voxel cells in the scale of 0.4×0.4×2.0 mm(3) voxel in order to perform high-accuracy dose estimation, e.g. for the purpose of calculating the dose distribution in a human body. However, the miniaturization of the voxel size increases calculation time considerably. The aim of this study is to investigate sophisticated modeling methods which can perform Monte-Carlo calculations for human geometry efficiently. Thus, we devised a new voxel modeling method "Multistep Lattice-Voxel method," which can configure a voxel model that combines different voxel sizes by utilizing the lattice function over and over. To verify the performance of the calculation with the modeling method, several calculations for human geometry were carried out. The results demonstrated that the Multistep Lattice-Voxel method enabled the precise voxel model to reduce calculation time substantially while keeping the high-accuracy of dose estimation. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Multistep synthesis on SU-8: combining microfabrication and solid-phase chemistry on a single material.

    PubMed

    Cavalli, Gabriel; Banu, Shahanara; Ranasinghe, Rohan T; Broder, Graham R; Martins, Hugo F P; Neylon, Cameron; Morgan, Hywel; Bradley, Mark; Roach, Peter L

    2007-01-01

    SU-8 is an epoxy-novolac resin and a well-established negative photoresist for microfabrication and microengineering. The photopolymerized resist is an extremely highly crosslinked polymer showing outstanding chemical and physical robustness with residual surface epoxy groups amenable for chemical functionalization. In this paper we describe, for the first time, the preparation and surface modification of SU-8 particles shaped as microbars, the attachment of appropriate linkers, and the successful application of these particles to multistep solid-phase synthesis leading to oligonucleotides and peptides attached in an unambiguous manner to the support surface.

  13. Engineering biological systems using automated biofoundries

    PubMed Central

    Chao, Ran; Mishra, Shekhar; Si, Tong; Zhao, Huimin

    2017-01-01

    Engineered biological systems such as genetic circuits and microbial cell factories have promised to solve many challenges in the modern society. However, the artisanal processes of research and development are slow, expensive, and inconsistent, representing a major obstacle in biotechnology and bioengineering. In recent years, biological foundries or biofoundries have been developed to automate design-build-test engineering cycles in an effort to accelerate these processes. This review summarizes the enabling technologies for such biofoundries as well as their early successes and remaining challenges. PMID:28602523

  14. A Tet-on and Cre-loxP Based Genetic Engineering System for Convenient Recycling of Selection Markers in Penicillium oxalicum

    PubMed Central

    Jiang, Baojie; Zhang, Ruiqin; Feng, Dan; Wang, Fangzhong; Liu, Kuimei; Jiang, Yi; Niu, Kangle; Yuan, Quanquan; Wang, Mingyu; Wang, Hailong; Zhang, Youming; Fang, Xu

    2016-01-01

    The lack of selective markers has been a key problem preventing multistep genetic engineering in filamentous fungi, particularly for industrial species such as the lignocellulose degrading Penicillium oxalicum JUA10-1(formerly named as Penicillium decumbens). To resolve this problem, we constructed a genetic manipulation system taking advantage of two established genetic systems: the Cre-loxP system and Tet-on system in P. oxalicum JUA10-1. This system is efficient and convenient. The expression of Cre recombinase was activated by doxycycline since it was controlled by Tet-on system. Using this system, two genes, ligD and bglI, were sequentially disrupted by loxP flanked ptrA. The successful application of this procedure will provide a useful tool for genetic engineering in filamentous fungi. This system will also play an important role in improving the productivity of interesting products and minimizing by-product when fermented by filamentous fungi. PMID:27148179

  15. A Dimensionality Reduction-Based Multi-Step Clustering Method for Robust Vessel Trajectory Analysis

    PubMed Central

    Liu, Jingxian; Wu, Kefeng

    2017-01-01

    The Shipboard Automatic Identification System (AIS) is crucial for navigation safety and maritime surveillance, data mining and pattern analysis of AIS information have attracted considerable attention in terms of both basic research and practical applications. Clustering of spatio-temporal AIS trajectories can be used to identify abnormal patterns and mine customary route data for transportation safety. Thus, the capacities of navigation safety and maritime traffic monitoring could be enhanced correspondingly. However, trajectory clustering is often sensitive to undesirable outliers and is essentially more complex compared with traditional point clustering. To overcome this limitation, a multi-step trajectory clustering method is proposed in this paper for robust AIS trajectory clustering. In particular, the Dynamic Time Warping (DTW), a similarity measurement method, is introduced in the first step to measure the distances between different trajectories. The calculated distances, inversely proportional to the similarities, constitute a distance matrix in the second step. Furthermore, as a widely-used dimensional reduction method, Principal Component Analysis (PCA) is exploited to decompose the obtained distance matrix. In particular, the top k principal components with above 95% accumulative contribution rate are extracted by PCA, and the number of the centers k is chosen. The k centers are found by the improved center automatically selection algorithm. In the last step, the improved center clustering algorithm with k clusters is implemented on the distance matrix to achieve the final AIS trajectory clustering results. In order to improve the accuracy of the proposed multi-step clustering algorithm, an automatic algorithm for choosing the k clusters is developed according to the similarity distance. Numerous experiments on realistic AIS trajectory datasets in the bridge area waterway and Mississippi River have been implemented to compare our proposed method with

  16. A universal method for automated gene mapping

    PubMed Central

    Zipperlen, Peder; Nairz, Knud; Rimann, Ivo; Basler, Konrad; Hafen, Ernst; Hengartner, Michael; Hajnal, Alex

    2005-01-01

    Small insertions or deletions (InDels) constitute a ubiquituous class of sequence polymorphisms found in eukaryotic genomes. Here, we present an automated high-throughput genotyping method that relies on the detection of fragment-length polymorphisms (FLPs) caused by InDels. The protocol utilizes standard sequencers and genotyping software. We have established genome-wide FLP maps for both Caenorhabditis elegans and Drosophila melanogaster that facilitate genetic mapping with a minimum of manual input and at comparatively low cost. PMID:15693948

  17. Continuous Video Modeling to Assist with Completion of Multi-Step Home Living Tasks by Young Adults with Moderate Intellectual Disability

    ERIC Educational Resources Information Center

    Mechling, Linda C.; Ayres, Kevin M.; Bryant, Kathryn J.; Foster, Ashley L.

    2014-01-01

    The current study evaluated a relatively new video-based procedure, continuous video modeling (CVM), to teach multi-step cleaning tasks to high school students with moderate intellectual disability. CVM in contrast to video modeling and video prompting allows repetition of the video model (looping) as many times as needed while the user completes…

  18. Controllable 3D architectures of aligned carbon nanotube arrays by multi-step processes

    NASA Astrophysics Data System (ADS)

    Huang, Shaoming

    2003-06-01

    An effective way to fabricate large area three-dimensional (3D) aligned CNTs pattern based on pyrolysis of iron(II) phthalocyanine (FePc) by two-step processes is reported. The controllable generation of different lengths and selective growth of the aligned CNT arrays on metal-patterned (e.g., Ag and Au) substrate are the bases for generating such 3D aligned CNTs architectures. By controlling experimental conditions 3D aligned CNT arrays with different lengths/densities and morphologies/structures as well as multi-layered architectures can be fabricated in large scale by multi-step pyrolysis of FePc. These 3D architectures could have interesting properties and be applied for developing novel nanotube-based devices.

  19. Model annotation for synthetic biology: automating model to nucleotide sequence conversion

    PubMed Central

    Misirli, Goksel; Hallinan, Jennifer S.; Yu, Tommy; Lawson, James R.; Wimalaratne, Sarala M.; Cooling, Michael T.; Wipat, Anil

    2011-01-01

    Motivation: The need for the automated computational design of genetic circuits is becoming increasingly apparent with the advent of ever more complex and ambitious synthetic biology projects. Currently, most circuits are designed through the assembly of models of individual parts such as promoters, ribosome binding sites and coding sequences. These low level models are combined to produce a dynamic model of a larger device that exhibits a desired behaviour. The larger model then acts as a blueprint for physical implementation at the DNA level. However, the conversion of models of complex genetic circuits into DNA sequences is a non-trivial undertaking due to the complexity of mapping the model parts to their physical manifestation. Automating this process is further hampered by the lack of computationally tractable information in most models. Results: We describe a method for automatically generating DNA sequences from dynamic models implemented in CellML and Systems Biology Markup Language (SBML). We also identify the metadata needed to annotate models to facilitate automated conversion, and propose and demonstrate a method for the markup of these models using RDF. Our algorithm has been implemented in a software tool called MoSeC. Availability: The software is available from the authors' web site http://research.ncl.ac.uk/synthetic_biology/downloads.html. Contact: anil.wipat@ncl.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:21296753

  20. Improving the driver-automation interaction: an approach using automation uncertainty.

    PubMed

    Beller, Johannes; Heesen, Matthias; Vollrath, Mark

    2013-12-01

    The aim of this study was to evaluate whether communicating automation uncertainty improves the driver-automation interaction. A false system understanding of infallibility may provoke automation misuse and can lead to severe consequences in case of automation failure. The presentation of automation uncertainty may prevent this false system understanding and, as was shown by previous studies, may have numerous benefits. Few studies, however, have clearly shown the potential of communicating uncertainty information in driving. The current study fills this gap. We conducted a driving simulator experiment, varying the presented uncertainty information between participants (no uncertainty information vs. uncertainty information) and the automation reliability (high vs.low) within participants. Participants interacted with a highly automated driving system while engaging in secondary tasks and were required to cooperate with the automation to drive safely. Quantile regressions and multilevel modeling showed that the presentation of uncertainty information increases the time to collision in the case of automation failure. Furthermore, the data indicated improved situation awareness and better knowledge of fallibility for the experimental group. Consequently, the automation with the uncertainty symbol received higher trust ratings and increased acceptance. The presentation of automation uncertaintythrough a symbol improves overall driver-automation cooperation. Most automated systems in driving could benefit from displaying reliability information. This display might improve the acceptance of fallible systems and further enhances driver-automation cooperation.

  1. Immobilised enzyme microreactor for screening of multi-step bioconversions: characterisation of a de novo transketolase-ω-transaminase pathway to synthesise chiral amino alcohols.

    PubMed

    Matosevic, S; Lye, G J; Baganz, F

    2011-09-20

    Complex molecules are synthesised via a number of multi-step reactions in living cells. In this work, we describe the development of a continuous flow immobilized enzyme microreactor platform for use in evaluation of multi-step bioconversion pathways demonstrating a de novo transketolase/ω-transaminase-linked asymmetric amino alcohol synthesis. The prototype dual microreactor is based on the reversible attachment of His₆-tagged enzymes via Ni-NTA linkage to two surface derivatised capillaries connected in series. Kinetic parameters established for the model transketolase (TK)-catalysed conversion of lithium-hydroxypyruvate (Li-HPA) and glycolaldehyde (GA) to L-erythrulose using a continuous flow system with online monitoring of reaction output was in good agreement with kinetic parameters determined for TK in stop-flow mode. By coupling the transketolase catalysed chiral ketone forming reaction with the biocatalytic addition of an amine to the TK product using a transaminase (ω-TAm) it is possible to generate chiral amino alcohols from achiral starting compounds. We demonstrated this in a two-step configuration, where the TK reaction was followed by the ω-TAm-catalysed amination of L-erythrulose to synthesise 2-amino-1,3,4-butanetriol (ABT). Synthesis of the ABT product via the dual reaction and the on-line monitoring of each component provided a full profile of the de novo two-step bioconversion and demonstrated the utility of this microreactor system to provide in vitro multi-step pathway evaluation. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Development of Multistep and Degenerate Variational Integrators for Applications in Plasma Physics

    NASA Astrophysics Data System (ADS)

    Ellison, Charles Leland

    Geometric integrators yield high-fidelity numerical results by retaining conservation laws in the time advance. A particularly powerful class of geometric integrators is symplectic integrators, which are widely used in orbital mechanics and accelerator physics. An important application presently lacking symplectic integrators is the guiding center motion of magnetized particles represented by non-canonical coordinates. Because guiding center trajectories are foundational to many simulations of magnetically confined plasmas, geometric guiding center algorithms have high potential for impact. The motivation is compounded by the need to simulate long-pulse fusion devices, including ITER, and opportunities in high performance computing, including the use of petascale resources and beyond. This dissertation uses a systematic procedure for constructing geometric integrators --- known as variational integration --- to deliver new algorithms for guiding center trajectories and other plasma-relevant dynamical systems. These variational integrators are non-trivial because the Lagrangians of interest are degenerate - the Euler-Lagrange equations are first-order differential equations and the Legendre transform is not invertible. The first contribution of this dissertation is that variational integrators for degenerate Lagrangian systems are typically multistep methods. Multistep methods admit parasitic mode instabilities that can ruin the numerical results. These instabilities motivate the second major contribution: degenerate variational integrators. By replicating the degeneracy of the continuous system, degenerate variational integrators avoid parasitic mode instabilities. The new methods are therefore robust geometric integrators for degenerate Lagrangian systems. These developments in variational integration theory culminate in one-step degenerate variational integrators for non-canonical magnetic field line flow and guiding center dynamics. The guiding center integrator

  3. Genetic algorithm based feature selection combined with dual classification for the automated detection of proliferative diabetic retinopathy.

    PubMed

    Welikala, R A; Fraz, M M; Dehmeshki, J; Hoppe, A; Tah, V; Mann, S; Williamson, T H; Barman, S A

    2015-07-01

    Proliferative diabetic retinopathy (PDR) is a condition that carries a high risk of severe visual impairment. The hallmark of PDR is the growth of abnormal new vessels. In this paper, an automated method for the detection of new vessels from retinal images is presented. This method is based on a dual classification approach. Two vessel segmentation approaches are applied to create two separate binary vessel map which each hold vital information. Local morphology features are measured from each binary vessel map to produce two separate 4-D feature vectors. Independent classification is performed for each feature vector using a support vector machine (SVM) classifier. The system then combines these individual outcomes to produce a final decision. This is followed by the creation of additional features to generate 21-D feature vectors, which feed into a genetic algorithm based feature selection approach with the objective of finding feature subsets that improve the performance of the classification. Sensitivity and specificity results using a dataset of 60 images are 0.9138 and 0.9600, respectively, on a per patch basis and 1.000 and 0.975, respectively, on a per image basis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Automated extraction of DNA from biological stains on fabric from crime cases. A comparison of a manual and three automated methods.

    PubMed

    Stangegaard, Michael; Hjort, Benjamin B; Hansen, Thomas N; Hoflund, Anders; Mogensen, Helle S; Hansen, Anders J; Morling, Niels

    2013-05-01

    The presence of PCR inhibitors in extracted DNA may interfere with the subsequent quantification and short tandem repeat (STR) reactions used in forensic genetic DNA typing. DNA extraction from fabric for forensic genetic purposes may be challenging due to the occasional presence of PCR inhibitors that may be co-extracted with the DNA. Using 120 forensic trace evidence samples consisting of various types of fabric, we compared three automated DNA extraction methods based on magnetic beads (PrepFiler Express Forensic DNA Extraction Kit on an AutoMate Express, QIAsyphony DNA Investigator kit either with the sample pre-treatment recommended by Qiagen or an in-house optimized sample pre-treatment on a QIAsymphony SP) and one manual method (Chelex) with the aim of reducing the amount of PCR inhibitors in the DNA extracts and increasing the proportion of reportable STR-profiles. A total of 480 samples were processed. The highest DNA recovery was obtained with the PrepFiler Express kit on an AutoMate Express while the lowest DNA recovery was obtained using a QIAsymphony SP with the sample pre-treatment recommended by Qiagen. Extraction using a QIAsymphony SP with the sample pre-treatment recommended by Qiagen resulted in the lowest percentage of PCR inhibition (0%) while extraction using manual Chelex resulted in the highest percentage of PCR inhibition (51%). The largest number of reportable STR-profiles was obtained with DNA from samples extracted with the PrepFiler Express kit (75%) while the lowest number was obtained with DNA from samples extracted using a QIAsymphony SP with the sample pre-treatment recommended by Qiagen (41%). Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  5. Multi-Step Ka/Ka Dichroic Plate with Rounded Corners for NASA's 34m Beam Waveguide Antenna

    NASA Technical Reports Server (NTRS)

    Veruttipong, Watt; Khayatian, Behrouz; Hoppe, Daniel; Long, Ezra

    2013-01-01

    A multi-step Ka/Ka dichroic plate Frequency Selective Surface (FSS structure) is designed, manufactured and tested for use in NASA's Deep Space Network (DSN) 34m Beam Waveguide (BWG) antennas. The proposed design allows ease of manufacturing and ability to handle the increased transmit power (reflected off the FSS) of the DSN BWG antennas from 20kW to 100 kW. The dichroic is designed using HFSS and results agree well with measured data considering the manufacturing tolerances that could be achieved on the dichroic.

  6. Quantum molecular dynamics and multistep-direct analyses of multiple preequilibrium emission

    NASA Astrophysics Data System (ADS)

    Chadwick, M. B.; Chiba, S.; Niita, K.; Maruyama, T.; Iwamoto, A.

    1995-11-01

    We study multiple preequilibrium emission in nucleon induced reactions at intermediate energies, and compare quantum molecular dynamics (QMD) calculations with multistep-direct Feshbach-Kerman-Koonin results [M. B. Chadwick, P. G. Young, D. C. George, and Y. Watanabe, Phys. Rev. C 50, 996 (1994)]. When the theoretical expressions of this reference are reformulated so that the definitions of primary and multiple emission correspond to those used in QMD, the two theories yield similar results for primary and multiple preequilibrium emission. We use QMD as a tool to determine the multiplicities of fast preequilibrium nucleons as a function of incident energy. For fast particle cross sections to exceed 5% of the inclusive preequilibrium emission cross sections we find that two particles should be included in reactions above 50 MeV, three above about 180 MeV, and four are only needed when the incident energy exceeds about 400 MeV.

  7. Multistep modeling (MSM) of biomolecular structure application to the A-G mispair in the B-DNA environment

    NASA Technical Reports Server (NTRS)

    Srinivasan, S.; Raghunathan, G.; Shibata, M.; Rein, R.

    1986-01-01

    A multistep modeling procedure has been evolved to study the structural changes introduced by lesions in DNA. We report here the change in the structure of regular B-DNA geometry due to the incorporation of Ganti-Aanti mispair in place of a regular G-C pair, preserving the helix continuity. The energetics of the structure so obtained is compared with the Ganti-Asyn configuration under similar constrained conditions. We present the methodology adopted and discuss the results.

  8. An automated Genomes-to-Natural Products platform (GNP) for the discovery of modular natural products.

    PubMed

    Johnston, Chad W; Skinnider, Michael A; Wyatt, Morgan A; Li, Xiang; Ranieri, Michael R M; Yang, Lian; Zechel, David L; Ma, Bin; Magarvey, Nathan A

    2015-09-28

    Bacterial natural products are a diverse and valuable group of small molecules, and genome sequencing indicates that the vast majority remain undiscovered. The prediction of natural product structures from biosynthetic assembly lines can facilitate their discovery, but highly automated, accurate, and integrated systems are required to mine the broad spectrum of sequenced bacterial genomes. Here we present a genome-guided natural products discovery tool to automatically predict, combinatorialize and identify polyketides and nonribosomal peptides from biosynthetic assembly lines using LC-MS/MS data of crude extracts in a high-throughput manner. We detail the directed identification and isolation of six genetically predicted polyketides and nonribosomal peptides using our Genome-to-Natural Products platform. This highly automated, user-friendly programme provides a means of realizing the potential of genetically encoded natural products.

  9. Coping Strategies Applied to Comprehend Multistep Arithmetic Word Problems by Students with Above-Average Numeracy Skills and Below-Average Reading Skills

    ERIC Educational Resources Information Center

    Nortvedt, Guri A.

    2011-01-01

    This article discusses how 13-year-old students with above-average numeracy skills and below-average reading skills cope with comprehending word problems. Compared to other students who are proficient in numeracy and are skilled readers, these students are more disadvantaged when solving single-step and multistep arithmetic word problems. The…

  10. Data-based control of a multi-step forming process

    NASA Astrophysics Data System (ADS)

    Schulte, R.; Frey, P.; Hildenbrand, P.; Vogel, M.; Betz, C.; Lechner, M.; Merklein, M.

    2017-09-01

    The fourth industrial revolution represents a new stage in the organization and management of the entire value chain. However, concerning the field of forming technology, the fourth industrial revolution has only arrived gradually until now. In order to make a valuable contribution to the digital factory the controlling of a multistage forming process was investigated. Within the framework of the investigation, an abstracted and transferable model is used to outline which data have to be collected, how an interface between the different forming machines can be designed tangible and which control tasks must be fulfilled. The goal of this investigation was to control the subsequent process step based on the data recorded in the first step. The investigated process chain links various metal forming processes, which are typical elements of a multi-step forming process. Data recorded in the first step of the process chain is analyzed and processed for an improved process control of the subsequent process. On the basis of the gained scientific knowledge, it is possible to make forming operations more robust and at the same time more flexible, and thus create the fundament for linking various production processes in an efficient way.

  11. Automated detection of retinal nerve fiber layer defects on fundus images: false positive reduction based on vessel likelihood

    NASA Astrophysics Data System (ADS)

    Muramatsu, Chisako; Ishida, Kyoko; Sawada, Akira; Hatanaka, Yuji; Yamamoto, Tetsuya; Fujita, Hiroshi

    2016-03-01

    Early detection of glaucoma is important to slow down or cease progression of the disease and for preventing total blindness. We have previously proposed an automated scheme for detection of retinal nerve fiber layer defect (NFLD), which is one of the early signs of glaucoma observed on retinal fundus images. In this study, a new multi-step detection scheme was included to improve detection of subtle and narrow NFLDs. In addition, new features were added to distinguish between NFLDs and blood vessels, which are frequent sites of false positives (FPs). The result was evaluated with a new test dataset consisted of 261 cases, including 130 cases with NFLDs. Using the proposed method, the initial detection rate was improved from 82% to 98%. At the sensitivity of 80%, the number of FPs per image was reduced from 4.25 to 1.36. The result indicates the potential usefulness of the proposed method for early detection of glaucoma.

  12. Cockpit automation

    NASA Technical Reports Server (NTRS)

    Wiener, Earl L.

    1988-01-01

    The aims and methods of aircraft cockpit automation are reviewed from a human-factors perspective. Consideration is given to the mixed pilot reception of increased automation, government concern with the safety and reliability of highly automated aircraft, the formal definition of automation, and the ground-proximity warning system and accidents involving controlled flight into terrain. The factors motivating automation include technology availability; safety; economy, reliability, and maintenance; workload reduction and two-pilot certification; more accurate maneuvering and navigation; display flexibility; economy of cockpit space; and military requirements.

  13. Abundance and composition of indigenous bacterial communities in a multi-step biofiltration-based drinking water treatment plant.

    PubMed

    Lautenschlager, Karin; Hwang, Chiachi; Ling, Fangqiong; Liu, Wen-Tso; Boon, Nico; Köster, Oliver; Egli, Thomas; Hammes, Frederik

    2014-10-01

    Indigenous bacterial communities are essential for biofiltration processes in drinking water treatment systems. In this study, we examined the microbial community composition and abundance of three different biofilter types (rapid sand, granular activated carbon, and slow sand filters) and their respective effluents in a full-scale, multi-step treatment plant (Zürich, CH). Detailed analysis of organic carbon degradation underpinned biodegradation as the primary function of the biofilter biomass. The biomass was present in concentrations ranging between 2-5 × 10(15) cells/m(3) in all filters but was phylogenetically, enzymatically and metabolically diverse. Based on 16S rRNA gene-based 454 pyrosequencing analysis for microbial community composition, similar microbial taxa (predominantly Proteobacteria, Planctomycetes, Acidobacteria, Bacteriodetes, Nitrospira and Chloroflexi) were present in all biofilters and in their respective effluents, but the ratio of microbial taxa was different in each filter type. This change was also reflected in the cluster analysis, which revealed a change of 50-60% in microbial community composition between the different filter types. This study documents the direct influence of the filter biomass on the microbial community composition of the final drinking water, particularly when the water is distributed without post-disinfection. The results provide new insights on the complexity of indigenous bacteria colonizing drinking water systems, especially in different biofilters of a multi-step treatment plant. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Variation of nanopore diameter along porous anodic alumina channels by multi-step anodization.

    PubMed

    Lee, Kwang Hong; Lim, Xin Yuan; Wai, Kah Wing; Romanato, Filippo; Wong, Chee Cheong

    2011-02-01

    In order to form tapered nanocapillaries, we investigated a method to vary the nanopore diameter along the porous anodic alumina (PAA) channels using multi-step anodization. By anodizing the aluminum in either single acid (H3PO4) or multi-acid (H2SO4, oxalic acid and H3PO4) with increasing or decreasing voltage, the diameter of the nanopore along the PAA channel can be varied systematically corresponding to the applied voltages. The pore size along the channel can be enlarged or shrunken in the range of 20 nm to 200 nm. Structural engineering of the template along the film growth direction can be achieved by deliberately designing a suitable voltage and electrolyte together with anodization time.

  15. Solvent recyclability in a multistep direct liquefaction process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hetland, M.D.; Rindt, J.R.

    1995-12-31

    Direct liquefaction research at the Energy & Environmental Research Center (EERC) has, for a number of years, concentrated on developing a direct liquefaction process specifically for low-rank coals (LRCs) through the use of hydrogen-donating solvents and solvents similar to coal-derived liquids, the water/gas shift reaction, and lower-severity reaction conditions. The underlying assumption of all of the research was that advantage could be taken of the reactivity and specific qualities of LRCs to produce a tetrahydrofuran (THF)-soluble material that might be easier to upgrade than the soluble residuum produced during direct liquefaction of high-rank coals. A multistep approach was taken tomore » produce the THF-soluble material, consisting of (1) preconversion treatment to prepare the coal for solubilization, (2) solubilization of the coal in the solvent, and (3) polishing to complete solubilization of the remaining material. The product of these three steps can then be upgraded during a traditional hydrotreatment step. The results of the EERC`s research indicated that additional studies to develop this process more fully were justified. Two areas were targeted for further research: (1) determination of the recyclability of the solvent used during solubilization and (2) determination of the minimum severity required for hydrotreatment of the liquid product. The current project was funded to investigate these two areas.« less

  16. ANDSystem: an Associative Network Discovery System for automated literature mining in the field of biology

    PubMed Central

    2015-01-01

    Background Sufficient knowledge of molecular and genetic interactions, which comprise the entire basis of the functioning of living systems, is one of the necessary requirements for successfully answering almost any research question in the field of biology and medicine. To date, more than 24 million scientific papers can be found in PubMed, with many of them containing descriptions of a wide range of biological processes. The analysis of such tremendous amounts of data requires the use of automated text-mining approaches. Although a handful of tools have recently been developed to meet this need, none of them provide error-free extraction of highly detailed information. Results The ANDSystem package was developed for the reconstruction and analysis of molecular genetic networks based on an automated text-mining technique. It provides a detailed description of the various types of interactions between genes, proteins, microRNA's, metabolites, cellular components, pathways and diseases, taking into account the specificity of cell lines and organisms. Although the accuracy of ANDSystem is comparable to other well known text-mining tools, such as Pathway Studio and STRING, it outperforms them in having the ability to identify an increased number of interaction types. Conclusion The use of ANDSystem, in combination with Pathway Studio and STRING, can improve the quality of the automated reconstruction of molecular and genetic networks. ANDSystem should provide a useful tool for researchers working in a number of different fields, including biology, biotechnology, pharmacology and medicine. PMID:25881313

  17. Nuclear transport adapts to varying heat stress in a multistep mechanism.

    PubMed

    Ogawa, Yutaka; Imamoto, Naoko

    2018-05-10

    Appropriate cell growth conditions are limited to a narrow temperature range. Once the temperature is out of this range, cells respond to protect themselves, but temperature thresholds at which various intracellular responses occur, including nuclear transport systems, remain unclear. Using a newly developed precise temperature shift assay, we found that individual transport pathways have different sensitivities to a rise in temperature. Nuclear translocations of molecular chaperone HSP70s occur at a much lower temperature than the inhibition of Ran-dependent transport. Subsequently, importin (Imp) α/β-dependent import ceases at a lower temperature than other Ran-dependent transport, suggesting that these are controlled by independent mechanisms. In vitro research revealed that the inhibition of Imp α/β-dependent import is caused by the dysfunction of Imp α1 specifically at lower temperature. Thus, the thermosensitivity of Imp α1 modulates transport balances and enables the multistep shutdown of Ran-dependent transport systems according to the degree of heat stress. © 2018 Ogawa and Imamoto.

  18. An Enzyme-Catalyzed Multistep DNA Refolding Mechanism in Hairpin Telomere Formation

    PubMed Central

    Shi, Ke; Huang, Wai Mun; Aihara, Hideki

    2013-01-01

    Hairpin telomeres of bacterial linear chromosomes are generated by a DNA cutting–rejoining enzyme protelomerase. Protelomerase resolves a concatenated dimer of chromosomes as the last step of chromosome replication, converting a palindromic DNA sequence at the junctions between chromosomes into covalently closed hairpins. The mechanism by which protelomerase transforms a duplex DNA substrate into the hairpin telomeres remains largely unknown. We report here a series of crystal structures of the protelomerase TelA bound to DNA that represent distinct stages along the reaction pathway. The structures suggest that TelA converts a linear duplex substrate into hairpin turns via a transient strand-refolding intermediate that involves DNA-base flipping and wobble base-pairs. The extremely compact di-nucleotide hairpin structure of the product is fully stabilized by TelA prior to strand ligation, which drives the reaction to completion. The enzyme-catalyzed, multistep strand refolding is a novel mechanism in DNA rearrangement reactions. PMID:23382649

  19. Multistep Cylindrical Structure Analysis at Normal Incidence Based on Water-Substrate Broadband Metamaterial Absorbers

    NASA Astrophysics Data System (ADS)

    Fang, Chonghua

    2018-01-01

    A new multistep cylindrical structure based on water-substrate broadband metamaterial absorbers is designed to reduce the traditional radar cross-section (RCS) of a rod-shaped object. The proposed configuration consists of two distinct parts. One of these components is formed by a four-step cylindrical metal structure, whereas the other one is formed by a new water-substrate broadband metamaterial absorber. The designed structure can significantly reduce the radar cross section more than 10 dB from 4.58 to 18.42 GHz which is the 86.5 % bandwidth of from C-band to 20 GHz. The results of measurement show reasonably good accordance with the simulated ones, which verifies the ability and effect of the proposed design.

  20. The route from problem to solution in multistep continuous flow synthesis of pharmaceutical compounds.

    PubMed

    Bana, Péter; Örkényi, Róbert; Lövei, Klára; Lakó, Ágnes; Túrós, György István; Éles, János; Faigl, Ferenc; Greiner, István

    2017-12-01

    Recent advances in the field of continuous flow chemistry allow the multistep preparation of complex molecules such as APIs (Active Pharmaceutical Ingredients) in a telescoped manner. Numerous examples of laboratory-scale applications are described, which are pointing towards novel manufacturing processes of pharmaceutical compounds, in accordance with recent regulatory, economical and quality guidances. The chemical and technical knowledge gained during these studies is considerable; nevertheless, connecting several individual chemical transformations and the attached analytics and purification holds hidden traps. In this review, we summarize innovative solutions for these challenges, in order to benefit chemists aiming to exploit flow chemistry systems for the synthesis of biologically active molecules. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Automated saccharification assay for determination of digestibility in plant materials.

    PubMed

    Gomez, Leonardo D; Whitehead, Caragh; Barakate, Abdellah; Halpin, Claire; McQueen-Mason, Simon J

    2010-10-27

    Cell wall resistance represents the main barrier for the production of second generation biofuels. The deconstruction of lignocellulose can provide sugars for the production of fuels or other industrial products through fermentation. Understanding the biochemical basis of the recalcitrance of cell walls to digestion will allow development of more effective and cost efficient ways to produce sugars from biomass. One approach is to identify plant genes that play a role in biomass recalcitrance, using association genetics. Such an approach requires a robust and reliable high throughput (HT) assay for biomass digestibility, which can be used to screen the large numbers of samples involved in such studies. We developed a HT saccharification assay based on a robotic platform that can carry out in a 96-well plate format the enzymatic digestion and quantification of the released sugars. The handling of the biomass powder for weighing and formatting into 96 wells is performed by a robotic station, where the plant material is ground, delivered to the desired well in the plates and weighed with a precision of 0.1 mg. Once the plates are loaded, an automated liquid handling platform delivers an optional mild pretreatment (< 100°C) followed by enzymatic hydrolysis of the biomass. Aliquots from the hydrolysis are then analyzed for the release of reducing sugar equivalents. The same platform can be used for the comparative evaluation of different enzymes and enzyme cocktails. The sensitivity and reliability of the platform was evaluated by measuring the saccharification of stems from lignin modified tobacco plants, and the results of automated and manual analyses compared. The automated assay systems are sensitive, robust and reliable. The system can reliably detect differences in the saccharification of plant tissues, and is able to process large number of samples with a minimum amount of human intervention. The automated system uncovered significant increases in the

  2. Optimization of the Production of Extracellular Polysaccharide from the Shiitake Medicinal Mushroom Lentinus edodes (Agaricomycetes) Using Mutation and a Genetic Algorithm-Coupled Artificial Neural Network (GA-ANN).

    PubMed

    Adeeyo, Adeyemi Ojutalayo; Lateef, Agbaje; Gueguim-Kana, Evariste Bosco

    2016-01-01

    Exopolysaccharide (EPS) production by a strain of Lentinus edodes was studied via the effects of treatments with ultraviolet (UV) irradiation and acridine orange. Furthermore, optimization of EPS production was studied using a genetic algorithm coupled with an artificial neural network in submerged fermentation. Exposure to irradiation and acridine orange resulted in improved EPS production (2.783 and 5.548 g/L, respectively) when compared with the wild strain (1.044 g/L), whereas optimization led to improved productivity (23.21 g/L). The EPS produced by various strains also demonstrated good DPPH scavenging activities of 45.40-88.90%, and also inhibited the growth of Escherichia coli and Klebsiella pneumoniae. This study shows that multistep optimization schemes involving physical-chemical mutation and media optimization can be an attractive strategy for improving the yield of bioactives from medicinal mushrooms. To the best of our knowledge, this report presents the first reference of a multistep approach to optimizing EPS production in L. edodes.

  3. Engineering biological systems using automated biofoundries.

    PubMed

    Chao, Ran; Mishra, Shekhar; Si, Tong; Zhao, Huimin

    2017-07-01

    Engineered biological systems such as genetic circuits and microbial cell factories have promised to solve many challenges in the modern society. However, the artisanal processes of research and development are slow, expensive, and inconsistent, representing a major obstacle in biotechnology and bioengineering. In recent years, biological foundries or biofoundries have been developed to automate design-build-test engineering cycles in an effort to accelerate these processes. This review summarizes the enabling technologies for such biofoundries as well as their early successes and remaining challenges. Copyright © 2017 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  4. Autonomy and Automation

    NASA Technical Reports Server (NTRS)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  5. Star sub-pixel centroid calculation based on multi-step minimum energy difference method

    NASA Astrophysics Data System (ADS)

    Wang, Duo; Han, YanLi; Sun, Tengfei

    2013-09-01

    The star's centroid plays a vital role in celestial navigation, star images which be gotten during daytime, due to the strong sky background, have a low SNR, and the star objectives are nearly submerged in the background, takes a great trouble to the centroid localization. Traditional methods, such as a moment method, weighted centroid calculation method is simple but has a big error, especially in the condition of a low SNR. Gaussian method has a high positioning accuracy, but the computational complexity. Analysis of the energy distribution in star image, a location method for star target centroids based on multi-step minimum energy difference is proposed. This method uses the linear superposition to narrow the centroid area, in the certain narrow area uses a certain number of interpolation to pixels for the pixels' segmentation, and then using the symmetry of the stellar energy distribution, tentatively to get the centroid position: assume that the current pixel is the star centroid position, and then calculates and gets the difference of the sum of the energy which in the symmetric direction(in this paper we take the two directions of transverse and longitudinal) and the equal step length(which can be decided through different conditions, the paper takes 9 as the step length) of the current pixel, and obtain the centroid position in this direction when the minimum difference appears, and so do the other directions, then the validation comparison of simulated star images, and compare with several traditional methods, experiments shows that the positioning accuracy of the method up to 0.001 pixel, has good effect to calculate the centroid of low SNR conditions; at the same time, uses this method on a star map which got at the fixed observation site during daytime in near-infrared band, compare the results of the paper's method with the position messages which were known of the star, it shows that :the multi-step minimum energy difference method achieves a better

  6. Multistep modeling of protein structure: application towards refinement of tyr-tRNA synthetase

    NASA Technical Reports Server (NTRS)

    Srinivasan, S.; Shibata, M.; Roychoudhury, M.; Rein, R.

    1987-01-01

    The scope of multistep modeling (MSM) is expanding by adding a least-squares minimization step in the procedure to fit backbone reconstruction consistent with a set of C-alpha coordinates. The analytical solution of Phi and Psi angles, that fits a C-alpha x-ray coordinate is used for tyr-tRNA synthetase. Phi and Psi angles for the region where the above mentioned method fails, are obtained by minimizing the difference in C-alpha distances between the computed model and the crystal structure in a least-squares sense. We present a stepwise application of this part of MSM to the determination of the complete backbone geometry of the 321 N terminal residues of tyrosine tRNA synthetase to a root mean square deviation of 0.47 angstroms from the crystallographic C-alpha coordinates.

  7. 149 HCV AND lymphoma: Genetic and epigenetic factors

    PubMed Central

    Zignego, AL; Gragnani, L; Fognani, E; Piluso, A

    2014-01-01

    Over 180 million people worldwide are chronically infected with the hepatitis C virus (HCV). HCV infection is a major cause for hepatocellular carcinoma (HCC), moreover the association with B-cell lymphoproliferative disorders (LPDs) like mixed cryoglobulinemia (MC) or B-cell non-Hodgkin lymphoma (B-NHL) is undisputed. The mechanisms by which HCV contributes to LPD development are still poorly understood. Available data suggest that the viral infection may induce LPDs through a multifactorial and multistep process that involves the sustained activation of B cells, the abnormal and prolonged B cell survival, and genetic and/or epigenetic factors. Concerning genetic factors, different authors reported an association between specific HLA clusters or B-cell activating factor promoter genotype and a higher risk of developing MC and lymphoma. In addition, the results of a large, ongoing genome wide association study (GWAS) will probably allow the identification of specific genetic profile of HCV patients with LPDs. Furthermore, microRNAs (miRNAs) can give a major contribution to the pathogenesis of several neoplastic, lymphoproliferative diseases and it is conceivable their involvement in the pathogenesis of HCV-related LPDs. We recently showed that specific miRNAs were differently modulated in PBMCs from HCV patients who developed MC and/or NHL. In addition, HCV patients who developed HCC, showed a differential miRNAs regulation. In conclusion, available data suggest that the genetic/epigenetic analysis of HCV-related cancerogenesis is of great usefulness in both the pathogenetic and clinical/translational areas possibly allowing the definition of diagnostic/prognostic markers for early detection of lymphatic or hepatic cancer.

  8. Genetics Home Reference: tyrosinemia

    MedlinePlus

    ... in the multistep process that breaks down the amino acid tyrosine, a building block of most proteins. If ... Resources MedlinePlus (4 links) Encyclopedia: Aminoaciduria Health Topic: Amino Acid Metabolism Disorders Health Topic: Liver Diseases Health Topic: ...

  9. The light spot test: Measuring anxiety in mice in an automated home-cage environment.

    PubMed

    Aarts, Emmeke; Maroteaux, Gregoire; Loos, Maarten; Koopmans, Bastijn; Kovačević, Jovana; Smit, August B; Verhage, Matthijs; Sluis, Sophie van der

    2015-11-01

    Behavioral tests of animals in a controlled experimental setting provide a valuable tool to advance understanding of genotype-phenotype relations, and to study the effects of genetic and environmental manipulations. To optimally benefit from the increasing numbers of genetically engineered mice, reliable high-throughput methods for comprehensive behavioral phenotyping of mice lines have become a necessity. Here, we describe the development and validation of an anxiety test, the light spot test, that allows for unsupervised, automated, high-throughput testing of mice in a home-cage system. This automated behavioral test circumvents bias introduced by pretest handling, and enables recording both baseline behavior and the behavioral test response over a prolonged period of time. We demonstrate that the light spot test induces a behavioral response in C57BL/6J mice. This behavior reverts to baseline when the aversive stimulus is switched off, and is blunted by treatment with the anxiolytic drug Diazepam, demonstrating predictive validity of the assay, and indicating that the observed behavioral response has a significant anxiety component. Also, we investigated the effectiveness of the light spot test as part of sequential testing for different behavioral aspects in the home-cage. Two learning tests, administered prior to the light spot test, affected the light spot test parameters. The light spot test is a novel, automated assay for anxiety-related high-throughput testing of mice in an automated home-cage environment, allowing for both comprehensive behavioral phenotyping of mice, and rapid screening of pharmacological compounds. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Validation of shortened 2-day sterility testing of mesenchymal stem cell-based therapeutic preparation on an automated culture system.

    PubMed

    Lysák, Daniel; Holubová, Monika; Bergerová, Tamara; Vávrová, Monika; Cangemi, Giuseppina Cristina; Ciccocioppo, Rachele; Kruzliak, Peter; Jindra, Pavel

    2016-03-01

    Cell therapy products represent a new trend of treatment in the field of immunotherapy and regenerative medicine. Their biological nature and multistep preparation procedure require the application of complex release criteria and quality control. Microbial contamination of cell therapy products is a potential source of morbidity in recipients. The automated blood culture systems are widely used for the detection of microorganisms in cell therapy products. However the standard 2-week cultivation period is too long for some cell-based treatments and alternative methods have to be devised. We tried to verify whether a shortened cultivation of the supernatant from the mesenchymal stem cell (MSC) culture obtained 2 days before the cell harvest could sufficiently detect microbial growth and allow the release of MSC for clinical application. We compared the standard Ph. Eur. cultivation method and the automated blood culture system BACTEC (Becton Dickinson). The time to detection (TTD) and the detection limit were analyzed for three bacterial and two fungal strains. The Staphylococcus aureus and Pseudomonas aeruginosa were recognized within 24 h with both methods (detection limit ~10 CFU). The time required for the detection of Bacillus subtilis was shorter with the automated method (TTD 10.3 vs. 60 h for 10-100 CFU). The BACTEC system reached significantly shorter times to the detection of Candida albicans and Aspergillus brasiliensis growth compared to the classical method (15.5 vs. 48 and 31.5 vs. 48 h, respectively; 10-100 CFU). The positivity was demonstrated within 48 h in all bottles, regardless of the size of the inoculum. This study validated the automated cultivation system as a method able to detect all tested microorganisms within a 48-h period with a detection limit of ~10 CFU. Only in case of B. subtilis, the lowest inoculum (~10 CFU) was not recognized. The 2-day cultivation technique is then capable of confirming the microbiological safety of MSC and

  11. Shutdown Dose Rate Analysis Using the Multi-Step CADIS Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ibrahim, Ahmad M.; Peplow, Douglas E.; Peterson, Joshua L.

    2015-01-01

    The Multi-Step Consistent Adjoint Driven Importance Sampling (MS-CADIS) hybrid Monte Carlo (MC)/deterministic radiation transport method was proposed to speed up the shutdown dose rate (SDDR) neutron MC calculation using an importance function that represents the neutron importance to the final SDDR. This work applied the MS-CADIS method to the ITER SDDR benchmark problem. The MS-CADIS method was also used to calculate the SDDR uncertainty resulting from uncertainties in the MC neutron calculation and to determine the degree of undersampling in SDDR calculations because of the limited ability of the MC method to tally detailed spatial and energy distributions. The analysismore » that used the ITER benchmark problem compared the efficiency of the MS-CADIS method to the traditional approach of using global MC variance reduction techniques for speeding up SDDR neutron MC calculation. Compared to the standard Forward-Weighted-CADIS (FW-CADIS) method, the MS-CADIS method increased the efficiency of the SDDR neutron MC calculation by 69%. The MS-CADIS method also increased the fraction of nonzero scoring mesh tally elements in the space-energy regions of high importance to the final SDDR.« less

  12. Computational comparison of quantum-mechanical models for multistep direct reactions

    NASA Astrophysics Data System (ADS)

    Koning, A. J.; Akkermans, J. M.

    1993-02-01

    We have carried out a computational comparison of all existing quantum-mechanical models for multistep direct (MSD) reactions. The various MSD models, including the so-called Feshbach-Kerman-Koonin, Tamura-Udagawa-Lenske and Nishioka-Yoshida-Weidenmüller models, have been implemented in a single computer system. All model calculations thus use the same set of parameters and the same numerical techniques; only one adjustable parameter is employed. The computational results have been compared with experimental energy spectra and angular distributions for several nuclear reactions, namely, 90Zr(p,p') at 80 MeV, 209Bi(p,p') at 62 MeV, and 93Nb(n,n') at 25.7 MeV. In addition, the results have been compared with the Kalbach systematics and with semiclassical exciton model calculations. All quantum MSD models provide a good fit to the experimental data. In addition, they reproduce the systematics very well and are clearly better than semiclassical model calculations. We furthermore show that the calculated predictions do not differ very strongly between the various quantum MSD models, leading to the conclusion that the simplest MSD model (the Feshbach-Kerman-Koonin model) is adequate for the analysis of experimental data.

  13. The fusion of biology, computer science, and engineering: towards efficient and successful synthetic biology.

    PubMed

    Linshiz, Gregory; Goldberg, Alex; Konry, Tania; Hillson, Nathan J

    2012-01-01

    Synthetic biology is a nascent field that emerged in earnest only around the turn of the millennium. It aims to engineer new biological systems and impart new biological functionality, often through genetic modifications. The design and construction of new biological systems is a complex, multistep process, requiring multidisciplinary collaborative efforts from "fusion" scientists who have formal training in computer science or engineering, as well as hands-on biological expertise. The public has high expectations for synthetic biology and eagerly anticipates the development of solutions to the major challenges facing humanity. This article discusses laboratory practices and the conduct of research in synthetic biology. It argues that the fusion science approach, which integrates biology with computer science and engineering best practices, including standardization, process optimization, computer-aided design and laboratory automation, miniaturization, and systematic management, will increase the predictability and reproducibility of experiments and lead to breakthroughs in the construction of new biological systems. The article also discusses several successful fusion projects, including the development of software tools for DNA construction design automation, recursive DNA construction, and the development of integrated microfluidics systems.

  14. A Fully Automated Drosophila Olfactory Classical Conditioning and Testing System for Behavioral Learning and Memory Assessment

    PubMed Central

    Jiang, Hui; Hanna, Eriny; Gatto, Cheryl L.; Page, Terry L.; Bhuva, Bharat; Broadie, Kendal

    2016-01-01

    Background Aversive olfactory classical conditioning has been the standard method to assess Drosophila learning and memory behavior for decades, yet training and testing are conducted manually under exceedingly labor-intensive conditions. To overcome this severe limitation, a fully automated, inexpensive system has been developed, which allows accurate and efficient Pavlovian associative learning/memory analyses for high-throughput pharmacological and genetic studies. New Method The automated system employs a linear actuator coupled to an odorant T-maze with airflow-mediated transfer of animals between training and testing stages. Odorant, airflow and electrical shock delivery are automatically administered and monitored during training trials. Control software allows operator-input variables to define parameters of Drosophila learning, short-term memory and long-term memory assays. Results The approach allows accurate learning/memory determinations with operational fail-safes. Automated learning indices (immediately post-training) and memory indices (after 24 hours) are comparable to traditional manual experiments, while minimizing experimenter involvement. Comparison with Existing Methods The automated system provides vast improvements over labor-intensive manual approaches with no experimenter involvement required during either training or testing phases. It provides quality control tracking of airflow rates, odorant delivery and electrical shock treatments, and an expanded platform for high-throughput studies of combinational drug tests and genetic screens. The design uses inexpensive hardware and software for a total cost of ~$500US, making it affordable to a wide range of investigators. Conclusions This study demonstrates the design, construction and testing of a fully automated Drosophila olfactory classical association apparatus to provide low-labor, high-fidelity, quality-monitored, high-throughput and inexpensive learning and memory behavioral assays

  15. A fully automated Drosophila olfactory classical conditioning and testing system for behavioral learning and memory assessment.

    PubMed

    Jiang, Hui; Hanna, Eriny; Gatto, Cheryl L; Page, Terry L; Bhuva, Bharat; Broadie, Kendal

    2016-03-01

    Aversive olfactory classical conditioning has been the standard method to assess Drosophila learning and memory behavior for decades, yet training and testing are conducted manually under exceedingly labor-intensive conditions. To overcome this severe limitation, a fully automated, inexpensive system has been developed, which allows accurate and efficient Pavlovian associative learning/memory analyses for high-throughput pharmacological and genetic studies. The automated system employs a linear actuator coupled to an odorant T-maze with airflow-mediated transfer of animals between training and testing stages. Odorant, airflow and electrical shock delivery are automatically administered and monitored during training trials. Control software allows operator-input variables to define parameters of Drosophila learning, short-term memory and long-term memory assays. The approach allows accurate learning/memory determinations with operational fail-safes. Automated learning indices (immediately post-training) and memory indices (after 24h) are comparable to traditional manual experiments, while minimizing experimenter involvement. The automated system provides vast improvements over labor-intensive manual approaches with no experimenter involvement required during either training or testing phases. It provides quality control tracking of airflow rates, odorant delivery and electrical shock treatments, and an expanded platform for high-throughput studies of combinational drug tests and genetic screens. The design uses inexpensive hardware and software for a total cost of ∼$500US, making it affordable to a wide range of investigators. This study demonstrates the design, construction and testing of a fully automated Drosophila olfactory classical association apparatus to provide low-labor, high-fidelity, quality-monitored, high-throughput and inexpensive learning and memory behavioral assays. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Accurate identification of microseismic P- and S-phase arrivals using the multi-step AIC algorithm

    NASA Astrophysics Data System (ADS)

    Zhu, Mengbo; Wang, Liguan; Liu, Xiaoming; Zhao, Jiaxuan; Peng, Ping'an

    2018-03-01

    Identification of P- and S-phase arrivals is the primary work in microseismic monitoring. In this study, a new multi-step AIC algorithm is proposed. This algorithm consists of P- and S-phase arrival pickers (P-picker and S-picker). The P-picker contains three steps: in step 1, a preliminary P-phase arrival window is determined by the waveform peak. Then a preliminary P-pick is identified using the AIC algorithm. Finally, the P-phase arrival window is narrowed based on the above P-pick. Thus the P-phase arrival can be identified accurately by using the AIC algorithm again. The S-picker contains five steps: in step 1, a narrow S-phase arrival window is determined based on the P-pick and the AIC curve of amplitude biquadratic time-series. In step 2, the S-picker automatically judges whether the S-phase arrival is clear to identify. In step 3 and 4, the AIC extreme points are extracted, and the relationship between the local minimum and the S-phase arrival is researched. In step 5, the S-phase arrival is picked based on the maximum probability criterion. To evaluate of the proposed algorithm, a P- and S-picks classification criterion is also established based on a source location numerical simulation. The field data tests show a considerable improvement of the multi-step AIC algorithm in comparison with the manual picks and the original AIC algorithm. Furthermore, the technique is independent of the kind of SNR. Even in the poor-quality signal group which the SNRs are below 5, the effective picking rates (the corresponding location error is <15 m) of P- and S-phase arrivals are still up to 80.9% and 76.4% respectively.

  17. Rapid discrimination of sea buckthorn berries from different H. rhamnoides subspecies by multi-step IR spectroscopy coupled with multivariate data analysis

    NASA Astrophysics Data System (ADS)

    Liu, Yue; Zhang, Ying; Zhang, Jing; Fan, Gang; Tu, Ya; Sun, Suqin; Shen, Xudong; Li, Qingzhu; Zhang, Yi

    2018-03-01

    As an important ethnic medicine, sea buckthorn was widely used to prevent and treat various diseases due to its nutritional and medicinal properties. According to the Chinese Pharmacopoeia, sea buckthorn was originated from H. rhamnoides, which includes five subspecies distributed in China. Confusion and misidentification usually occurred due to their similar morphology, especially in dried and powdered forms. Additionally, these five subspecies have vital differences in quality and physiological efficacy. This paper focused on the quick classification and identification method of sea buckthorn berry powders from five H. rhamnoides subspecies using multi-step IR spectroscopy coupled with multivariate data analysis. The holistic chemical compositions revealed by the FT-IR spectra demonstrated that flavonoids, fatty acids and sugars were the main chemical components. Further, the differences in FT-IR spectra regarding their peaks, positions and intensities were used to identify H. rhamnoides subspecies samples. The discrimination was achieved using principal component analysis (PCA) and partial least square-discriminant analysis (PLS-DA). The results showed that the combination of multi-step IR spectroscopy and chemometric analysis offered a simple, fast and reliable method for the classification and identification of the sea buckthorn berry powders from different H. rhamnoides subspecies.

  18. Multistep divergent synthesis of benzimidazole linked benzoxazole/benzothiazole via copper catalyzed domino annulation.

    PubMed

    Liao, Jen-Yu; Selvaraju, Manikandan; Chen, Chih-Hau; Sun, Chung-Ming

    2013-04-21

    An efficient, facile synthesis of structurally diverse benzimidazole integrated benzoxazole and benzothiazoles has been developed. In a multi-step synthetic sequence, 4-fluoro-3-nitrobenzoic acid was converted into benzimidazole bis-heterocycles, via the intermediacy of benzimidazole linked ortho-chloro amines. The amphiphilic reactivity of this intermediate was designed to achieve the title compounds by the reaction of various acid chlorides and isothiocyanates in a single step through the in situ formation of ortho-chloro anilides and thioureas under microwave irradiation. A versatile one pot domino annulation reaction was developed to involve the reaction of benzimidazole linked ortho-chloro amines with acid chlorides and isothiocyanates. The initial acylation and urea formation followed by copper catalyzed intramolecular C-O and C-S cross coupling reactions furnished the angularly oriented bis-heterocycles which bear a close resemblance to the streptomyces antibiotic UK-1.

  19. ActionMap: A web-based software that automates loci assignments to framework maps.

    PubMed

    Albini, Guillaume; Falque, Matthieu; Joets, Johann

    2003-07-01

    Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/).

  20. ActionMap: a web-based software that automates loci assignments to framework maps

    PubMed Central

    Albini, Guillaume; Falque, Matthieu; Joets, Johann

    2003-01-01

    Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/). PMID:12824426

  1. Genetic approaches of the Fe-S cluster biogenesis process in bacteria: Historical account, methodological aspects and future challenges.

    PubMed

    Py, Béatrice; Barras, Frédéric

    2015-06-01

    Since their discovery in the 50's, Fe-S cluster proteins have attracted much attention from chemists, biophysicists and biochemists. However, in the 80's they were joined by geneticists who helped to realize that in vivo maturation of Fe-S cluster bound proteins required assistance of a large number of factors defining complex multi-step pathways. The question of how clusters are formed and distributed in vivo has since been the focus of much effort. Here we review how genetics in discovering genes and investigating processes as they unfold in vivo has provoked seminal advances toward our understanding of Fe-S cluster biogenesis. The power and limitations of genetic approaches are discussed. As a final comment, we argue how the marriage of classic strategies and new high-throughput technologies should allow genetics of Fe-S cluster biology to be even more insightful in the future. This article is part of a Special Issue entitled: Fe/S proteins: Analysis, structure, function, biogenesis and diseases. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Spatial mapping reveals multi-step pattern of wound healing in Physarum polycephalum

    NASA Astrophysics Data System (ADS)

    Bäuerle, Felix K.; Kramar, Mirna; Alim, Karen

    2017-11-01

    Wounding is a severe impairment of function, especially for an exposed organism like the network-forming true slime mould Physarum polycephalum. The tubular network making up the organism’s body plan is entirely interconnected and shares a common cytoplasm. Oscillatory contractions of the enclosing tube walls drive the shuttle streaming of the cytoplasm. Cytoplasmic flows underlie the reorganization of the network for example by movement toward attractive stimuli or away from repellants. Here, we follow the reorganization of P. polycephalum networks after severe wounding. Spatial mapping of the contraction changes in response to wounding reveal a multi-step pattern. Phases of increased activity alternate with cessation of contractions and stalling of flows, giving rise to coordinated transport and growth at the severing site. Overall, severing surprisingly acts like an attractive stimulus enabling healing of severed tubes. The reproducible cessation of contractions arising during this wound-healing response may open up new venues to investigate the biochemical wiring underlying P. polycephalum’s complex behaviours.

  3. Facilitating Students' Review of the Chemistry of Nitrogen-Containing Heterocyclic Compounds and Their Characterization through Multistep Synthesis of Thieno[2,3-"b"]Pyridine Derivatives

    ERIC Educational Resources Information Center

    Liu, Hanlin; Zaplishnyy, Vladimir; Mikhaylichenko, Lana

    2016-01-01

    A multistep synthesis of thieno[2,3-"b"]pyridine derivatives is described that is suitable for the upper-level undergraduate organic laboratory. This experiment exposes students to various hands-on experimental techniques as well as methods of product characterization such as IR and [superscript 1]H NMR spectroscopy, and…

  4. Single- and multistep resistance selection studies on the activity of retapamulin compared to other agents against Staphylococcus aureus and Streptococcus pyogenes.

    PubMed

    Kosowska-Shick, Klaudia; Clark, Catherine; Credito, Kim; McGhee, Pamela; Dewasse, Bonifacio; Bogdanovich, Tatiana; Appelbaum, Peter C

    2006-02-01

    Retapamulin had the lowest rate of spontaneous mutations by single-step passaging and the lowest parent and selected mutant MICs by multistep passaging among all drugs tested for all Staphylococcus aureus strains and three Streptococcus pyogenes strains which yielded resistant clones. Retapamulin has a low potential for resistance selection in S. pyogenes, with a slow and gradual propensity for resistance development in S. aureus.

  5. Single- and Multistep Resistance Selection Studies on the Activity of Retapamulin Compared to Other Agents against Staphylococcus aureus and Streptococcus pyogenes

    PubMed Central

    Kosowska-Shick, Klaudia; Clark, Catherine; Credito, Kim; McGhee, Pamela; Dewasse, Bonifacio; Bogdanovich, Tatiana; Appelbaum, Peter C.

    2006-01-01

    Retapamulin had the lowest rate of spontaneous mutations by single-step passaging and the lowest parent and selected mutant MICs by multistep passaging among all drugs tested for all Staphylococcus aureus strains and three Streptococcus pyogenes strains which yielded resistant clones. Retapamulin has a low potential for resistance selection in S. pyogenes, with a slow and gradual propensity for resistance development in S. aureus. PMID:16436741

  6. Reaction and catalyst engineering to exploit kinetically controlled whole-cell multistep biocatalysis for terminal FAME oxyfunctionalization.

    PubMed

    Schrewe, Manfred; Julsing, Mattijs K; Lange, Kerstin; Czarnotta, Eik; Schmid, Andreas; Bühler, Bruno

    2014-09-01

    The oxyfunctionalization of unactivated C−H bonds can selectively and efficiently be catalyzed by oxygenase-containing whole-cell biocatalysts. Recombinant Escherichia coli W3110 containing the alkane monooxygenase AlkBGT and the outer membrane protein AlkL from Pseudomonas putida GPo1 have been shown to efficiently catalyze the terminal oxyfunctionalization of renewable fatty acid methyl esters yielding bifunctional products of interest for polymer synthesis. In this study, AlkBGTL-containing E. coli W3110 is shown to catalyze the multistep conversion of dodecanoic acid methyl ester (DAME) via terminal alcohol and aldehyde to the acid, exhibiting Michaelis-Menten-type kinetics for each reaction step. In two-liquid phase biotransformations, the product formation pattern was found to be controlled by DAME availability. Supplying DAME as bulk organic phase led to accumulation of the terminal alcohol as the predominant product. Limiting DAME availability via application of bis(2-ethylhexyl)phthalate (BEHP) as organic carrier solvent enabled almost exclusive acid accumulation. Furthermore, utilization of BEHP enhanced catalyst stability by reducing toxic effects of substrate and products. A further shift towards the overoxidized products was achieved by co-expression of the gene encoding the alcohol dehydrogenase AlkJ, which was shown to catalyze efficient and irreversible alcohol to aldehyde oxidation in vivo. With DAME as organic phase, the aldehyde accumulated as main product using resting cells containing AlkBGT, AlkL, as well as AlkJ. This study highlights the versatility of whole-cell biocatalysis for synthesis of industrially relevant bifunctional building blocks and demonstrates how integrated reaction and catalyst engineering can be implemented to control product formation patterns in biocatalytic multistep reactions. © 2014 Wiley Periodicals, Inc.

  7. A multi-step approach for testing non-toxic amphiphilic antifouling coatings against marine microfouling at different levels of biological complexity.

    PubMed

    Zecher, Karsten; Aitha, Vishwa Prasad; Heuer, Kirsten; Ahlers, Herbert; Roland, Katrin; Fiedel, Michael; Philipp, Bodo

    2018-03-01

    Marine biofouling on artificial surfaces such as ship hulls or fish farming nets causes enormous economic damage. The time for the developmental process of antifouling coatings can be shortened by reliable laboratory assays. For designing such test systems, it is important that toxic effects can be excluded, that multiple parameters can be addressed simultaneously and that mechanistic aspects can be included. In this study, a multi-step approach for testing antifouling coatings was established employing photoautotrophic biofilm formation of marine microorganisms in micro- and mesoscoms. Degree and pattern of biofilm formation was determined by quantification of chlorophyll fluorescence. For the microcosms, co-cultures of diatoms and a heterotrophic bacterium were exposed to fouling-release coatings. For the mesocosms, a novel device was developed that permits parallel quantification of a multitude of coatings under defined conditions with varying degrees of shear stress. Additionally, the antifouling coatings were tested for leaching of potential compounds and finally tested in sea trials. This multistep-approach revealed that the individual steps led to consistent results regarding antifouling activity of the coatings. Furthermore, the novel mesocosm system can be employed for advanced antifouling analysis including metagenomic approaches for determination of microbial diversity attaching to different coatings under changing shear forces. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. A Parallel Genetic Algorithm for Automated Electronic Circuit Design

    NASA Technical Reports Server (NTRS)

    Lohn, Jason D.; Colombano, Silvano P.; Haith, Gary L.; Stassinopoulos, Dimitris; Norvig, Peter (Technical Monitor)

    2000-01-01

    We describe a parallel genetic algorithm (GA) that automatically generates circuit designs using evolutionary search. A circuit-construction programming language is introduced and we show how evolution can generate practical analog circuit designs. Our system allows circuit size (number of devices), circuit topology, and device values to be evolved. We present experimental results as applied to analog filter and amplifier design tasks.

  9. Multi-step aberrant CpG island hyper-methylation is associated with the progression of adult T-cell leukemia/lymphoma.

    PubMed

    Sato, Hiaki; Oka, Takashi; Shinnou, Yoko; Kondo, Takami; Washio, Kana; Takano, Masayuki; Takata, Katsuyoshi; Morito, Toshiaki; Huang, Xingang; Tamura, Maiko; Kitamura, Yuta; Ohara, Nobuya; Ouchida, Mamoru; Ohshima, Koichi; Shimizu, Kenji; Tanimoto, Mitsune; Takahashi, Kiyoshi; Matsuoka, Masao; Utsunomiya, Atae; Yoshino, Tadashi

    2010-01-01

    Aberrant CpG island methylation contributes to the pathogenesis of various malignancies. However, little is known about the association of epigenetic abnormalities with multistep tumorigenic events in adult T cell leukemia/lymphoma (ATLL). To determine whether epigenetic abnormalities induce the progression of ATLL, we analyzed the methylation profiles of the SHP1, p15, p16, p73, HCAD, DAPK, hMLH-1, and MGMT genes by methylation specific PCR assay in 65 cases with ATLL patients. The number of CpG island methylated genes increased with disease progression and aberrant hypermethylation in specific genes was detected even in HTLV-1 carriers and correlated with progression to ATLL. The CpG island methylator phenotype (CIMP) was observed most frequently in lymphoma type ATLL and was also closely associated with the progression and crisis of ATLL. The high number of methylated genes and increase of CIMP incidence were shown to be unfavorable prognostic factors and correlated with a shorter overall survival by Kaplan-Meyer analysis. The present findings strongly suggest that the multistep accumulation of aberrant CpG methylation in specific target genes and the presence of CIMP are deeply involved in the crisis, progression, and prognosis of ATLL, as well as indicate the value of CpG methylation and CIMP for new diagnostic and prognostic biomarkers.

  10. Multi-Step Aberrant CpG Island Hyper-Methylation Is Associated with the Progression of Adult T–Cell Leukemia/Lymphoma

    PubMed Central

    Sato, Hiaki; Oka, Takashi; Shinnou, Yoko; Kondo, Takami; Washio, Kana; Takano, Masayuki; Takata, Katsuyoshi; Morito, Toshiaki; Huang, Xingang; Tamura, Maiko; Kitamura, Yuta; Ohara, Nobuya; Ouchida, Mamoru; Ohshima, Koichi; Shimizu, Kenji; Tanimoto, Mitsune; Takahashi, Kiyoshi; Matsuoka, Masao; Utsunomiya, Atae; Yoshino, Tadashi

    2010-01-01

    Aberrant CpG island methylation contributes to the pathogenesis of various malignancies. However, little is known about the association of epigenetic abnormalities with multistep tumorigenic events in adult T cell leukemia/lymphoma (ATLL). To determine whether epigenetic abnormalities induce the progression of ATLL, we analyzed the methylation profiles of the SHP1, p15, p16, p73, HCAD, DAPK, hMLH-1, and MGMT genes by methylation specific PCR assay in 65 cases with ATLL patients. The number of CpG island methylated genes increased with disease progression and aberrant hypermethylation in specific genes was detected even in HTLV-1 carriers and correlated with progression to ATLL. The CpG island methylator phenotype (CIMP) was observed most frequently in lymphoma type ATLL and was also closely associated with the progression and crisis of ATLL. The high number of methylated genes and increase of CIMP incidence were shown to be unfavorable prognostic factors and correlated with a shorter overall survival by Kaplan-Meyer analysis. The present findings strongly suggest that the multistep accumulation of aberrant CpG methylation in specific target genes and the presence of CIMP are deeply involved in the crisis, progression, and prognosis of ATLL, as well as indicate the value of CpG methylation and CIMP for new diagnostic and prognostic biomarkers. PMID:20019193

  11. Medical hypothesis: bifunctional genetic-hormonal pathways to breast cancer.

    PubMed

    Davis, D L; Telang, N T; Osborne, M P; Bradlow, H L

    1997-04-01

    As inherited germ line mutations, such as loss of BRCA1 or AT, account for less than 5% of all breast cancer, most cases involve acquired somatic perturbations. Cumulative lifetime exposure to bioavailable estradiol links most known risk factors (except radiation) for breast cancer. Based on a series of recent experimental and epidemiologic findings, we hypothesize that the multistep process of breast carcinogenesis results from exposure to endogenous or exogenous hormones, including phytoestrogens that directly or indirectly alter estrogen metabolism. Xenohormones are defined as xenobiotic materials that modify hormonal production; they can work bifunctionally, through genetic or hormonal paths, depending on the periods and extent of exposure. As for genetic paths, xenohormones can modify DNA structure or function. As for hormonal paths, two distinct mechanisms can influence the potential for aberrant cell growth: compounds can directly bind with endogenous hormone or growth factor receptors affecting cell proliferation or compounds can modify breast cell proliferation altering the formation of hormone metabolites that influence epithelial-stromal interaction and growth regulation. Beneficial xenohormones, such as indole-3-carbinol, genistein, and other bioflavonoids, may reduce aberrant breast cell proliferation, and influence the rate of DNA repair or apoptosis and thereby influence the genetic or hormonal microenvironments. Upon validation with appropriate in vitro and in vivo studies, biologic markers of the risk for breast cancer, such as hormone metabolites, total bioavailable estradiol, and free radical generators can enhance cancer detection and prevention.

  12. Genetic Divergence Disclosing a Rapid Prehistorical Dispersion of Native Americans in Central and South America

    PubMed Central

    He, Yungang; Wang, Wei R.; Li, Ran; Wang, Sijia; Jin, Li

    2012-01-01

    An accurate estimate of the divergence time between Native Americans is important for understanding the initial entry and early dispersion of human beings in the New World. Current methods for estimating the genetic divergence time of populations could seriously depart from a linear relationship with the true divergence for multiple populations of a different population size and significant population expansion. Here, to address this problem, we propose a novel measure to estimate the genetic divergence time of populations. Computer simulation revealed that the new measure maintained an excellent linear correlation with the population divergence time in complicated multi-population scenarios with population expansion. Utilizing the new measure and microsatellite data of 21 Native American populations, we investigated the genetic divergences of the Native American populations. The results indicated that genetic divergences between North American populations are greater than that between Central and South American populations. None of the divergences, however, were large enough to constitute convincing evidence supporting the two-wave or multi-wave migration model for the initial entry of human beings into America. The genetic affinity of the Native American populations was further explored using Neighbor-Net and the genetic divergences suggested that these populations could be categorized into four genetic groups living in four different ecologic zones. The divergence of the population groups suggests that the early dispersion of human beings in America was a multi-step procedure. Further, the divergences suggest the rapid dispersion of Native Americans in Central and South Americas after a long standstill period in North America. PMID:22970308

  13. Discovery of novel mGluR1 antagonists: a multistep virtual screening approach based on an SVM model and a pharmacophore hypothesis significantly increases the hit rate and enrichment factor.

    PubMed

    Li, Guo-Bo; Yang, Ling-Ling; Feng, Shan; Zhou, Jian-Ping; Huang, Qi; Xie, Huan-Zhang; Li, Lin-Li; Yang, Sheng-Yong

    2011-03-15

    Development of glutamate non-competitive antagonists of mGluR1 (Metabotropic glutamate receptor subtype 1) has increasingly attracted much attention in recent years due to their potential therapeutic application for various nervous disorders. Since there is no crystal structure reported for mGluR1, ligand-based virtual screening (VS) methods, typically pharmacophore-based VS (PB-VS), are often used for the discovery of mGluR1 antagonists. Nevertheless, PB-VS usually suffers a lower hit rate and enrichment factor. In this investigation, we established a multistep ligand-based VS approach that is based on a support vector machine (SVM) classification model and a pharmacophore model. Performance evaluation of these methods in virtual screening against a large independent test set, M-MDDR, show that the multistep VS approach significantly increases the hit rate and enrichment factor compared with the individual SB-VS and PB-VS methods. The multistep VS approach was then used to screen several large chemical libraries including PubChem, Specs, and Enamine. Finally a total of 20 compounds were selected from the top ranking compounds, and shifted to the subsequent in vitro and in vivo studies, which results will be reported in the near future. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Automating quantum dot barcode assays using microfluidics and magnetism for the development of a point-of-care device.

    PubMed

    Gao, Yali; Lam, Albert W Y; Chan, Warren C W

    2013-04-24

    The impact of detecting multiple infectious diseases simultaneously at point-of-care with good sensitivity, specificity, and reproducibility would be enormous for containing the spread of diseases in both resource-limited and rich countries. Many barcoding technologies have been introduced for addressing this need as barcodes can be applied to detecting thousands of genetic and protein biomarkers simultaneously. However, the assay process is not automated and is tedious and requires skilled technicians. Barcoding technology is currently limited to use in resource-rich settings. Here we used magnetism and microfluidics technology to automate the multiple steps in a quantum dot barcode assay. The quantum dot-barcoded microbeads are sequentially (a) introduced into the chip, (b) magnetically moved to a stream containing target molecules, (c) moved back to the original stream containing secondary probes, (d) washed, and (e) finally aligned for detection. The assay requires 20 min, has a limit of detection of 1.2 nM, and can detect genetic targets for HIV, hepatitis B, and syphilis. This study provides a simple strategy to automate the entire barcode assay process and moves barcoding technologies one step closer to point-of-care applications.

  15. Multi-step process for concentrating magnetic particles in waste sludges

    DOEpatents

    Watson, John L.

    1990-01-01

    This invention involves a multi-step, multi-force process for dewatering sludges which have high concentrations of magnetic particles, such as waste sludges generated during steelmaking. This series of processing steps involves (1) mixing a chemical flocculating agent with the sludge; (2) allowing the particles to aggregate under non-turbulent conditions; (3) subjecting the mixture to a magnetic field which will pull the magnetic aggregates in a selected direction, causing them to form a compacted sludge; (4) preferably, decanting the clarified liquid from the compacted sludge; and (5) using filtration to convert the compacted sludge into a cake having a very high solids content. Steps 2 and 3 should be performed simultaneously. This reduces the treatment time and increases the extent of flocculation and the effectiveness of the process. As partially formed aggregates with active flocculating groups are pulled through the mixture by the magnetic field, they will contact other particles and form larger aggregates. This process can increase the solids concentration of steelmaking sludges in an efficient and economic manner, thereby accomplishing either of two goals: (a) it can convert hazardous wastes into economic resources for recycling as furnace feed material, or (b) it can dramatically reduce the volume of waste material which must be disposed.

  16. Multi-step process for concentrating magnetic particles in waste sludges

    DOEpatents

    Watson, J.L.

    1990-07-10

    This invention involves a multi-step, multi-force process for dewatering sludges which have high concentrations of magnetic particles, such as waste sludges generated during steelmaking. This series of processing steps involves (1) mixing a chemical flocculating agent with the sludge; (2) allowing the particles to aggregate under non-turbulent conditions; (3) subjecting the mixture to a magnetic field which will pull the magnetic aggregates in a selected direction, causing them to form a compacted sludge; (4) preferably, decanting the clarified liquid from the compacted sludge; and (5) using filtration to convert the compacted sludge into a cake having a very high solids content. Steps 2 and 3 should be performed simultaneously. This reduces the treatment time and increases the extent of flocculation and the effectiveness of the process. As partially formed aggregates with active flocculating groups are pulled through the mixture by the magnetic field, they will contact other particles and form larger aggregates. This process can increase the solids concentration of steelmaking sludges in an efficient and economic manner, thereby accomplishing either of two goals: (a) it can convert hazardous wastes into economic resources for recycling as furnace feed material, or (b) it can dramatically reduce the volume of waste material which must be disposed. 7 figs.

  17. Preparation of molecularly imprinted polymers for strychnine by precipitation polymerization and multistep swelling and polymerization and their application for the selective extraction of strychnine from nux-vomica extract powder.

    PubMed

    Nakamura, Yukari; Matsunaga, Hisami; Haginaka, Jun

    2016-04-01

    Monodisperse molecularly imprinted polymers for strychnine were prepared by precipitation polymerization and multistep swelling and polymerization, respectively. In precipitation polymerization, methacrylic acid and divinylbenzene were used as a functional monomer and crosslinker, respectively, while in multistep swelling and polymerization, methacrylic acid and ethylene glycol dimethacrylate were used as a functional monomer and crosslinker, respectively. The retention and molecular recognition properties of the molecularly imprinted polymers prepared by both methods for strychnine were evaluated using a mixture of sodium phosphate buffer and acetonitrile as a mobile phase by liquid chromatography. In addition to shape recognition, ionic and hydrophobic interactions could affect the retention of strychnine in low acetonitrile content. Furthermore, molecularly imprinted polymers prepared by both methods could selectively recognize strychnine among solutes tested. The retention factors and imprinting factors of strychnine on the molecularly imprinted polymer prepared by precipitation polymerization were 220 and 58, respectively, using 20 mM sodium phosphate buffer (pH 6.0)/acetonitrile (50:50, v/v) as a mobile phase, and those on the molecularly imprinted polymer prepared by multistep swelling and polymerization were 73 and 4.5. These results indicate that precipitation polymerization is suitable for the preparation of a molecularly imprinted polymer for strychnine. Furthermore, the molecularly imprinted polymer could be successfully applied for selective extraction of strychnine in nux-vomica extract powder. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. 78 FR 66039 - Modification of National Customs Automation Program Test Concerning Automated Commercial...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-04

    ... Customs Automation Program Test Concerning Automated Commercial Environment (ACE) Cargo Release (Formerly... Simplified Entry functionality in the Automated Commercial Environment (ACE). Originally, the test was known...) test concerning Automated Commercial Environment (ACE) Simplified Entry (SE test) functionality is...

  19. Optimal installation locations for automated external defibrillators in Taipei 7-Eleven stores: using GIS and a genetic algorithm with a new stirring operator.

    PubMed

    Huang, Chung-Yuan; Wen, Tzai-Hung

    2014-01-01

    Immediate treatment with an automated external defibrillator (AED) increases out-of-hospital cardiac arrest (OHCA) patient survival potential. While considerable attention has been given to determining optimal public AED locations, spatial and temporal factors such as time of day and distance from emergency medical services (EMSs) are understudied. Here we describe a geocomputational genetic algorithm with a new stirring operator (GANSO) that considers spatial and temporal cardiac arrest occurrence factors when assessing the feasibility of using Taipei 7-Eleven stores as installation locations for AEDs. Our model is based on two AED conveyance modes, walking/running and driving, involving service distances of 100 and 300 meters, respectively. Our results suggest different AED allocation strategies involving convenience stores in urban settings. In commercial areas, such installations can compensate for temporal gaps in EMS locations when responding to nighttime OHCA incidents. In residential areas, store installations can compensate for long distances from fire stations, where AEDs are currently held in Taipei.

  20. Optimal Installation Locations for Automated External Defibrillators in Taipei 7-Eleven Stores: Using GIS and a Genetic Algorithm with a New Stirring Operator

    PubMed Central

    Wen, Tzai-Hung

    2014-01-01

    Immediate treatment with an automated external defibrillator (AED) increases out-of-hospital cardiac arrest (OHCA) patient survival potential. While considerable attention has been given to determining optimal public AED locations, spatial and temporal factors such as time of day and distance from emergency medical services (EMSs) are understudied. Here we describe a geocomputational genetic algorithm with a new stirring operator (GANSO) that considers spatial and temporal cardiac arrest occurrence factors when assessing the feasibility of using Taipei 7-Eleven stores as installation locations for AEDs. Our model is based on two AED conveyance modes, walking/running and driving, involving service distances of 100 and 300 meters, respectively. Our results suggest different AED allocation strategies involving convenience stores in urban settings. In commercial areas, such installations can compensate for temporal gaps in EMS locations when responding to nighttime OHCA incidents. In residential areas, store installations can compensate for long distances from fire stations, where AEDs are currently held in Taipei. PMID:25045396

  1. Role of home automation in distribution automation and automated meter reading. Tropical report, December 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, K.W.

    1994-12-01

    This is one of a series of topical reports dealing with the strategic, technical, and market development of home automation. Particular emphasis is placed upon identifying those aspects of home automation that will impact the gas industry and gas products. Communication standards, market drivers, key organizations, technical implementation, product opportunities, and market growth projects will all be addressed in this or subsequent reports. These reports will also discuss how the gas industry and gas-fired equipment can use home automation technology to benefit the consumer.

  2. Practical interpretation of CYP2D6 haplotypes: Comparison and integration of automated and expert calling.

    PubMed

    Ruaño, Gualberto; Kocherla, Mohan; Graydon, James S; Holford, Theodore R; Makowski, Gregory S; Goethe, John W

    2016-05-01

    We describe a population genetic approach to compare samples interpreted with expert calling (EC) versus automated calling (AC) for CYP2D6 haplotyping. The analysis represents 4812 haplotype calls based on signal data generated by the Luminex xMap analyzers from 2406 patients referred to a high-complexity molecular diagnostics laboratory for CYP450 testing. DNA was extracted from buccal swabs. We compared the results of expert calls (EC) and automated calls (AC) with regard to haplotype number and frequency. The ratio of EC to AC was 1:3. Haplotype frequencies from EC and AC samples were convergent across haplotypes, and their distribution was not statistically different between the groups. Most duplications required EC, as only expansions with homozygous or hemizygous haplotypes could be automatedly called. High-complexity laboratories can offer equivalent interpretation to automated calling for non-expanded CYP2D6 loci, and superior interpretation for duplications. We have validated scientific expert calling specified by scoring rules as standard operating procedure integrated with an automated calling algorithm. The integration of EC with AC is a practical strategy for CYP2D6 clinical haplotyping. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Keep Your Scanners Peeled: Gaze Behavior as a Measure of Automation Trust During Highly Automated Driving.

    PubMed

    Hergeth, Sebastian; Lorenz, Lutz; Vilimek, Roman; Krems, Josef F

    2016-05-01

    The feasibility of measuring drivers' automation trust via gaze behavior during highly automated driving was assessed with eye tracking and validated with self-reported automation trust in a driving simulator study. Earlier research from other domains indicates that drivers' automation trust might be inferred from gaze behavior, such as monitoring frequency. The gaze behavior and self-reported automation trust of 35 participants attending to a visually demanding non-driving-related task (NDRT) during highly automated driving was evaluated. The relationship between dispositional, situational, and learned automation trust with gaze behavior was compared. Overall, there was a consistent relationship between drivers' automation trust and gaze behavior. Participants reporting higher automation trust tended to monitor the automation less frequently. Further analyses revealed that higher automation trust was associated with lower monitoring frequency of the automation during NDRTs, and an increase in trust over the experimental session was connected with a decrease in monitoring frequency. We suggest that (a) the current results indicate a negative relationship between drivers' self-reported automation trust and monitoring frequency, (b) gaze behavior provides a more direct measure of automation trust than other behavioral measures, and (c) with further refinement, drivers' automation trust during highly automated driving might be inferred from gaze behavior. Potential applications of this research include the estimation of drivers' automation trust and reliance during highly automated driving. © 2016, Human Factors and Ergonomics Society.

  4. Methods of automated absence seizure detection, interference by stimulation, and possibilities for prediction in genetic absence models.

    PubMed

    van Luijtelaar, Gilles; Lüttjohann, Annika; Makarov, Vladimir V; Maksimenko, Vladimir A; Koronovskii, Alexei A; Hramov, Alexander E

    2016-02-15

    Genetic rat models for childhood absence epilepsy have become instrumental in developing theories on the origin of absence epilepsy, the evaluation of new and experimental treatments, as well as in developing new methods for automatic seizure detection, prediction, and/or interference of seizures. Various methods for automated off and on-line analyses of ECoG in rodent models are reviewed, as well as data on how to interfere with the spike-wave discharges by different types of invasive and non-invasive electrical, magnetic, and optical brain stimulation. Also a new method for seizure prediction is proposed. Many selective and specific methods for off- and on-line spike-wave discharge detection seem excellent, with possibilities to overcome the issue of individual differences. Moreover, electrical deep brain stimulation is rather effective in interrupting ongoing spike-wave discharges with low stimulation intensity. A network based method is proposed for absence seizures prediction with a high sensitivity but a low selectivity. Solutions that prevent false alarms, integrated in a closed loop brain stimulation system open the ways for experimental seizure control. The presence of preictal cursor activity detected with state of the art time frequency and network analyses shows that spike-wave discharges are not caused by sudden and abrupt transitions but that there are detectable dynamic events. Their changes in time-space-frequency characteristics might yield new options for seizure prediction and seizure control. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Influence of multi-step washing using Na2EDTA, oxalic acid and phosphoric acid on metal fractionation and spectroscopy characteristics from contaminated soil.

    PubMed

    Wei, Meng; Chen, Jiajun

    2016-11-01

    A multi-step soil washing test using a typical chelating agent (Na 2 EDTA), organic acid (oxalic acid), and inorganic weak acid (phosphoric acid) was conducted to remediate soil contaminated with heavy metals near an arsenic mining area. The aim of the test was to improve the heavy metal removal efficiency and investigate its influence on metal fractionation and the spectroscopy characteristics of contaminated soil. The results indicated that the orders of the multi-step washing were critical for the removal efficiencies of the metal fractions, bioavailability, and potential mobility due to the different dissolution levels of mineral fractions and the inter-transformation of metal fractions by XRD and FT-IR spectral analyses. The optimal soil washing options were identified as the Na 2 EDTA-phosphoric-oxalic acid (EPO) and phosphoric-oxalic acid-Na 2 EDTA (POE) sequences because of their high removal efficiencies (approximately 45 % for arsenic and 88 % for cadmium) and the minimal harmful effects that were determined by the mobility and bioavailability of the remaining heavy metals based on the metal stability (I R ) and modified redistribution index ([Formula: see text]).

  6. Robotic Automation of In Vivo Two-Photon Targeted Whole-Cell Patch-Clamp Electrophysiology.

    PubMed

    Annecchino, Luca A; Morris, Alexander R; Copeland, Caroline S; Agabi, Oshiorenoya E; Chadderton, Paul; Schultz, Simon R

    2017-08-30

    Whole-cell patch-clamp electrophysiological recording is a powerful technique for studying cellular function. While in vivo patch-clamp recording has recently benefited from automation, it is normally performed "blind," meaning that throughput for sampling some genetically or morphologically defined cell types is unacceptably low. One solution to this problem is to use two-photon microscopy to target fluorescently labeled neurons. Combining this with robotic automation is difficult, however, as micropipette penetration induces tissue deformation, moving target cells from their initial location. Here we describe a platform for automated two-photon targeted patch-clamp recording, which solves this problem by making use of a closed loop visual servo algorithm. Our system keeps the target cell in focus while iteratively adjusting the pipette approach trajectory to compensate for tissue motion. We demonstrate platform validation with patch-clamp recordings from a variety of cells in the mouse neocortex and cerebellum. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Simulation Approach for Timing Analysis of Genetic Logic Circuits.

    PubMed

    Baig, Hasan; Madsen, Jan

    2017-07-21

    Constructing genetic logic circuits is an application of synthetic biology in which parts of the DNA of a living cell are engineered to perform a dedicated Boolean function triggered by an appropriate concentration of certain proteins or by different genetic components. These logic circuits work in a manner similar to electronic logic circuits, but they are much more stochastic and hence much harder to characterize. In this article, we introduce an approach to analyze the threshold value and timing of genetic logic circuits. We show how this approach can be used to analyze the timing behavior of single and cascaded genetic logic circuits. We further analyze the timing sensitivity of circuits by varying the degradation rates and concentrations. Our approach can be used not only to characterize the timing behavior but also to analyze the timing constraints of cascaded genetic logic circuits, a capability that we believe will be important for design automation in synthetic biology.

  8. Comparison of manual and automated AmpliSeq™ workflows in the typing of a Somali population with the Precision ID Identity Panel.

    PubMed

    van der Heijden, Suzanne; de Oliveira, Susanne Juel; Kampmann, Marie-Louise; Børsting, Claus; Morling, Niels

    2017-11-01

    The Precision ID Identity Panel was used to type 109 Somali individuals in order to obtain allele frequencies for the Somali population. These frequencies were used to establish a Somali HID-SNP database, which will be used for the biostatistic calculations in family and immigration cases. Genotypes obtained with the Precision ID Identity Panel were found to be almost in complete concordance with genotypes obtained with the SNPforID PCR-SBE-CE assay. In seven SNP loci, silent alleles were identified, of which most were previously described in the literature. The project also set out to compare different AmpliSeq™ workflows to investigate the possibility of using automated library building in forensic genetic case work. In order to do so, the SNP typing of the Somalis was performed using three different workflows: 1) manual library building and sequencing on the Ion PGM™, 2) automated library building using the Biomek ® 3000 and sequencing on the Ion PGM™, and 3) automated library building using the Ion Chef™ and sequencing on the Ion S5™. AmpliSeq™ workflows were compared based on coverage, locus balance, noise, and heterozygote balance. Overall, the Ion Chef™/Ion S5™ workflow was found to give the best results and required least hands-on time in the laboratory. However, the Ion Chef™/Ion S5™ workflow was also the most expensive. The number of libraries that may be constructed in one Ion Chef™ library building run was limited to eight, which is too little for high throughput workflows. The Biomek ® 3000/Ion PGM™ workflow was found to perform similarly to the manual/Ion PGM™ workflow. This argues for the use of automated library building in forensic genetic case work. Automated library building decreases the workload of the laboratory staff, decreases the risk of pipetting errors, and simplifies the daily workflow in forensic genetic laboratories. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Kinetic Analysis for the Multistep Profiles of Organic Reactions: Significance of the Conformational Entropy on the Rate Constants of the Claisen Rearrangement.

    PubMed

    Sumiya, Yosuke; Nagahata, Yutaka; Komatsuzaki, Tamiki; Taketsugu, Tetsuya; Maeda, Satoshi

    2015-12-03

    The significance of kinetic analysis as a tool for understanding the reactivity and selectivity of organic reactions has recently been recognized. However, conventional simulation approaches that solve rate equations numerically are not amenable to multistep reaction profiles consisting of fast and slow elementary steps. Herein, we present an efficient and robust approach for evaluating the overall rate constants of multistep reactions via the recursive contraction of the rate equations to give the overall rate constants for the products and byproducts. This new method was applied to the Claisen rearrangement of allyl vinyl ether, as well as a substituted allyl vinyl ether. Notably, the profiles of these reactions contained 23 and 84 local minima, and 66 and 278 transition states, respectively. The overall rate constant for the Claisen rearrangement of allyl vinyl ether was consistent with the experimental value. The selectivity of the Claisen rearrangement reaction has also been assessed using a substituted allyl vinyl ether. The results of this study showed that the conformational entropy in these flexible chain molecules had a substantial impact on the overall rate constants. This new method could therefore be used to estimate the overall rate constants of various other organic reactions involving flexible molecules.

  10. Process development for automated solar cell and module production. Task 4: Automated array assembly

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A process sequence which can be used in conjunction with automated equipment for the mass production of solar cell modules for terrestrial use was developed. The process sequence was then critically analyzed from a technical and economic standpoint to determine the technological readiness of certain process steps for implementation. The steps receiving analysis were: back contact metallization, automated cell array layup/interconnect, and module edge sealing. For automated layup/interconnect, both hard automation and programmable automation (using an industrial robot) were studied. The programmable automation system was then selected for actual hardware development.

  11. A semi-automated region of interest detection method in the scintigraphic glomerular filtration rate determination for patients with abnormal low renal function.

    PubMed

    Tian, Cancan; Zheng, Xiujuan; Han, Yuan; Sun, Xiaoguang; Chen, Kewei; Huang, Qiu

    2013-11-01

    This work presents a novel semi-automated renal region-of-interest (ROI) determination method that is user friendly, time saving, and yet provides a robust glomerular filtration rate (GFR) estimation highly consistent with the reference method. We reviewed data from 57 patients who underwent (99m)Tc-diethylenetriaminepentaacetic acid renal scintigraphy and were diagnosed with abnormal renal function. The renal and background ROIs were delineated by the proposed multi-step, semi-automated method, which integrates temporal/morphologic information via visual inspection and computer-aided calculations. The total GFR was estimated using the proposed method (sGFR) performed by 2 junior clinicians (A and B) with 1 and 3 years of experience, respectively (sGFR_a, sGFR_b), and compared with the reference total GFR (rGFR) estimated by a senior clinician with 20 years of experience who manually delineated the kidney and background ROIs. All GFR calculations herein were conducted using the Gates method. Data from 10 patients with unilateral or non-functioning kidneys were excluded from the analysis. For the remaining patients, sGFR correlated well with rGFR (r(s/rGFR_a) = 0.957, P < 0.001 and r(s/rGFR_b) = 0.951, P < 0.001) and sGFR_a correlated well with sGFR_b (r(a/b) = 0.997, P < 0.001). Moreover, the Bland-Altman plots for sGFR_a and sGFR_b confirm the high reproducibility of the proposed method between different operators. Finally, the proposed procedure is almost 3 times faster than the routinely used procedure in clinical practice. The results suggest that this method is easy to use, highly reproducible, and accurate in measuring the GFR of patients with low renal function. The method is being further extended to a fully automated procedure.

  12. Automated Quantitative Rare Earth Elements Mineralogy by Scanning Electron Microscopy

    NASA Astrophysics Data System (ADS)

    Sindern, Sven; Meyer, F. Michael

    2016-09-01

    Increasing industrial demand of rare earth elements (REEs) stems from the central role they play for advanced technologies and the accelerating move away from carbon-based fuels. However, REE production is often hampered by the chemical, mineralogical as well as textural complexity of the ores with a need for better understanding of their salient properties. This is not only essential for in-depth genetic interpretations but also for a robust assessment of ore quality and economic viability. The design of energy and cost-efficient processing of REE ores depends heavily on information about REE element deportment that can be made available employing automated quantitative process mineralogy. Quantitative mineralogy assigns numeric values to compositional and textural properties of mineral matter. Scanning electron microscopy (SEM) combined with a suitable software package for acquisition of backscatter electron and X-ray signals, phase assignment and image analysis is one of the most efficient tools for quantitative mineralogy. The four different SEM-based automated quantitative mineralogy systems, i.e. FEI QEMSCAN and MLA, Tescan TIMA and Zeiss Mineralogic Mining, which are commercially available, are briefly characterized. Using examples of quantitative REE mineralogy, this chapter illustrates capabilities and limitations of automated SEM-based systems. Chemical variability of REE minerals and analytical uncertainty can reduce performance of phase assignment. This is shown for the REE phases parisite and synchysite. In another example from a monazite REE deposit, the quantitative mineralogical parameters surface roughness and mineral association derived from image analysis are applied for automated discrimination of apatite formed in a breakdown reaction of monazite and apatite formed by metamorphism prior to monazite breakdown. SEM-based automated mineralogy fulfils all requirements for characterization of complex unconventional REE ores that will become

  13. Management Planning for Workplace Automation.

    ERIC Educational Resources Information Center

    McDole, Thomas L.

    Several factors must be considered when implementing office automation. Included among these are whether or not to automate at all, the effects of automation on employees, requirements imposed by automation on the physical environment, effects of automation on the total organization, and effects on clientele. The reasons behind the success or…

  14. Automation in astronomy.

    NASA Technical Reports Server (NTRS)

    Wampler, E. J.

    1972-01-01

    Description and evaluation of the remotely operated Lick Observatory Cassegrain focus of the 120-inch telescope. The experience with this instrument has revealed that an automated system can profoundly change the observer's approach to his work. This makes it difficult to evaluate the 'advantage' of an automated telescope over a conventional instrument. Some of the problems arising with automation in astronomy are discussed.

  15. Automation in Clinical Microbiology

    PubMed Central

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  16. Virtual automation.

    PubMed

    Casis, E; Garrido, A; Uranga, B; Vives, A; Zufiaurre, C

    2001-01-01

    Total laboratory automation (TLA) can be substituted in mid-size laboratories by a computer sample workflow control (virtual automation). Such a solution has been implemented in our laboratory using PSM, software developed in cooperation with Roche Diagnostics (Barcelona, Spain), to this purpose. This software is connected to the online analyzers and to the laboratory information system and is able to control and direct the samples working as an intermediate station. The only difference with TLA is the replacement of transport belts by personnel of the laboratory. The implementation of this virtual automation system has allowed us the achievement of the main advantages of TLA: workload increase (64%) with reduction in the cost per test (43%), significant reduction in the number of biochemistry primary tubes (from 8 to 2), less aliquoting (from 600 to 100 samples/day), automation of functional testing, drastic reduction of preanalytical errors (from 11.7 to 0.4% of the tubes) and better total response time for both inpatients (from up to 48 hours to up to 4 hours) and outpatients (from up to 10 days to up to 48 hours). As an additional advantage, virtual automation could be implemented without hardware investment and significant headcount reduction (15% in our lab).

  17. Laboratory automation: total and subtotal.

    PubMed

    Hawker, Charles D

    2007-12-01

    Worldwide, perhaps 2000 or more clinical laboratories have implemented some form of laboratory automation, either a modular automation system, such as for front-end processing, or a total laboratory automation system. This article provides descriptions and examples of these various types of automation. It also presents an outline of how a clinical laboratory that is contemplating automation should approach its decision and the steps it should follow to ensure a successful implementation. Finally, the role of standards in automation is reviewed.

  18. Assessment of an automated capillary system for Plasmodium vivax microsatellite genotyping.

    PubMed

    Manrique, Paulo; Hoshi, Mari; Fasabi, Manuel; Nolasco, Oscar; Yori, Pablo; Calderón, Martiza; Gilman, Robert H; Kosek, Margaret N; Vinetz, Joseph M; Gamboa, Dionicia

    2015-08-21

    Several platforms have been used to generate the primary data for microsatellite analysis of malaria parasite genotypes. Each has relative advantages but share a limitation of being time- and cost-intensive. A commercially available automated capillary gel cartridge system was assessed in the microsatellite analysis of Plasmodium vivax diversity in the Peruvian Amazon. The reproducibility and accuracy of a commercially-available automated capillary system, QIAxcel, was assessed using a sequenced PCR product of 227 base pairs. This product was measured 42 times, then 27 P. vivax samples from Peruvian Amazon subjects were analyzed with this instrument using five informative microsatellites. Results from the QIAxcel system were compared with a Sanger-type sequencing machine, the ABI PRISM(®) 3100 Genetic Analyzer. Significant differences were seen between the sequenced amplicons and the results from the QIAxcel instrument. Different runs, plates and cartridges yielded significantly different results. Additionally, allele size decreased with each run by 0.045, or 1 bp, every three plates. QIAxcel and ABI PRISM systems differed in giving different values than those obtained by ABI PRISM, and too many (i.e. inaccurate) alleles per locus were also seen with the automated instrument. While P. vivax diversity could generally be estimated using an automated capillary gel cartridge system, the data demonstrate that this system is not sufficiently precise for reliably identifying parasite strains via microsatellite analysis. This conclusion reached after systematic analysis was due both to inadequate precision and poor reproducibility in measuring PCR product size.

  19. Modeling dynamics for oncogenesis encompassing mutations and genetic instability.

    PubMed

    Fassoni, Artur C; Yang, Hyun M

    2018-06-27

    Tumorigenesis has been described as a multistep process, where each step is associated with a genetic alteration, in the direction to progressively transform a normal cell and its descendants into a malignant tumour. Into this work, we propose a mathematical model for cancer onset and development, considering three populations: normal, premalignant and cancer cells. The model takes into account three hallmarks of cancer: self-sufficiency on growth signals, insensibility to anti-growth signals and evading apoptosis. By using a nonlinear expression to describe the mutation from premalignant to cancer cells, the model includes genetic instability as an enabling characteristic of tumour progression. Mathematical analysis was performed in detail. Results indicate that apoptosis and tissue repair system are the first barriers against tumour progression. One of these mechanisms must be corrupted for cancer to develop from a single mutant cell. The results also show that the presence of aggressive cancer cells opens way to survival of less adapted premalignant cells. Numerical simulations were performed with parameter values based on experimental data of breast cancer, and the necessary time taken for cancer to reach a detectable size from a single mutant cell was estimated with respect to some parameters. We find that the rates of apoptosis and mutations have a large influence on the pace of tumour progression and on the time it takes to become clinically detectable.

  20. Fast dye salts provide fast access to azidoarene synthons in multi-step one-pot tandem click transformations

    PubMed Central

    Fletcher, James T.; Reilly, Jacquelline E.

    2012-01-01

    This study examined whether commercially available diazonium salts could be used as efficient aromatic azide precursors in one-pot multi-step click transformations. Seven different diazonium salts, including Fast Red RC, Fast Blue B, Fast Corinth V and Variamine Blue B were surveyed under aqueous click reaction conditions of CuSO4/Na ascorbate catalyst with 1:1 t-BuOH:H2O solvent. Two-step tandem reactions with terminal alkyne and diyne co-reactants led to 1,2,3-triazole products in 66%-88% yields, while three-step tandem reactions with trimethylsilyl-protected alkyne and diyne co-reactants led to 1,2,3-triazole products in 61%-78% yields. PMID:22368306

  1. Automated electrotransformation of Escherichia coli on a digital microfluidic platform using bioactivated magnetic beads.

    PubMed

    Moore, J A; Nemat-Gorgani, M; Madison, A C; Sandahl, M A; Punnamaraju, S; Eckhardt, A E; Pollack, M G; Vigneault, F; Church, G M; Fair, R B; Horowitz, M A; Griffin, P B

    2017-01-01

    This paper reports on the use of a digital microfluidic platform to perform multiplex automated genetic engineering (MAGE) cycles on droplets containing Escherichia coli cells. Bioactivated magnetic beads were employed for cell binding, washing, and media exchange in the preparation of electrocompetent cells in the electrowetting-on-dieletric (EWoD) platform. On-cartridge electroporation was used to deliver oligonucleotides into the cells. In addition to the optimization of a magnetic bead-based benchtop protocol for generating and transforming electrocompetent E. coli cells, we report on the implementation of this protocol in a fully automated digital microfluidic platform. Bead-based media exchange and electroporation pulse conditions were optimized on benchtop for transformation frequency to provide initial parameters for microfluidic device trials. Benchtop experiments comparing electrotransformation of free and bead-bound cells are presented. Our results suggest that dielectric shielding intrinsic to bead-bound cells significantly reduces electroporation field exposure efficiency. However, high transformation frequency can be maintained in the presence of magnetic beads through the application of more intense electroporation pulses. As a proof of concept, MAGE cycles were successfully performed on a commercial EWoD cartridge using variations of the optimal magnetic bead-based preparation procedure and pulse conditions determined by the benchtop results. Transformation frequencies up to 22% were achieved on benchtop; this frequency was matched within 1% (21%) by MAGE cycles on the microfluidic device. However, typical frequencies on the device remain lower, averaging 9% with a standard deviation of 9%. The presented results demonstrate the potential of digital microfluidics to perform complex and automated genetic engineering protocols.

  2. Automated electrotransformation of Escherichia coli on a digital microfluidic platform using bioactivated magnetic beads

    PubMed Central

    Moore, J. A.; Nemat-Gorgani, M.; Madison, A. C.; Punnamaraju, S.; Eckhardt, A. E.; Pollack, M. G.; Church, G. M.; Fair, R. B.; Horowitz, M. A.; Griffin, P. B.

    2017-01-01

    This paper reports on the use of a digital microfluidic platform to perform multiplex automated genetic engineering (MAGE) cycles on droplets containing Escherichia coli cells. Bioactivated magnetic beads were employed for cell binding, washing, and media exchange in the preparation of electrocompetent cells in the electrowetting-on-dieletric (EWoD) platform. On-cartridge electroporation was used to deliver oligonucleotides into the cells. In addition to the optimization of a magnetic bead-based benchtop protocol for generating and transforming electrocompetent E. coli cells, we report on the implementation of this protocol in a fully automated digital microfluidic platform. Bead-based media exchange and electroporation pulse conditions were optimized on benchtop for transformation frequency to provide initial parameters for microfluidic device trials. Benchtop experiments comparing electrotransformation of free and bead-bound cells are presented. Our results suggest that dielectric shielding intrinsic to bead-bound cells significantly reduces electroporation field exposure efficiency. However, high transformation frequency can be maintained in the presence of magnetic beads through the application of more intense electroporation pulses. As a proof of concept, MAGE cycles were successfully performed on a commercial EWoD cartridge using variations of the optimal magnetic bead-based preparation procedure and pulse conditions determined by the benchtop results. Transformation frequencies up to 22% were achieved on benchtop; this frequency was matched within 1% (21%) by MAGE cycles on the microfluidic device. However, typical frequencies on the device remain lower, averaging 9% with a standard deviation of 9%. The presented results demonstrate the potential of digital microfluidics to perform complex and automated genetic engineering protocols. PMID:28191268

  3. Automated PCR setup for forensic casework samples using the Normalization Wizard and PCR Setup robotic methods.

    PubMed

    Greenspoon, S A; Sykes, K L V; Ban, J D; Pollard, A; Baisden, M; Farr, M; Graham, N; Collins, B L; Green, M M; Christenson, C C

    2006-12-20

    Human genome, pharmaceutical and research laboratories have long enjoyed the application of robotics to performing repetitive laboratory tasks. However, the utilization of robotics in forensic laboratories for processing casework samples is relatively new and poses particular challenges. Since the quantity and quality (a mixture versus a single source sample, the level of degradation, the presence of PCR inhibitors) of the DNA contained within a casework sample is unknown, particular attention must be paid to procedural susceptibility to contamination, as well as DNA yield, especially as it pertains to samples with little biological material. The Virginia Department of Forensic Science (VDFS) has successfully automated forensic casework DNA extraction utilizing the DNA IQ(trade mark) System in conjunction with the Biomek 2000 Automation Workstation. Human DNA quantitation is also performed in a near complete automated fashion utilizing the AluQuant Human DNA Quantitation System and the Biomek 2000 Automation Workstation. Recently, the PCR setup for casework samples has been automated, employing the Biomek 2000 Automation Workstation and Normalization Wizard, Genetic Identity version, which utilizes the quantitation data, imported into the software, to create a customized automated method for DNA dilution, unique to that plate of DNA samples. The PCR Setup software method, used in conjunction with the Normalization Wizard method and written for the Biomek 2000, functions to mix the diluted DNA samples, transfer the PCR master mix, and transfer the diluted DNA samples to PCR amplification tubes. Once the process is complete, the DNA extracts, still on the deck of the robot in PCR amplification strip tubes, are transferred to pre-labeled 1.5 mL tubes for long-term storage using an automated method. The automation of these steps in the process of forensic DNA casework analysis has been accomplished by performing extensive optimization, validation and testing of the

  4. Automation of Oklahoma School Library Media Centers: Automation at the Local Level.

    ERIC Educational Resources Information Center

    Oklahoma State Dept. of Education, Oklahoma City. Library and Learning Resources Section.

    This document outlines a workshop for media specialists--"School Library Automation: Solving the Puzzle"--that is designed to reduce automation anxiety and give a broad overview of the concerns confronting school library media centers planning for or involved in automation. Issues are addressed under the following headings: (1) Levels of School…

  5. Ultra-fast consensus of discrete-time multi-agent systems with multi-step predictive output feedback

    NASA Astrophysics Data System (ADS)

    Zhang, Wenle; Liu, Jianchang

    2016-04-01

    This article addresses the ultra-fast consensus problem of high-order discrete-time multi-agent systems based on a unified consensus framework. A novel multi-step predictive output mechanism is proposed under a directed communication topology containing a spanning tree. By predicting the outputs of a network several steps ahead and adding this information into the consensus protocol, it is shown that the asymptotic convergence factor is improved by a power of q + 1 compared to the routine consensus. The difficult problem of selecting the optimal control gain is solved well by introducing a variable called convergence step. In addition, the ultra-fast formation achievement is studied on the basis of this new consensus protocol. Finally, the ultra-fast consensus with respect to a reference model and robust consensus is discussed. Some simulations are performed to illustrate the effectiveness of the theoretical results.

  6. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    PubMed

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy

  7. Automation: triumph or trap?

    PubMed

    Smythe, M H

    1997-01-01

    Automation, a hot topic in the laboratory world today, can be a very expensive option. Those who are considering implementing automation can save time and money by examining the issues from the standpoint of an industrial/manufacturing engineer. The engineer not only asks what problems will be solved by automation, but what problems will be created. This article discusses questions that must be asked and answered to ensure that automation efforts will yield real and substantial payoffs.

  8. A multi-step chromatographic strategy to purify three fungal endo-β-glucanases.

    PubMed

    McCarthy, Tracey; Tuohy, Maria G

    2011-01-01

    Fungi and fungal enzymes have traditionally occupied a central role in biotechnology. Understanding the biochemical properties of the variety of enzymes produced by these eukaryotes has been an area of research interest for decades and again more recently due to global interest in greener bio-production technologies. Purification of an individual enzyme allows its unique biochemical and functional properties to be determined, can provide key information as to the role of individual biocatalysts within a complex enzyme system, and can inform both protein engineering and enzyme production strategies in the development of novel green technologies based on fungal biocatalysts. Many enzymes of current biotechnological interest are secreted by fungi into the extracellular culture medium. These crude enzyme mixtures are typically complex, multi-component, and generally also contain other non-enzymatic proteins and secondary metabolites. In this chapter, we describe a multi-step chromatographic strategy required to isolate three new endo-β-glucanases (denoted EG V, EG VI, and EG VII) with activity against cereal mixed-linkage β-glucans from the thermophilic fungus Talaromyces emersonii. This work also illustrates the challenges frequently involved in isolating individual extracellular fungal proteins in general.

  9. Routine human-competitive machine intelligence by means of genetic programming

    NASA Astrophysics Data System (ADS)

    Koza, John R.; Streeter, Matthew J.; Keane, Martin

    2004-01-01

    Genetic programming is a systematic method for getting computers to automatically solve a problem. Genetic programming starts from a high-level statement of what needs to be done and automatically creates a computer program to solve the problem. The paper demonstrates that genetic programming (1) now routinely delivers high-return human-competitive machine intelligence; (2) is an automated invention machine; (3) can automatically create a general solution to a problem in the form of a parameterized topology; and (4) has delivered a progression of qualitatively more substantial results in synchrony with five approximately order-of-magnitude increases in the expenditure of computer time. Recent results involving the automatic synthesis of the topology and sizing of analog electrical circuits and controllers demonstrate these points.

  10. Fuzzy Control/Space Station automation

    NASA Technical Reports Server (NTRS)

    Gersh, Mark

    1990-01-01

    Viewgraphs on fuzzy control/space station automation are presented. Topics covered include: Space Station Freedom (SSF); SSF evolution; factors pointing to automation & robotics (A&R); astronaut office inputs concerning A&R; flight system automation and ground operations applications; transition definition program; and advanced automation software tools.

  11. Complex network analysis of brain functional connectivity under a multi-step cognitive task

    NASA Astrophysics Data System (ADS)

    Cai, Shi-Min; Chen, Wei; Liu, Dong-Bai; Tang, Ming; Chen, Xun

    2017-01-01

    Functional brain network has been widely studied to understand the relationship between brain organization and behavior. In this paper, we aim to explore the functional connectivity of brain network under a multi-step cognitive task involving consecutive behaviors, and further understand the effect of behaviors on the brain organization. The functional brain networks are constructed based on a high spatial and temporal resolution fMRI dataset and analyzed via complex network based approach. We find that at voxel level the functional brain network shows robust small-worldness and scale-free characteristics, while its assortativity and rich-club organization are slightly restricted to the order of behaviors performed. More interestingly, the functional connectivity of brain network in activated ROIs strongly correlates with behaviors and is obviously restricted to the order of behaviors performed. These empirical results suggest that the brain organization has the generic properties of small-worldness and scale-free characteristics, and its diverse functional connectivity emerging from activated ROIs is strongly driven by these behavioral activities via the plasticity of brain.

  12. Multi-step prediction for influenza outbreak by an adjusted long short-term memory.

    PubMed

    Zhang, J; Nawata, K

    2018-05-01

    Influenza results in approximately 3-5 million annual cases of severe illness and 250 000-500 000 deaths. We urgently need an accurate multi-step-ahead time-series forecasting model to help hospitals to perform dynamical assignments of beds to influenza patients for the annually varied influenza season, and aid pharmaceutical companies to formulate a flexible plan of manufacturing vaccine for the yearly different influenza vaccine. In this study, we utilised four different multi-step prediction algorithms in the long short-term memory (LSTM). The result showed that implementing multiple single-output prediction in a six-layer LSTM structure achieved the best accuracy. The mean absolute percentage errors from two- to 13-step-ahead prediction for the US influenza-like illness rates were all <15%, averagely 12.930%. To the best of our knowledge, it is the first time that LSTM has been applied and refined to perform multi-step-ahead prediction for influenza outbreaks. Hopefully, this modelling methodology can be applied in other countries and therefore help prevent and control influenza worldwide.

  13. Automated detection system of single nucleotide polymorphisms using two kinds of functional magnetic nanoparticles

    NASA Astrophysics Data System (ADS)

    Liu, Hongna; Li, Song; Wang, Zhifei; Li, Zhiyang; Deng, Yan; Wang, Hua; Shi, Zhiyang; He, Nongyue

    2008-11-01

    Single nucleotide polymorphisms (SNPs) comprise the most abundant source of genetic variation in the human genome wide codominant SNPs identification. Therefore, large-scale codominant SNPs identification, especially for those associated with complex diseases, has induced the need for completely high-throughput and automated SNP genotyping method. Herein, we present an automated detection system of SNPs based on two kinds of functional magnetic nanoparticles (MNPs) and dual-color hybridization. The amido-modified MNPs (NH 2-MNPs) modified with APTES were used for DNA extraction from whole blood directly by electrostatic reaction, and followed by PCR, was successfully performed. Furthermore, biotinylated PCR products were captured on the streptavidin-coated MNPs (SA-MNPs) and interrogated by hybridization with a pair of dual-color probes to determine SNP, then the genotype of each sample can be simultaneously identified by scanning the microarray printed with the denatured fluorescent probes. This system provided a rapid, sensitive and highly versatile automated procedure that will greatly facilitate the analysis of different known SNPs in human genome.

  14. Automated measurement of mouse social behaviors using depth sensing, video tracking, and machine learning.

    PubMed

    Hong, Weizhe; Kennedy, Ann; Burgos-Artizzu, Xavier P; Zelikowsky, Moriel; Navonne, Santiago G; Perona, Pietro; Anderson, David J

    2015-09-22

    A lack of automated, quantitative, and accurate assessment of social behaviors in mammalian animal models has limited progress toward understanding mechanisms underlying social interactions and their disorders such as autism. Here we present a new integrated hardware and software system that combines video tracking, depth sensing, and machine learning for automatic detection and quantification of social behaviors involving close and dynamic interactions between two mice of different coat colors in their home cage. We designed a hardware setup that integrates traditional video cameras with a depth camera, developed computer vision tools to extract the body "pose" of individual animals in a social context, and used a supervised learning algorithm to classify several well-described social behaviors. We validated the robustness of the automated classifiers in various experimental settings and used them to examine how genetic background, such as that of Black and Tan Brachyury (BTBR) mice (a previously reported autism model), influences social behavior. Our integrated approach allows for rapid, automated measurement of social behaviors across diverse experimental designs and also affords the ability to develop new, objective behavioral metrics.

  15. Automated measurement of mouse social behaviors using depth sensing, video tracking, and machine learning

    PubMed Central

    Hong, Weizhe; Kennedy, Ann; Burgos-Artizzu, Xavier P.; Zelikowsky, Moriel; Navonne, Santiago G.; Perona, Pietro; Anderson, David J.

    2015-01-01

    A lack of automated, quantitative, and accurate assessment of social behaviors in mammalian animal models has limited progress toward understanding mechanisms underlying social interactions and their disorders such as autism. Here we present a new integrated hardware and software system that combines video tracking, depth sensing, and machine learning for automatic detection and quantification of social behaviors involving close and dynamic interactions between two mice of different coat colors in their home cage. We designed a hardware setup that integrates traditional video cameras with a depth camera, developed computer vision tools to extract the body “pose” of individual animals in a social context, and used a supervised learning algorithm to classify several well-described social behaviors. We validated the robustness of the automated classifiers in various experimental settings and used them to examine how genetic background, such as that of Black and Tan Brachyury (BTBR) mice (a previously reported autism model), influences social behavior. Our integrated approach allows for rapid, automated measurement of social behaviors across diverse experimental designs and also affords the ability to develop new, objective behavioral metrics. PMID:26354123

  16. View-Invariant Gait Recognition Through Genetic Template Segmentation

    NASA Astrophysics Data System (ADS)

    Isaac, Ebenezer R. H. P.; Elias, Susan; Rajagopalan, Srinivasan; Easwarakumar, K. S.

    2017-08-01

    Template-based model-free approach provides by far the most successful solution to the gait recognition problem in literature. Recent work discusses how isolating the head and leg portion of the template increase the performance of a gait recognition system making it robust against covariates like clothing and carrying conditions. However, most involve a manual definition of the boundaries. The method we propose, the genetic template segmentation (GTS), employs the genetic algorithm to automate the boundary selection process. This method was tested on the GEI, GEnI and AEI templates. GEI seems to exhibit the best result when segmented with our approach. Experimental results depict that our approach significantly outperforms the existing implementations of view-invariant gait recognition.

  17. A next-generation dual-recombinase system for time and host specific targeting of pancreatic cancer

    PubMed Central

    Schachtler, Christina; Zukowska, Magdalena; Eser, Stefan; Feyerabend, Thorsten B.; Paul, Mariel C.; Eser, Philipp; Klein, Sabine; Lowy, Andrew M.; Banerjee, Ruby; Yang, Fangtang; Lee, Chang-Lung; Moding, Everett J.; Kirsch, David G.; Scheideler, Angelika; Alessi, Dario R.; Varela, Ignacio; Bradley, Allan; Kind, Alexander; Schnieke, Angelika E.; Rodewald, Hans-Reimer; Rad, Roland; Schmid, Roland M.; Schneider, Günter; Saur, Dieter

    2014-01-01

    Genetically engineered mouse models (GEMMs) have dramatically improved our understanding of tumor evolution and therapeutic resistance. However, sequential genetic manipulation of gene expression and targeting of the host is almost impossible using conventional Cre-loxP–based models. We have developed an inducible dual-recombinase system by combining flippase-FRT (Flp-FRT) and Cre-loxP recombination technologies to improve GEMMs of pancreatic cancer. This enables investigation of multistep carcinogenesis, genetic manipulation of tumor subpopulations (such as cancer stem cells), selective targeting of the tumor microenvironment and genetic validation of therapeutic targets in autochthonous tumors on a genome-wide scale. As a proof of concept, we performed tumor cell–autonomous and nonautonomous targeting, recapitulated hallmarks of human multistep carcinogenesis, validated genetic therapy by 3-phosphoinositide-dependent protein kinase inactivation as well as cancer cell depletion and show that mast cells in the tumor microenvironment, which had been thought to be key oncogenic players, are dispensable for tumor formation. PMID:25326799

  18. A next-generation dual-recombinase system for time- and host-specific targeting of pancreatic cancer.

    PubMed

    Schönhuber, Nina; Seidler, Barbara; Schuck, Kathleen; Veltkamp, Christian; Schachtler, Christina; Zukowska, Magdalena; Eser, Stefan; Feyerabend, Thorsten B; Paul, Mariel C; Eser, Philipp; Klein, Sabine; Lowy, Andrew M; Banerjee, Ruby; Yang, Fangtang; Lee, Chang-Lung; Moding, Everett J; Kirsch, David G; Scheideler, Angelika; Alessi, Dario R; Varela, Ignacio; Bradley, Allan; Kind, Alexander; Schnieke, Angelika E; Rodewald, Hans-Reimer; Rad, Roland; Schmid, Roland M; Schneider, Günter; Saur, Dieter

    2014-11-01

    Genetically engineered mouse models (GEMMs) have dramatically improved our understanding of tumor evolution and therapeutic resistance. However, sequential genetic manipulation of gene expression and targeting of the host is almost impossible using conventional Cre-loxP-based models. We have developed an inducible dual-recombinase system by combining flippase-FRT (Flp-FRT) and Cre-loxP recombination technologies to improve GEMMs of pancreatic cancer. This enables investigation of multistep carcinogenesis, genetic manipulation of tumor subpopulations (such as cancer stem cells), selective targeting of the tumor microenvironment and genetic validation of therapeutic targets in autochthonous tumors on a genome-wide scale. As a proof of concept, we performed tumor cell-autonomous and nonautonomous targeting, recapitulated hallmarks of human multistep carcinogenesis, validated genetic therapy by 3-phosphoinositide-dependent protein kinase inactivation as well as cancer cell depletion and show that mast cells in the tumor microenvironment, which had been thought to be key oncogenic players, are dispensable for tumor formation.

  19. Asleep at the automated wheel-Sleepiness and fatigue during highly automated driving.

    PubMed

    Vogelpohl, Tobias; Kühn, Matthias; Hummel, Thomas; Vollrath, Mark

    2018-03-20

    Due to the lack of active involvement in the driving situation and due to monotonous driving environments drivers with automation may be prone to become fatigued faster than manual drivers (e.g. Schömig et al., 2015). However, little is known about the progression of fatigue during automated driving and its effects on the ability to take back manual control after a take-over request. In this driving simulator study with Nö=ö60 drivers we used a three factorial 2ö×ö2ö×ö12 mixed design to analyze the progression (12ö×ö5ömin; within subjects) of driver fatigue in drivers with automation compared to manual drivers (between subjects). Driver fatigue was induced as either mainly sleep related or mainly task related fatigue (between subjects). Additionally, we investigated the drivers' reactions to a take-over request in a critical driving scenario to gain insights into the ability of fatigued drivers to regain manual control and situation awareness after automated driving. Drivers in the automated driving condition exhibited facial indicators of fatigue after 15 to 35ömin of driving. Manual drivers only showed similar indicators of fatigue if they suffered from a lack of sleep and then only after a longer period of driving (approx. 40ömin). Several drivers in the automated condition closed their eyes for extended periods of time. In the driving with automation condition mean automation deactivation times after a take-over request were slower for a certain percentage (about 30%) of the drivers with a lack of sleep (Mö=ö3.2; SDö=ö2.1ös) compared to the reaction times after a long drive (Mö=ö2.4; SDö=ö0.9ös). Drivers with automation also took longer than manual drivers to first glance at the speed display after a take-over request and were more likely to stay behind a braking lead vehicle instead of overtaking it. Drivers are unable to stay alert during extended periods of automated driving without non-driving related tasks. Fatigued drivers could

  20. Elements of EAF automation processes

    NASA Astrophysics Data System (ADS)

    Ioana, A.; Constantin, N.; Dragna, E. C.

    2017-01-01

    Our article presents elements of Electric Arc Furnace (EAF) automation. So, we present and analyze detailed two automation schemes: the scheme of electrical EAF automation system; the scheme of thermic EAF automation system. The application results of these scheme of automation consists in: the sensitive reduction of specific consummation of electrical energy of Electric Arc Furnace, increasing the productivity of Electric Arc Furnace, increase the quality of the developed steel, increasing the durability of the building elements of Electric Arc Furnace.

  1. Thermally Stable Ni-rich Austenite Formed Utilizing Multistep Intercritical Heat Treatment in a Low-Carbon 10 Wt Pct Ni Martensitic Steel

    NASA Astrophysics Data System (ADS)

    Jain, Divya; Isheim, Dieter; Zhang, Xian J.; Ghosh, Gautam; Seidman, David N.

    2017-08-01

    Austenite reversion and its thermal stability attained during the transformation is key to enhanced toughness and blast resistance in transformation-induced-plasticity martensitic steels. We demonstrate that the thermal stability of Ni-stabilized austenite and kinetics of the transformation can be controlled by forming Ni-rich regions in proximity of pre-existing (retained) austenite. Atom probe tomography (APT) in conjunction with thermodynamic and kinetic modeling elucidates the role of Ni-rich regions in enhancing growth kinetics of thermally stable austenite, formed utilizing a multistep intercritical ( Quench- Lamellarization- Tempering (QLT)-type) heat treatment for a low-carbon 10 wt pct Ni steel. Direct evidence of austenite formation is provided by dilatometry, and the volume fraction is quantified by synchrotron X-ray diffraction. The results indicate the growth of nm-thick austenite layers during the second intercritical tempering treatment (T-step) at 863 K (590 °C), with austenite retained from first intercritical treatment (L-step) at 923 K (650 °C) acting as a nucleation template. For the first time, the thermal stability of austenite is quantified with respect to its compositional evolution during the multistep intercritical treatment of these steels. Austenite compositions measured by APT are used in combination with the thermodynamic and kinetic approach formulated by Ghosh and Olson to assess thermal stability and predict the martensite-start temperature. This approach is particularly useful as empirical relations cannot be extrapolated for the highly Ni-enriched austenite investigated in the present study.

  2. Automated analysis of food-borne pathogens using a novel microbial cell culture, sensing and classification system.

    PubMed

    Xiang, Kun; Li, Yinglei; Ford, William; Land, Walker; Schaffer, J David; Congdon, Robert; Zhang, Jing; Sadik, Omowunmi

    2016-02-21

    We hereby report the design and implementation of an Autonomous Microbial Cell Culture and Classification (AMC(3)) system for rapid detection of food pathogens. Traditional food testing methods require multistep procedures and long incubation period, and are thus prone to human error. AMC(3) introduces a "one click approach" to the detection and classification of pathogenic bacteria. Once the cultured materials are prepared, all operations are automatic. AMC(3) is an integrated sensor array platform in a microbial fuel cell system composed of a multi-potentiostat, an automated data collection system (Python program, Yocto Maxi-coupler electromechanical relay module) and a powerful classification program. The classification scheme consists of Probabilistic Neural Network (PNN), Support Vector Machines (SVM) and General Regression Neural Network (GRNN) oracle-based system. Differential Pulse Voltammetry (DPV) is performed on standard samples or unknown samples. Then, using preset feature extractions and quality control, accepted data are analyzed by the intelligent classification system. In a typical use, thirty-two extracted features were analyzed to correctly classify the following pathogens: Escherichia coli ATCC#25922, Escherichia coli ATCC#11775, and Staphylococcus epidermidis ATCC#12228. 85.4% accuracy range was recorded for unknown samples, and within a shorter time period than the industry standard of 24 hours.

  3. 77 FR 48527 - National Customs Automation Program (NCAP) Test Concerning Automated Commercial Environment (ACE...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-14

    ... Program (NCAP) Test Concerning Automated Commercial Environment (ACE) Simplified Entry: Modification of... Automated Commercial Environment (ACE). The test's participant selection criteria are modified to reflect... (NCAP) test concerning Automated Commercial Environment (ACE) Simplified Entry functionality (Simplified...

  4. Global proteomic profiling in multistep hepatocarcinogenesis and identification of PARP1 as a novel molecular marker in hepatocellular carcinoma

    PubMed Central

    Wang, Jianguo; Xie, Haiyang; Li, Jie; Cao, Jili; Zhou, Lin; Zheng, Shusen

    2016-01-01

    The more accurate biomarkers have long been desired for hepatocellular carcinoma (HCC). Here, we characterized global large-scale proteomics of multistep hepatocarcinogenesis in an attempt to identify novel biomarkers for HCC. Quantitative data of 37874 sequences and 3017 proteins during hepatocarcinogenesis were obtained in cohort 1 of 75 samples (5 pooled groups: normal livers, hepatitis livers, cirrhotic livers, peritumoral livers, and HCC tissues) by iTRAQ 2D LC-MS/MS. The diagnostic performance of the top six most upregulated proteins in HCC group and HSP70 as reference were subsequently validated in cohort 2 of 114 samples (hepatocarcinogenesis from normal livers to HCC) using immunohistochemistry. Of seven candidate protein markers, PARP1, GS and NDRG1 showed the optimal diagnostic performance for HCC. PARP1, as a novel marker, showed comparable diagnostic performance to that of classic markers GS and NDRG1 in HCC (AUCs = 0.872, 0.856 and 0.792, respectively). A significant higher AUC of 0.945 was achieved when three markers combined. For diagnosis of HCC, the sensitivity and specificity were 88.2% and 81.0% when at least two of the markers were positive. Similar diagnostic values of PARP1, GS and NDRG1 were confirmed by immunohistochemistry in cohort 3 of 180 HCC patients. Further analysis indicated that PARP1 and NDRG1 were associated with some clinicopathological features, and the independent prognostic factors for HCC patients. Overall, global large-scale proteomics on spectrum of multistep hepatocarcinogenesis are obtained. PARP1 is a novel promising diagnostic/prognostic marker for HCC, and the three-marker panel (PARP1, GS and NDRG1) with excellent diagnostic performance for HCC was established. PMID:26883192

  5. Automation in College Libraries.

    ERIC Educational Resources Information Center

    Werking, Richard Hume

    1991-01-01

    Reports the results of a survey of the "Bowdoin List" group of liberal arts colleges. The survey obtained information about (1) automation modules in place and when they had been installed; (2) financing of automation and its impacts on the library budgets; and (3) library director's views on library automation and the nature of the…

  6. Automation: Decision Aid or Decision Maker?

    NASA Technical Reports Server (NTRS)

    Skitka, Linda J.

    1998-01-01

    This study clarified that automation bias is something unique to automated decision making contexts, and is not the result of a general tendency toward complacency. By comparing performance on exactly the same events on the same tasks with and without an automated decision aid, we were able to determine that at least the omission error part of automation bias is due to the unique context created by having an automated decision aid, and is not a phenomena that would occur even if people were not in an automated context. However, this study also revealed that having an automated decision aid did lead to modestly improved performance across all non-error events. Participants in the non- automated condition responded with 83.68% accuracy, whereas participants in the automated condition responded with 88.67% accuracy, across all events. Automated decision aids clearly led to better overall performance when they were accurate. People performed almost exactly at the level of reliability as the automation (which across events was 88% reliable). However, also clear, is that the presence of less than 100% accurate automated decision aids creates a context in which new kinds of errors in decision making can occur. Participants in the non-automated condition responded with 97% accuracy on the six "error" events, whereas participants in the automated condition had only a 65% accuracy rate when confronted with those same six events. In short, the presence of an AMA can lead to vigilance decrements that can lead to errors in decision making.

  7. Semi-automated quantitative Drosophila wings measurements.

    PubMed

    Loh, Sheng Yang Michael; Ogawa, Yoshitaka; Kawana, Sara; Tamura, Koichiro; Lee, Hwee Kuan

    2017-06-28

    Drosophila melanogaster is an important organism used in many fields of biological research such as genetics and developmental biology. Drosophila wings have been widely used to study the genetics of development, morphometrics and evolution. Therefore there is much interest in quantifying wing structures of Drosophila. Advancement in technology has increased the ease in which images of Drosophila can be acquired. However such studies have been limited by the slow and tedious process of acquiring phenotypic data. We have developed a system that automatically detects and measures key points and vein segments on a Drosophila wing. Key points are detected by performing image transformations and template matching on Drosophila wing images while vein segments are detected using an Active Contour algorithm. The accuracy of our key point detection was compared against key point annotations of users. We also performed key point detection using different training data sets of Drosophila wing images. We compared our software with an existing automated image analysis system for Drosophila wings and showed that our system performs better than the state of the art. Vein segments were manually measured and compared against the measurements obtained from our system. Our system was able to detect specific key points and vein segments from Drosophila wing images with high accuracy.

  8. Human genetic resistance to malaria.

    PubMed

    Williams, Thomas N

    2009-01-01

    This brief chapter highlights the need for caution when designing and interpreting studies aimed at seeking new genes that may be associated with malaria protection, or investigating the potential mechanisms for protection in promising candidates. Judging genetic effects on the basis of the wrong clinical phenotype and missing true protective genes because their protective effects are masked by unpredictable epistatic effects are major potential pitfalls. These issues are by no means unique to malaria: in recent years, the importance of larger sample sizes and careful phenotypic definitions have become appreciated increasingly, particularly for genome-wide studies of complex diseases (Cordell and Clayton, 2005; Burton, Tobin and Hopper, 2005). Until recently, research in the field of malaria genetics has not enjoyed the sort of funding afforded to similar work investigating diseases of importance to the developed world. However, in the last few years, coupled with advances in genetic diagnostics that have led to massive automation and falling costs per gene explored, momentum has grown towards more generous funding that brings with it the opportunity for much larger, multisite cohesive studies. The stage is set for a giant leap forward in the coming years.

  9. Laboratory Automation and Middleware.

    PubMed

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Managing laboratory automation

    PubMed Central

    Saboe, Thomas J.

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed. PMID:18925018

  11. Managing laboratory automation.

    PubMed

    Saboe, T J

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed.

  12. The Science of Home Automation

    NASA Astrophysics Data System (ADS)

    Thomas, Brian Louis

    Smart home technologies and the concept of home automation have become more popular in recent years. This popularity has been accompanied by social acceptance of passive sensors installed throughout the home. The subsequent increase in smart homes facilitates the creation of home automation strategies. We believe that home automation strategies can be generated intelligently by utilizing smart home sensors and activity learning. In this dissertation, we hypothesize that home automation can benefit from activity awareness. To test this, we develop our activity-aware smart automation system, CARL (CASAS Activity-aware Resource Learning). CARL learns the associations between activities and device usage from historical data and utilizes the activity-aware capabilities to control the devices. To help validate CARL we deploy and test three different versions of the automation system in a real-world smart environment. To provide a foundation of activity learning, we integrate existing activity recognition and activity forecasting into CARL home automation. We also explore two alternatives to using human-labeled data to train the activity learning models. The first unsupervised method is Activity Detection, and the second is a modified DBSCAN algorithm that utilizes Dynamic Time Warping (DTW) as a distance metric. We compare the performance of activity learning with human-defined labels and with automatically-discovered activity categories. To provide evidence in support of our hypothesis, we evaluate CARL automation in a smart home testbed. Our results indicate that home automation can be boosted through activity awareness. We also find that the resulting automation has a high degree of usability and comfort for the smart home resident.

  13. Extracting genetic alteration information for personalized cancer therapy from ClinicalTrials.gov

    PubMed Central

    Xu, Jun; Lee, Hee-Jin; Zeng, Jia; Wu, Yonghui; Zhang, Yaoyun; Huang, Liang-Chin; Johnson, Amber; Holla, Vijaykumar; Bailey, Ann M; Cohen, Trevor; Meric-Bernstam, Funda; Bernstam, Elmer V

    2016-01-01

    Objective: Clinical trials investigating drugs that target specific genetic alterations in tumors are important for promoting personalized cancer therapy. The goal of this project is to create a knowledge base of cancer treatment trials with annotations about genetic alterations from ClinicalTrials.gov. Methods: We developed a semi-automatic framework that combines advanced text-processing techniques with manual review to curate genetic alteration information in cancer trials. The framework consists of a document classification system to identify cancer treatment trials from ClinicalTrials.gov and an information extraction system to extract gene and alteration pairs from the Title and Eligibility Criteria sections of clinical trials. By applying the framework to trials at ClinicalTrials.gov, we created a knowledge base of cancer treatment trials with genetic alteration annotations. We then evaluated each component of the framework against manually reviewed sets of clinical trials and generated descriptive statistics of the knowledge base. Results and Discussion: The automated cancer treatment trial identification system achieved a high precision of 0.9944. Together with the manual review process, it identified 20 193 cancer treatment trials from ClinicalTrials.gov. The automated gene-alteration extraction system achieved a precision of 0.8300 and a recall of 0.6803. After validation by manual review, we generated a knowledge base of 2024 cancer trials that are labeled with specific genetic alteration information. Analysis of the knowledge base revealed the trend of increased use of targeted therapy for cancer, as well as top frequent gene-alteration pairs of interest. We expect this knowledge base to be a valuable resource for physicians and patients who are seeking information about personalized cancer therapy. PMID:27013523

  14. Automation in organizations: Eternal conflict

    NASA Technical Reports Server (NTRS)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  15. Experience of automation failures in training: effects on trust, automation bias, complacency and performance.

    PubMed

    Sauer, Juergen; Chavaillaz, Alain; Wastell, David

    2016-06-01

    This work examined the effects of operators' exposure to various types of automation failures in training. Forty-five participants were trained for 3.5 h on a simulated process control environment. During training, participants either experienced a fully reliable, automatic fault repair facility (i.e. faults detected and correctly diagnosed), a misdiagnosis-prone one (i.e. faults detected but not correctly diagnosed) or a miss-prone one (i.e. faults not detected). One week after training, participants were tested for 3 h, experiencing two types of automation failures (misdiagnosis, miss). The results showed that automation bias was very high when operators trained on miss-prone automation encountered a failure of the diagnostic system. Operator errors resulting from automation bias were much higher when automation misdiagnosed a fault than when it missed one. Differences in trust levels that were instilled by the different training experiences disappeared during the testing session. Practitioner Summary: The experience of automation failures during training has some consequences. A greater potential for operator errors may be expected when an automatic system failed to diagnose a fault than when it failed to detect one.

  16. Automated Engineering Design (AED); An approach to automated documentation

    NASA Technical Reports Server (NTRS)

    Mcclure, C. W.

    1970-01-01

    The automated engineering design (AED) is reviewed, consisting of a high level systems programming language, a series of modular precoded subroutines, and a set of powerful software machine tools that effectively automate the production and design of new languages. AED is used primarily for development of problem and user-oriented languages. Software production phases are diagramed, and factors which inhibit effective documentation are evaluated.

  17. The Automated Office.

    ERIC Educational Resources Information Center

    Naclerio, Nick

    1979-01-01

    Clerical personnel may be able to climb career ladders as a result of office automation and expanded job opportunities in the word processing area. Suggests opportunities in an automated office system and lists books and periodicals on word processing for counselors and teachers. (MF)

  18. Exact free vibration of multi-step Timoshenko beam system with several attachments

    NASA Astrophysics Data System (ADS)

    Farghaly, S. H.; El-Sayed, T. A.

    2016-05-01

    This paper deals with the analysis of the natural frequencies, mode shapes of an axially loaded multi-step Timoshenko beam combined system carrying several attachments. The influence of system design and the proposed sub-system non-dimensional parameters on the combined system characteristics are the major part of this investigation. The effect of material properties, rotary inertia and shear deformation of the beam system for each span are included. The end masses are elastically supported against rotation and translation at an offset point from the point of attachment. A sub-system having two degrees of freedom is located at the beam ends and at any of the intermediate stations and acts as a support and/or a suspension. The boundary conditions of the ordinary differential equation governing the lateral deflections and slope due to bending of the beam system including the shear force term, due to the sub-system, have been formulated. Exact global coefficient matrices for the combined modal frequencies, the modal shape and for the discrete sub-system have been derived. Based on these formulae, detailed parametric studies of the combined system are carried out. The applied mathematical model is valid for wide range of applications especially in mechanical, naval and structural engineering fields.

  19. Automated processing pipeline for neonatal diffusion MRI in the developing Human Connectome Project.

    PubMed

    Bastiani, Matteo; Andersson, Jesper L R; Cordero-Grande, Lucilio; Murgasova, Maria; Hutter, Jana; Price, Anthony N; Makropoulos, Antonios; Fitzgibbon, Sean P; Hughes, Emer; Rueckert, Daniel; Victor, Suresh; Rutherford, Mary; Edwards, A David; Smith, Stephen M; Tournier, Jacques-Donald; Hajnal, Joseph V; Jbabdi, Saad; Sotiropoulos, Stamatios N

    2018-05-28

    The developing Human Connectome Project is set to create and make available to the scientific community a 4-dimensional map of functional and structural cerebral connectivity from 20 to 44 weeks post-menstrual age, to allow exploration of the genetic and environmental influences on brain development, and the relation between connectivity and neurocognitive function. A large set of multi-modal MRI data from fetuses and newborn infants is currently being acquired, along with genetic, clinical and developmental information. In this overview, we describe the neonatal diffusion MRI (dMRI) image processing pipeline and the structural connectivity aspect of the project. Neonatal dMRI data poses specific challenges, and standard analysis techniques used for adult data are not directly applicable. We have developed a processing pipeline that deals directly with neonatal-specific issues, such as severe motion and motion-related artefacts, small brain sizes, high brain water content and reduced anisotropy. This pipeline allows automated analysis of in-vivo dMRI data, probes tissue microstructure, reconstructs a number of major white matter tracts, and includes an automated quality control framework that identifies processing issues or inconsistencies. We here describe the pipeline and present an exemplar analysis of data from 140 infants imaged at 38-44 weeks post-menstrual age. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Systematic review automation technologies

    PubMed Central

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  1. Complex supramolecular interfacial tessellation through convergent multi-step reaction of a dissymmetric simple organic precursor

    NASA Astrophysics Data System (ADS)

    Zhang, Yi-Qi; Paszkiewicz, Mateusz; Du, Ping; Zhang, Liding; Lin, Tao; Chen, Zhi; Klyatskaya, Svetlana; Ruben, Mario; Seitsonen, Ari P.; Barth, Johannes V.; Klappenberger, Florian

    2018-03-01

    Interfacial supramolecular self-assembly represents a powerful tool for constructing regular and quasicrystalline materials. In particular, complex two-dimensional molecular tessellations, such as semi-regular Archimedean tilings with regular polygons, promise unique properties related to their nontrivial structures. However, their formation is challenging, because current methods are largely limited to the direct assembly of precursors, that is, where structure formation relies on molecular interactions without using chemical transformations. Here, we have chosen ethynyl-iodophenanthrene (which features dissymmetry in both geometry and reactivity) as a single starting precursor to generate the rare semi-regular (3.4.6.4) Archimedean tiling with long-range order on an atomically flat substrate through a multi-step reaction. Intriguingly, the individual chemical transformations converge to form a symmetric alkynyl-Ag-alkynyl complex as the new tecton in high yields. Using a combination of microscopy and X-ray spectroscopy tools, as well as computational modelling, we show that in situ generated catalytic Ag complexes mediate the tecton conversion.

  2. Understanding human management of automation errors.

    PubMed

    McBride, Sara E; Rogers, Wendy A; Fisk, Arthur D

    2014-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance.

  3. An integrated approach to characterize genetic interaction networks in yeast metabolism

    PubMed Central

    Szappanos, Balázs; Kovács, Károly; Szamecz, Béla; Honti, Frantisek; Costanzo, Michael; Baryshnikova, Anastasia; Gelius-Dietrich, Gabriel; Lercher, Martin J.; Jelasity, Márk; Myers, Chad L.; Andrews, Brenda J.; Boone, Charles; Oliver, Stephen G.; Pál, Csaba; Papp, Balázs

    2011-01-01

    Intense experimental and theoretical efforts have been made to globally map genetic interactions, yet we still do not understand how gene-gene interactions arise from the operation of biomolecular networks. To bridge the gap between empirical and computational studies, we: i) quantitatively measure genetic interactions between ~185,000 metabolic gene pairs in Saccharomyces cerevisiae, ii) superpose the data on a detailed systems biology model of metabolism, and iii) introduce a machine-learning method to reconcile empirical interaction data with model predictions. We systematically investigate the relative impacts of functional modularity and metabolic flux coupling on the distribution of negative and positive genetic interactions. We also provide a mechanistic explanation for the link between the degree of genetic interaction, pleiotropy, and gene dispensability. Last, we demonstrate the feasibility of automated metabolic model refinement by correcting misannotations in NAD biosynthesis and confirming them by in vivo experiments. PMID:21623372

  4. A Versatile Microfluidic Device for Automating Synthetic Biology.

    PubMed

    Shih, Steve C C; Goyal, Garima; Kim, Peter W; Koutsoubelis, Nicolas; Keasling, Jay D; Adams, Paul D; Hillson, Nathan J; Singh, Anup K

    2015-10-16

    New microbes are being engineered that contain the genetic circuitry, metabolic pathways, and other cellular functions required for a wide range of applications such as producing biofuels, biobased chemicals, and pharmaceuticals. Although currently available tools are useful in improving the synthetic biology process, further improvements in physical automation would help to lower the barrier of entry into this field. We present an innovative microfluidic platform for assembling DNA fragments with 10× lower volumes (compared to that of current microfluidic platforms) and with integrated region-specific temperature control and on-chip transformation. Integration of these steps minimizes the loss of reagents and products compared to that with conventional methods, which require multiple pipetting steps. For assembling DNA fragments, we implemented three commonly used DNA assembly protocols on our microfluidic device: Golden Gate assembly, Gibson assembly, and yeast assembly (i.e., TAR cloning, DNA Assembler). We demonstrate the utility of these methods by assembling two combinatorial libraries of 16 plasmids each. Each DNA plasmid is transformed into Escherichia coli or Saccharomyces cerevisiae using on-chip electroporation and further sequenced to verify the assembly. We anticipate that this platform will enable new research that can integrate this automated microfluidic platform to generate large combinatorial libraries of plasmids and will help to expedite the overall synthetic biology process.

  5. Synthesis of Well-Defined Copper "N"-Heterocyclic Carbene Complexes and Their Use as Catalysts for a "Click Reaction": A Multistep Experiment that Emphasizes the Role of Catalysis in Green Chemistry

    ERIC Educational Resources Information Center

    Ison, Elon A.; Ison, Ana

    2012-01-01

    A multistep experiment for an advanced synthesis lab course that incorporates topics in organic-inorganic synthesis and catalysis and highlights green chemistry principles was developed. Students synthesized two "N"-heterocyclic carbene ligands, used them to prepare two well-defined copper(I) complexes and subsequently utilized the complexes as…

  6. End-to-end automated microfluidic platform for synthetic biology: from design to functional analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linshiz, Gregory; Jensen, Erik; Stawski, Nina

    Synthetic biology aims to engineer biological systems for desired behaviors. The construction of these systems can be complex, often requiring genetic reprogramming, extensive de novo DNA synthesis, and functional screening. Here, we present a programmable, multipurpose microfluidic platform and associated software and apply the platform to major steps of the synthetic biology research cycle: design, construction, testing, and analysis. We show the platform’s capabilities for multiple automated DNA assembly methods, including a new method for Isothermal Hierarchical DNA Construction, and for Escherichia coli and Saccharomyces cerevisiae transformation. The platform enables the automated control of cellular growth, gene expression induction, andmore » proteogenic and metabolic output analysis. Finally, taken together, we demonstrate the microfluidic platform’s potential to provide end-to-end solutions for synthetic biology research, from design to functional analysis.« less

  7. End-to-end automated microfluidic platform for synthetic biology: from design to functional analysis

    DOE PAGES

    Linshiz, Gregory; Jensen, Erik; Stawski, Nina; ...

    2016-02-02

    Synthetic biology aims to engineer biological systems for desired behaviors. The construction of these systems can be complex, often requiring genetic reprogramming, extensive de novo DNA synthesis, and functional screening. Here, we present a programmable, multipurpose microfluidic platform and associated software and apply the platform to major steps of the synthetic biology research cycle: design, construction, testing, and analysis. We show the platform’s capabilities for multiple automated DNA assembly methods, including a new method for Isothermal Hierarchical DNA Construction, and for Escherichia coli and Saccharomyces cerevisiae transformation. The platform enables the automated control of cellular growth, gene expression induction, andmore » proteogenic and metabolic output analysis. Finally, taken together, we demonstrate the microfluidic platform’s potential to provide end-to-end solutions for synthetic biology research, from design to functional analysis.« less

  8. The Automation-by-Expertise-by-Training Interaction.

    PubMed

    Strauch, Barry

    2017-03-01

    I introduce the automation-by-expertise-by-training interaction in automated systems and discuss its influence on operator performance. Transportation accidents that, across a 30-year interval demonstrated identical automation-related operator errors, suggest a need to reexamine traditional views of automation. I review accident investigation reports, regulator studies, and literature on human computer interaction, expertise, and training and discuss how failing to attend to the interaction of automation, expertise level, and training has enabled operators to commit identical automation-related errors. Automated systems continue to provide capabilities exceeding operators' need for effective system operation and provide interfaces that can hinder, rather than enhance, operator automation-related situation awareness. Because of limitations in time and resources, training programs do not provide operators the expertise needed to effectively operate these automated systems, requiring them to obtain the expertise ad hoc during system operations. As a result, many do not acquire necessary automation-related system expertise. Integrating automation with expected operator expertise levels, and within training programs that provide operators the necessary automation expertise, can reduce opportunities for automation-related operator errors. Research to address the automation-by-expertise-by-training interaction is needed. However, such research must meet challenges inherent to examining realistic sociotechnical system automation features with representative samples of operators, perhaps by using observational and ethnographic research. Research in this domain should improve the integration of design and training and, it is hoped, enhance operator performance.

  9. Automated Microwave Dielectric Constant Measurement

    DTIC Science & Technology

    1987-03-01

    IJSWC TR 86-46 AD.-A 184 182 AUTOMATED MICROWAVE DIELECTRIC CONSTANT MEASUREMENT SYTIEM BY B. C. GLANCY A. KRALL PESEARCH AND TECHNOLOGY DEPARTMENT...NO0. NO. ACCESSION NO. Silver Spring, Maryland 20903-500061152N ZROO1 ZRO131 R1AA29 11. TITLE (Include Security Classification) AUTOMATED MICROWAVE ...constants as a funct on of microwave frequency has been simplified using an automated testing apparatus. This automated procedure is based on the use of a

  10. Understanding human management of automation errors

    PubMed Central

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  11. Cockpit Adaptive Automation and Pilot Performance

    NASA Technical Reports Server (NTRS)

    Parasuraman, Raja

    2001-01-01

    The introduction of high-level automated systems in the aircraft cockpit has provided several benefits, e.g., new capabilities, enhanced operational efficiency, and reduced crew workload. At the same time, conventional 'static' automation has sometimes degraded human operator monitoring performance, increased workload, and reduced situation awareness. Adaptive automation represents an alternative to static automation. In this approach, task allocation between human operators and computer systems is flexible and context-dependent rather than static. Adaptive automation, or adaptive task allocation, is thought to provide for regulation of operator workload and performance, while preserving the benefits of static automation. In previous research we have reported beneficial effects of adaptive automation on the performance of both pilots and non-pilots of flight-related tasks. For adaptive systems to be viable, however, such benefits need to be examined jointly in the context of a single set of tasks. The studies carried out under this project evaluated a systematic method for combining different forms of adaptive automation. A model for effective combination of different forms of adaptive automation, based on matching adaptation to operator workload was proposed and tested. The model was evaluated in studies using IFR-rated pilots flying a general-aviation simulator. Performance, subjective, and physiological (heart rate variability, eye scan-paths) measures of workload were recorded. The studies compared workload-based adaptation to to non-adaptive control conditions and found evidence for systematic benefits of adaptive automation. The research provides an empirical basis for evaluating the effectiveness of adaptive automation in the cockpit. The results contribute to the development of design principles and guidelines for the implementation of adaptive automation in the cockpit, particularly in general aviation, and in other human-machine systems. Project goals

  12. An automated metrics system to measure and improve the success of laboratory automation implementation.

    PubMed

    Benn, Neil; Turlais, Fabrice; Clark, Victoria; Jones, Mike; Clulow, Stephen

    2007-03-01

    The authors describe a system for collecting usage metrics from widely distributed automation systems. An application that records and stores usage data centrally, calculates run times, and charts the data was developed. Data were collected over 20 months from at least 28 workstations. The application was used to plot bar charts of date versus run time for individual workstations, the automation in a specific laboratory, or automation of a specified type. The authors show that revised user training, redeployment of equipment, and running complimentary processes on one workstation can increase the average number of runs by up to 20-fold and run times by up to 450%. Active monitoring of usage leads to more effective use of automation. Usage data could be used to determine whether purchasing particular automation was a good investment.

  13. Team-Centered Perspective for Adaptive Automation Design

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III

    2003-01-01

    Automation represents a very active area of human factors research. The journal, Human Factors, published a special issue on automation in 1985. Since then, hundreds of scientific studies have been published examining the nature of automation and its interaction with human performance. However, despite a dramatic increase in research investigating human factors issues in aviation automation, there remain areas that need further exploration. This NASA Technical Memorandum describes a new area of automation design and research, called adaptive automation. It discusses the concepts and outlines the human factors issues associated with the new method of adaptive function allocation. The primary focus is on human-centered design, and specifically on ensuring that adaptive automation is from a team-centered perspective. The document shows that adaptive automation has many human factors issues common to traditional automation design. Much like the introduction of other new technologies and paradigm shifts, adaptive automation presents an opportunity to remediate current problems but poses new ones for human-automation interaction in aerospace operations. The review here is intended to communicate the philosophical perspective and direction of adaptive automation research conducted under the Aerospace Operations Systems (AOS), Physiological and Psychological Stressors and Factors (PPSF) project.

  14. Toward designing for trust in database automation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duez, P. P.; Jamieson, G. A.

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operatingmore » functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated

  15. Multi-step resistive switching behavior of Li-doped ZnO resistance random access memory device controlled by compliance current

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Chun-Cheng; Department of Mathematic and Physical Sciences, R.O.C. Air Force Academy, Kaohsiung 820, Taiwan; Tang, Jian-Fu

    2016-06-28

    The multi-step resistive switching (RS) behavior of a unipolar Pt/Li{sub 0.06}Zn{sub 0.94}O/Pt resistive random access memory (RRAM) device is investigated. It is found that the RRAM device exhibits normal, 2-, 3-, and 4-step RESET behaviors under different compliance currents. The transport mechanism within the device is investigated by means of current-voltage curves, in-situ transmission electron microscopy, and electrochemical impedance spectroscopy. It is shown that the ion transport mechanism is dominated by Ohmic behavior under low electric fields and the Poole-Frenkel emission effect (normal RS behavior) or Li{sup +} ion diffusion (2-, 3-, and 4-step RESET behaviors) under high electric fields.

  16. The 'problem' with automation - Inappropriate feedback and interaction, not 'over-automation'

    NASA Technical Reports Server (NTRS)

    Norman, D. A.

    1990-01-01

    Automation in high-risk industry is often blamed for causing harm and increasing the chance of human error when failures occur. It is proposed that the problem is not the presence of automation, but rather its inappropriate design. The problem is that the operations are performed appropriately under normal conditions, but there is inadequate feedback and interaction with the humans who must control the overall conduct of the task. The problem is that the automation is at an intermediate level of intelligence, powerful enough to take over control which used to be done by people, but not powerful enough to handle all abnormalities. Moreover, its level of intelligence is insufficient to provide the continual, appropriate feedback that occurs naturally among human operators. To solve this problem, the automation should either be made less intelligent or more so, but the current level is quite inappropriate. The overall message is that it is possible to reduce error through appropriate design considerations.

  17. The Automation of Reserve Processing.

    ERIC Educational Resources Information Center

    Self, James

    1985-01-01

    Describes an automated reserve processing system developed locally at Clemons Library, University of Virginia. Discussion covers developments in the reserve operation at Clemons Library, automation of the processing and circulation functions of reserve collections, and changes in reserve operation performance and staffing needs due to automation.…

  18. A Comparison of Automated and Manual Crater Counting Techniques in Images of Elysium Planitia.

    NASA Astrophysics Data System (ADS)

    Plesko, C. S.; Brumby, S. P.; Asphaug, E.

    2004-11-01

    Surveys of impact craters yield a wealth of information about Martian geology, providing clues to the relative age, local composition and erosional history of the surface. Martian craters are also of intrinsic geophysical interest, given that the processes by which they form are not entirely clear, especially cratering in ice-saturated regoliths (Plesko et al. 2004, AGU) which appear common on Mars (Squyres and Carr 1986). However, the deluge of data over the last decade has made comprehensive manual counts prohibitive, except in select regions. Given that most small craters on Mars may be secondaries from a few very recent impact events (McEwen et al. in press, Icarus 2004), using select regions for age dating introduces considerable potential for sampling error. Automation is thus an enabling planetary science technology. In contrast to machine counts, human counts are prone to human decision making, thus not intrinsically reproducible. One can address human "noise" by averaging over many human counts (Kanefsky et al. 2001), but this multiplies the already laborious effort required. In this study, we test automated crater counting algorithms developed with the Los Alamos National Laboratory genetic programming suite GENIE (Harvey et al., 2002) against established manual counts of craters in Elysium Planitia, using MOC and THEMIS data. We intend to establish the validity of our method against well-regarded hand counts (Hartmann et al. 2000), and then apply it generally to larger regions of Mars. Previous work on automated crater counting used customized algorithms (Bierhaus et al. 2003, Burl et al.. 2001). Algorithms generated by genetic programming have the advantage of requiring little time or user effort to generate, so it is relatively easy to generate a suite of algorithms for varied terrain types, or to compare results from multiple algorithms for improved accuracy (Plesko et al. 2003).

  19. Modeling Increased Complexity and the Reliance on Automation: FLightdeck Automation Problems (FLAP) Model

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    This paper highlights the development of a model that is focused on the safety issue of increasing complexity and reliance on automation systems in transport category aircraft. Recent statistics show an increase in mishaps related to manual handling and automation errors due to pilot complacency and over-reliance on automation, loss of situational awareness, automation system failures and/or pilot deficiencies. Consequently, the aircraft can enter a state outside the flight envelope and/or air traffic safety margins which potentially can lead to loss-of-control (LOC), controlled-flight-into-terrain (CFIT), or runway excursion/confusion accidents, etc. The goal of this modeling effort is to provide NASA's Aviation Safety Program (AvSP) with a platform capable of assessing the impacts of AvSP technologies and products towards reducing the relative risk of automation related accidents and incidents. In order to do so, a generic framework, capable of mapping both latent and active causal factors leading to automation errors, is developed. Next, the framework is converted into a Bayesian Belief Network model and populated with data gathered from Subject Matter Experts (SMEs). With the insertion of technologies and products, the model provides individual and collective risk reduction acquired by technologies and methodologies developed within AvSP.

  20. Trust in automation: designing for appropriate reliance.

    PubMed

    Lee, John D; See, Katrina A

    2004-01-01

    Automation is often problematic because people fail to rely upon it appropriately. Because people respond to technology socially, trust influences reliance on automation. In particular, trust guides reliance when complexity and unanticipated situations make a complete understanding of the automation impractical. This review considers trust from the organizational, sociological, interpersonal, psychological, and neurological perspectives. It considers how the context, automation characteristics, and cognitive processes affect the appropriateness of trust. The context in which the automation is used influences automation performance and provides a goal-oriented perspective to assess automation characteristics along a dimension of attributional abstraction. These characteristics can influence trust through analytic, analogical, and affective processes. The challenges of extrapolating the concept of trust in people to trust in automation are discussed. A conceptual model integrates research regarding trust in automation and describes the dynamics of trust, the role of context, and the influence of display characteristics. Actual or potential applications of this research include improved designs of systems that require people to manage imperfect automation.

  1. Automation and Cataloging.

    ERIC Educational Resources Information Center

    Furuta, Kenneth; And Others

    1990-01-01

    These three articles address issues in library cataloging that are affected by automation: (1) the impact of automation and bibliographic utilities on professional catalogers; (2) the effect of the LASS microcomputer software on the cost of authority work in cataloging at the University of Arizona; and (3) online subject heading and classification…

  2. Mobile Genome Express (MGE): A comprehensive automatic genetic analyses pipeline with a mobile device.

    PubMed

    Yoon, Jun-Hee; Kim, Thomas W; Mendez, Pedro; Jablons, David M; Kim, Il-Jin

    2017-01-01

    The development of next-generation sequencing (NGS) technology allows to sequence whole exomes or genome. However, data analysis is still the biggest bottleneck for its wide implementation. Most laboratories still depend on manual procedures for data handling and analyses, which translates into a delay and decreased efficiency in the delivery of NGS results to doctors and patients. Thus, there is high demand for developing an automatic and an easy-to-use NGS data analyses system. We developed comprehensive, automatic genetic analyses controller named Mobile Genome Express (MGE) that works in smartphones or other mobile devices. MGE can handle all the steps for genetic analyses, such as: sample information submission, sequencing run quality check from the sequencer, secured data transfer and results review. We sequenced an Actrometrix control DNA containing multiple proven human mutations using a targeted sequencing panel, and the whole analysis was managed by MGE, and its data reviewing program called ELECTRO. All steps were processed automatically except for the final sequencing review procedure with ELECTRO to confirm mutations. The data analysis process was completed within several hours. We confirmed the mutations that we have identified were consistent with our previous results obtained by using multi-step, manual pipelines.

  3. Isolation of circulating tumor cells from pancreatic cancer by automated filtration

    PubMed Central

    Brychta, Nora; Drosch, Michael; Driemel, Christiane; Fischer, Johannes C.; Neves, Rui P.; Esposito, Irene; Knoefel, Wolfram; Möhlendick, Birte; Hille, Claudia; Stresemann, Antje; Krahn, Thomas; Kassack, Matthias U.; Stoecklein, Nikolas H.; von Ahsen, Oliver

    2017-01-01

    It is now widely recognized that the isolation of circulating tumor cells based on cell surface markers might be hindered by variability in their protein expression. Especially in pancreatic cancer, isolation based only on EpCAM expression has produced very diverse results. Methods that are independent of surface markers and therefore independent of phenotypical changes in the circulating cells might increase CTC recovery also in pancreatic cancer. We compared an EpCAM-dependent (IsoFlux) and a size-dependent (automated Siemens Healthineers filtration device) isolation method for the enrichment of pancreatic cancer CTCs. The recovery rate of the filtration based approach is dramatically superior to the EpCAM-dependent approach especially for cells with low EpCAM-expression (filtration: 52%, EpCAM-dependent: 1%). As storage and shipment of clinical samples is important for centralized analyses, we also evaluated the use of frozen diagnostic leukapheresis (DLA) as source for isolating CTCs and subsequent genetic analysis such as KRAS mutation detection analysis. Using frozen DLA samples of pancreatic cancer patients we detected CTCs in 42% of the samples by automated filtration. PMID:29156783

  4. Isolation of circulating tumor cells from pancreatic cancer by automated filtration.

    PubMed

    Brychta, Nora; Drosch, Michael; Driemel, Christiane; Fischer, Johannes C; Neves, Rui P; Esposito, Irene; Knoefel, Wolfram; Möhlendick, Birte; Hille, Claudia; Stresemann, Antje; Krahn, Thomas; Kassack, Matthias U; Stoecklein, Nikolas H; von Ahsen, Oliver

    2017-10-17

    It is now widely recognized that the isolation of circulating tumor cells based on cell surface markers might be hindered by variability in their protein expression. Especially in pancreatic cancer, isolation based only on EpCAM expression has produced very diverse results. Methods that are independent of surface markers and therefore independent of phenotypical changes in the circulating cells might increase CTC recovery also in pancreatic cancer. We compared an EpCAM-dependent (IsoFlux) and a size-dependent (automated Siemens Healthineers filtration device) isolation method for the enrichment of pancreatic cancer CTCs. The recovery rate of the filtration based approach is dramatically superior to the EpCAM-dependent approach especially for cells with low EpCAM-expression (filtration: 52%, EpCAM-dependent: 1%). As storage and shipment of clinical samples is important for centralized analyses, we also evaluated the use of frozen diagnostic leukapheresis (DLA) as source for isolating CTCs and subsequent genetic analysis such as KRAS mutation detection analysis. Using frozen DLA samples of pancreatic cancer patients we detected CTCs in 42% of the samples by automated filtration.

  5. Extracting genetic alteration information for personalized cancer therapy from ClinicalTrials.gov.

    PubMed

    Xu, Jun; Lee, Hee-Jin; Zeng, Jia; Wu, Yonghui; Zhang, Yaoyun; Huang, Liang-Chin; Johnson, Amber; Holla, Vijaykumar; Bailey, Ann M; Cohen, Trevor; Meric-Bernstam, Funda; Bernstam, Elmer V; Xu, Hua

    2016-07-01

    Clinical trials investigating drugs that target specific genetic alterations in tumors are important for promoting personalized cancer therapy. The goal of this project is to create a knowledge base of cancer treatment trials with annotations about genetic alterations from ClinicalTrials.gov. We developed a semi-automatic framework that combines advanced text-processing techniques with manual review to curate genetic alteration information in cancer trials. The framework consists of a document classification system to identify cancer treatment trials from ClinicalTrials.gov and an information extraction system to extract gene and alteration pairs from the Title and Eligibility Criteria sections of clinical trials. By applying the framework to trials at ClinicalTrials.gov, we created a knowledge base of cancer treatment trials with genetic alteration annotations. We then evaluated each component of the framework against manually reviewed sets of clinical trials and generated descriptive statistics of the knowledge base. The automated cancer treatment trial identification system achieved a high precision of 0.9944. Together with the manual review process, it identified 20 193 cancer treatment trials from ClinicalTrials.gov. The automated gene-alteration extraction system achieved a precision of 0.8300 and a recall of 0.6803. After validation by manual review, we generated a knowledge base of 2024 cancer trials that are labeled with specific genetic alteration information. Analysis of the knowledge base revealed the trend of increased use of targeted therapy for cancer, as well as top frequent gene-alteration pairs of interest. We expect this knowledge base to be a valuable resource for physicians and patients who are seeking information about personalized cancer therapy. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Human-centered aircraft automation: A concept and guidelines

    NASA Technical Reports Server (NTRS)

    Billings, Charles E.

    1991-01-01

    Aircraft automation is examined and its effects on flight crews. Generic guidelines are proposed for the design and use of automation in transport aircraft, in the hope of stimulating increased and more effective dialogue among designers of automated cockpits, purchasers of automated aircraft, and the pilots who must fly those aircraft in line operations. The goal is to explore the means whereby automation may be a maximally effective tool or resource for pilots without compromising human authority and with an increase in system safety. After definition of the domain of the aircraft pilot and brief discussion of the history of aircraft automation, a concept of human centered automation is presented and discussed. Automated devices are categorized as a control automation, information automation, and management automation. The environment and context of aircraft automation are then considered, followed by thoughts on the likely future of automation of that category.

  7. Genetic Constructor: An Online DNA Design Platform.

    PubMed

    Bates, Maxwell; Lachoff, Joe; Meech, Duncan; Zulkower, Valentin; Moisy, Anaïs; Luo, Yisha; Tekotte, Hille; Franziska Scheitz, Cornelia Johanna; Khilari, Rupal; Mazzoldi, Florencio; Chandran, Deepak; Groban, Eli

    2017-12-15

    Genetic Constructor is a cloud Computer Aided Design (CAD) application developed to support synthetic biologists from design intent through DNA fabrication and experiment iteration. The platform allows users to design, manage, and navigate complex DNA constructs and libraries, using a new visual language that focuses on functional parts abstracted from sequence. Features like combinatorial libraries and automated primer design allow the user to separate design from construction by focusing on functional intent, and design constraints aid iterative refinement of designs. A plugin architecture enables contributions from scientists and coders to leverage existing powerful software and connect to DNA foundries. The software is easily accessible and platform agnostic, free for academics, and available in an open-source community edition. Genetic Constructor seeks to democratize DNA design, manufacture, and access to tools and services from the synthetic biology community.

  8. A multistep damage recognition mechanism for global genomic nucleotide excision repair

    PubMed Central

    Sugasawa, Kaoru; Okamoto, Tomoko; Shimizu, Yuichiro; Masutani, Chikahide; Iwai, Shigenori; Hanaoka, Fumio

    2001-01-01

    A mammalian nucleotide excision repair (NER) factor, the XPC–HR23B complex, can specifically bind to certain DNA lesions and initiate the cell-free repair reaction. Here we describe a detailed analysis of its binding specificity using various DNA substrates, each containing a single defined lesion. A highly sensitive gel mobility shift assay revealed that XPC–HR23B specifically binds a small bubble structure with or without damaged bases, whereas dual incision takes place only when damage is present in the bubble. This is evidence that damage recognition for NER is accomplished through at least two steps; XPC–HR23B first binds to a site that has a DNA helix distortion, and then the presence of injured bases is verified prior to dual incision. Cyclobutane pyrimidine dimers (CPDs) were hardly recognized by XPC–HR23B, suggesting that additional factors may be required for CPD recognition. Although the presence of mismatched bases opposite a CPD potentiated XPC–HR23B binding, probably due to enhancement of the helix distortion, cell-free excision of such compound lesions was much more efficient than expected from the observed affinity for XPC–HR23B. This also suggests that additional factors and steps are required for the recognition of some types of lesions. A multistep mechanism of this sort may provide a molecular basis for ensuring the high level of damage discrimination that is required for global genomic NER. PMID:11238373

  9. A multistep damage recognition mechanism for global genomic nucleotide excision repair.

    PubMed

    Sugasawa, K; Okamoto, T; Shimizu, Y; Masutani, C; Iwai, S; Hanaoka, F

    2001-03-01

    A mammalian nucleotide excision repair (NER) factor, the XPC-HR23B complex, can specifically bind to certain DNA lesions and initiate the cell-free repair reaction. Here we describe a detailed analysis of its binding specificity using various DNA substrates, each containing a single defined lesion. A highly sensitive gel mobility shift assay revealed that XPC-HR23B specifically binds a small bubble structure with or without damaged bases, whereas dual incision takes place only when damage is present in the bubble. This is evidence that damage recognition for NER is accomplished through at least two steps; XPC-HR23B first binds to a site that has a DNA helix distortion, and then the presence of injured bases is verified prior to dual incision. Cyclobutane pyrimidine dimers (CPDs) were hardly recognized by XPC-HR23B, suggesting that additional factors may be required for CPD recognition. Although the presence of mismatched bases opposite a CPD potentiated XPC-HR23B binding, probably due to enhancement of the helix distortion, cell-free excision of such compound lesions was much more efficient than expected from the observed affinity for XPC-HR23B. This also suggests that additional factors and steps are required for the recognition of some types of lesions. A multistep mechanism of this sort may provide a molecular basis for ensuring the high level of damage discrimination that is required for global genomic NER.

  10. Classification of Automated Search Traffic

    NASA Astrophysics Data System (ADS)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  11. Laboratory automation: trajectory, technology, and tactics.

    PubMed

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a

  12. Automation pilot

    NASA Technical Reports Server (NTRS)

    1983-01-01

    An important concept of the Action Information Management System (AIMS) approach is to evaluate office automation technology in the context of hands on use by technical program managers in the conduct of human acceptance difficulties which may accompany the transition to a significantly changing work environment. The improved productivity and communications which result from application of office automation technology are already well established for general office environments, but benefits unique to NASA are anticipated and these will be explored in detail.

  13. Selecting automation for the clinical chemistry laboratory.

    PubMed

    Melanson, Stacy E F; Lindeman, Neal I; Jarolim, Petr

    2007-07-01

    Laboratory automation proposes to improve the quality and efficiency of laboratory operations, and may provide a solution to the quality demands and staff shortages faced by today's clinical laboratories. Several vendors offer automation systems in the United States, with both subtle and obvious differences. Arriving at a decision to automate, and the ensuing evaluation of available products, can be time-consuming and challenging. Although considerable discussion concerning the decision to automate has been published, relatively little attention has been paid to the process of evaluating and selecting automation systems. To outline a process for evaluating and selecting automation systems as a reference for laboratories contemplating laboratory automation. Our Clinical Chemistry Laboratory staff recently evaluated all major laboratory automation systems in the United States, with their respective chemistry and immunochemistry analyzers. Our experience is described and organized according to the selection process, the important considerations in clinical chemistry automation, decisions and implementation, and we give conclusions pertaining to this experience. Including the formation of a committee, workflow analysis, submitting a request for proposal, site visits, and making a final decision, the process of selecting chemistry automation took approximately 14 months. We outline important considerations in automation design, preanalytical processing, analyzer selection, postanalytical storage, and data management. Selecting clinical chemistry laboratory automation is a complex, time-consuming process. Laboratories considering laboratory automation may benefit from the concise overview and narrative and tabular suggestions provided.

  14. Multi-channel acoustic recording and automated analysis of Drosophila courtship songs

    PubMed Central

    2013-01-01

    Background Drosophila melanogaster has served as a powerful model system for genetic studies of courtship songs. To accelerate research on the genetic and neural mechanisms underlying courtship song, we have developed a sensitive recording system to simultaneously capture the acoustic signals from 32 separate pairs of courting flies as well as software for automated segmentation of songs. Results Our novel hardware design enables recording of low amplitude sounds in most laboratory environments. We demonstrate the power of this system by collecting, segmenting and analyzing over 18 hours of courtship song from 75 males from five wild-type strains of Drosophila melanogaster. Our analysis reveals previously undetected modulation of courtship song features and extensive natural genetic variation for most components of courtship song. Despite having a large dataset with sufficient power to detect subtle modulations of song, we were unable to identify previously reported periodic rhythms in the inter-pulse interval of song. We provide detailed instructions for assembling the hardware and for using our open-source segmentation software. Conclusions Analysis of a large dataset of acoustic signals from Drosophila melanogaster provides novel insight into the structure and dynamics of species-specific courtship songs. Our new system for recording and analyzing fly acoustic signals should therefore greatly accelerate future studies of the genetics, neurobiology and evolution of courtship song. PMID:23369160

  15. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the...

  16. Automated solid-phase subcloning based on beads brought into proximity by magnetic force.

    PubMed

    Hudson, Elton P; Nikoshkov, Andrej; Uhlen, Mathias; Rockberg, Johan

    2012-01-01

    In the fields of proteomics, metabolic engineering and synthetic biology there is a need for high-throughput and reliable cloning methods to facilitate construction of expression vectors and genetic pathways. Here, we describe a new approach for solid-phase cloning in which both the vector and the gene are immobilized to separate paramagnetic beads and brought into proximity by magnetic force. Ligation events were directly evaluated using fluorescent-based microscopy and flow cytometry. The highest ligation efficiencies were obtained when gene- and vector-coated beads were brought into close contact by application of a magnet during the ligation step. An automated procedure was developed using a laboratory workstation to transfer genes into various expression vectors and more than 95% correct clones were obtained in a number of various applications. The method presented here is suitable for efficient subcloning in an automated manner to rapidly generate a large number of gene constructs in various vectors intended for high throughput applications.

  17. Automated Solid-Phase Subcloning Based on Beads Brought into Proximity by Magnetic Force

    PubMed Central

    Hudson, Elton P.; Nikoshkov, Andrej; Uhlen, Mathias; Rockberg, Johan

    2012-01-01

    In the fields of proteomics, metabolic engineering and synthetic biology there is a need for high-throughput and reliable cloning methods to facilitate construction of expression vectors and genetic pathways. Here, we describe a new approach for solid-phase cloning in which both the vector and the gene are immobilized to separate paramagnetic beads and brought into proximity by magnetic force. Ligation events were directly evaluated using fluorescent-based microscopy and flow cytometry. The highest ligation efficiencies were obtained when gene- and vector-coated beads were brought into close contact by application of a magnet during the ligation step. An automated procedure was developed using a laboratory workstation to transfer genes into various expression vectors and more than 95% correct clones were obtained in a number of various applications. The method presented here is suitable for efficient subcloning in an automated manner to rapidly generate a large number of gene constructs in various vectors intended for high throughput applications. PMID:22624028

  18. Evaluation of BRCA1 and BRCA2 mutation prevalence, risk prediction models and a multistep testing approach in French‐Canadian families with high risk of breast and ovarian cancer

    PubMed Central

    Simard, Jacques; Dumont, Martine; Moisan, Anne‐Marie; Gaborieau, Valérie; Vézina, Hélène; Durocher, Francine; Chiquette, Jocelyne; Plante, Marie; Avard, Denise; Bessette, Paul; Brousseau, Claire; Dorval, Michel; Godard, Béatrice; Houde, Louis; Joly, Yann; Lajoie, Marie‐Andrée; Leblanc, Gilles; Lépine, Jean; Lespérance, Bernard; Malouin, Hélène; Parboosingh, Jillian; Pichette, Roxane; Provencher, Louise; Rhéaume, Josée; Sinnett, Daniel; Samson, Carolle; Simard, Jean‐Claude; Tranchant, Martine; Voyer, Patricia; BRCAs, INHERIT; Easton, Douglas; Tavtigian, Sean V; Knoppers, Bartha‐Maria; Laframboise, Rachel; Bridge, Peter; Goldgar, David

    2007-01-01

    Background and objective In clinical settings with fixed resources allocated to predictive genetic testing for high‐risk cancer predisposition genes, optimal strategies for mutation screening programmes are critically important. These depend on the mutation spectrum found in the population under consideration and the frequency of mutations detected as a function of the personal and family history of cancer, which are both affected by the presence of founder mutations and demographic characteristics of the underlying population. The results of multistep genetic testing for mutations in BRCA1 or BRCA2 in a large series of families with breast cancer in the French‐Canadian population of Quebec, Canada are reported. Methods A total of 256 high‐risk families were ascertained from regional familial cancer clinics throughout the province of Quebec. Initially, families were tested for a panel of specific mutations known to occur in this population. Families in which no mutation was identified were then comprehensively tested. Three algorithms to predict the presence of mutations were evaluated, including the prevalence tables provided by Myriad Genetics Laboratories, the Manchester Scoring System and a logistic regression approach based on the data from this study. Results 8 of the 15 distinct mutations found in 62 BRCA1/BRCA2‐positive families had never been previously reported in this population, whereas 82% carried 1 of the 4 mutations currently observed in ⩾2 families. In the subset of 191 families in which at least 1 affected individual was tested, 29% carried a mutation. Of these 27 BRCA1‐positive and 29 BRCA2‐positive families, 48 (86%) were found to harbour a mutation detected by the initial test. Among the remaining 143 inconclusive families, all 8 families found to have a mutation after complete sequencing had Manchester Scores ⩾18. The logistic regression and Manchester Scores provided equal predictive power, and both were significantly better

  19. Flight-deck automation - Promises and problems

    NASA Technical Reports Server (NTRS)

    Wiener, E. L.; Curry, R. E.

    1980-01-01

    The paper analyzes the role of human factors in flight-deck automation, identifies problem areas, and suggests design guidelines. Flight-deck automation using microprocessor technology and display systems improves performance and safety while leading to a decrease in size, cost, and power consumption. On the other hand negative factors such as failure of automatic equipment, automation-induced error compounded by crew error, crew error in equipment set-up, failure to heed automatic alarms, and loss of proficiency must also be taken into account. Among the problem areas discussed are automation of control tasks, monitoring of complex systems, psychosocial aspects of automation, and alerting and warning systems. Guidelines are suggested for designing, utilising, and improving control and monitoring systems. Investigation into flight-deck automation systems is important as the knowledge gained can be applied to other systems such as air traffic control and nuclear power generation, but the many problems encountered with automated systems need to be analyzed and overcome in future research.

  20. Molecular Cloning Designer Simulator (MCDS): All-in-one molecular cloning and genetic engineering design, simulation and management software for complex synthetic biology and metabolic engineering projects.

    PubMed

    Shi, Zhenyu; Vickers, Claudia E

    2016-12-01

    Molecular Cloning Designer Simulator (MCDS) is a powerful new all-in-one cloning and genetic engineering design, simulation and management software platform developed for complex synthetic biology and metabolic engineering projects. In addition to standard functions, it has a number of features that are either unique, or are not found in combination in any one software package: (1) it has a novel interactive flow-chart user interface for complex multi-step processes, allowing an integrated overview of the whole project; (2) it can perform a user-defined workflow of cloning steps in a single execution of the software; (3) it can handle multiple types of genetic recombineering, a technique that is rapidly replacing classical cloning for many applications; (4) it includes experimental information to conveniently guide wet lab work; and (5) it can store results and comments to allow the tracking and management of the whole project in one platform. MCDS is freely available from https://mcds.codeplex.com.

  1. Modular workcells: modern methods for laboratory automation.

    PubMed

    Felder, R A

    1998-12-01

    Laboratory automation is beginning to become an indispensable survival tool for laboratories facing difficult market competition. However, estimates suggest that only 8% of laboratories will be able to afford total laboratory automation systems. Therefore, automation vendors have developed alternative hardware configurations called 'modular automation', to fit the smaller laboratory. Modular automation consists of consolidated analyzers, integrated analyzers, modular workcells, and pre- and post-analytical automation. These terms will be defined in this paper. Using a modular automation model, the automated core laboratory will become a site where laboratory data is evaluated by trained professionals to provide diagnostic information to practising physicians. Modem software information management and process control tools will complement modular hardware. Proper standardization that will allow vendor-independent modular configurations will assure success of this revolutionary new technology.

  2. Physiological Self-Regulation and Adaptive Automation

    NASA Technical Reports Server (NTRS)

    Prinzell, Lawrence J.; Pope, Alan T.; Freeman, Frederick G.

    2007-01-01

    Adaptive automation has been proposed as a solution to current problems of human-automation interaction. Past research has shown the potential of this advanced form of automation to enhance pilot engagement and lower cognitive workload. However, there have been concerns voiced regarding issues, such as automation surprises, associated with the use of adaptive automation. This study examined the use of psychophysiological self-regulation training with adaptive automation that may help pilots deal with these problems through the enhancement of cognitive resource management skills. Eighteen participants were assigned to 3 groups (self-regulation training, false feedback, and control) and performed resource management, monitoring, and tracking tasks from the Multiple Attribute Task Battery. The tracking task was cycled between 3 levels of task difficulty (automatic, adaptive aiding, manual) on the basis of the electroencephalogram-derived engagement index. The other two tasks remained in automatic mode that had a single automation failure. Those participants who had received self-regulation training performed significantly better and reported lower National Aeronautics and Space Administration Task Load Index scores than participants in the false feedback and control groups. The theoretical and practical implications of these results for adaptive automation are discussed.

  3. Protocols for Automated Protist Analysis

    DTIC Science & Technology

    2011-12-01

    Report No: CG-D-14-13 Protocols for Automated Protist Analysis December 2011 Distribution Statement A: Approved for public...release; distribution is unlimited. Protocols for Automated Protist Analysis ii UNCLAS//Public | CG-926 RDC | B. Nelson, et al. | Public...Director United States Coast Guard Research & Development Center 1 Chelsea Street New London, CT 06320 Protocols for Automated Protist Analysis

  4. Automated quantification of neuronal networks and single-cell calcium dynamics using calcium imaging

    PubMed Central

    Patel, Tapan P.; Man, Karen; Firestein, Bonnie L.; Meaney, David F.

    2017-01-01

    Background Recent advances in genetically engineered calcium and membrane potential indicators provide the potential to estimate the activation dynamics of individual neurons within larger, mesoscale networks (100s–1000 +neurons). However, a fully integrated automated workflow for the analysis and visualization of neural microcircuits from high speed fluorescence imaging data is lacking. New method Here we introduce FluoroSNNAP, Fluorescence Single Neuron and Network Analysis Package. FluoroSNNAP is an open-source, interactive software developed in MATLAB for automated quantification of numerous biologically relevant features of both the calcium dynamics of single-cells and network activity patterns. FluoroSNNAP integrates and improves upon existing tools for spike detection, synchronization analysis, and inference of functional connectivity, making it most useful to experimentalists with little or no programming knowledge. Results We apply FluoroSNNAP to characterize the activity patterns of neuronal microcircuits undergoing developmental maturation in vitro. Separately, we highlight the utility of single-cell analysis for phenotyping a mixed population of neurons expressing a human mutant variant of the microtubule associated protein tau and wild-type tau. Comparison with existing method(s) We show the performance of semi-automated cell segmentation using spatiotemporal independent component analysis and significant improvement in detecting calcium transients using a template-based algorithm in comparison to peak-based or wavelet-based detection methods. Our software further enables automated analysis of microcircuits, which is an improvement over existing methods. Conclusions We expect the dissemination of this software will facilitate a comprehensive analysis of neuronal networks, promoting the rapid interrogation of circuits in health and disease. PMID:25629800

  5. Multistep continuous-flow synthesis of (R)- and (S)-rolipram using heterogeneous catalysts

    NASA Astrophysics Data System (ADS)

    Tsubogo, Tetsu; Oyamada, Hidekazu; Kobayashi, Shū

    2015-04-01

    Chemical manufacturing is conducted using either batch systems or continuous-flow systems. Flow systems have several advantages over batch systems, particularly in terms of productivity, heat and mixing efficiency, safety, and reproducibility. However, for over half a century, pharmaceutical manufacturing has used batch systems because the synthesis of complex molecules such as drugs has been difficult to achieve with continuous-flow systems. Here we describe the continuous-flow synthesis of drugs using only columns packed with heterogeneous catalysts. Commercially available starting materials were successively passed through four columns containing achiral and chiral heterogeneous catalysts to produce (R)-rolipram, an anti-inflammatory drug and one of the family of γ-aminobutyric acid (GABA) derivatives. In addition, simply by replacing a column packed with a chiral heterogeneous catalyst with another column packed with the opposing enantiomer, we obtained antipole (S)-rolipram. Similarly, we also synthesized (R)-phenibut, another drug belonging to the GABA family. These flow systems are simple and stable with no leaching of metal catalysts. Our results demonstrate that multistep (eight steps in this case) chemical transformations for drug synthesis can proceed smoothly under flow conditions using only heterogeneous catalysts, without the isolation of any intermediates and without the separation of any catalysts, co-products, by-products, and excess reagents. We anticipate that such syntheses will be useful in pharmaceutical manufacturing.

  6. Stochastic online appointment scheduling of multi-step sequential procedures in nuclear medicine.

    PubMed

    Pérez, Eduardo; Ntaimo, Lewis; Malavé, César O; Bailey, Carla; McCormack, Peter

    2013-12-01

    The increased demand for medical diagnosis procedures has been recognized as one of the contributors to the rise of health care costs in the U.S. in the last few years. Nuclear medicine is a subspecialty of radiology that uses advanced technology and radiopharmaceuticals for the diagnosis and treatment of medical conditions. Procedures in nuclear medicine require the use of radiopharmaceuticals, are multi-step, and have to be performed under strict time window constraints. These characteristics make the scheduling of patients and resources in nuclear medicine challenging. In this work, we derive a stochastic online scheduling algorithm for patient and resource scheduling in nuclear medicine departments which take into account the time constraints imposed by the decay of the radiopharmaceuticals and the stochastic nature of the system when scheduling patients. We report on a computational study of the new methodology applied to a real clinic. We use both patient and clinic performance measures in our study. The results show that the new method schedules about 600 more patients per year on average than a scheduling policy that was used in practice by improving the way limited resources are managed at the clinic. The new methodology finds the best start time and resources to be used for each appointment. Furthermore, the new method decreases patient waiting time for an appointment by about two days on average.

  7. Logistics Automation Master Plan (LAMP). Better Logistics Support through Automation.

    DTIC Science & Technology

    1983-06-01

    office micro-computers, positioned throughout the command chain , by providing real time links between LCA and all users: 2. Goals: Assist HQDA staff in...field i.e., Airland Battle 2000. IV-27 Section V: CONCEPT OF EXECUTION Suply (Retail) A. SRstem Description. I. The Division Logistics Property Book...7. Divisional Direct Support Unit Automated Supply System (DDASS)/Direct pport Level Suply Automation (DLSA). DDASS and DLSA are system development

  8. Automation: how much is too much?

    PubMed

    Hancock, P A

    2014-01-01

    The headlong rush to automate continues apace. The dominant question still remains whether we can automate, not whether we should automate. However, it is this latter question that is featured and considered explicitly here. The suggestion offered is that unlimited automation of all technical functions will eventually prove anathema to the fundamental quality of human life. Examples of tasks, pursuits and past-times that should potentially be excused from the automation imperative are discussed. This deliberation leads us back to the question of balance in the cooperation, coordination and potential conflict between humans and the machines they create.

  9. Automation and robotics

    NASA Technical Reports Server (NTRS)

    Montemerlo, Melvin

    1988-01-01

    The Autonomous Systems focus on the automation of control systems for the Space Station and mission operations. Telerobotics focuses on automation for in-space servicing, assembly, and repair. The Autonomous Systems and Telerobotics each have a planned sequence of integrated demonstrations showing the evolutionary advance of the state-of-the-art. Progress is briefly described for each area of concern.

  10. Deterministic multi-step rotation of magnetic single-domain state in Nickel nanodisks using multiferroic magnetoelastic coupling

    NASA Astrophysics Data System (ADS)

    Sohn, Hyunmin; Liang, Cheng-yen; Nowakowski, Mark E.; Hwang, Yongha; Han, Seungoh; Bokor, Jeffrey; Carman, Gregory P.; Candler, Robert N.

    2017-10-01

    We demonstrate deterministic multi-step rotation of a magnetic single-domain (SD) state in Nickel nanodisks using the multiferroic magnetoelastic effect. Ferromagnetic Nickel nanodisks are fabricated on a piezoelectric Lead Zirconate Titanate (PZT) substrate, surrounded by patterned electrodes. With the application of a voltage between opposing electrode pairs, we generate anisotropic in-plane strains that reshape the magnetic energy landscape of the Nickel disks, reorienting magnetization toward a new easy axis. By applying a series of voltages sequentially to adjacent electrode pairs, circulating in-plane anisotropic strains are applied to the Nickel disks, deterministically rotating a SD state in the Nickel disks by increments of 45°. The rotation of the SD state is numerically predicted by a fully-coupled micromagnetic/elastodynamic finite element analysis (FEA) model, and the predictions are experimentally verified with magnetic force microscopy (MFM). This experimental result will provide a new pathway to develop energy efficient magnetic manipulation techniques at the nanoscale.

  11. Genetic and molecular control of Osterix in skeletal formation

    PubMed Central

    Sinha, Krishna M.; Zhou, Xin

    2013-01-01

    Osteoblast differentiation is a multi-step process where mesenchymal cells differentiate into osteoblast lineage cells including osteocytes. Osterix (Osx) is an osteoblast-specific transcription factor which activates a repertoire of genes during differentiation of preosteoblasts into mature osteoblasts and osteocytes. The essential role of Osx in the genetic program of bone formation and in bone homeostasis is well established. Osx mutant embryos do not form bone and fail to express osteoblast-specific marker genes. Inactivation of Osx in mice after birth causes multiple skeletal phenotypes including lack of new bone formation, absence of resorption of mineralized cartilage, and defects in osteocyte maturation and function. Since Osx is a major effector in skeletal formation, studies on Osx gained momentum over the last five-seven years and implicated its important function in tooth formation as well as in healing of bone fractures. This review outlines mouse genetic studies that establish the essential role of Osx in bone and tooth formation as well as in healing of bone fractures. We also discuss the recent advances in regulation of Osx expression which is under control of a transcriptional network, signaling pathways, and epigenetic regulation. Finally we summarize important findings on the positive and negative regulation of Osx’s transcriptional activity through protein-protein interactions in expression of its target genes during osteoblast differentiation. In particular, the identification of the histone demethylase NO66 as an Osx-interacting protein which negatively regulates Osx activity opens further avenues in studying epigenetic control of Osx target genes during differentiation and maturation of osteoblasts. PMID:23225263

  12. Evolution paths for advanced automation

    NASA Technical Reports Server (NTRS)

    Healey, Kathleen J.

    1990-01-01

    As Space Station Freedom (SSF) evolves, increased automation and autonomy will be required to meet Space Station Freedom Program (SSFP) objectives. As a precursor to the use of advanced automation within the SSFP, especially if it is to be used on SSF (e.g., to automate the operation of the flight systems), the underlying technologies will need to be elevated to a high level of readiness to ensure safe and effective operations. Ground facilities supporting the development of these flight systems -- from research and development laboratories through formal hardware and software development environments -- will be responsible for achieving these levels of technology readiness. These facilities will need to evolve support the general evolution of the SSFP. This evolution will include support for increasing the use of advanced automation. The SSF Advanced Development Program has funded a study to define evolution paths for advanced automaton within the SSFP's ground-based facilities which will enable, promote, and accelerate the appropriate use of advanced automation on-board SSF. The current capability of the test beds and facilities, such as the Software Support Environment, with regard to advanced automation, has been assessed and their desired evolutionary capabilities have been defined. Plans and guidelines for achieving this necessary capability have been constructed. The approach taken has combined indepth interviews of test beds personnel at all SSF Work Package centers with awareness of relevant state-of-the-art technology and technology insertion methodologies. Key recommendations from the study include advocating a NASA-wide task force for advanced automation, and the creation of software prototype transition environments to facilitate the incorporation of advanced automation in the SSFP.

  13. What Is an Automated External Defibrillator?

    MedlinePlus

    ANSWERS by heart Treatments + Tests What Is an Automated External Defibrillator? An automated external defibrillator (AED) is a lightweight, portable device ... ANSWERS by heart Treatments + Tests What Is an Automated External Defibrillator? detect a rhythm that should be ...

  14. Automated DBS microsampling, microscale automation and microflow LC-MS for therapeutic protein PK.

    PubMed

    Zhang, Qian; Tomazela, Daniela; Vasicek, Lisa A; Spellman, Daniel S; Beaumont, Maribel; Shyong, BaoJen; Kenny, Jacqueline; Fauty, Scott; Fillgrove, Kerry; Harrelson, Jane; Bateman, Kevin P

    2016-04-01

    Reduce animal usage for discovery-stage PK studies for biologics programs using microsampling-based approaches and microscale LC-MS. We report the development of an automated DBS-based serial microsampling approach for studying the PK of therapeutic proteins in mice. Automated sample preparation and microflow LC-MS were used to enable assay miniaturization and improve overall assay throughput. Serial sampling of mice was possible over the full 21-day study period with the first six time points over 24 h being collected using automated DBS sample collection. Overall, this approach demonstrated comparable data to a previous study using single mice per time point liquid samples while reducing animal and compound requirements by 14-fold. Reduction in animals and drug material is enabled by the use of automated serial DBS microsampling for mice studies in discovery-stage studies of protein therapeutics.

  15. Automating spectral measurements

    NASA Astrophysics Data System (ADS)

    Goldstein, Fred T.

    2008-09-01

    This paper discusses the architecture of software utilized in spectroscopic measurements. As optical coatings become more sophisticated, there is mounting need to automate data acquisition (DAQ) from spectrophotometers. Such need is exacerbated when 100% inspection is required, ancillary devices are utilized, cost reduction is crucial, or security is vital. While instrument manufacturers normally provide point-and-click DAQ software, an application programming interface (API) may be missing. In such cases automation is impossible or expensive. An API is typically provided in libraries (*.dll, *.ocx) which may be embedded in user-developed applications. Users can thereby implement DAQ automation in several Windows languages. Another possibility, developed by FTG as an alternative to instrument manufacturers' software, is the ActiveX application (*.exe). ActiveX, a component of many Windows applications, provides means for programming and interoperability. This architecture permits a point-and-click program to act as automation client and server. Excel, for example, can control and be controlled by DAQ applications. Most importantly, ActiveX permits ancillary devices such as barcode readers and XY-stages to be easily and economically integrated into scanning procedures. Since an ActiveX application has its own user-interface, it can be independently tested. The ActiveX application then runs (visibly or invisibly) under DAQ software control. Automation capabilities are accessed via a built-in spectro-BASIC language with industry-standard (VBA-compatible) syntax. Supplementing ActiveX, spectro-BASIC also includes auxiliary serial port commands for interfacing programmable logic controllers (PLC). A typical application is automatic filter handling.

  16. Ask the experts: automation: part I.

    PubMed

    Allinson, John L; Blick, Kenneth E; Cohen, Lucinda; Higton, David; Li, Ming

    2013-08-01

    Bioanalysis invited a selection of leading researchers to express their views on automation in the bioanalytical laboratory. The topics discussed include the challenges that the modern bioanalyst faces when integrating automation into existing drug-development processes, the impact of automation and how they envision the modern bioanalytical laboratory changing in the near future. Their enlightening responses provide a valuable insight into the impact of automation and the future of the constantly evolving bioanalytical laboratory.

  17. An Automation Survival Guide for Media Centers.

    ERIC Educational Resources Information Center

    Whaley, Roger E.

    1989-01-01

    Reviews factors that should affect the decision to automate a school media center and offers suggestions for the automation process. Topics discussed include getting the library collection ready for automation, deciding what automated functions are needed, evaluating software vendors, selecting software, and budgeting. (CLB)

  18. Discovery of Novel New Delhi Metallo-β-Lactamases-1 Inhibitors by Multistep Virtual Screening

    PubMed Central

    Wang, Xuequan; Lu, Meiling; Shi, Yang; Ou, Yu; Cheng, Xiaodong

    2015-01-01

    The emergence of NDM-1 containing multi-antibiotic resistant "Superbugs" necessitates the needs of developing of novel NDM-1inhibitors. In this study, we report the discovery of novel NDM-1 inhibitors by multi-step virtual screening. From a 2,800,000 virtual drug-like compound library selected from the ZINC database, we generated a focused NDM-1 inhibitor library containing 298 compounds of which 44 chemical compounds were purchased and evaluated experimentally for their ability to inhibit NDM-1 in vitro. Three novel NDM-1 inhibitors with micromolar IC50 values were validated. The most potent inhibitor, VNI-41, inhibited NDM-1 with an IC50 of 29.6 ± 1.3 μM. Molecular dynamic simulation revealed that VNI-41 interacted extensively with the active site. In particular, the sulfonamide group of VNI-41 interacts directly with the metal ion Zn1 that is critical for the catalysis. These results demonstrate the feasibility of applying virtual screening methodologies in identifying novel inhibitors for NDM-1, a metallo-β-lactamase with a malleable active site and provide a mechanism base for rational design of NDM-1 inhibitors using sulfonamide as a functional scaffold. PMID:25734558

  19. Automated detection of nerve fiber layer defects on retinal fundus images using fully convolutional network for early diagnosis of glaucoma

    NASA Astrophysics Data System (ADS)

    Watanabe, Ryusuke; Muramatsu, Chisako; Ishida, Kyoko; Sawada, Akira; Hatanaka, Yuji; Yamamoto, Tetsuya; Fujita, Hiroshi

    2017-03-01

    Early detection of glaucoma is important to slow down progression of the disease and to prevent total vision loss. We have been studying an automated scheme for detection of a retinal nerve fiber layer defect (NFLD), which is one of the earliest signs of glaucoma on retinal fundus images. In our previous study, we proposed a multi-step detection scheme which consists of Gabor filtering, clustering and adaptive thresholding. The problems of the previous method were that the number of false positives (FPs) was still large and that the method included too many rules. In attempt to solve these problems, we investigated the end-to-end learning system without pre-specified features. A deep convolutional neural network (DCNN) with deconvolutional layers was trained to detect NFLD regions. In this preliminary investigation, we investigated effective ways of preparing the input images and compared the detection results. The optimal result was then compared with the result obtained by the previous method. DCNN training was carried out using original images of abnormal cases, original images of both normal and abnormal cases, ellipse-based polar transformed images, and transformed half images. The result showed that use of both normal and abnormal cases increased the sensitivity as well as the number of FPs. Although NFLDs are visualized with the highest contrast in green plane, the use of color images provided higher sensitivity than the use of green image only. The free response receiver operating characteristic curve using the transformed color images, which was the best among seven different sets studied, was comparable to that of the previous method. Use of DCNN has a potential to improve the generalizability of automated detection method of NFLDs and may be useful in assisting glaucoma diagnosis on retinal fundus images.

  20. Automations influence on nuclear power plants: a look at three accidents and how automation played a role.

    PubMed

    Schmitt, Kara

    2012-01-01

    Nuclear power is one of the ways that we can design an efficient sustainable future. Automation is the primary system used to assist operators in the task of monitoring and controlling nuclear power plants (NPP). Automation performs tasks such as assessing the status of the plant's operations as well as making real time life critical situational specific decisions. While the advantages and disadvantages of automation are well studied in variety of domains, accidents remind us that there is still vulnerability to unknown variables. This paper will look at the effects of automation within three NPP accidents and incidents and will consider why automation failed in preventing these accidents from occurring. It will also review the accidents at the Three Mile Island, Chernobyl, and Fukushima Daiichi NPP's in order to determine where better use of automation could have resulted in a more desirable outcome.

  1. Innate immunity and the new forward genetics.

    PubMed

    Beutler, Bruce

    2016-12-01

    As it is a hard-wired system for responses to microbes, innate immunity is particularly susceptible to classical genetic analysis. Mutations led the way to the discovery of many of the molecular elements of innate immune sensing and signaling pathways. In turn, the need for a faster way to find the molecular causes of mutation-induced phenotypes triggered a huge transformation in forward genetics. During the 1980s and 1990s, many heritable phenotypes were ascribed to mutations through positional cloning. In mice, this required three steps. First, a genetic mapping step was used to show that a given phenotype emanated from a circumscribed region of the genome. Second, a physical mapping step was undertaken, in which all of the region was cloned and its gene content determined. Finally, a concerted search for the mutation was performed. Such projects usually lasted for several years, but could produce breakthroughs in our understanding of biological processes. Publication of the annotated mouse genome sequence in 2002 made physical mapping unnecessary. More recently we devised a new technology for automated genetic mapping, which eliminated both genetic mapping and the search for mutations among candidate genes. The cause of phenotype can now be determined instantaneously. We have created more than 100,000 coding/splicing mutations. And by screening for defects of innate and adaptive immunity we have discovered many "new" proteins needed for innate immune function. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Innate immunity and the new forward genetics

    PubMed Central

    Beutler, Bruce

    2016-01-01

    As it is a hard-wired system for responses to microbes, innate immunity is particularly susceptible to classical genetic analysis. Mutations led the way to the discovery of many of the molecular elements of innate immune sensing and signaling pathways. In turn, the need for a faster way to find the molecular causes of mutation-induced phenotypes triggered a huge transformation in forward genetics. During the 1980s and 1990s, many heritable phenotypes were ascribed to mutations through positional cloning. In mice, this required three steps. First, a genetic mapping step was used to show that a given phenotype emanated from a circumscribed region of the genome. Second, a physical mapping step was undertaken, in which all of the region was cloned and its gene content determined. Finally, a concerted search for the mutation was performed. Such projects usually lasted for several years, but could produce breakthroughs in our understanding of biological processes. Publication of the annotated mouse genome sequence in 2002 made physical mapping unnecessary. More recently we devised a new technology for automated genetic mapping, which eliminated both genetic mapping and the search for mutations among candidate genes. The cause of phenotype can now be determined instantaneously. We have created more than 100,000 coding/splicing mutations. And by screening for defects of innate and adaptive immunity we have discovered many “new” proteins needed for innate immune function. PMID:27890263

  3. Genetic Algorithm Calibration of Probabilistic Cellular Automata for Modeling Mining Permit Activity

    USGS Publications Warehouse

    Louis, S.J.; Raines, G.L.

    2003-01-01

    We use a genetic algorithm to calibrate a spatially and temporally resolved cellular automata to model mining activity on public land in Idaho and western Montana. The genetic algorithm searches through a space of transition rule parameters of a two dimensional cellular automata model to find rule parameters that fit observed mining activity data. Previous work by one of the authors in calibrating the cellular automaton took weeks - the genetic algorithm takes a day and produces rules leading to about the same (or better) fit to observed data. These preliminary results indicate that genetic algorithms are a viable tool in calibrating cellular automata for this application. Experience gained during the calibration of this cellular automata suggests that mineral resource information is a critical factor in the quality of the results. With automated calibration, further refinements of how the mineral-resource information is provided to the cellular automaton will probably improve our model.

  4. Automation in School Library Media Centers.

    ERIC Educational Resources Information Center

    Driver, Russell W.; Driver, Mary Anne

    1982-01-01

    Surveys the historical development of automated technical processing in schools and notes the impact of this automation in a number of cases. Speculations about the future involvement of school libraries in automated processing and networking are included. Thirty references are listed. (BBM)

  5. Robust model predictive control for multi-step short range spacecraft rendezvous

    NASA Astrophysics Data System (ADS)

    Zhu, Shuyi; Sun, Ran; Wang, Jiaolong; Wang, Jihe; Shao, Xiaowei

    2018-07-01

    This work presents a robust model predictive control (MPC) approach for the multi-step short range spacecraft rendezvous problem. During the specific short range phase concerned, the chaser is supposed to be initially outside the line-of-sight (LOS) cone. Therefore, the rendezvous process naturally includes two steps: the first step is to transfer the chaser into the LOS cone and the second step is to transfer the chaser into the aimed region with its motion confined within the LOS cone. A novel MPC framework named after Mixed MPC (M-MPC) is proposed, which is the combination of the Variable-Horizon MPC (VH-MPC) framework and the Fixed-Instant MPC (FI-MPC) framework. The M-MPC framework enables the optimization for the two steps to be implemented jointly rather than to be separated factitiously, and its computation workload is acceptable for the usually low-power processors onboard spacecraft. Then considering that disturbances including modeling error, sensor noise and thrust uncertainty may induce undesired constraint violations, a robust technique is developed and it is attached to the above M-MPC framework to form a robust M-MPC approach. The robust technique is based on the chance-constrained idea, which ensures that constraints can be satisfied with a prescribed probability. It improves the robust technique proposed by Gavilan et al., because it eliminates the unnecessary conservativeness by explicitly incorporating known statistical properties of the navigation uncertainty. The efficacy of the robust M-MPC approach is shown in a simulation study.

  6. Progress in the development of paper-based diagnostics for low-resource point-of-care settings

    PubMed Central

    Byrnes, Samantha; Thiessen, Gregory; Fu, Elain

    2014-01-01

    This Review focuses on recent work in the field of paper microfluidics that specifically addresses the goal of translating the multistep processes that are characteristic of gold-standard laboratory tests to low-resource point-of-care settings. A major challenge is to implement multistep processes with the robust fluid control required to achieve the necessary sensitivity and specificity of a given application in a user-friendly package that minimizes equipment. We review key work in the areas of fluidic controls for automation in paper-based devices, readout methods that minimize dedicated equipment, and power and heating methods that are compatible with low-resource point-of-care settings. We also highlight a focused set of recent applications and discuss future challenges. PMID:24256361

  7. CalQuo: automated, simultaneous single-cell and population-level quantification of global intracellular Ca2+ responses.

    PubMed

    Fritzsche, Marco; Fernandes, Ricardo A; Colin-York, Huw; Santos, Ana M; Lee, Steven F; Lagerholm, B Christoffer; Davis, Simon J; Eggeling, Christian

    2015-11-13

    Detecting intracellular calcium signaling with fluorescent calcium indicator dyes is often coupled with microscopy techniques to follow the activation state of non-excitable cells, including lymphocytes. However, the analysis of global intracellular calcium responses both at the single-cell level and in large ensembles simultaneously has yet to be automated. Here, we present a new software package, CalQuo (Calcium Quantification), which allows the automated analysis and simultaneous monitoring of global fluorescent calcium reporter-based signaling responses in up to 1000 single cells per experiment, at temporal resolutions of sub-seconds to seconds. CalQuo quantifies the number and fraction of responding cells, the temporal dependence of calcium signaling and provides global and individual calcium-reporter fluorescence intensity profiles. We demonstrate the utility of the new method by comparing the calcium-based signaling responses of genetically manipulated human lymphocytic cell lines.

  8. "First generation" automated DNA sequencing technology.

    PubMed

    Slatko, Barton E; Kieleczawa, Jan; Ju, Jingyue; Gardner, Andrew F; Hendrickson, Cynthia L; Ausubel, Frederick M

    2011-10-01

    Beginning in the 1980s, automation of DNA sequencing has greatly increased throughput, reduced costs, and enabled large projects to be completed more easily. The development of automation technology paralleled the development of other aspects of DNA sequencing: better enzymes and chemistry, separation and imaging technology, sequencing protocols, robotics, and computational advancements (including base-calling algorithms with quality scores, database developments, and sequence analysis programs). Despite the emergence of high-throughput sequencing platforms, automated Sanger sequencing technology remains useful for many applications. This unit provides background and a description of the "First-Generation" automated DNA sequencing technology. It also includes protocols for using the current Applied Biosystems (ABI) automated DNA sequencing machines. © 2011 by John Wiley & Sons, Inc.

  9. Introduction matters: Manipulating trust in automation and reliance in automated driving.

    PubMed

    Körber, Moritz; Baseler, Eva; Bengler, Klaus

    2018-01-01

    Trust in automation is a key determinant for the adoption of automated systems and their appropriate use. Therefore, it constitutes an essential research area for the introduction of automated vehicles to road traffic. In this study, we investigated the influence of trust promoting (Trust promoted group) and trust lowering (Trust lowered group) introductory information on reported trust, reliance behavior and take-over performance. Forty participants encountered three situations in a 17-min highway drive in a conditionally automated vehicle (SAE Level 3). Situation 1 and Situation 3 were non-critical situations where a take-over was optional. Situation 2 represented a critical situation where a take-over was necessary to avoid a collision. A non-driving-related task (NDRT) was presented between the situations to record the allocation of visual attention. Participants reporting a higher trust level spent less time looking at the road or instrument cluster and more time looking at the NDRT. The manipulation of introductory information resulted in medium differences in reported trust and influenced participants' reliance behavior. Participants of the Trust promoted group looked less at the road or instrument cluster and more at the NDRT. The odds of participants of the Trust promoted group to overrule the automated driving system in the non-critical situations were 3.65 times (Situation 1) to 5 times (Situation 3) higher. In Situation 2, the Trust promoted group's mean take-over time was extended by 1154 ms and the mean minimum time-to-collision was 933 ms shorter. Six participants from the Trust promoted group compared to no participant of the Trust lowered group collided with the obstacle. The results demonstrate that the individual trust level influences how much drivers monitor the environment while performing an NDRT. Introductory information influences this trust level, reliance on an automated driving system, and if a critical take-over situation can be

  10. Flight-deck automation: Promises and problems

    NASA Technical Reports Server (NTRS)

    Wiener, E. L.; Curry, R. E.

    1980-01-01

    The state of the art in human factors in flight-deck automation is presented. A number of critical problem areas are identified and broad design guidelines are offered. Automation-related aircraft accidents and incidents are discussed as examples of human factors problems in automated flight.

  11. Automation in Photogrammetry,

    DTIC Science & Technology

    1980-07-25

    matrix (DTM) and digital planimetric data, combined and integrated into so-called "data bases." I’ll say more about this later. AUTOMATION OF...projection with mechanical inversors to maintain the Scheimpflug condition. Some automation has been achieved, with computer control to determine rectifier... matrix (DTM) form that is not necessarily collected from the same photography as that from which the orthophoto is being produced. Because they are

  12. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the patient's peripheral blood (blood circulating in one of the body's extremities, such as the arm). These...

  13. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the patient's peripheral blood (blood circulating in one of the body's extremities, such as the arm). These...

  14. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the patient's peripheral blood (blood circulating in one of the body's extremities, such as the arm). These...

  15. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the patient's peripheral blood (blood circulating in one of the body's extremities, such as the arm). These...

  16. A theoretical introduction to "combinatory SYBRGreen qPCR screening", a matrix-based approach for the detection of materials derived from genetically modified plants.

    PubMed

    Van den Bulcke, Marc; Lievens, Antoon; Barbau-Piednoir, Elodie; MbongoloMbella, Guillaume; Roosens, Nancy; Sneyers, Myriam; Casi, Amaya Leunda

    2010-03-01

    The detection of genetically modified (GM) materials in food and feed products is a complex multi-step analytical process invoking screening, identification, and often quantification of the genetically modified organisms (GMO) present in a sample. "Combinatory qPCR SYBRGreen screening" (CoSYPS) is a matrix-based approach for determining the presence of GM plant materials in products. The CoSYPS decision-support system (DSS) interprets the analytical results of SYBRGREEN qPCR analysis based on four values: the C(t)- and T(m) values and the LOD and LOQ for each method. A theoretical explanation of the different concepts applied in CoSYPS analysis is given (GMO Universe, "Prime number tracing", matrix/combinatory approach) and documented using the RoundUp Ready soy GTS40-3-2 as an example. By applying a limited set of SYBRGREEN qPCR methods and through application of a newly developed "prime number"-based algorithm, the nature of subsets of corresponding GMO in a sample can be determined. Together, these analyses provide guidance for semi-quantitative estimation of GMO presence in a food and feed product.

  17. Automation Applications in an Advanced Air Traffic Management System : Volume 4A. Automation Requirements.

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 4 describes the automation requirements. A presentation of automation requirements is made for an advanced air traffic management system in terms of controller work force, computer resources, controller productivity, system manning, failure ef...

  18. The Pros and Cons of Army Automation

    DTIC Science & Technology

    2007-11-13

    The Pros and Cons of Army Automation 1 Running Head: THE PROS AND CONS OF ARMY AUTOMATION The Pros and Cons of Army Automation SGM...TITLE AND SUBTITLE The Pros and Cons of Army Automation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...Prescribed by ANSI Std Z39-18 The Pros and Cons of Army Automation 2 Outline I. Introduction (MSG (P) Dostie) II. Manual skills (MSG (P

  19. I trust it, but I don't know why: effects of implicit attitudes toward automation on trust in an automated system.

    PubMed

    Merritt, Stephanie M; Heimbaugh, Heather; LaChapell, Jennifer; Lee, Deborah

    2013-06-01

    This study is the first to examine the influence of implicit attitudes toward automation on users' trust in automation. Past empirical work has examined explicit (conscious) influences on user level of trust in automation but has not yet measured implicit influences. We examine concurrent effects of explicit propensity to trust machines and implicit attitudes toward automation on trust in an automated system. We examine differential impacts of each under varying automation performance conditions (clearly good, ambiguous, clearly poor). Participants completed both a self-report measure of propensity to trust and an Implicit Association Test measuring implicit attitude toward automation, then performed an X-ray screening task. Automation performance was manipulated within-subjects by varying the number and obviousness of errors. Explicit propensity to trust and implicit attitude toward automation did not significantly correlate. When the automation's performance was ambiguous, implicit attitude significantly affected automation trust, and its relationship with propensity to trust was additive: Increments in either were related to increases in trust. When errors were obvious, a significant interaction between the implicit and explicit measures was found, with those high in both having higher trust. Implicit attitudes have important implications for automation trust. Users may not be able to accurately report why they experience a given level of trust. To understand why users trust or fail to trust automation, measurements of implicit and explicit predictors may be necessary. Furthermore, implicit attitude toward automation might be used as a lever to effectively calibrate trust.

  20. Automation-induced monitoring inefficiency: role of display location.

    PubMed

    Singh, I L; Molloy, R; Parasuraman, R

    1997-01-01

    Operators can be poor monitors of automation if they are engaged concurrently in other tasks. However, in previous studies of this phenomenon the automated task was always presented in the periphery, away from the primary manual tasks that were centrally displayed. In this study we examined whether centrally locating an automated task would boost monitoring performance during a flight-simulation task consisting of system monitoring, tracking and fuel resource management sub-tasks. Twelve nonpilot subjects were required to perform the tracking and fuel management tasks manually while watching the automated system monitoring task for occasional failures. The automation reliability was constant at 87.5% for six subjects and variable (alternating between 87.5% and 56.25%) for the other six subjects. Each subject completed four 30 min sessions over a period of 2 days. In each automation reliability condition the automation routine was disabled for the last 20 min of the fourth session in order to simulate catastrophic automation failure (0 % reliability). Monitoring for automation failure was inefficient when automation reliability was constant but not when it varied over time, replicating previous results. Furthermore, there was no evidence of resource or speed accuracy trade-off between tasks. Thus, automation-induced failures of monitoring cannot be prevented by centrally locating the automated task.

  1. Automation-induced monitoring inefficiency: role of display location

    NASA Technical Reports Server (NTRS)

    Singh, I. L.; Molloy, R.; Parasuraman, R.

    1997-01-01

    Operators can be poor monitors of automation if they are engaged concurrently in other tasks. However, in previous studies of this phenomenon the automated task was always presented in the periphery, away from the primary manual tasks that were centrally displayed. In this study we examined whether centrally locating an automated task would boost monitoring performance during a flight-simulation task consisting of system monitoring, tracking and fuel resource management sub-tasks. Twelve nonpilot subjects were required to perform the tracking and fuel management tasks manually while watching the automated system monitoring task for occasional failures. The automation reliability was constant at 87.5% for six subjects and variable (alternating between 87.5% and 56.25%) for the other six subjects. Each subject completed four 30 min sessions over a period of 2 days. In each automation reliability condition the automation routine was disabled for the last 20 min of the fourth session in order to simulate catastrophic automation failure (0 % reliability). Monitoring for automation failure was inefficient when automation reliability was constant but not when it varied over time, replicating previous results. Furthermore, there was no evidence of resource or speed accuracy trade-off between tasks. Thus, automation-induced failures of monitoring cannot be prevented by centrally locating the automated task.

  2. Humans: still vital after all these years of automation.

    PubMed

    Parasuraman, Raja; Wickens, Christopher D

    2008-06-01

    The authors discuss empirical studies of human-automation interaction and their implications for automation design. Automation is prevalent in safety-critical systems and increasingly in everyday life. Many studies of human performance in automated systems have been conducted over the past 30 years. Developments in three areas are examined: levels and stages of automation, reliance on and compliance with automation, and adaptive automation. Automation applied to information analysis or decision-making functions leads to differential system performance benefits and costs that must be considered in choosing appropriate levels and stages of automation. Human user dependence on automated alerts and advisories reflects two components of operator trust, reliance and compliance, which are in turn determined by the threshold designers use to balance automation misses and false alarms. Finally, adaptive automation can provide additional benefits in balancing workload and maintaining the user's situation awareness, although more research is required to identify when adaptation should be user controlled or system driven. The past three decades of empirical research on humans and automation has provided a strong science base that can be used to guide the design of automated systems. This research can be applied to most current and future automated systems.

  3. Automation in haemostasis.

    PubMed

    Huber, A R; Méndez, A; Brunner-Agten, S

    2013-01-01

    Automatia, an ancient Greece goddess of luck who makes things happen by themselves and on her own will without human engagement, is present in our daily life in the medical laboratory. Automation has been introduced and perfected by clinical chemistry and since then expanded into other fields such as haematology, immunology, molecular biology and also coagulation testing. The initial small and relatively simple standalone instruments have been replaced by more complex systems that allow for multitasking. Integration of automated coagulation testing into total laboratory automation has become possible in the most recent years. Automation has many strengths and opportunities if weaknesses and threats are respected. On the positive side, standardization, reduction of errors, reduction of cost and increase of throughput are clearly beneficial. Dependence on manufacturers, high initiation cost and somewhat expensive maintenance are less favourable factors. The modern lab and especially the todays lab technicians and academic personnel in the laboratory do not add value for the doctor and his patients by spending lots of time behind the machines. In the future the lab needs to contribute at the bedside suggesting laboratory testing and providing support and interpretation of the obtained results. The human factor will continue to play an important role in testing in haemostasis yet under different circumstances.

  4. Robotics/Automated Systems Technicians.

    ERIC Educational Resources Information Center

    Doty, Charles R.

    Major resources exist that can be used to develop or upgrade programs in community colleges and technical institutes that educate robotics/automated systems technicians. The first category of resources is Economic, Social, and Education Issues. The Office of Technology Assessment (OTA) report, "Automation and the Workplace," presents analyses of…

  5. Evaluation of an Automated Keywording System.

    ERIC Educational Resources Information Center

    Malone, Linda C.; And Others

    1990-01-01

    Discussion of automated indexing techniques focuses on ways to statistically document improvements in the development of an automated keywording system over time. The system developed by the Joint Chiefs of Staff to automate the storage, categorization, and retrieval of information from military exercises is explained, and performance measures are…

  6. Slotting optimization of automated storage and retrieval system (AS/RS) for efficient delivery of parts in an assembly shop using genetic algorithm: A case Study

    NASA Astrophysics Data System (ADS)

    Yue, L.; Guan, Z.; He, C.; Luo, D.; Saif, U.

    2017-06-01

    In recent years, the competitive pressure on manufacturing companies shifted them from mass production to mass customization to produce large variety of products. It is a great challenge for companies nowadays to produce customized mixed flow mode of production to meet customized demand on time. Due to large variety of products, the storage system to deliver variety of products to production lines influences on the timely production of variety of products, as investigated from by simulation study of an inefficient storage system of a real Company, in the current research. Therefore, current research proposed a slotting optimization model with mixed model sequence to assemble in consideration of the final flow lines to optimize whole automated storage and retrieval system (AS/RS) and distribution system in the case company. Current research is aimed to minimize vertical height of centre of gravity of AS/RS and total time spent for taking the materials out from the AS/RS simultaneously. Genetic algorithm is adopted to solve the proposed problem and computational result shows significant improvement in stability and efficiency of AS/RS as compared to the existing method used in the case company.

  7. An Automated System for Skeletal Maturity Assessment by Extreme Learning Machines

    PubMed Central

    Mansourvar, Marjan; Shamshirband, Shahaboddin; Raj, Ram Gopal; Gunalan, Roshan; Mazinani, Iman

    2015-01-01

    Assessing skeletal age is a subjective and tedious examination process. Hence, automated assessment methods have been developed to replace manual evaluation in medical applications. In this study, a new fully automated method based on content-based image retrieval and using extreme learning machines (ELM) is designed and adapted to assess skeletal maturity. The main novelty of this approach is it overcomes the segmentation problem as suffered by existing systems. The estimation results of ELM models are compared with those of genetic programming (GP) and artificial neural networks (ANNs) models. The experimental results signify improvement in assessment accuracy over GP and ANN, while generalization capability is possible with the ELM approach. Moreover, the results are indicated that the ELM model developed can be used confidently in further work on formulating novel models of skeletal age assessment strategies. According to the experimental results, the new presented method has the capacity to learn many hundreds of times faster than traditional learning methods and it has sufficient overall performance in many aspects. It has conclusively been found that applying ELM is particularly promising as an alternative method for evaluating skeletal age. PMID:26402795

  8. StrAuto: automation and parallelization of STRUCTURE analysis.

    PubMed

    Chhatre, Vikram E; Emerson, Kevin J

    2017-03-24

    Population structure inference using the software STRUCTURE has become an integral part of population genetic studies covering a broad spectrum of taxa including humans. The ever-expanding size of genetic data sets poses computational challenges for this analysis. Although at least one tool currently implements parallel computing to reduce computational overload of this analysis, it does not fully automate the use of replicate STRUCTURE analysis runs required for downstream inference of optimal K. There is pressing need for a tool that can deploy population structure analysis on high performance computing clusters. We present an updated version of the popular Python program StrAuto, to streamline population structure analysis using parallel computing. StrAuto implements a pipeline that combines STRUCTURE analysis with the Evanno Δ K analysis and visualization of results using STRUCTURE HARVESTER. Using benchmarking tests, we demonstrate that StrAuto significantly reduces the computational time needed to perform iterative STRUCTURE analysis by distributing runs over two or more processors. StrAuto is the first tool to integrate STRUCTURE analysis with post-processing using a pipeline approach in addition to implementing parallel computation - a set up ideal for deployment on computing clusters. StrAuto is distributed under the GNU GPL (General Public License) and available to download from http://strauto.popgen.org .

  9. Work and Programmable Automation.

    ERIC Educational Resources Information Center

    DeVore, Paul W.

    A new industrial era based on electronics and the microprocessor has arrived, an era that is being called intelligent automation. Intelligent automation, in the form of robots, replaces workers, and the new products, using microelectronic devices, require significantly less labor to produce than the goods they replace. The microprocessor thus…

  10. The BAARA (Biological AutomAted RAdiotracking) System: A New Approach in Ecological Field Studies

    PubMed Central

    Řeřucha, Šimon; Bartonička, Tomáš; Jedlička, Petr; Čížek, Martin; Hlouša, Ondřej; Lučan, Radek; Horáček, Ivan

    2015-01-01

    Radiotracking is an important and often the only possible method to explore specific habits and the behaviour of animals, but it has proven to be very demanding and time-consuming, especially when frequent positioning of a large group is required. Our aim was to address this issue by making the process partially automated, to mitigate the demands and related costs. This paper presents a novel automated tracking system that consists of a network of automated tracking stations deployed within the target area. Each station reads the signals from telemetry transmitters, estimates the bearing and distance of the tagged animals and records their position. The station is capable of tracking a theoretically unlimited number of transmitters on different frequency channels with the period of 5–15 seconds per single channel. An ordinary transmitter that fits within the supported frequency band might be used with BAARA (Biological AutomAted RAdiotracking); an extra option is the use of a custom-programmable transmitter with configurable operational parameters, such as the precise frequency channel or the transmission parameters. This new approach to a tracking system was tested for its applicability in a series of field and laboratory tests. BAARA has been tested within fieldwork explorations of Rousettus aegyptiacus during field trips to Dakhla oasis in Egypt. The results illustrate the novel perspective which automated radiotracking opens for the study of spatial behaviour, particularly in addressing topics in the domain of population ecology. PMID:25714910

  11. Understanding and avoiding potential problems in implementing automation

    NASA Astrophysics Data System (ADS)

    Rouse, W. B.; Morris, N. M.

    1985-11-01

    Technology-driven efforts to implement automation often encounter problems due to lack of acceptance or begrudging acceptance by the personnel involved. It is argued in this paper that the level of automation perceived by an individual heavily influences whether or not the automation is accepted by that individual. The factors that appear to affect perceived level of automation are discussed. Issues considered include the impact of automation on the system and the individual, correlates of acceptance, problems and risks of automation, and factors influencing alienation. Based on an understanding of these issues, a set of eight guidelines is proposed as a possible means of avoiding problems in implementing automation.

  12. Understanding and avoiding potential problems in implementing automation

    NASA Technical Reports Server (NTRS)

    Rouse, W. B.; Morris, N. M.

    1985-01-01

    Technology-driven efforts to implement automation often encounter problems due to lack of acceptance or begrudging acceptance by the personnel involved. It is argued in this paper that the level of automation perceived by an individual heavily influences whether or not the automation is accepted by that individual. The factors that appear to affect perceived level of automation are discussed. Issues considered include the impact of automation on the system and the individual, correlates of acceptance, problems and risks of automation, and factors influencing alienation. Based on an understanding of these issues, a set of eight guidelines is proposed as a possible means of avoiding problems in implementing automation.

  13. You're a What? Automation Technician

    ERIC Educational Resources Information Center

    Mullins, John

    2010-01-01

    Many people think of automation as laborsaving technology, but it sure keeps Jim Duffell busy. Defined simply, automation is a technique for making a device run or a process occur with minimal direct human intervention. But the functions and technologies involved in automated manufacturing are complex. Nearly all functions, from orders coming in…

  14. Does Automated Feedback Improve Writing Quality?

    ERIC Educational Resources Information Center

    Wilson, Joshua; Olinghouse, Natalie G.; Andrada, Gilbert N.

    2014-01-01

    The current study examines data from students in grades 4-8 who participated in a statewide computer-based benchmark writing assessment that featured automated essay scoring and automated feedback. We examined whether the use of automated feedback was associated with gains in writing quality across revisions to an essay, and with transfer effects…

  15. Clinical Laboratory Automation: A Case Study

    PubMed Central

    Archetti, Claudia; Montanelli, Alessandro; Finazzi, Dario; Caimi, Luigi; Garrafa, Emirena

    2017-01-01

    Background This paper presents a case study of an automated clinical laboratory in a large urban academic teaching hospital in the North of Italy, the Spedali Civili in Brescia, where four laboratories were merged in a unique laboratory through the introduction of laboratory automation. Materials and Methods The analysis compares the preautomation situation and the new setting from a cost perspective, by considering direct and indirect costs. It also presents an analysis of the turnaround time (TAT). The study considers equipment, staff and indirect costs. Results The introduction of automation led to a slight increase in equipment costs which is highly compensated by a remarkable decrease in staff costs. Consequently, total costs decreased by 12.55%. The analysis of the TAT shows an improvement of nonemergency exams while emergency exams are still validated within the maximum time imposed by the hospital. Conclusions The strategy adopted by the management, which was based on re-using the available equipment and staff when merging the pre-existing laboratories, has reached its goal: introducing automation while minimizing the costs. Significance for public health Automation is an emerging trend in modern clinical laboratories with a positive impact on service level to patients and on staff safety as shown by different studies. In fact, it allows process standardization which, in turn, decreases the frequency of outliers and errors. In addition, it induces faster processing times, thus improving the service level. On the other side, automation decreases the staff exposition to accidents strongly improving staff safety. In this study, we analyse a further potential benefit of automation, that is economic convenience. We study the case of the automated laboratory of one of the biggest hospital in Italy and compare the cost related to the pre and post automation situation. Introducing automation lead to a cost decrease without affecting the service level to patients

  16. System reliability, performance and trust in adaptable automation.

    PubMed

    Chavaillaz, Alain; Wastell, David; Sauer, Jürgen

    2016-01-01

    The present study examined the effects of reduced system reliability on operator performance and automation management in an adaptable automation environment. 39 operators were randomly assigned to one of three experimental groups: low (60%), medium (80%), and high (100%) reliability of automation support. The support system provided five incremental levels of automation which operators could freely select according to their needs. After 3 h of training on a simulated process control task (AutoCAMS) in which the automation worked infallibly, operator performance and automation management were measured during a 2.5-h testing session. Trust and workload were also assessed through questionnaires. Results showed that although reduced system reliability resulted in lower levels of trust towards automation, there were no corresponding differences in the operators' reliance on automation. While operators showed overall a noteworthy ability to cope with automation failure, there were, however, decrements in diagnostic speed and prospective memory with lower reliability. Copyright © 2015. Published by Elsevier Ltd.

  17. Automation, Manpower, and Education.

    ERIC Educational Resources Information Center

    Rosenberg, Jerry M.

    Each group in our population will be affected by automation and other forms of technological advancement. This book seeks to identify the needs of these various groups, and to present ways in which educators can best meet them. The author corrects certain prevalent misconceptions concerning manpower utilization and automation. Based on the…

  18. Funding for Library Automation.

    ERIC Educational Resources Information Center

    Thompson, Ronelle K. H.

    This paper provides a brief overview of planning and implementing a project to fund library automation. It is suggested that: (1) proposal budgets should include all costs of a project, such as furniture needed for computer terminals, costs for modifying library procedures, initial supplies, or ongoing maintenance; (2) automation does not save…

  19. Automated Power-Distribution System

    NASA Technical Reports Server (NTRS)

    Ashworth, Barry; Riedesel, Joel; Myers, Chris; Miller, William; Jones, Ellen F.; Freeman, Kenneth; Walsh, Richard; Walls, Bryan K.; Weeks, David J.; Bechtel, Robert T.

    1992-01-01

    Autonomous power-distribution system includes power-control equipment and automation equipment. System automatically schedules connection of power to loads and reconfigures itself when it detects fault. Potential terrestrial applications include optimization of consumption of power in homes, power supplies for autonomous land vehicles and vessels, and power supplies for automated industrial processes.

  20. An automated diagnosis system of liver disease using artificial immune and genetic algorithms.

    PubMed

    Liang, Chunlin; Peng, Lingxi

    2013-04-01

    The rise of health care cost is one of the world's most important problems. Disease prediction is also a vibrant research area. Researchers have approached this problem using various techniques such as support vector machine, artificial neural network, etc. This study typically exploits the immune system's characteristics of learning and memory to solve the problem of liver disease diagnosis. The proposed system applies a combination of two methods of artificial immune and genetic algorithm to diagnose the liver disease. The system architecture is based on artificial immune system. The learning procedure of system adopts genetic algorithm to interfere the evolution of antibody population. The experiments use two benchmark datasets in our study, which are acquired from the famous UCI machine learning repository. The obtained diagnosis accuracies are very promising with regard to the other diagnosis system in the literatures. These results suggest that this system may be a useful automatic diagnosis tool for liver disease.

  1. Decision-Making for Automation: Hebrew and Arabic Script Materials in the Automated Library. Occasional Papers, Number 205.

    ERIC Educational Resources Information Center

    Vernon, Elizabeth

    It is generally accepted in the library world that an automated catalog means more accessible data for patrons, greater productivity for librarians, and an improvement in the sharing of bibliographic data among libraries. While the desirability of automation is not a controversial issue, some aspects of automating remain problematic. This article…

  2. 78 FR 53466 - Modification of Two National Customs Automation Program (NCAP) Tests Concerning Automated...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-29

    ... DEPARTMENT OF HOMELAND SECURITY U.S. Customs and Border Protection Modification of Two National Customs Automation Program (NCAP) Tests Concerning Automated Commercial Environment (ACE) Document Image System (DIS) and Simplified Entry (SE); Correction AGENCY: U.S. Customs and Border Protection, Department...

  3. Automated telescope scheduling

    NASA Technical Reports Server (NTRS)

    Johnston, Mark D.

    1988-01-01

    With the ever increasing level of automation of astronomical telescopes the benefits and feasibility of automated planning and scheduling are becoming more apparent. Improved efficiency and increased overall telescope utilization are the most obvious goals. Automated scheduling at some level has been done for several satellite observatories, but the requirements on these systems were much less stringent than on modern ground or satellite observatories. The scheduling problem is particularly acute for Hubble Space Telescope: virtually all observations must be planned in excruciating detail weeks to months in advance. Space Telescope Science Institute has recently made significant progress on the scheduling problem by exploiting state-of-the-art artificial intelligence software technology. What is especially interesting is that this effort has already yielded software that is well suited to scheduling groundbased telescopes, including the problem of optimizing the coordinated scheduling of more than one telescope.

  4. Unsupervised automated high throughput phenotyping of RNAi time-lapse movies.

    PubMed

    Failmezger, Henrik; Fröhlich, Holger; Tresch, Achim

    2013-10-04

    Gene perturbation experiments in combination with fluorescence time-lapse cell imaging are a powerful tool in reverse genetics. High content applications require tools for the automated processing of the large amounts of data. These tools include in general several image processing steps, the extraction of morphological descriptors, and the grouping of cells into phenotype classes according to their descriptors. This phenotyping can be applied in a supervised or an unsupervised manner. Unsupervised methods are suitable for the discovery of formerly unknown phenotypes, which are expected to occur in high-throughput RNAi time-lapse screens. We developed an unsupervised phenotyping approach based on Hidden Markov Models (HMMs) with multivariate Gaussian emissions for the detection of knockdown-specific phenotypes in RNAi time-lapse movies. The automated detection of abnormal cell morphologies allows us to assign a phenotypic fingerprint to each gene knockdown. By applying our method to the Mitocheck database, we show that a phenotypic fingerprint is indicative of a gene's function. Our fully unsupervised HMM-based phenotyping is able to automatically identify cell morphologies that are specific for a certain knockdown. Beyond the identification of genes whose knockdown affects cell morphology, phenotypic fingerprints can be used to find modules of functionally related genes.

  5. Automated detection of analyzable metaphase chromosome cells depicted on scanned digital microscopic images

    NASA Astrophysics Data System (ADS)

    Qiu, Yuchen; Wang, Xingwei; Chen, Xiaodong; Li, Yuhua; Liu, Hong; Li, Shibo; Zheng, Bin

    2010-02-01

    applying this CAD-guided high-resolution microscopic image scanning system to prescreen and select ROIs that may contain analyzable metaphase chromosome cells. The success and the further improvement of this automated scanning system may have great impact on the future clinical practice in genetic laboratories to detect and diagnose diseases.

  6. Greater Buyer Effectiveness through Automation

    DTIC Science & Technology

    1989-01-01

    assignment to the buyer Coordination - automated routing of requirement package to technical, finance, transportation, packaging, small business ... security , data, safety, etc. Consolidation - automated identification of requirements for identical or similar items for potential consolidation

  7. Order Division Automated System.

    ERIC Educational Resources Information Center

    Kniemeyer, Justin M.; And Others

    This publication was prepared by the Order Division Automation Project staff to fulfill the Library of Congress' requirement to document all automation efforts. The report was originally intended for internal use only and not for distribution outside the Library. It is now felt that the library community at-large may have an interest in the…

  8. The Israel DNA database--the establishment of a rapid, semi-automated analysis system.

    PubMed

    Zamir, Ashira; Dell'Ariccia-Carmon, Aviva; Zaken, Neomi; Oz, Carla

    2012-03-01

    The Israel Police DNA database, also known as IPDIS (Israel Police DNA Index System), has been operating since February 2007. During that time more than 135,000 reference samples have been uploaded and more than 2000 hits reported. We have developed an effective semi-automated system that includes two automated punchers, three liquid handler robots and four genetic analyzers. An inhouse LIMS program enables full tracking of every sample through the entire process of registration, pre-PCR handling, analysis of profiles, uploading to the database, hit reports and ultimately storage. The LIMS is also responsible for the future tracking of samples and their profiles to be expunged from the database according to the Israeli DNA legislation. The database is administered by an in-house developed software program, where reference and evidentiary profiles are uploaded, stored, searched and matched. The DNA database has proven to be an effective investigative tool which has gained the confidence of the Israeli public and on which the Israel National Police force has grown to rely. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  9. Proof-of-concept automation of propellant processing

    NASA Technical Reports Server (NTRS)

    Ramohalli, Kumar; Schallhorn, P. A.

    1989-01-01

    For space-based propellant production, automation of the process is needed. Currently, all phases of terrestrial production have some form of human interaction. A mixer was acquired to help perform the tasks of automation. A heating system to be used with the mixer was designed, built, and installed. Tests performed on the heating system verify design criteria. An IBM PS/2 personal computer was acquired for the future automation work. It is hoped that some the mixing process itself will be automated. This is a concept demonstration task; proving that propellant production can be automated reliably.

  10. Human-Centered Aviation Automation: Principles and Guidelines

    NASA Technical Reports Server (NTRS)

    Billings, Charles E.

    1996-01-01

    This document presents principles and guidelines for human-centered automation in aircraft and in the aviation system. Drawing upon operational experience with highly automated aircraft, it describes classes of problems that have occurred in these vehicles, the effects of advanced automation on the human operators of the aviation system, and ways in which these problems may be avoided in the design of future aircraft and air traffic management automation. Many incidents and a few serious accidents suggest that these problems are related to automation complexity, autonomy, coupling, and opacity, or inadequate feedback to operators. An automation philosophy that emphasizes improved communication, coordination and cooperation between the human and machine elements of this complex, distributed system is required to improve the safety and efficiency of aviation operations in the future.

  11. Automated Test-Form Generation

    ERIC Educational Resources Information Center

    van der Linden, Wim J.; Diao, Qi

    2011-01-01

    In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…

  12. Poly(vinylpyrrolidone)-Free Multistep Synthesis of Silver Nanoplates with Plasmon Resonance in the Near Infrared Range.

    PubMed

    Khan, Assad U; Zhou, Zhengping; Krause, Joseph; Liu, Guoliang

    2017-11-01

    Herein, a poly(vinylpyrrolidone) (PVP)-free method is described for synthesizing Ag nanoplates that have localized surface plasmon resonance in the near-infrared (NIR) range. Citrate-capped Ag spherical nanoparticles are first grown into small Ag nanoplates that resonate in the range of 500-800 nm. The small Ag nanoplates are used as seeds to further grow into large Ag nanoplates with a lateral dimension of 100-600 nm and a plasmon resonance wavelength of 800-1660 nm and above. The number of growth steps can be increased as desired. Without introducing additional citrate into the solutions of small Ag nanoplate seeds, large Ag nanoplates can be synthesized within minutes. The entire synthesis is completely PVP free, which promotes the nanoparticle growth along the lateral direction to form large Ag nanoplates. The multistep growth and the minimum usage of citrate are essential for the fast growth of high-aspect-ratio Ag nanoplates resonating in the NIR range. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. 76 FR 34246 - Automated Commercial Environment (ACE); Announcement of National Customs Automation Program Test...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-13

    ... CBP with authority to conduct limited test programs or procedures designed to evaluate planned... aspects of this test, including the design, conduct and implementation of the test, in order to determine... Environment (ACE); Announcement of National Customs Automation Program Test of Automated Procedures for In...

  14. Evaluation of glycodendron and synthetically-modified dextran clearing agents for multi-step targeting of radioisotopes for molecular imaging and radioimmunotherapy

    PubMed Central

    Cheal, Sarah M.; Yoo, Barney; Boughdad, Sarah; Punzalan, Blesida; Yang, Guangbin; Dilhas, Anna; Torchon, Geralda; Pu, Jun; Axworthy, Don B.; Zanzonico, Pat; Ouerfelli, Ouathek; Larson, Steven M.

    2014-01-01

    A series of N-acetylgalactosamine-dendrons (NAG-dendrons) and dextrans bearing biotin moieties were compared for their ability to complex with and sequester circulating bispecific anti-tumor antibody (scFv4) streptavidin (SA) fusion protein (scFv4-SA) in vivo, to improve tumor to normal tissue concentration ratios for targeted radioimmunotherapy and diagnosis. Specifically, a total of five NAG-dendrons employing a common synthetic scaffold structure containing 4, 8, 16, or 32 carbohydrate residues and a single biotin moiety were prepared (NAGB), and for comparative purposes, a biotinylated-dextran with average molecular weight (MW) of 500 kD was synthesized from amino-dextran (DEXB). One of the NAGB compounds, CA16, has been investigated in humans; our aim was to determine if other NAGB analogs (e.g. CA8 or CA4) were bioequivalent to CA16 and/or better suited as MST reagents. In vivo studies included dynamic positron-emission tomography (PET) imaging of 124I-labelled-scFv4-SA clearance and dual-label biodistribution studies following multi-step targeting (MST) directed at subcutaneous (s.c.) human colon adenocarcinoma xenografts in mice. The MST protocol consists of three injections: first, a bispecific antibody specific for an anti-tumor associated glycoprotein (TAG-72) single chain genetically-fused with SA (scFv4-SA); second, CA16 or other clearing agent; and third, radiolabeled biotin. We observed using PET imaging of 124I-labelled-scFv4-SA clearance that the spatial arrangement of ligands conjugated to NAG (i.e. biotin) can impact the binding to antibody in circulation and subsequent liver uptake of the NAG-antibody complex. Also, NAGB CA32-LC or CA16-LC can be utilized during MST to achieve comparable tumor- to-blood ratios and absolute tumor uptake seen previously with CA16. Finally, DEXB was equally effective as NAGB CA32-LC at lowering scFv4-SA in circulation, but at the expense of reducing absolute tumor uptake of radiolabeled biotin. PMID:24219178

  15. Automation Applications in an Advanced Air Traffic Management System : Volume 4B. Automation Requirements (Concluded)

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 4 describes the automation requirements. A presentation of automation requirements is made for an advanced air traffic management system in terms of controller work for-e, computer resources, controller productivity, system manning, failure ef...

  16. Genetics-based methods for detection of Salmonella spp. in foods.

    PubMed

    Mozola, Mark A

    2006-01-01

    Genetic methods are now at the forefront of foodborne pathogen testing. The sensitivity, specificity, and inclusivity advantages offered by deoxyribonucleic acid (DNA) probe technology have driven an intense effort in methods development over the past 20 years. DNA probe-based methods for Salmonella spp. and other pathogens have progressed from time-consuming procedures involving the use of radioisotopes to simple, high throughput, automated assays. The analytical sensitivity of nucleic acid amplification technology has facilitated a reduction in analysis time by allowing enriched samples to be tested for previously undetectable quantities of analyte. This article will trace the evolution of the development of genetic methods for detection of Salmonella in foods, review the basic assay formats and their advantages and limitations, and discuss method performance characteristics and considerations for selection of methods.

  17. Individual differences in the calibration of trust in automation.

    PubMed

    Pop, Vlad L; Shrewsbury, Alex; Durso, Francis T

    2015-06-01

    The objective was to determine whether operators with an expectancy that automation is trustworthy are better at calibrating their trust to changes in the capabilities of automation, and if so, why. Studies suggest that individual differences in automation expectancy may be able to account for why changes in the capabilities of automation lead to a substantial change in trust for some, yet only a small change for others. In a baggage screening task, 225 participants searched for weapons in 200 X-ray images of luggage. Participants were assisted by an automated decision aid exhibiting different levels of reliability. Measures of expectancy that automation is trustworthy were used in conjunction with subjective measures of trust and perceived reliability to identify individual differences in trust calibration. Operators with high expectancy that automation is trustworthy were more sensitive to changes (both increases and decreases) in automation reliability. This difference was eliminated by manipulating the causal attribution of automation errors. Attributing the cause of automation errors to factors external to the automation fosters an understanding of tasks and situations in which automation differs in reliability and may lead to more appropriate trust. The development of interventions can lead to calibrated trust in automation. © 2014, Human Factors and Ergonomics Society.

  18. Comparison of microbial community shifts in two parallel multi-step drinking water treatment processes.

    PubMed

    Xu, Jiajiong; Tang, Wei; Ma, Jun; Wang, Hong

    2017-07-01

    Drinking water treatment processes remove undesirable chemicals and microorganisms from source water, which is vital to public health protection. The purpose of this study was to investigate the effects of treatment processes and configuration on the microbiome by comparing microbial community shifts in two series of different treatment processes operated in parallel within a full-scale drinking water treatment plant (DWTP) in Southeast China. Illumina sequencing of 16S rRNA genes of water samples demonstrated little effect of coagulation/sedimentation and pre-oxidation steps on bacterial communities, in contrast to dramatic and concurrent microbial community shifts during ozonation, granular activated carbon treatment, sand filtration, and disinfection for both series. A large number of unique operational taxonomic units (OTUs) at these four treatment steps further illustrated their strong shaping power towards the drinking water microbial communities. Interestingly, multidimensional scaling analysis revealed tight clustering of biofilm samples collected from different treatment steps, with Nitrospira, the nitrite-oxidizing bacteria, noted at higher relative abundances in biofilm compared to water samples. Overall, this study provides a snapshot of step-to-step microbial evolvement in multi-step drinking water treatment systems, and the results provide insight to control and manipulation of the drinking water microbiome via optimization of DWTP design and operation.

  19. Space station automation study-satellite servicing, volume 2

    NASA Technical Reports Server (NTRS)

    Meissinger, H. F.

    1984-01-01

    Technology requirements for automated satellite servicing operations aboard the NASA space station were studied. The three major tasks addressed: (1) servicing requirements (satellite and space station elements) and the role of automation; (2) assessment of automation technology; and (3) conceptual design of servicing facilities on the space station. It is found that many servicing functions cloud benefit from automation support; and the certain research and development activities on automation technologies for servicing should start as soon as possible. Also, some advanced automation developments for orbital servicing could be effectively applied to U.S. industrial ground based operations.

  20. Design of a Multistep Phase Mask for High-Energy Terahertz Pulse Generation by Optical Rectification

    NASA Astrophysics Data System (ADS)

    Avetisyan, Y.; Makaryan, A.; Tadevosyan, V.; Tonouchi, M.

    2017-12-01

    A new scheme for generating high-energy terahertz (THz) pulses based on using a multistep phase mask (MSPM) is suggested and analyzed. The mask is placed on the entrance surface of the nonlinear optical (NLO) crystal eliminating the necessity of the imaging optics. In contrast to the contact grating method, introduction of large amounts of angular dispersion is avoided. The operation principle of the suggested scheme is based on the fact that the MSPM splits a single input beam into many smaller time-delayed "beamlets," which together form a discretely tilted-front laser pulse in NLO crystal. The analysis of THz-pulse generation in ZnTe and lithium niobate (LN) crystals shows that application of ZnTe crystal is more preferable, especially when long-wavelength pump sources are used. The dimensions of the mask's steps required for high-energy THz-pulse generation in ZnTe and LN crystals are calculated. The optimal number of steps is estimated, taking into account individual beamlet's spatial broadening and problems related to the mask fabrication. The proposed method is a promising way to develop high-energy, monolithic, and alignment-free THz-pulse sources.

  1. MULTIWAVELENGTH OBSERVATIONS OF A SLOW-RISE, MULTISTEP X1.6 FLARE AND THE ASSOCIATED ERUPTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yurchyshyn, V.; Kumar, P.; Cho, K.-S.

    Using multiwavelength observations, we studied a slow-rise, multistep X1.6 flare that began on 2014 November 7 as a localized eruption of core fields inside a δ-sunspot and later engulfed the entire active region (AR). This flare event was associated with formation of two systems of post-eruption arcades (PEAs) and several J-shaped flare ribbons showing extremely fine details, irreversible changes in the photospheric magnetic fields, and it was accompanied by a fast and wide coronal mass ejection. Data from the Solar Dynamics Observatory and IRIS spacecraft, along with the ground-based data from the New Solar Telescope, present evidence that (i) themore » flare and the eruption were directly triggered by a flux emergence that occurred inside a δ-sunspot at the boundary between two umbrae; (ii) this event represented an example of the formation of an unstable flux rope observed only in hot AIA channels (131 and 94 Å) and LASCO C2 coronagraph images; (iii) the global PEA spanned the entire AR and was due to global-scale reconnection occurring at heights of about one solar radius, indicating the global spatial and temporal scale of the eruption.« less

  2. Multifunctional picoliter droplet manipulation platform and its application in single cell analysis.

    PubMed

    Gu, Shu-Qing; Zhang, Yun-Xia; Zhu, Ying; Du, Wen-Bin; Yao, Bo; Fang, Qun

    2011-10-01

    We developed an automated and multifunctional microfluidic platform based on DropLab to perform flexible generation and complex manipulations of picoliter-scale droplets. Multiple manipulations including precise droplet generation, sequential reagent merging, and multistep solid-phase extraction for picoliter-scale droplets could be achieved in the present platform. The system precision in generating picoliter-scale droplets was significantly improved by minimizing the thermo-induced fluctuation of flow rate. A novel droplet fusion technique based on the difference of droplet interfacial tensions was developed without the need of special microchannel networks or external devices. It enabled sequential addition of reagents to droplets on demand for multistep reactions. We also developed an effective picoliter-scale droplet splitting technique with magnetic actuation. The difficulty in phase separation of magnetic beads from picoliter-scale droplets due to the high interfacial tension was overcome using ferromagnetic particles to carry the magnetic beads to pass through the phase interface. With this technique, multistep solid-phase extraction was achieved among picoliter-scale droplets. The present platform had the ability to perform complex multistep manipulations to picoliter-scale droplets, which is particularly required for single cell analysis. Its utility and potentials in single cell analysis were preliminarily demonstrated in achieving high-efficiency single-cell encapsulation, enzyme activity assay at the single cell level, and especially, single cell DNA purification based on solid-phase extraction.

  3. Cooperative Mapping for Automated Vehicles

    DOT National Transportation Integrated Search

    2017-10-01

    Localization is essential for automated vehicles, even for simple tasks such as lanekeeping. Some automated vehicle systems use their sensors to perceive their surroundings on-the-fly, such as the early variants of the Tesla Autopilot, while others s...

  4. Automated thermometric enzyme immunoassay of human proinsulin produced by Escherichia coli.

    PubMed

    Birnbaum, S; Bülow, L; Hardy, K; Danielsson, B; Mosbach, K

    1986-10-01

    We have determined and monitored the production and release of human proinsulin by genetically engineered Escherichia coli cells. Several M9 media samples were analyzed sequentially after centrifugation with the aid of a rapid automated flow-through thermometric enzyme-linked immunosorbent assay (TELISA) system. The response time was 7 min after sample injection and a single assay was complete after 13 min. Insulin concentrations in the range of 0.1-50 micrograms/ml could be determined. The TELISA method correlated well with conventional radioimmunoassay determinations. Standard curves were reproducible over a period of several days even when the immobilized antibody column was stored at 25 degrees C in the enzyme thermistor unit. Thus, immediate assay start up was possible.

  5. Current genetic methodologies in the identification of disaster victims and in forensic analysis.

    PubMed

    Ziętkiewicz, Ewa; Witt, Magdalena; Daca, Patrycja; Zebracka-Gala, Jadwiga; Goniewicz, Mariusz; Jarząb, Barbara; Witt, Michał

    2012-02-01

    This review presents the basic problems and currently available molecular techniques used for genetic profiling in disaster victim identification (DVI). The environmental conditions of a mass disaster often result in severe fragmentation, decomposition and intermixing of the remains of victims. In such cases, traditional identification based on the anthropological and physical characteristics of the victims is frequently inconclusive. This is the reason why DNA profiling became the gold standard for victim identification in mass-casualty incidents (MCIs) or any forensic cases where human remains are highly fragmented and/or degraded beyond recognition. The review provides general information about the sources of genetic material for DNA profiling, the genetic markers routinely used during genetic profiling (STR markers, mtDNA and single-nucleotide polymorphisms [SNP]) and the basic statistical approaches used in DNA-based disaster victim identification. Automated technological platforms that allow the simultaneous analysis of a multitude of genetic markers used in genetic identification (oligonucleotide microarray techniques and next-generation sequencing) are also presented. Forensic and population databases containing information on human variability, routinely used for statistical analyses, are discussed. The final part of this review is focused on recent developments, which offer particularly promising tools for forensic applications (mRNA analysis, transcriptome variation in individuals/populations and genetic profiling of specific cells separated from mixtures).

  6. Multi-step rhodopsin inactivation schemes can account for the size variability of single photon responses in Limulus ventral photoreceptors

    PubMed Central

    1994-01-01

    Limulus ventral photoreceptors generate highly variable responses to the absorption of single photons. We have obtained data on the size distribution of these responses, derived the distribution predicted from simple transduction cascade models and compared the theory and data. In the simplest of models, the active state of the visual pigment (defined by its ability to activate G protein) is turned off in a single reaction. The output of such a cascade is predicted to be highly variable, largely because of stochastic variation in the number of G proteins activated. The exact distribution predicted is exponential, but we find that an exponential does not adequately account for the data. The data agree much better with the predictions of a cascade model in which the active state of the visual pigment is turned off by a multi-step process. PMID:8057085

  7. Toward high-throughput phenotyping: unbiased automated feature extraction and selection from knowledge sources.

    PubMed

    Yu, Sheng; Liao, Katherine P; Shaw, Stanley Y; Gainer, Vivian S; Churchill, Susanne E; Szolovits, Peter; Murphy, Shawn N; Kohane, Isaac S; Cai, Tianxi

    2015-09-01

    Analysis of narrative (text) data from electronic health records (EHRs) can improve population-scale phenotyping for clinical and genetic research. Currently, selection of text features for phenotyping algorithms is slow and laborious, requiring extensive and iterative involvement by domain experts. This paper introduces a method to develop phenotyping algorithms in an unbiased manner by automatically extracting and selecting informative features, which can be comparable to expert-curated ones in classification accuracy. Comprehensive medical concepts were collected from publicly available knowledge sources in an automated, unbiased fashion. Natural language processing (NLP) revealed the occurrence patterns of these concepts in EHR narrative notes, which enabled selection of informative features for phenotype classification. When combined with additional codified features, a penalized logistic regression model was trained to classify the target phenotype. The authors applied our method to develop algorithms to identify patients with rheumatoid arthritis and coronary artery disease cases among those with rheumatoid arthritis from a large multi-institutional EHR. The area under the receiver operating characteristic curves (AUC) for classifying RA and CAD using models trained with automated features were 0.951 and 0.929, respectively, compared to the AUCs of 0.938 and 0.929 by models trained with expert-curated features. Models trained with NLP text features selected through an unbiased, automated procedure achieved comparable or slightly higher accuracy than those trained with expert-curated features. The majority of the selected model features were interpretable. The proposed automated feature extraction method, generating highly accurate phenotyping algorithms with improved efficiency, is a significant step toward high-throughput phenotyping. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All

  8. Human-centered automation: Development of a philosophy

    NASA Technical Reports Server (NTRS)

    Graeber, Curtis; Billings, Charles E.

    1990-01-01

    Information on human-centered automation philosophy is given in outline/viewgraph form. It is asserted that automation of aircraft control will continue in the future, but that automation should supplement, not supplant the human management and control function in civil air transport.

  9. Automation--down to the nuts and bolts.

    PubMed

    Fix, R J; Rowe, J M; McConnell, B C

    2000-01-01

    Laboratories that once viewed automation as an expensive luxury are now looking to automation as a solution to increase sample throughput, to help ensure data integrity and to improve laboratory safety. The question is no longer, 'Should we automate?', but 'How should we approach automation?' A laboratory may choose from three approaches when deciding to automate: (1) contract with a third party vendor to produce a turnkey system, (2) develop and fabricate the system in-house or (3) some combination of approaches (1) and (2). The best approach for a given laboratory depends upon its available resources. The first lesson to be learned in automation is that no matter how straightforward an idea appears in the beginning, the solution will not be realized until many complex problems have been resolved. Issues dealing with sample vessel manipulation, liquid handling and system control must be addressed before a final design can be developed. This requires expertise in engineering, electronics, programming and chemistry. Therefore, the team concept of automation should be employed to help ensure success. This presentation discusses the advantages and disadvantages of the three approaches to automation. The development of an automated sample handling and control system for the STAR System focused microwave will be used to illustrate the complexities encountered in a seemingly simple project, and to highlight the importance of the team concept to automation no matter which approach is taken. The STAR System focused microwave from CEM Corporation is an open vessel digestion system with six microwave cells. This system is used to prepare samples for trace metal determination. The automated sample handling was developed around a XYZ motorized gantry system. Grippers were specially designed to perform several different functions and to provide feedback to the control software. Software was written in Visual Basic 5.0 to control the movement of the samples and the operation and

  10. Progress in Fully Automated Abdominal CT Interpretation

    PubMed Central

    Summers, Ronald M.

    2016-01-01

    OBJECTIVE Automated analysis of abdominal CT has advanced markedly over just the last few years. Fully automated assessment of organs, lymph nodes, adipose tissue, muscle, bowel, spine, and tumors are some examples where tremendous progress has been made. Computer-aided detection of lesions has also improved dramatically. CONCLUSION This article reviews the progress and provides insights into what is in store in the near future for automated analysis for abdominal CT, ultimately leading to fully automated interpretation. PMID:27101207

  11. The future is now: Technology's impact on the practice of genetic counseling.

    PubMed

    Gordon, Erynn S; Babu, Deepti; Laney, Dawn A

    2018-03-01

    Smartphones, artificial intelligence, automation, digital communication, and other types of technology are playing an increasingly important role in our daily lives. It is no surprise that technology is also shaping the practice of medicine, and more specifically the practice of genetic counseling. While digital tools have been part of the practice of medical genetics for decades, such as internet- or CD-ROM-based tools like Online Mendelian Inheritance in Man and Pictures of Standard Syndromes and Undiagnosed Malformations in the 1980s, the potential for emerging tools to change how we practice and the way patients consume information is startling. Technology has the potential to aid in at-risk patient identification, assist in generating a differential diagnosis, improve efficiency in medical history collection and risk assessment, provide educational support for patients, and streamline follow-up. Here we review the historic and current uses of technology in genetic counseling, identify challenges to integration, and propose future applications of technology that can shape the practice of genetic counseling. © 2018 Wiley Periodicals, Inc.

  12. A system-level approach to automation research

    NASA Technical Reports Server (NTRS)

    Harrison, F. W.; Orlando, N. E.

    1984-01-01

    Automation is the application of self-regulating mechanical and electronic devices to processes that can be accomplished with the human organs of perception, decision, and actuation. The successful application of automation to a system process should reduce man/system interaction and the perceived complexity of the system, or should increase affordability, productivity, quality control, and safety. The expense, time constraints, and risk factors associated with extravehicular activities have led the Automation Technology Branch (ATB), as part of the NASA Automation Research and Technology Program, to investigate the use of robots and teleoperators as automation aids in the context of space operations. The ATB program addresses three major areas: (1) basic research in autonomous operations, (2) human factors research on man-machine interfaces with remote systems, and (3) the integration and analysis of automated systems. This paper reviews the current ATB research in the area of robotics and teleoperators.

  13. Automation's Effect on Library Personnel.

    ERIC Educational Resources Information Center

    Dakshinamurti, Ganga

    1985-01-01

    Reports on survey studying the human-machine interface in Canadian university, public, and special libraries. Highlights include position category and educational background of 118 participants, participants' feelings toward automation, physical effects of automation, diffusion in decision making, interpersonal communication, future trends,…

  14. Specimen coordinate automated measuring machine/fiducial automated measuring machine

    DOEpatents

    Hedglen, Robert E.; Jacket, Howard S.; Schwartz, Allan I.

    1991-01-01

    The Specimen coordinate Automated Measuring Machine (SCAMM) and the Fiducial Automated Measuring Machine (FAMM) is a computer controlled metrology system capable of measuring length, width, and thickness, and of locating fiducial marks. SCAMM and FAMM have many similarities in their designs, and they can be converted from one to the other without taking them out of the hot cell. Both have means for: supporting a plurality of samples and a standard; controlling the movement of the samples in the +/- X and Y directions; determining the coordinates of the sample; compensating for temperature effects; and verifying the accuracy of the measurements and repeating as necessary. SCAMM and FAMM are designed to be used in hot cells.

  15. Use of the MicroSeq 500 16S rRNA Gene-Based Sequencing for Identification of Bacterial Isolates That Commercial Automated Systems Failed To Identify Correctly

    PubMed Central

    Fontana, Carla; Favaro, Marco; Pelliccioni, Marco; Pistoia, Enrico Salvatore; Favalli, Cartesio

    2005-01-01

    Reliable automated identification and susceptibility testing of clinically relevant bacteria is an essential routine for microbiology laboratories, thus improving patient care. Examples of automated identification systems include the Phoenix (Becton Dickinson) and the VITEK 2 (bioMérieux). However, more and more frequently, microbiologists must isolate “difficult” strains that automated systems often fail to identify. An alternative approach could be the genetic identification of isolates; this is based on 16S rRNA gene sequencing and analysis. The aim of the present study was to evaluate the possible use of MicroSeq 500 (Applera) for sequencing the 16S rRNA gene to identify isolates whose identification is unobtainable by conventional systems. We analyzed 83 “difficult” clinical isolates: 25 gram-positive and 58 gram-negative strains that were contemporaneously identified by both systems—VITEK 2 and Phoenix—while genetic identification was performed by using the MicroSeq 500 system. The results showed that phenotypic identifications by VITEK 2 and Phoenix were remarkably similar: 74% for gram-negative strains (43 of 58) and 80% for gram-positive strains were concordant by both systems and also concordant with genetic characterization. The exceptions were the 15 gram-negative and 9 gram-positive isolates whose phenotypic identifications were contrasting or inconclusive. For these, the use of MicroSeq 500 was fundamental to achieving species identification. In clinical microbiology the use of MicroSeq 500, particularly for strains with ambiguous biochemical profiles (including slow-growing strains), identifies strains more easily than do conventional systems. Moreover, MicroSeq 500 is easy to use and cost-effective, making it applicable also in the clinical laboratory. PMID:15695654

  16. 3D-Printed Microfluidic Automation

    PubMed Central

    Au, Anthony K.; Bhattacharjee, Nirveek; Horowitz, Lisa F.; Chang, Tim C.; Folch, Albert

    2015-01-01

    Microfluidic automation – the automated routing, dispensing, mixing, and/or separation of fluids through microchannels – generally remains a slowly-spreading technology because device fabrication requires sophisticated facilities and the technology’s use demands expert operators. Integrating microfluidic automation in devices has involved specialized multi-layering and bonding approaches. Stereolithography is an assembly-free, 3D-printing technique that is emerging as an efficient alternative for rapid prototyping of biomedical devices. Here we describe fluidic valves and pumps that can be stereolithographically printed in optically-clear, biocompatible plastic and integrated within microfluidic devices at low cost. User-friendly fluid automation devices can be printed and used by non-engineers as replacement for costly robotic pipettors or tedious manual pipetting. Engineers can manipulate the designs as digital modules into new devices of expanded functionality. Printing these devices only requires the digital file and electronic access to a printer. PMID:25738695

  17. 3D-printed microfluidic automation.

    PubMed

    Au, Anthony K; Bhattacharjee, Nirveek; Horowitz, Lisa F; Chang, Tim C; Folch, Albert

    2015-04-21

    Microfluidic automation - the automated routing, dispensing, mixing, and/or separation of fluids through microchannels - generally remains a slowly-spreading technology because device fabrication requires sophisticated facilities and the technology's use demands expert operators. Integrating microfluidic automation in devices has involved specialized multi-layering and bonding approaches. Stereolithography is an assembly-free, 3D-printing technique that is emerging as an efficient alternative for rapid prototyping of biomedical devices. Here we describe fluidic valves and pumps that can be stereolithographically printed in optically-clear, biocompatible plastic and integrated within microfluidic devices at low cost. User-friendly fluid automation devices can be printed and used by non-engineers as replacement for costly robotic pipettors or tedious manual pipetting. Engineers can manipulate the designs as digital modules into new devices of expanded functionality. Printing these devices only requires the digital file and electronic access to a printer.

  18. Library Automation in the Netherlands and Pica.

    ERIC Educational Resources Information Center

    Bossers, Anton; Van Muyen, Martin

    1984-01-01

    Describes the Pica Library Automation Network (originally the Project for Integrated Catalogue Automation), which is based on a centralized bibliographic database. Highlights include the Pica conception of library automation, online shared cataloging system, circulation control system, acquisition system, and online Dutch union catalog with…

  19. Automation of the longwall mining system

    NASA Technical Reports Server (NTRS)

    Zimmerman, W.; Aster, R. W.; Harris, J.; High, J.

    1982-01-01

    Cost effective, safe, and technologically sound applications of automation technology to underground coal mining were identified. The longwall analysis commenced with a general search for government and industry experience of mining automation technology. A brief industry survey was conducted to identify longwall operational, safety, and design problems. The prime automation candidates resulting from the industry experience and survey were: (1) the shearer operation, (2) shield and conveyor pan line advance, (3) a management information system to allow improved mine logistics support, and (4) component fault isolation and diagnostics to reduce untimely maintenance delays. A system network analysis indicated that a 40% improvement in productivity was feasible if system delays associated with all of the above four areas were removed. A technology assessment and conceptual system design of each of the four automation candidate areas showed that state of the art digital computer, servomechanism, and actuator technologies could be applied to automate the longwall system.

  20. Distribution automation applications of fiber optics

    NASA Technical Reports Server (NTRS)

    Kirkham, Harold; Johnston, A.; Friend, H.

    1989-01-01

    Motivations for interest and research in distribution automation are discussed. The communication requirements of distribution automation are examined and shown to exceed the capabilities of power line carrier, radio, and telephone systems. A fiber optic based communication system is described that is co-located with the distribution system and that could satisfy the data rate and reliability requirements. A cost comparison shows that it could be constructed at a cost that is similar to that of a power line carrier system. The requirements for fiber optic sensors for distribution automation are discussed. The design of a data link suitable for optically-powered electronic sensing is presented. Empirical results are given. A modeling technique that was used to understand the reflections of guided light from a variety of surfaces is described. An optical position-indicator design is discussed. Systems aspects of distribution automation are discussed, in particular, the lack of interface, communications, and data standards. The economics of distribution automation are examined.

  1. Automated knowledge generation

    NASA Technical Reports Server (NTRS)

    Myler, Harley R.; Gonzalez, Avelino J.

    1988-01-01

    The general objectives of the NASA/UCF Automated Knowledge Generation Project were the development of an intelligent software system that could access CAD design data bases, interpret them, and generate a diagnostic knowledge base in the form of a system model. The initial area of concentration is in the diagnosis of the process control system using the Knowledge-based Autonomous Test Engineer (KATE) diagnostic system. A secondary objective was the study of general problems of automated knowledge generation. A prototype was developed, based on object-oriented language (Flavors).

  2. Engineering Design and Automation in the Applied Engineering Technologies (AET) Group at Los Alamos National Laboratory.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wantuck, P. J.; Hollen, R. M.

    2002-01-01

    This paper provides an overview of some design and automation-related projects ongoing within the Applied Engineering Technologies (AET) Group at Los Alamos National Laboratory. AET uses a diverse set of technical capabilities to develop and apply processes and technologies to applications for a variety of customers both internal and external to the Laboratory. The Advanced Recovery and Integrated Extraction System (ARIES) represents a new paradigm for the processing of nuclear material from retired weapon systems in an environment that seeks to minimize the radiation dose to workers. To achieve this goal, ARIES relies upon automation-based features to handle and processmore » the nuclear material. Our Chemical Process Development Team specializes in fuzzy logic and intelligent control systems. Neural network technology has been utilized in some advanced control systems developed by team members. Genetic algorithms and neural networks have often been applied for data analysis. Enterprise modeling, or discrete event simulation, as well as chemical process simulation has been employed for chemical process plant design. Fuel cell research and development has historically been an active effort within the AET organization. Under the principal sponsorship of the Department of Energy, the Fuel Cell Team is now focusing on technologies required to produce fuel cell compatible feed gas from reformation of a variety of conventional fuels (e.g., gasoline, natural gas), principally for automotive applications. This effort involves chemical reactor design and analysis, process modeling, catalyst analysis, as well as full scale system characterization and testing. The group's Automation and Robotics team has at its foundation many years of experience delivering automated and robotic systems for nuclear, analytical chemistry, and bioengineering applications. As an integrator of commercial systems and a developer of unique custom-made systems, the team currently supports the

  3. Automated processing of endoscopic surgical instruments.

    PubMed

    Roth, K; Sieber, J P; Schrimm, H; Heeg, P; Buess, G

    1994-10-01

    This paper deals with the requirements for automated processing of endoscopic surgical instruments. After a brief analysis of the current problems, solutions are discussed. Test-procedures have been developed to validate the automated processing, so that the cleaning results are guaranteed and reproducable. Also a device for testing and cleaning was designed together with Netzsch Newamatic and PCI, called TC-MIC, to automate processing and reduce manual work.

  4. Automated Pilot Advisory System

    NASA Technical Reports Server (NTRS)

    Parks, J. L., Jr.; Haidt, J. G.

    1981-01-01

    An Automated Pilot Advisory System (APAS) was developed and operationally tested to demonstrate the concept that low cost automated systems can provide air traffic and aviation weather advisory information at high density uncontrolled airports. The system was designed to enhance the see and be seen rule of flight, and pilots who used the system preferred it over the self announcement system presently used at uncontrolled airports.

  5. Automated Status Notification System

    NASA Technical Reports Server (NTRS)

    2005-01-01

    NASA Lewis Research Center's Automated Status Notification System (ASNS) was born out of need. To prevent "hacker attacks," Lewis' telephone system needed to monitor communications activities 24 hr a day, 7 days a week. With decreasing staff resources, this continuous monitoring had to be automated. By utilizing existing communications hardware, a UNIX workstation, and NAWK (a pattern scanning and processing language), we implemented a continuous monitoring system.

  6. 21 CFR 864.5240 - Automated blood cell diluting apparatus.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated blood cell diluting apparatus. 864.5240... § 864.5240 Automated blood cell diluting apparatus. (a) Identification. An automated blood cell diluting apparatus is a fully automated or semi-automated device used to make appropriate dilutions of a blood sample...

  7. 21 CFR 864.5240 - Automated blood cell diluting apparatus.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Automated blood cell diluting apparatus. 864.5240... § 864.5240 Automated blood cell diluting apparatus. (a) Identification. An automated blood cell diluting apparatus is a fully automated or semi-automated device used to make appropriate dilutions of a blood sample...

  8. Semi-automated and automated glioma grading using dynamic susceptibility-weighted contrast-enhanced perfusion MRI relative cerebral blood volume measurements.

    PubMed

    Friedman, S N; Bambrough, P J; Kotsarini, C; Khandanpour, N; Hoggard, N

    2012-12-01

    Despite the established role of MRI in the diagnosis of brain tumours, histopathological assessment remains the clinically used technique, especially for the glioma group. Relative cerebral blood volume (rCBV) is a dynamic susceptibility-weighted contrast-enhanced perfusion MRI parameter that has been shown to correlate to tumour grade, but assessment requires a specialist and is time consuming. We developed analysis software to determine glioma gradings from perfusion rCBV scans in a manner that is quick, easy and does not require a specialist operator. MRI perfusion data from 47 patients with different histopathological grades of glioma were analysed with custom-designed software. Semi-automated analysis was performed with a specialist and non-specialist operator separately determining the maximum rCBV value corresponding to the tumour. Automated histogram analysis was performed by calculating the mean, standard deviation, median, mode, skewness and kurtosis of rCBV values. All values were compared with the histopathologically assessed tumour grade. A strong correlation between specialist and non-specialist observer measurements was found. Significantly different values were obtained between tumour grades using both semi-automated and automated techniques, consistent with previous results. The raw (unnormalised) data single-pixel maximum rCBV semi-automated analysis value had the strongest correlation with glioma grade. Standard deviation of the raw data had the strongest correlation of the automated analysis. Semi-automated calculation of raw maximum rCBV value was the best indicator of tumour grade and does not require a specialist operator. Both semi-automated and automated MRI perfusion techniques provide viable non-invasive alternatives to biopsy for glioma tumour grading.

  9. 21 CFR 864.5620 - Automated hemoglobin system.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated hemoglobin system. 864.5620 Section 864....5620 Automated hemoglobin system. (a) Identification. An automated hemoglobin system is a fully... hemoglobin content of human blood. (b) Classification. Class II (performance standards). [45 FR 60601, Sept...

  10. 21 CFR 864.5620 - Automated hemoglobin system.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Automated hemoglobin system. 864.5620 Section 864....5620 Automated hemoglobin system. (a) Identification. An automated hemoglobin system is a fully... hemoglobin content of human blood. (b) Classification. Class II (performance standards). [45 FR 60601, Sept...

  11. Small cities face greater impact from automation.

    PubMed

    Frank, Morgan R; Sun, Lijun; Cebrian, Manuel; Youn, Hyejin; Rahwan, Iyad

    2018-02-01

    The city has proved to be the most successful form of human agglomeration and provides wide employment opportunities for its dwellers. As advances in robotics and artificial intelligence revive concerns about the impact of automation on jobs, a question looms: how will automation affect employment in cities? Here, we provide a comparative picture of the impact of automation across US urban areas. Small cities will undertake greater adjustments, such as worker displacement and job content substitutions. We demonstrate that large cities exhibit increased occupational and skill specialization due to increased abundance of managerial and technical professions. These occupations are not easily automatable, and, thus, reduce the potential impact of automation in large cities. Our results pass several robustness checks including potential errors in the estimation of occupational automation and subsampling of occupations. Our study provides the first empirical law connecting two societal forces: urban agglomeration and automation's impact on employment. © 2018 The Authors.

  12. Small cities face greater impact from automation

    PubMed Central

    Sun, Lijun; Cebrian, Manuel; Rahwan, Iyad

    2018-01-01

    The city has proved to be the most successful form of human agglomeration and provides wide employment opportunities for its dwellers. As advances in robotics and artificial intelligence revive concerns about the impact of automation on jobs, a question looms: how will automation affect employment in cities? Here, we provide a comparative picture of the impact of automation across US urban areas. Small cities will undertake greater adjustments, such as worker displacement and job content substitutions. We demonstrate that large cities exhibit increased occupational and skill specialization due to increased abundance of managerial and technical professions. These occupations are not easily automatable, and, thus, reduce the potential impact of automation in large cities. Our results pass several robustness checks including potential errors in the estimation of occupational automation and subsampling of occupations. Our study provides the first empirical law connecting two societal forces: urban agglomeration and automation's impact on employment. PMID:29436514

  13. Intelligent Automation Approach for Improving Pilot Situational Awareness

    NASA Technical Reports Server (NTRS)

    Spirkovska, Lilly

    2004-01-01

    Automation in the aviation domain has been increasing for the past two decades. Pilot reaction to automation varies from highly favorable to highly critical depending on both the pilot's background and how effectively the automation is implemented. We describe a user-centered approach for automation that considers the pilot's tasks and his needs related to accomplishing those tasks. Further, we augment rather than replace how the pilot currently fulfills his goals, relying on redundant displays that offer the pilot an opportunity to build trust in the automation. Our prototype system automates the interpretation of hydraulic system faults of the UH-60 helicopter. We describe the problem with the current system and our methodology for resolving it.

  14. Space power subsystem automation technology

    NASA Technical Reports Server (NTRS)

    Graves, J. R. (Compiler)

    1982-01-01

    The technology issues involved in power subsystem automation and the reasonable objectives to be sought in such a program were discussed. The complexities, uncertainties, and alternatives of power subsystem automation, along with the advantages from both an economic and a technological perspective were considered. Whereas most spacecraft power subsystems now use certain automated functions, the idea of complete autonomy for long periods of time is almost inconceivable. Thus, it seems prudent that the technology program for power subsystem automation be based upon a growth scenario which should provide a structured framework of deliberate steps to enable the evolution of space power subsystems from the current practice of limited autonomy to a greater use of automation with each step being justified on a cost/benefit basis. Each accomplishment should move toward the objectives of decreased requirement for ground control, increased system reliability through onboard management, and ultimately lower energy cost through longer life systems that require fewer resources to operate and maintain. This approach seems well-suited to the evolution of more sophisticated algorithms and eventually perhaps even the use of some sort of artificial intelligence. Multi-hundred kilowatt systems of the future will probably require an advanced level of autonomy if they are to be affordable and manageable.

  15. 21 CFR 864.5850 - Automated slide spinner.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated slide spinner. 864.5850 Section 864.5850 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  16. 21 CFR 864.5620 - Automated hemoglobin system.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Automated hemoglobin system. 864.5620 Section 864.5620 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  17. 21 CFR 864.5620 - Automated hemoglobin system.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Automated hemoglobin system. 864.5620 Section 864.5620 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  18. 21 CFR 864.5680 - Automated heparin analyzer.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Automated heparin analyzer. 864.5680 Section 864.5680 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  19. 21 CFR 864.5850 - Automated slide spinner.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Automated slide spinner. 864.5850 Section 864.5850 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  20. 21 CFR 864.5680 - Automated heparin analyzer.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Automated heparin analyzer. 864.5680 Section 864.5680 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  1. 21 CFR 864.5850 - Automated slide spinner.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Automated slide spinner. 864.5850 Section 864.5850 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  2. 21 CFR 864.5680 - Automated heparin analyzer.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Automated heparin analyzer. 864.5680 Section 864.5680 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  3. 21 CFR 864.5850 - Automated slide spinner.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Automated slide spinner. 864.5850 Section 864.5850 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  4. 21 CFR 864.5850 - Automated slide spinner.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Automated slide spinner. 864.5850 Section 864.5850 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  5. 21 CFR 864.5680 - Automated heparin analyzer.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated heparin analyzer. 864.5680 Section 864.5680 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  6. 21 CFR 864.5680 - Automated heparin analyzer.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Automated heparin analyzer. 864.5680 Section 864.5680 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  7. 21 CFR 864.5620 - Automated hemoglobin system.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Automated hemoglobin system. 864.5620 Section 864.5620 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  8. Automation literature: A brief review and analysis

    NASA Technical Reports Server (NTRS)

    Smith, D.; Dieterly, D. L.

    1980-01-01

    Current thought and research positions which may allow for an improved capability to understand the impact of introducing automation to an existing system are established. The orientation was toward the type of studies which may provide some general insight into automation; specifically, the impact of automation in human performance and the resulting system performance. While an extensive number of articles were reviewed, only those that addressed the issue of automation and human performance were selected to be discussed. The literature is organized along two dimensions: time, Pre-1970, Post-1970; and type of approach, Engineering or Behavioral Science. The conclusions reached are not definitive, but do provide the initial stepping stones in an attempt to begin to bridge the concept of automation in a systematic progression.

  9. Integration of enabling methods for the automated flow preparation of piperazine-2-carboxamide.

    PubMed

    Ingham, Richard J; Battilocchio, Claudio; Hawkins, Joel M; Ley, Steven V

    2014-01-01

    Here we describe the use of a new open-source software package and a Raspberry Pi(®) computer for the simultaneous control of multiple flow chemistry devices and its application to a machine-assisted, multi-step flow preparation of pyrazine-2-carboxamide - a component of Rifater(®), used in the treatment of tuberculosis - and its reduced derivative piperazine-2-carboxamide.

  10. Laboratory systems integration: robotics and automation.

    PubMed

    Felder, R A

    1991-01-01

    Robotic technology is going to have a profound impact on the clinical laboratory of the future. Faced with increased pressure to reduce health care spending yet increase services to patients, many laboratories are looking for alternatives to the inflexible or "fixed" automation found in many clinical analyzers. Robots are being examined by many clinical pathologists as an attractive technology which can adapt to the constant changes in laboratory testing. Already, laboratory designs are being altered to accommodate robotics and automated specimen processors. However, the use of robotics and computer intelligence in the clinical laboratory is still in its infancy. Successful examples of robotic automation exist in several laboratories. Investigators have used robots to automate endocrine testing, high performance liquid chromatography, and specimen transportation. Large commercial laboratories are investigating the use of specimen processors which combine the use of fixed automation and robotics. Robotics have also reduced the exposure of medical technologists to specimens infected with viral pathogens. The successful examples of clinical robotics applications were a result of the cooperation of clinical chemists, engineers, and medical technologists. At the University of Virginia we have designed and implemented a robotic critical care laboratory. Initial clinical experience suggests that robotic performance is reliable, however, staff acceptance and utilization requires continuing education. We are also developing a robotic cyclosporine which promises to greatly reduce the labor costs of this analysis. The future will bring lab wide automation that will fully integrate computer artificial intelligence and robotics. Specimens will be transported by mobile robots. Specimen processing, aliquotting, and scheduling will be automated.(ABSTRACT TRUNCATED AT 250 WORDS)

  11. Evaluation of automated sample preparation, retention time locked gas chromatography-mass spectrometry and data analysis methods for the metabolomic study of Arabidopsis species.

    PubMed

    Gu, Qun; David, Frank; Lynen, Frédéric; Rumpel, Klaus; Dugardeyn, Jasper; Van Der Straeten, Dominique; Xu, Guowang; Sandra, Pat

    2011-05-27

    In this paper, automated sample preparation, retention time locked gas chromatography-mass spectrometry (GC-MS) and data analysis methods for the metabolomics study were evaluated. A miniaturized and automated derivatisation method using sequential oximation and silylation was applied to a polar extract of 4 types (2 types×2 ages) of Arabidopsis thaliana, a popular model organism often used in plant sciences and genetics. Automation of the derivatisation process offers excellent repeatability, and the time between sample preparation and analysis was short and constant, reducing artifact formation. Retention time locked (RTL) gas chromatography-mass spectrometry was used, resulting in reproducible retention times and GC-MS profiles. Two approaches were used for data analysis. XCMS followed by principal component analysis (approach 1) and AMDIS deconvolution combined with a commercially available program (Mass Profiler Professional) followed by principal component analysis (approach 2) were compared. Several features that were up- or down-regulated in the different types were detected. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Stages and levels of automation in support of space teleoperations.

    PubMed

    Li, Huiyang; Wickens, Christopher D; Sarter, Nadine; Sebok, Angelia

    2014-09-01

    This study examined the impact of stage of automation on the performance and perceived workload during simulated robotic arm control tasks in routine and off-nominal scenarios. Automation varies with respect to the stage of information processing it supports and its assigned level of automation. Making appropriate choices in terms of stages and levels of automation is critical to ensure robust joint system performance. To date, this issue has been empirically studied in domains such as aviation and medicine but not extensively in the context of space operations. A total of 36 participants played the role of a payload specialist and controlled a simulated robotic arm. Participants performed fly-to tasks with two types of automation (camera recommendation and trajectory control automation) of varying stage. Tasks were performed during routine scenarios and in scenarios in which either the trajectory control automation or a hazard avoidance automation failed. Increasing the stage of automation progressively improved performance and lowered workload when the automation was reliable, but incurred severe performance costs when the system failed. The results from this study support concerns about automation-induced complacency and automation bias when later stages of automation are introduced. The benefits of such automation are offset by the risk of catastrophic outcomes when system failures go unnoticed or become difficult to recover from. A medium stage of automation seems preferable as it provides sufficient support during routine operations and helps avoid potentially catastrophic outcomes in circumstances when the automation fails.

  13. Implementation of and experiences with new automation

    PubMed Central

    Mahmud, Ifte; Kim, David

    2000-01-01

    In an environment where cost, timeliness, and quality drives the business, it is essential to look for answers in technology where these challenges can be met. In the Novartis Pharmaceutical Quality Assurance Department, automation and robotics have become just the tools to meet these challenges. Although automation is a relatively new concept in our department, we have fully embraced it within just a few years. As our company went through a merger, there was a significant reduction in the workforce within the Quality Assurance Department through voluntary and involuntary separations. However the workload remained constant or in some cases actually increased. So even with reduction in laboratory personnel, we were challenged internally and from the headquarters in Basle to improve productivity while maintaining integrity in quality testing. Benchmark studies indicated the Suffern site to be the choice manufacturing site above other facilities. This is attributed to the Suffern facility employees' commitment to reduce cycle time, improve efficiency, and maintain high level of regulatory compliance. One of the stronger contributing factors was automation technology in the laboratoriess, and this technology will continue to help the site's status in the future. The Automation Group was originally formed about 2 years ago to meet the demands of high quality assurance testing throughput needs and to bring our testing group up to standard with the industry. Automation began with only two people in the group and now we have three people who are the next generation automation scientists. Even with such a small staff,we have made great strides in laboratory automation as we have worked extensively with each piece of equipment brought in. The implementation process of each project was often difficult because the second generation automation group came from the laboratory and without much automation experience. However, with the involvement from the users at ‘get-go’, we

  14. Implementation of and experiences with new automation.

    PubMed

    Mahmud, I; Kim, D

    2000-01-01

    In an environment where cost, timeliness, and quality drives the business, it is essential to look for answers in technology where these challenges can be met. In the Novartis Pharmaceutical Quality Assurance Department, automation and robotics have become just the tools to meet these challenges. Although automation is a relatively new concept in our department, we have fully embraced it within just a few years. As our company went through a merger, there was a significant reduction in the workforce within the Quality Assurance Department through voluntary and involuntary separations. However the workload remained constant or in some cases actually increased. So even with reduction in laboratory personnel, we were challenged internally and from the headquarters in Basle to improve productivity while maintaining integrity in quality testing. Benchmark studies indicated the Suffern site to be the choice manufacturing site above other facilities. This is attributed to the Suffern facility employees' commitment to reduce cycle time, improve efficiency, and maintain high level of regulatory compliance. One of the stronger contributing factors was automation technology in the laboratoriess, and this technology will continue to help the site's status in the future. The Automation Group was originally formed about 2 years ago to meet the demands of high quality assurance testing throughput needs and to bring our testing group up to standard with the industry. Automation began with only two people in the group and now we have three people who are the next generation automation scientists. Even with such a small staff,we have made great strides in laboratory automation as we have worked extensively with each piece of equipment brought in. The implementation process of each project was often difficult because the second generation automation group came from the laboratory and without much automation experience. However, with the involvement from the users at 'get-go', we were

  15. Application of the quality by design approach to the drug substance manufacturing process of an Fc fusion protein: towards a global multi-step design space.

    PubMed

    Eon-duval, Alex; Valax, Pascal; Solacroup, Thomas; Broly, Hervé; Gleixner, Ralf; Strat, Claire L E; Sutter, James

    2012-10-01

    The article describes how Quality by Design principles can be applied to the drug substance manufacturing process of an Fc fusion protein. First, the quality attributes of the product were evaluated for their potential impact on safety and efficacy using risk management tools. Similarly, process parameters that have a potential impact on critical quality attributes (CQAs) were also identified through a risk assessment. Critical process parameters were then evaluated for their impact on CQAs, individually and in interaction with each other, using multivariate design of experiment techniques during the process characterisation phase. The global multi-step Design Space, defining operational limits for the entire drug substance manufacturing process so as to ensure that the drug substance quality targets are met, was devised using predictive statistical models developed during the characterisation study. The validity of the global multi-step Design Space was then confirmed by performing the entire process, from cell bank thawing to final drug substance, at its limits during the robustness study: the quality of the final drug substance produced under different conditions was verified against predefined targets. An adaptive strategy was devised whereby the Design Space can be adjusted to the quality of the input material to ensure reliable drug substance quality. Finally, all the data obtained during the process described above, together with data generated during additional validation studies as well as manufacturing data, were used to define the control strategy for the drug substance manufacturing process using a risk assessment methodology. Copyright © 2012 Wiley-Liss, Inc.

  16. Detection of Heterogeneous Small Inclusions by a Multi-Step MUSIC Method

    NASA Astrophysics Data System (ADS)

    Solimene, Raffaele; Dell'Aversano, Angela; Leone, Giovanni

    2014-05-01

    In this contribution the problem of detecting and localizing scatterers with small (in terms of wavelength) cross sections by collecting their scattered field is addressed. The problem is dealt with for a two-dimensional and scalar configuration where the background is given as a two-layered cylindrical medium. More in detail, while scattered field data are taken in the outermost layer, inclusions are embedded within the inner layer. Moreover, the case of heterogeneous inclusions (i.e., having different scattering coefficients) is addressed. As a pertinent applicative context we identify the problem of diagnose concrete pillars in order to detect and locate rebars, ducts and other small in-homogeneities that can populate the interior of the pillar. The nature of inclusions influences the scattering coefficients. For example, the field scattered by rebars is stronger than the one due to ducts. Accordingly, it is expected that the more weakly scattering inclusions can be difficult to be detected as their scattered fields tend to be overwhelmed by those of strong scatterers. In order to circumvent this problem, in this contribution a multi-step MUltiple SIgnal Classification (MUSIC) detection algorithm is adopted [1]. In particular, the first stage aims at detecting rebars. Once rebars have been detected, their positions are exploited to update the Green's function and to subtract the scattered field due to their presence. The procedure is repeated until all the inclusions are detected. The analysis is conducted by numerical experiments for a multi-view/multi-static single-frequency configuration and the synthetic data are generated by a FDTD forward solver. Acknowledgement This work benefited from networking activities carried out within the EU funded COST Action TU1208 "Civil Engineering Applications of Ground Penetrating Radar." [1] R. Solimene, A. Dell'Aversano and G. Leone, "MUSIC algorithms for rebar detection," J. of Geophysics and Engineering, vol. 10, pp. 1

  17. Fatigue and voluntary utilization of automation in simulated driving.

    PubMed

    Neubauer, Catherine; Matthews, Gerald; Langheim, Lisa; Saxby, Dyani

    2012-10-01

    A driving simulator was used to assess the impact on fatigue, stress, and workload of full vehicle automation that was initiated by the driver. Previous studies have shown that mandatory use of full automation induces a state of "passive fatigue" associated with loss of alertness. By contrast, voluntary use of automation may enhance the driver's perceptions of control and ability to manage fatigue. Participants were assigned to one of two experimental conditions, automation optional (AO) and nonautomation (NA), and then performed a 35 min, monotonous simulated drive. In the last 5 min, automation was unavailable and drivers were required to respond to an emergency event. Subjective state and workload were evaluated before and after the drive. Making automation available to the driver failed to alleviate fatigue and stress states induced by driving in monotonous conditions. Drivers who were fatigued prior to the drive were more likely to choose to use automation, but automation use increased distress, especially in fatigue-prone drivers. Drivers in the AO condition were slower to initiate steering responses to the emergency event, suggesting optional automation may be distracting. Optional, driver-controlled automation appears to pose the same dangers to task engagement and alertness as externally initiated automation. Drivers of automated vehicles may be vulnerable to fatigue that persists when normal vehicle control is restored. It is important to evaluate automated systems' impact on driver fatigue, to seek design solutions to the issue of maintaining driver engagement, and to address the vulnerabilities of fatigue-prone drivers.

  18. Automated System Marketplace 1987: Maturity and Competition.

    ERIC Educational Resources Information Center

    Walton, Robert A.; Bridge, Frank R.

    1988-01-01

    This annual review of the library automation marketplace presents profiles of 15 major library automation firms and looks at emerging trends. Seventeen charts and tables provide data on market shares, number and size of installations, hardware availability, operating systems, and interfaces. A directory of 49 automation sources is included. (MES)

  19. Automated lattice data generation

    NASA Astrophysics Data System (ADS)

    Ayyar, Venkitesh; Hackett, Daniel C.; Jay, William I.; Neil, Ethan T.

    2018-03-01

    The process of generating ensembles of gauge configurations (and measuring various observables over them) can be tedious and error-prone when done "by hand". In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.

  20. Automated discovery of structural features of the optic nerve head on the basis of image and genetic data

    NASA Astrophysics Data System (ADS)

    Christopher, Mark; Tang, Li; Fingert, John H.; Scheetz, Todd E.; Abramoff, Michael D.

    2014-03-01

    Evaluation of optic nerve head (ONH) structure is a commonly used clinical technique for both diagnosis and monitoring of glaucoma. Glaucoma is associated with characteristic changes in the structure of the ONH. We present a method for computationally identifying ONH structural features using both imaging and genetic data from a large cohort of participants at risk for primary open angle glaucoma (POAG). Using 1054 participants from the Ocular Hypertension Treatment Study, ONH structure was measured by application of a stereo correspondence algorithm to stereo fundus images. In addition, the genotypes of several known POAG genetic risk factors were considered for each participant. ONH structural features were discovered using both a principal component analysis approach to identify the major modes of variance within structural measurements and a linear discriminant analysis approach to capture the relationship between genetic risk factors and ONH structure. The identified ONH structural features were evaluated based on the strength of their associations with genotype and development of POAG by the end of the OHTS study. ONH structural features with strong associations with genotype were identified for each of the genetic loci considered. Several identified ONH structural features were significantly associated (p < 0.05) with the development of POAG after Bonferroni correction. Further, incorporation of genetic risk status was found to substantially increase performance of early POAG prediction. These results suggest incorporating both imaging and genetic data into ONH structural modeling significantly improves the ability to explain POAG-related changes to ONH structure.

  1. Automated data collection in single particle electron microscopy

    PubMed Central

    Tan, Yong Zi; Cheng, Anchi; Potter, Clinton S.; Carragher, Bridget

    2016-01-01

    Automated data collection is an integral part of modern workflows in single particle electron microscopy (EM) research. This review surveys the software packages available for automated single particle EM data collection. The degree of automation at each stage of data collection is evaluated, and the capabilities of the software packages are described. Finally, future trends in automation are discussed. PMID:26671944

  2. Pilot interaction with automated airborne decision making systems

    NASA Technical Reports Server (NTRS)

    Rouse, W. B.; Chu, Y. Y.; Greenstein, J. S.; Walden, R. S.

    1976-01-01

    An investigation was made of interaction between a human pilot and automated on-board decision making systems. Research was initiated on the topic of pilot problem solving in automated and semi-automated flight management systems and attempts were made to develop a model of human decision making in a multi-task situation. A study was made of allocation of responsibility between human and computer, and discussed were various pilot performance parameters with varying degrees of automation. Optimal allocation of responsibility between human and computer was considered and some theoretical results found in the literature were presented. The pilot as a problem solver was discussed. Finally the design of displays, controls, procedures, and computer aids for problem solving tasks in automated and semi-automated systems was considered.

  3. Identifying Requirements for Effective Human-Automation Teamwork

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeffrey C. Joe; John O'Hara; Heather D. Medema

    Previous studies have shown that poorly designed human-automation collaboration, such as poorly designed communication protocols, often leads to problems for the human operators, such as: lack of vigilance, complacency, and loss of skills. These problems often lead to suboptimal system performance. To address this situation, a considerable amount of research has been conducted to improve human-automation collaboration and to make automation function better as a “team player.” Much of this research is based on an understanding of what it means to be a good team player from the perspective of a human team. However, the research is often based onmore » a simplified view of human teams and teamwork. In this study, we sought to better understand the capabilities and limitations of automation from the standpoint of human teams. We first examined human teams to identify the principles for effective teamwork. We next reviewed the research on integrating automation agents and human agents into mixed agent teams to identify the limitations of automation agents to conform to teamwork principles. This research resulted in insights that can lead to more effective human-automation collaboration by enabling a more realistic set of requirements to be developed based on the strengths and limitations of all agents.« less

  4. Managing the Implementation of Mission Operations Automation

    NASA Technical Reports Server (NTRS)

    Sodano, R.; Crouse, P.; Odendahl, S.; Fatig, M.; McMahon, K.; Lakin, J.

    2006-01-01

    Reducing the cost of mission operations has necessitated a high level of automation both on spacecraft and ground systems. While automation on spacecraft is implemented during the design phase, ground system automation tends to be implemented during the prime mission operations phase. Experience has shown that this tendency for late automation development can be hindered by several factors: additional hardware and software resources may need to be procured; software must be developed and tested on a non-interference basis with primary operations with limited manpower; and established procedures may not be suited for automation requiring substantial rework. In this paper we will review the experience of successfully automating mission operations for seven on-orbit missions: the Compton Gamma Ray Observatory (CGRO), the Rossi X-Ray Timing Explorer (RXTE), the Advanced Composition Explorer (ACE), the Far Ultraviolet Spectroscopic Explorer (FUSE), Interplanetary Physics Laboratory (WIND), Polar Plasma Laboratory (POLAR), and the Imager for Magnetopause-to-Aurora Global Exploration (IMAGE). We will provide lessons learned in areas such as: spacecraft recorder management, procedure development, lights out commanding from the ground system vs. stored command loads, spacecraft contingency response time, and ground station interfaces. Implementing automation strategies during the mission concept and spacecraft integration and test phase as the most efficient method will be discussed.

  5. Advanced automation for in-space vehicle processing

    NASA Technical Reports Server (NTRS)

    Sklar, Michael; Wegerif, D.

    1990-01-01

    The primary objective of this 3-year planned study is to assure that the fully evolved Space Station Freedom (SSF) can support automated processing of exploratory mission vehicles. Current study assessments show that required extravehicular activity (EVA) and to some extent intravehicular activity (IVA) manpower requirements for required processing tasks far exceeds the available manpower. Furthermore, many processing tasks are either hazardous operations or they exceed EVA capability. Thus, automation is essential for SSF transportation node functionality. Here, advanced automation represents the replacement of human performed tasks beyond the planned baseline automated tasks. Both physical tasks such as manipulation, assembly and actuation, and cognitive tasks such as visual inspection, monitoring and diagnosis, and task planning are considered. During this first year of activity both the Phobos/Gateway Mars Expedition and Lunar Evolution missions proposed by the Office of Exploration have been evaluated. A methodology for choosing optimal tasks to be automated has been developed. Processing tasks for both missions have been ranked on the basis of automation potential. The underlying concept in evaluating and describing processing tasks has been the use of a common set of 'Primitive' task descriptions. Primitive or standard tasks have been developed both for manual or crew processing and automated machine processing.

  6. Automated Assessment in Massive Open Online Courses

    ERIC Educational Resources Information Center

    Ivaniushin, Dmitrii A.; Shtennikov, Dmitrii G.; Efimchick, Eugene A.; Lyamin, Andrey V.

    2016-01-01

    This paper describes an approach to use automated assessments in online courses. Open edX platform is used as the online courses platform. The new assessment type uses Scilab as learning and solution validation tool. This approach allows to use automated individual variant generation and automated solution checks without involving the course…

  7. Automation trust and attention allocation in multitasking workspace.

    PubMed

    Karpinsky, Nicole D; Chancey, Eric T; Palmer, Dakota B; Yamani, Yusuke

    2018-07-01

    Previous research suggests that operators with high workload can distrust and then poorly monitor automation, which has been generally inferred from automation dependence behaviors. To test automation monitoring more directly, the current study measured operators' visual attention allocation, workload, and trust toward imperfect automation in a dynamic multitasking environment. Participants concurrently performed a manual tracking task with two levels of difficulty and a system monitoring task assisted by an unreliable signaling system. Eye movement data indicate that operators allocate less visual attention to monitor automation when the tracking task is more difficult. Participants reported reduced levels of trust toward the signaling system when the tracking task demanded more focused visual attention. Analyses revealed that trust mediated the relationship between the load of the tracking task and attention allocation in Experiment 1, an effect that was not replicated in Experiment 2. Results imply a complex process underlying task load, visual attention allocation, and automation trust during multitasking. Automation designers should consider operators' task load in multitasking workspaces to avoid reduced automation monitoring and distrust toward imperfect signaling systems. Copyright © 2018. Published by Elsevier Ltd.

  8. Aviation Safety/Automation Program Conference

    NASA Technical Reports Server (NTRS)

    Morello, Samuel A. (Compiler)

    1990-01-01

    The Aviation Safety/Automation Program Conference - 1989 was sponsored by the NASA Langley Research Center on 11 to 12 October 1989. The conference, held at the Sheraton Beach Inn and Conference Center, Virginia Beach, Virginia, was chaired by Samuel A. Morello. The primary objective of the conference was to ensure effective communication and technology transfer by providing a forum for technical interchange of current operational problems and program results to date. The Aviation Safety/Automation Program has as its primary goal to improve the safety of the national airspace system through the development and integration of human-centered automation technologies for aircraft crews and air traffic controllers.

  9. [Automated analyzer of enzyme immunoassay].

    PubMed

    Osawa, S

    1995-09-01

    Automated analyzers for enzyme immunoassay can be classified by several points of view: the kind of labeled antibodies or enzymes, detection methods, the number of tests per unit time, analytical time and speed per run. In practice, it is important for us consider the several points such as detection limits, the number of tests per unit time, analytical range, and precision. Most of the automated analyzers on the market can randomly access and measure samples. I will describe the recent advance of automated analyzers reviewing their labeling antibodies and enzymes, the detection methods, the number of test per unit time and analytical time and speed per test.

  10. Programmable Automated Welding System (PAWS)

    NASA Technical Reports Server (NTRS)

    Kline, Martin D.

    1994-01-01

    An ambitious project to develop an advanced, automated welding system is being funded as part of the Navy Joining Center with Babcock & Wilcox as the prime integrator. This program, the Programmable Automated Welding System (PAWS), involves the integration of both planning and real-time control activities. Planning functions include the development of a graphical decision support system within a standard, portable environment. Real-time control functions include the development of a modular, intelligent, real-time control system and the integration of a number of welding process sensors. This paper presents each of these components of the PAWS and discusses how they can be utilized to automate the welding operation.

  11. Nonanalytic Laboratory Automation: A Quarter Century of Progress.

    PubMed

    Hawker, Charles D

    2017-06-01

    Clinical laboratory automation has blossomed since the 1989 AACC meeting, at which Dr. Masahide Sasaki first showed a western audience what his laboratory had implemented. Many diagnostics and other vendors are now offering a variety of automated options for laboratories of all sizes. Replacing manual processing and handling procedures with automation was embraced by the laboratory community because of the obvious benefits of labor savings and improvement in turnaround time and quality. Automation was also embraced by the diagnostics vendors who saw automation as a means of incorporating the analyzers purchased by their customers into larger systems in which the benefits of automation were integrated to the analyzers.This report reviews the options that are available to laboratory customers. These options include so called task-targeted automation-modules that range from single function devices that automate single tasks (e.g., decapping or aliquoting) to multifunction workstations that incorporate several of the functions of a laboratory sample processing department. The options also include total laboratory automation systems that use conveyors to link sample processing functions to analyzers and often include postanalytical features such as refrigerated storage and sample retrieval.Most importantly, this report reviews a recommended process for evaluating the need for new automation and for identifying the specific requirements of a laboratory and developing solutions that can meet those requirements. The report also discusses some of the practical considerations facing a laboratory in a new implementation and reviews the concept of machine vision to replace human inspections. © 2017 American Association for Clinical Chemistry.

  12. Power subsystem automation study

    NASA Technical Reports Server (NTRS)

    Imamura, M. S.; Moser, R. L.; Veatch, M.

    1983-01-01

    Generic power-system elements and their potential faults are identified. Automation functions and their resulting benefits are defined and automation functions between power subsystem, central spacecraft computer, and ground flight-support personnel are partitioned. All automation activities were categorized as data handling, monitoring, routine control, fault handling, planning and operations, or anomaly handling. Incorporation of all these classes of tasks, except for anomaly handling, in power subsystem hardware and software was concluded to be mandatory to meet the design and operational requirements of the space station. The key drivers are long mission lifetime, modular growth, high-performance flexibility, a need to accommodate different electrical user-load equipment, onorbit assembly/maintenance/servicing, and potentially large number of power subsystem components. A significant effort in algorithm development and validation is essential in meeting the 1987 technology readiness date for the space station.

  13. Flight deck automation: Promises and realities

    NASA Technical Reports Server (NTRS)

    Norman, Susan D. (Editor); Orlady, Harry W. (Editor)

    1989-01-01

    Issues of flight deck automation are multifaceted and complex. The rapid introduction of advanced computer-based technology onto the flight deck of transport category aircraft has had considerable impact both on aircraft operations and on the flight crew. As part of NASA's responsibility to facilitate an active exchange of ideas and information among members of the aviation community, a NASA/FAA/Industry workshop devoted to flight deck automation, organized by the Aerospace Human Factors Research Division of NASA Ames Research Center. Participants were invited from industry and from government organizations responsible for design, certification, operation, and accident investigation of transport category, automated aircraft. The goal of the workshop was to clarify the implications of automation, both positive and negative. Workshop panels and working groups identified issues regarding the design, training, and procedural aspects of flight deck automation, as well as the crew's ability to interact and perform effectively with the new technology. The proceedings include the invited papers and the panel and working group reports, as well as the summary and conclusions of the conference.

  14. Opening up Library Automation Software

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2009-01-01

    Throughout the history of library automation, the author has seen a steady advancement toward more open systems. In the early days of library automation, when proprietary systems dominated, the need for standards was paramount since other means of inter-operability and data exchange weren't possible. Today's focus on Application Programming…

  15. Translation: Aids, Robots, and Automation.

    ERIC Educational Resources Information Center

    Andreyewsky, Alexander

    1981-01-01

    Examines electronic aids to translation both as ways to automate it and as an approach to solve problems resulting from shortage of qualified translators. Describes the limitations of robotic MT (Machine Translation) systems, viewing MAT (Machine-Aided Translation) as the only practical solution and the best vehicle for further automation. (MES)

  16. Automated packing systems: review of industrial implementations

    NASA Astrophysics Data System (ADS)

    Whelan, Paul F.; Batchelor, Bruce G.

    1993-08-01

    A rich theoretical background to the problems that occur in the automation of material handling can be found in operations research, production engineering, systems engineering and automation, more specifically machine vision, literature. This work has contributed towards the design of intelligent handling systems. This paper will review the application of these automated material handling and packing techniques to industrial problems. The discussion will also highlight the systems integration issues involved in these applications. An outline of one such industrial application, the automated placement of shape templates on to leather hides, is also discussed. The purpose of this system is to arrange shape templates on a leather hide in an efficient manner, so as to minimize the leather waste, before they are automatically cut from the hide. These pieces are used in the furniture and car manufacturing industries for the upholstery of high quality leather chairs and car seats. Currently this type of operation is semi-automated. The paper will outline the problems involved in the full automation of such a procedure.

  17. Toward a human-centered aircraft automation philosophy

    NASA Technical Reports Server (NTRS)

    Billings, Charles E.

    1989-01-01

    The evolution of automation in civil aircraft is examined in order to discern trends in the respective roles and functions of automation technology and the humans who operate these aircraft. The effects of advances in automation technology on crew reaction is considered and it appears that, though automation may well have decreased the frequency of certain types of human errors in flight, it may also have enabled new categories of human errors, some perhaps less obvious and therefore more serious than those it has alleviated. It is suggested that automation could be designed to keep the pilot closer to the control of the vehicle, while providing an array of information management and aiding functions designed to provide the pilot with data regarding flight replanning, degraded system operation, and the operational status and limits of the aircraft, its systems, and the physical and operational environment. The automation would serve as the pilot's assistant, providing and calculating data, watching for the unexpected, and keeping track of resources and their rate of expenditure.

  18. Automated Power-Distribution System

    NASA Technical Reports Server (NTRS)

    Thomason, Cindy; Anderson, Paul M.; Martin, James A.

    1990-01-01

    Automated power-distribution system monitors and controls electrical power to modules in network. Handles both 208-V, 20-kHz single-phase alternating current and 120- to 150-V direct current. Power distributed to load modules from power-distribution control units (PDCU's) via subsystem distributors. Ring busses carry power to PDCU's from power source. Needs minimal attention. Detects faults and also protects against them. Potential applications include autonomous land vehicles and automated industrial process systems.

  19. Automatic Generation of Wide Dynamic Range Image without Pseudo-Edge Using Integration of Multi-Steps Exposure Images

    NASA Astrophysics Data System (ADS)

    Migiyama, Go; Sugimura, Atsuhiko; Osa, Atsushi; Miike, Hidetoshi

    Recently, digital cameras are offering technical advantages rapidly. However, the shot image is different from the sight image generated when that scenery is seen with the naked eye. There are blown-out highlights and crushed blacks in the image that photographed the scenery of wide dynamic range. The problems are hardly generated in the sight image. These are contributory cause of difference between the shot image and the sight image. Blown-out highlights and crushed blacks are caused by the difference of dynamic range between the image sensor installed in a digital camera such as CCD and CMOS and the human visual system. Dynamic range of the shot image is narrower than dynamic range of the sight image. In order to solve the problem, we propose an automatic method to decide an effective exposure range in superposition of edges. We integrate multi-step exposure images using the method. In addition, we try to erase pseudo-edges using the process to blend exposure values. Afterwards, we get a pseudo wide dynamic range image automatically.

  20. Multi-step approach to add value to corncob: Production of biomass-degrading enzymes, lignin and fermentable sugars.

    PubMed

    Michelin, Michele; Ruiz, Héctor A; Polizeli, Maria de Lourdes T M; Teixeira, José A

    2018-01-01

    This work presents an integrated and multi-step approach for the recovery and/or application of the lignocellulosic fractions from corncob in the production of high value added compounds as xylo-oligosaccharides, enzymes, fermentable sugars, and lignin in terms of biorefinery concept. For that, liquid hot water followed by enzymatic hydrolysis were used. Liquid hot water was performed using different residence times (10-50min) and holding temperature (180-200°C), corresponding to severities (log(R 0 )) of 3.36-4.64. The most severe conditions showed higher xylo-oligosaccharides extraction (maximum of 93%) into the hydrolysates and higher recovery of cellulose on pretreated solids (maximum of 65%). Subsequently, hydrolysates and solids were used in the production of xylanases and cellulases, respectively, as well as, pretreated solids were also subjected to enzymatic hydrolysis for the recovery of lignin and fermentable sugars from cellulose. Maximum glucose yield (100%) was achieved for solids pretreated at log(R 0 ) of 4.42 and 5% solid loading. Copyright © 2017 Elsevier Ltd. All rights reserved.